When you purchase through links on our site, we may earn an affiliate commission. This doesn’t affect our editorial independence.
While artificial intelligence delves into different areas of human knowledge and endeavour, seeking to bring ease and access. Meanwhile, AI has repeatedly been faced with questions of uncertainty. Several eyebrows have been raised in an attempt to ensure the human race doesn’t run itself into a ditch. The promise of speed and unmatched intelligence seems too good to be true, especially in some delicate fields of endeavour. One such is psychotherapy, a profession tasked with the mental well-being of humans.
Psychotherapy is a delicate profession; its complexities are nearly unmatched, but with AI on the scene, what’s to come? Recent alarming events like severe behavioural changes and social isolation after interactions with AI therapists are bothering. A particular event even led to suicide. In this discussion, which I believe warrants a sequel, we will debate back and forth about the concerns about AI therapists. Also, we will analyse the undeniable benefits they bring.
Where AI Therapist excel
One of AI’s strongest suites, which even large language models (LLMs) have yet to develop fully, is its ability to expand context lengths. Its capacity to record the smallest details that escape the human mind gives it a major advantage. It can detect the tiniest behaviour patterns, words, verbal affectations, and facial gestures that clients overlook. Artificial intelligence can also recall this data as a reference for anyone at any time. Meanwhile, it assesses similar clients it encounters through inference. This ability of therapist bots to dig up dirt that litters the soul is its greatest strength—but is it enough?
While researching this topic, I read up on a person who turned to an AI therapist for help due to the opinionated and head-stuck conversations he had with others. He pointed out that people were too stuck in their beliefs to be factual and easily make conclusions based on assumptions and zero knowledge. AI, on the other hand, couldn’t be judgemental even if it wanted to be. It considers facts and communicates simply, giving ordinary, plain inferences and feedback. However, he stressed that it failed to comprehend special conversations and provided needed responses. Unlike humans, who will willingly drop all their guard to identify when a person is tipping off or at the edge.
What It Lacks
A lot of people are led to believe that access to vast amounts of knowledge should make AI wise, but this isn’t the case. At best, artificial intelligence as it stands can observe, analyse, and summate findings from patterns it assesses. The problem is that being wise takes intuitiveness—thinking outside the box. Intuition is cognition without evident rational thought. Merriam-Webster dictionary defines being ‘wise’ as characterised by wisdom: marked by deep understanding, keen discernment, and a capacity for sound judgement.
What it Needs
For AI to be an effective therapist, it must be able to evaluate—which it excels in. To discern—on paper, it would seem it could effectively do this, but its inability to read between the lines says it all. This property is absent because of a lack of real-life experience. It would equally have to give sound judgement. Truly, its ability to examine matters without bias makes it excel. But holistic judgment requires empathy and moral and societal responsibility
Lastly, the University of California, Berkeley School of Public Health Professor Jodi Halpern stated in an interview his biggest concern with therapy bots. He said, “My concern is with marketing bots as therapists and trusted companions to people who are depressed or otherwise highly vulnerable.” He emphasised that companies shouldn’t market these bots as “able to care,” as they take responsibility for people’s vulnerability without facing consequences. Furthermore, he stated that regulators and developers had failed to establish clear accountability for these bots. They lacked proper supervision and real human support to address severe issues. Furthermore, many users reported addictive behaviour that endangered their relationships.
Read more on:
AI therapists are still underdeveloped to take on real psychotherapies. Therefore, people’s vulnerability due to their loneliness and mental state shouldn’t be taken for granted. Although this technology has its pros, they are not yet enough to balance the scale in its favour. We’ll continue from here next time.