ai therapy

A report published by Roots Analysis in early 2025 projected that the compound annual growth rate (CAGR) of the global market for AI in mental health from 2025 to 2035 would be 37.4%, with a market value of $22.7 billion in 2035. The market value at the end of 2025 is estimated to be $ 1.81 billion, representing significant growth from $1.23 billion in 2024, considering the market emerged in 2021.

This shows the interest and usage of large language models (LLMs) like ChatGPT and Gemini in therapy and other mental health-related treatments. In recent months, there has been increased chatter on the internet regarding this topic, when various people shared their experiences with ChatGPT upon asking for help with various mental health-related situations. 

This raises the need to discuss the role of AI, particularly the LLMs, in therapy. It’s present and future, impacts, benefits, risks, and whether it’ll be a blessing or a curse for the days to come?

Why is AI Therapy Gaining Popularity?

To understand the popularity of AI chatbots in therapy, one has to understand the population that is using them and the underlying conditions that are prompting them to use them.

Upon looking at the data, it is revealed that amongst the population using chatbots and LLMs for mental health advice, a large number of people belong to developed and higher-income countries like the USA, Canada, the UK, and many European countries. Developing nations with lower incomes show a very small number of people seeking help through AI. This is a direct reflection of mental health awareness in countries amongst populations all around the world.

Although high-income regions display a high percentage of mental health awareness, the treatment and healthcare around it are very expensive and inaccessible to many. This is also true for the overall healthcare in most high-income nations. This creates the need to seek alternate methods of therapy or help. On the other hand, cheap therapy does not provide the quality of personalised treatment that a person needs. This is where AI chatbots are feeling the gaps.

Free and Accessible

Chatbots and LLMs are providing very cheap or free mental health advice that is accessible to many. Anyone with a cell phone, laptop, or PC that has basic features and internet access can now receive free or low-cost mental health consultation and advice at any time of day. This enables many to have access to therapy whose insurance didn’t cover therapy or who couldn’t simply afford to even buy health insurance.

Personalised Feedback

LLMs like ChatGPT gather a large amount of your data once given permission on your device. This gives them access to your psyche and habits that is unfathomable for a human therapist. Sometimes these pick on habits, and things that the person isn’t even aware of consciously. This is proven with demonstration, where the LLMs diagnose mental health disorders such as anxiety and depression with near-perfect accuracy in controlled conditions and very accurately in general conditions.

Anonimity

Despite achieving many milestones in destigmatising mental health treatments and therapy and placing many guardrails for the protection of patient privacy, there are still some people who external both internal and external stigma, seeking help. Within the boundary of their space, many feel comfortable seeking help that they wouldn’t otherwise pursue. On top of that, many people suffering from mental health issues are averse to human interaction, this they also lean towards therapy with AI. This comfort and convenience are also a big factor in the accelerated adoption of AI therapy among many people.

The Many Risks of AI Therapy

Despite having these advantages, AI therapy poses many greater risks, in the current times and for the future, to be adopted and accepted blindly.

Technical limitations

With the current capabilities of AI systems, LLMs like ChatGPT and Gemini struggle with long-term memory retention, and this limits their ability to maintain a long-term therapeutic relationship like a human therapist. Studies in other fields have also shown that the effectiveness of AI generally reduces in the long term. The AI models have also proven to have cultural and regional biases, which means their expertise does not always translate well with various cultures, lifestyles, and regions all over the world.

AI systems are trained on publicly available data on the internet, may it be in audio, video, or text format. However, therapy sessions and their transcripts are protected under legal and medical guardrails that are not available for public access or even on the internet, and even if it is present in any servers or clouds, these are protected with strong security. Thus, any mental health advice from any AI is based on a very limited knowledge base and is often inaccurate.

This, however, is one of the less concerning risks of AI therapy, as there are many more sinister threats this practice poses against humanity.

Lack of Qualification

To be called a therapist, a human undergoes years of studies and years of additional training where they learn the intricacies of the human psyche and behaviour. Still, to be a capable therapist, the person needs years of professional and personal experience. This lived experience enables them to have the empathy, understanding, and expertise to diagnose a patient and provide the proper care successfully. 

AI, on the other hand, lacks this very crucial ability. It is trained on a limited set of data, when it comes to mental health, and its responses are nothing but a prediction of the next best word or sentence, dictated by its algorithm. It will always fail to recognise the mood or the headspace of the person.

Patients often lie or mislead in therapy, whenever they are not ready to deal with certain topics, and a human therapist can pick up on that, analysing their body language and tone of speaking. However, AI does not have this ability.

Lack of Empathy and Emotional Intelligence

AI LLM models lack real empathy and emotional intelligence, even if sometimes their response seems otherwise. They have no real stakes in your health and development. Each sentence is the best probable response dictated according to the algorithm. It cannot provide you with personalised care, which is what therapy, in most cases, is and cannot track your progress.

Biases and Mistreatment

AI models often come with their inbuilt biases, which stem from their training data. Sadly, the internet is full of information related to mental health, and that’s why it may often give advice that does the opposite of help, harm. There are registered instances of ChatGPT giving harmful and wrong advice to individuals who are dealing with eating disorders and body dysmorphia and seeking help.

On top of that, the user also has some biases. The AI models often foster that bias instead of helping them to correct it. It echoes the user’s thoughts and keeps them in an echo chamber. Meanwhile, therapy is the opposite of that, where often the patients are confronted and treated with the help of a healthy discussion. 

On top of all of it, AI chatbots often fail to recognise crisis situations, which is a very important quality to have in a therapist. A Stanford University study revealed that AI chatbots can encourage schizophrenic delusions and suicidal thoughts, reflecting harmful social stigmas toward certain mental health conditions. The New York Times documented cases where ChatGPT conversations led users into delusional thinking, with vulnerable individuals being drawn into discussions about conspiracies and AI sentience.

AI Therapy : Privacy Concerns

AI LLM models like ChatGPT and Gemini collect and store a massive amount of personal data from the user. Once agreed on terms and conditions, the ownership of the data does not lie with you, and you have no ownership of where the data is ending up and how it is used. Mental health-related sensitive data are more prone to be misused as they represent the user in a deeply personal way. 

With an enormous growth rate, AI is inevitably going to be incorporated in many aspects of therapy and mental health treatment; however, it is not capable of replacing a human therapist. That’s why ChatGPT as a therapist, although sounds exciting and shows some results, inevitably it’ll be going to end up as a curse.

Article by Subhakanta Bhanja