A white chatbot screen with the question 'can you be my therapist' typed into the prompt

AI cannot provide the depth, empathy, and personalized care that human therapists offer

Artificial Intelligence (AI) is quickly becoming an integrated aspect of daily life for teens and adults. While this emerging technology can provide valuable help for many professions, the threat of AI replacing human therapists has entered the mental health space. 

According to the National Library of Medicine, a 2024 study found that approximately 28% of people surveyed have used AI for “quick support and as a personal therapist.”

While some may find temporary relief through instant AI support, Cranston Warren, Clinical Therapist at Loma Linda University Behavioral Health, cautions that relying on AI for ongoing mental health care presents serious challenges and often delivers only limited, superficial support.

"The effectiveness of AI really depends on the person using it, what questions they're asking, and how well they understand how to prompt the chatbot," Warren explains. "AI can offer some direction, but it doesn’t have the insight, experience, or ability to read body language the way a trained clinician can."

Although AI is trained to provide instant answers, its place in mental healthcare falls significantly short of providing meaningful, long-term care to replace therapy.

AI’s lack of human touch

A fundamental difference between AI and human therapists is the absence of empathy. While AI can offer general advice and even help structure a therapy session, it lacks the emotional insight and genuine compassion needed to truly support someone through their struggles.

“AI doesn’t know when to push, when to back off, or when to simply hold space for someone,” Warren says.

Entering prompts into a chatbot does not allow interpretation of body language, the ability to ask meaningful follow-up questions, or the capability to dig deeper into emotional layers the way a therapist can. 

Furthermore, when it comes to clinically diagnosable conditions, Warren urges extreme caution in relying on AI for serious mental health issues.

“In the mental health field, there are a variety of approaches to treating a patient, from solution-based therapies like Cognitive Behavioral Therapy and Dialectical Behavior Therapy, to talk therapies like Psychotherapy, Person-Centered Therapy, and many others,” Warren explains. “AI doesn’t have the capability to make a clinical judgment to know what a patient truly needs.”

Misdiagnosis and a false sense of security

Since AI cannot provide a medical diagnosis, using it as a substitute for therapy in cases of serious mental health disorders can be dangerous. For individuals with disorders like schizophrenia or bipolar disorder, AI can’t provide the necessary clinical oversight, particularly when it comes to medication compliance.

Even for those with mild symptoms, Warren says there's a risk of over-relying on AI. 

“For someone struggling with distorted thinking, such as catastrophizing or minimizing their struggles, AI may reinforce that perspective rather than correct it,” Warren warns. 

Unlike a therapist, AI cannot assess whether a person’s view of reality is accurate and can give a person a false sense of security.

And in moments of crisis? AI doesn’t know when someone needs a higher level of care.

“It can’t call for help, alert emergency services, or ensure your safety in a critical moment. That human layer of protection just isn’t there,” Warren says.

A concern for privacy

One major concern when using AI for therapy support is privacy. 

“Your interaction with AI is not guaranteed to be private,” Warren says. “Everything you feed into the model is being analyzed for data.” 

Unlike licensed therapists, who are bound by strict privacy laws such as HIPAA, AI developers are not held to the same standards. Therefore, AI models aren’t required to protect your information.

Data collected from AI can be used in a variety of ways that have not yet been regulated, potentially leading to privacy breaches, misuse of sensitive information, or unintended consequences for users seeking mental health support.


Read: Feeling anxious? Understanding the rise in anxiety disorders among young adults


When AI might help

AI can be useful in supporting people through minor mental health struggles, especially when someone needs quick advice or help managing stress and emotions.

Warren acknowledges that AI can potentially provide temporary, surface-level help for those dealing with mild depression, anxiety, or mood fluctuations. For example, AI can:

  • Suggest coping skills

  • Track behavior or emotions over time

  • Offer structured exercises or journaling prompts

  • Provide general emotional support in between therapy sessions

For those who lack access to a therapist — whether due to insurance limitations, financial challenges, time constraints, or hesitation to seek professional help — AI may appear to be an appealing, low-cost alternative to traditional therapy.

However, AI’s usefulness depends entirely on the prompts provided. If a person tells a Chatbot they are having a bad day, AI might be able to provide some helpful advice for the immediate moment, but it cannot treat the root of an issue.  
 
While it might be tempting to turn to AI for advice, a chatbot can't substitute for real mental health care. For true treatment, there is no replacement for a mental health professional’s empathy and training.

If you or a loved one needs professional mental health care, visit our website or call us at 909-558-9275