More and more people are turning to AI in mental health support, and it’s easy to understand why. As traditional therapy and healthcare become increasingly unaffordable, the ease of opening an app or typing into a chatbot feels like a lifeline. But as a therapist, I think it’s important to have an honest and nuanced conversation about what AI in mental health care can — and cannot — do for you.
The Appeal of AI in Mental Health
The numbers tell a clear story. Approximately 1 in 5 adults in the U.S. experience a mental illness each year, yet fewer than half receive treatment — largely due to cost, availability, and stigma (National Alliance on Mental Illness [NAMI], 2023). AI-powered chatbots offer something genuinely compelling: they’re available 24/7, often free or low-cost, and carry no social stigma. For people who might not otherwise access care, that matters.
AI chatbots have also gotten remarkably good at appearing kind, empathetic, and knowledgeable. In some contexts, they can be a useful tool — but there are serious risks that don’t get talked about enough.
The Risks of Using AI for Mental Health Support
General-purpose chatbots like ChatGPT, Gemini, and Claude are not regulated or specifically trained for mental health care. They are large language models designed to generate convincing, human-sounding responses — and unlike a real therapist, they have no clinical training, grounded experience, or professional judgment to draw from. They can inadvertently reinforce distorted thinking, validate delusional beliefs, and in serious cases, have failed people in genuine mental health crises.
A widely reported case in 2023 raised significant ethical and safety concerns after a vulnerable user’s death was linked in part to extended chatbot interaction (Lovens, 2023, as reported in The Brussels Times). A real therapist, by contrast, is trained to offer perspective, challenge unhelpful beliefs when appropriate, and provide the kind of grounded, professional care that an algorithm simply cannot replicate.
Your privacy is also at risk in ways that wouldn’t happen with a licensed therapist. Therapists are bound by HIPAA and strict confidentiality laws. Most AI platforms are not.
The Legitimate Use of AI Mental Health Tools
As of 2025, the FDA has not authorized any generative AI tool for mental health purposes — including the popular apps most people are already using (U.S. Food & Drug Administration, 2025). A promising exception is a newer category called Prescription Digital Therapeutics (PDTs) — FDA-cleared, clinically trialed apps requiring a prescription that deliver structured, evidence-based therapy. Current examples include Rejoyn, cleared for Major Depressive Disorder, and DaylightRx, cleared for Generalized Anxiety Disorder (U.S. Food & Drug Administration, 2024).
As a therapist, I will not personally recommend any AI mental health tool that hasn’t gone through that level of scrutiny. That said, I recognize many people are already using general chatbots for emotional support — so if you are, please consider a harm reduction approach: never rely on a chatbot during a genuine crisis (call or text 988 instead), assume your data is not fully private, be skeptical of tools that only tell you what you want to hear, and treat AI as a supplement — never a replacement — for real professional support.
What AI Cannot Replace
Here is where I want to speak from both clinical experience and personal conviction. In my view, the most healing element of therapy is the therapeutic relationship itself — knowing that there is a real person, a trained professional, who has witnessed your deepest struggles and continues to show up with consistency, care, and genuine empathy. That is something AI cannot replicate. Research consistently identifies the therapeutic alliance as one of the strongest predictors of treatment outcomes across all modalities (Wampold, 2015). AI can simulate warmth. It cannot build a real relationship with you.
Making Therapy Accessible — Even Alongside AI
I want to acknowledge something real: therapy is expensive, and the system creates genuine barriers. Insurance coverage is inconsistent, waitlists are long, and sliding-scale availability is limited. I understand why someone would turn to AI in mental health support rather than navigate all of that.
But many therapists — myself included — are committed to meeting clients where they are, including financially. Even seeing a therapist once a month is more than nothing. It provides a real human touchstone, clinical perspective, and a relationship that no app can offer. If cost is a barrier, ask directly: do you offer a sliding scale? It’s a conversation worth having.
I’d also encourage you to bring your AI tools into the therapy room if you’re using them. Let’s talk about what you’re using, how it’s helping, and where the limits are. Your therapist can help you use those tools more intentionally and safely.
Final Thoughts
AI in mental health care is not going away — and it doesn’t need to be the enemy of good care. Used honestly and carefully, it can play a supportive role alongside real treatment. But the foundation of healing, in my experience, remains the relationship you build with a real therapist. If accessing that feels out of reach, I encourage you to keep looking — many of us are genuinely here to find a way to make it work.
References
Lovens, J. (2023, March 28). Belgian man dies by suicide after exchanges with chatbot. The Brussels Times.
National Alliance on Mental Illness. (2023). Mental health by the numbers. https://www.nami.org/mhstats
U.S. Food & Drug Administration. (2024). De novo authorization: Rejoyn and DaylightRx. https://www.fda.gov/digital-health
U.S. Food & Drug Administration. (2025). Digital health center of excellence: AI/ML-based software as a medical device. https://www.fda.gov/digital-health
Wampold, B. E. (2015). How important are the common factors in psychotherapy? World Psychiatry, 14(3), 270–277.
This article was written by the author based on their own clinical experience and research. AI assistance (Claude, developed by Anthropic) was used in the editing and organization of this piece. All clinical opinions, recommendations, and conclusions remain those of the author.




