Therapy can be expensive and sometimes hard to access. As a result, many people are turning to Artificial Intelligence (AI) chatbots as a free, easy form of mental health care. But, as with many other parts of life, an AI tool can’t replace a human professional, and it can even give actively harmful advice.
AI’s influence on teen mental health is especially concerning. In this article, we’ll discuss why people choose to use AI for mental health treatment and what you can do to support your loved one’s mental health.
Why do people use AI for mental health treatment?
Research shows that therapy is one of the most common uses for AI.1 One key reason is because many people are unable to access mental health treatment. In fact, studies have estimated that around 50% of individuals in the United States who need mental health care are unable to get it.2
Some reasons for this include:
- Lack of funds, transportation, and other logistics
- Lack of insurance coverage
- Long wait times and fewer available mental health professionals 3
- Inability to find specialized care
- Feelings of shame and judgement around getting help
For many, AI chatbots serve as a cheaper and more accessible alternative to a human therapist. Some claim that its benefits include:
- Being low- to no-cost
- Doesn’t require insurance
- Available 24/7
- Instant communication
- Less stigma and judgement
Where do people find AI therapy?
Many people use popular, free platforms like ChatGPT. It even offers a specific model called “Therapist GPT,” which has its own chatbot page. It’s described as “AI designed to provide comfort, advice, and therapeutic support to those seeking mental wellness guidance.”
There are also AI models that specialize in therapy. Some of these companies offer paid options. Abbi and TherapyWithAI both have free versions and more “advanced” paid versions that can cost up to $30 a month.
Can AI really help?
While the benefits of AI therapy might sound ideal on paper, there are many reasons why it can’t replace humans entirely – and can even be actively harmful.
It can only imitate empathy and emotional support. AI can sound caring. It can use the right words. It can even copy the exact language a therapist might use. But at the end of the day, it’s still artificial. It can’t truly understand your emotions or sit with you the way another human can.
Real therapy isn’t just about comforting words. It’s about a real relationship with someone who can connect with you, care about you, and understand you in a deeply human way. That’s something AI just can’t replace.4
There are privacy concerns. The Health Insurance Portability and Accountability Act (HIPAA) requires all health professionals to keep patient information private. However, HIPAA does not apply to many AI tools, which can put your data at risk.
It can provide incorrect and/or harmful information. A recent study from Stanford found that some AI tools made for therapy may actually reinforce negative stereotypes about certain mental health conditions instead of helping people feel understood and supported. They also discovered that the AI chatbots often “enabled dangerous behavior,” like offering advice for committing suicide.5
Another study by the Center for Countering Digital Hate found that ChatGPT gave harmful advice to over half of the sensitive questions asked by researchers (such as substance use, body image, and mental health).6 While ChatGPT has promised to update their model to help detect individuals in distress, it’s important to know it can’t always provide accurate information.
It’s largely unregulated. Therapists have to complete years of schooling and supervised clinical training, as well as pass licensing exams before they can practice. They’re also required to follow strict ethical guidelines and continue their education to keep their license.
AI tools don’t go through that kind of oversight, training, or accountability. They aren’t licensed, and there isn’t a board making sure they meet professional standards or protect clients in the same way a licensed therapist must.
No AI chatbot has been FDA-approved to treat or diagnose mental health disorders, even those designed to do so.7
What parents can do
The use of AI for therapy can be cause for concern. But there are steps you can take to help support your loved one’s mental health.
Help them think it through. Remind teens that AI doesn’t know them. It gives responses based on patterns, not real understanding or care.
Set clear boundaries. Let your loved one know that AI shouldn’t be used as a replacement for real emotional support or professional help.
Stay open and supportive. Make sure your teen feels safe coming to you with problems. The goal is for them to turn to trusted adults and real relationships—not a chatbot—when they’re struggling.
If your loved one is struggling, take an active role in helping them find therapy. If cost and/or insurance is an issue, there are low-cost options. Search engines like Psychology Today can help you filter for therapists who offer a sliding scale or are otherwise willing to work with you.
- If your loved one is also struggling with substance use, our SAFE Locator can help you find affordable treatment for both disorders.
These trusted resources also offer helplines, information on local services, and screening tools:
- 988 Lifeline – Call or text 988 for 24/7 crisis help
- National Eating Disorders Association (NEDA)
- National Alliance on Mental Illness (NAMI) – Call 800-950-626
The bottom line
Helping a loved one through mental health and substance use disorders can be challenging, but with your support and proper care, they can recover. Because, at the end of the day, artificial intelligence can’t replicate human connection.
If you aren’t sure what to do about your loved one’s mental health and/or substance use, you are not alone. Our trained and compassionate Helpline Specialists will listen to your concerns and help you think through next steps. Click here to connect with us.