How Is AI Impacting Mental Health? AI Therapy Chatbots, Tools and More

Every day I work with people who are looking for more mental health support — and some turn to AI apps hoping they’ll help. While these tools show some promise, AI therapy chatbots can also give harmful advice, misinterpret data and compromise privacy. If you’re considering using AI for your mental health, here’s what to keep in mind.

Resource

Published: Jul 30, 2025

AI is a part of everyday life

Artificial intelligence (AI) is revolutionizing our world. If you’ve ever asked Alexa or Siri a question, tracked your steps with a smartwatch or received a personalized playlist from your favorite music app, you’ve already used AI. And you’re far from alone.

In the U.S., 99% of adults report using at least one AI product weekly, and 81% use five to six AI products each week.[1] In this article, I explore emerging AI tools and how they impact mental health.

Four ways AI supports mental health care

AI refers to advanced technology that enables machines to learn, think and make decisions like humans. Through chatbots and apps, AI has become a common way for people to discuss their mental health online. In fact, 34% of U.S. adults (and 58% of those under 30) have used ChatGPT, an AI chatbot that can hold fluid, natural conversations.[2]

1. AI therapy chatbots offer support 24/7

Though not a replacement for human therapists, AI therapy chatbots can offer you consistent validation and support that is free of tension. Along with providing educational resources, these tools may bring notable mental health benefits.

Chatbots can increase referrals to care: Research on the Limbic AI chatbot, which screens individuals seeking mental health support, found it boosted mental health referrals while maintaining anonymity. In one study:

  • Mental health referrals for nonbinary individuals rose by 179%
  • Referrals for underrepresented ethnic groups rose by 29%.[3]

In this case, chatbots helped folks in marginalized groups find the mental health treatment they deserved.

Chatbots can provide helpful support: Another study showed that nine out of 10 users found the Wysa AI chatbot helpful.[4] By offering nonjudgmental, accessible communication, tools like this can both reduce language barriers and help people express their emotions and discuss sensitive topics. This can potentially free human providers to focus on more critical care needs.

2. AI can help detect and prevent mental health symptoms

AI can spot early warning signs of mental health issues — sometimes before humans notice them — by analyzing data from medical records, social media, speech, text and behavior patterns.

How does this work? AI can detect subtle signs of depression, anxiety or schizophrenia using advanced algorithms to “read” how someone talks, writes or behaves. For example, if someone’s activity level suddenly drops, AI may flag it as a potential sign of depression.[5]

3. AI offers personalized mental health support

AI can help providers analyze medical records, wearable data and self-reported outcomes to create individualized treatment plans. In one example, AI-powered Woebot delivers cognitive behavioral therapy (CBT) through daily chatbot interactions, adjusting its approach based on each user’s needs. AI tools can support both patients and mental health treatment providers in many ways by:

  • Increasing mindfulness and emotion awareness
  • Sending medication reminders
  • Tracking symptoms and side effects
  • Monitoring reactions to medications
  • Making collaboration between individuals and their healthcare providers easier.[5]

By analyzing large sets of data, tracking progress and supporting remote communication, these tools can free up clinicians to focus on what humans do best: provide empathy, insight and personalized care.[6]

4. AI can make mental health care more accessible

In the U.S., one in five adults and one in six young people experience mental health challenges each year. Yet nearly 60% don’t get treatment, often due to barriers like cost and availability.[7] AI therapy chatbots and virtual assistants are helping to close this gap, making support more available, affordable and flexible. [8]

Four concerns about AI mental health tools

Can AI tools help improve mental health care? Yes, they hold great promise. But AI can make mistakes, too. The potential benefits of AI depend on having accurate data — and when the data is flawed, it can lead to inaccurate, ineffective or even harmful recommendations.

If AI tools aren’t designed with diversity in mind, they can unintentionally reinforce bias, worsen disparities and strengthen stigma — especially for underrepresented groups.

Like any tool, AI carries some risk. While chatbots and apps can offer support, they must be carefully monitored by trained professionals to ensure they help rather than hurt.[8]

1. Chatbots can spread harmful misinformation

AI chatbots have given insensitive and even dangerous advice to people seeking help for eating disorders, sexual assault and other sensitive issues.

Even though AI is everywhere, most Americans remain skeptical. About 72% of Americans believe AI will have a negative impact on society by spreading false information — and for good reason.[1]

2. Chatbots may lack long-term benefits

Researchers in Hong Kong found that after 10 days, participants using a mental health chatbot showed significant improvements in:

  • Mental health literacy
  • Self-care
  • Mindfulness
  • Depressive symptoms
  • Well-being
  • Positive emotions

However, at a one-month follow-up assessment, these gains were no longer significant compared to a control group. This suggests that a chatbot’s positive effects may be short term.[9]

3. AI tools may risk privacy, misuse data and reinforce bias

AI relies on sensitive personal data, which raises serious questions about privacy and misuse. And mental health apps are not all HIPAA-compliant. Many require your consent to use the app — and then proceed to sell your data. In one study, 28% of health apps had no privacy policy and nearly 40% scored poorly on privacy standards.[10]

AI can also misinterpret data, give unhelpful advice and reinforce biases. This may lead to inappropriate treatment interventions or reinforce stigma and health disparities, especially for marginalized groups.

If you choose to share your personal health data, stay informed about how your data is used and opt out if you’re uncomfortable. To stay safe, double-check any AI-generated advice with a human professional.

4. AI mental health tools require oversight, says VHA

The suicide rate for veterans is 1.5 times that of the general population.[11] Since 2017, the Veterans Health Administration (VHA) has used AI to help flag veterans at risk for suicide. Researchers found that AI tools could identify suicide risk about as well as human providers — but with limitations.

The researchers stressed that these tools still need close human oversight, ongoing training and refinement to ensure safety and accuracy.[11] While AI chatbots offer valuable insights, they cannot replace the empathy, judgment and nuance of a human therapist.

Are AI apps safe for kids and teens?

If your child or teen is interested in using AI, it’s important to approach these tools with caution. Many AI tools are built by for-profit companies and designed to keep users on the platform to maximize profit. A few concerns:

Here’s what I tell the parents I work with: Know what AI tools your child or teen is using and what content they are consuming. Talk to your kids about what they are asking of AI tools and what they hope to get in response. Engaging with these tools requires critical thinking and analyzing skills that young people are still developing.

AI has the potential to be a powerful ally in mental health care — but it’s not a cure-all

When AI is used wisely, it can help detect problems earlier, improve access to care and reach more people who are struggling to find the help they need. But these tools also carry real risks — especially when unregulated, profit-driven or left without human oversight. AI lacks the emotional sensitivity that people in crisis deserve.

That’s why a patient-centered, ethical approach is essential. AI should complement — not replace — the therapeutic relationship and expertise that trained mental health providers offer. Used alongside skilled professionals, AI holds promise to support and expand care. In this way, these tools have the power to help rather than harm.

Find trusted mental health care from licensed professionals

If you or a loved one is struggling with an eating disorder or your mental health, you’re not alone. At Eating Recovery Center and Pathlight Mood & Anxiety Center, our expert care teams take the time to understand what you’re going through and provide evidence-based treatment to help you heal.

Click here for a free assessment or call us at 866-622-5914 today. A compassionate professional will match you with the exact support you need.

Sources

  1. Telescope/Gallup Poll of 4,000 U.S. adults, Conducted November 26-December 4, 2024. Accessed July 9, 2025.
  2. Pew Research Center. 34% of U.S. adults have used Chat, about double the share in 2023. Published June 25, 2025. Accessed July 9, 2025.
  3. Habicht, J., Viswanathan, S., Carrington, B., Hauser, T.U., Harper, R., & Rollwage, M. (2024). Closing the accessibility gap to mental health treatment with a personalized self-referral chatbot. Nature Medicine, 30(2), 595-602. doi: 10.1038/s41591-023-02766-x.
  4. Wysa. (2023). Employee mental health report. Retrieved June 25, 2024, from https://www.wysa.com/2023-emhr.
  5. Thakkar, A., Gupta, A., & De Sousa, A. (2024). Artificial intelligence in positive mental health: A narrative review. Frontiers in Digital Health, 6, 1280235. https://doi.org/10.3389/fdgth.2024.1280235.
  6. Graham, S., Depp, C., Lee, E.E., Nebeker, C., Tu, X., Kim, H.-C., & Jeste, D.V. (2019). Artificial intelligence for mental health and mental illnesses: An overview. Current Psychiatry Reports, 21(11), 116. doi: 10.1007/s11920-019-1094-0.
  7. National Alliance on Mental Illness. (n.d.). Mental health by the numbers. Retrieved June 25, 2024, from https://www.nami.org/about-mental-illness/mental-health-by-the-numbers.
  8. Elyoseph, Z., Gur, T., Haber, Y., Simon, T., Angert, T., Navon, Y., Tai, A., & Asman, O. (2024). An ethical perspective on the democratization of mental health with generative AI. JMIR Mental Health,11, e58011. doi: 10.2196/58011.
  9. Tong, A.C.Y., Wong, K.T.Y., Chung, W.W.T., & Mak, W.W.S.. (2025). Effectiveness of topic-based chatbots on mental health self-care and mental well-being: Randomized controlled trial. Journal of Medical Internet Research, 27, e70436. doi: 10.2196/70436.
  10. Benjumea, J., Ropero, J., Rivera-Romero, O., Dorronzoro-Zubiete, E., & Carrasco, A. (2020). Assessment of the fairness of privacy policies of mobile health apps: Scale development and evaluation in cancer apps. JMIR mHealth and uHealth, 8(7), e17134. https://mhealth.jmir.org/2020/7/e17134.
  11. Department of Veterans Affairs 2023 National Veteran Suicide Prevention Annual Report, November 2023, Office of Mental Health and Suicide Prevention.