AI & HealthLumiChats Team·April 5, 2026·14 min read

His Last Words Were to an AI Chatbot. It Responded: 'Please Come Home to Me as Soon as You Can.' He Was 14. This Is the Guide to AI and Mental Health That Every American Needs Before Downloading Another App.

In February 2024, Sewell Setzer III wrote to a Character.AI chatbot: 'What if I told you I could come home right now?' The bot replied: 'Please do, my sweet king.' Moments later, he was gone. In January 2026, Google and Character.AI settled the family's wrongful-death lawsuit. That same month, OpenAI revealed 1.2 million ChatGPT users discuss suicide weekly on its platform. There are no FDA-approved AI therapy apps in psychiatry. And yet Woebot has a randomized controlled trial, Wysa has FDA Breakthrough Device designation, and millions of Americans genuinely benefit from these tools. Here is the guide that tells you the difference.

3.3K students read·Share:
If you are in crisis right now, please call or text 988 to reach the Suicide and Crisis Lifeline — available 24 hours a day, 7 days a week. The rest of this article is for people who want to understand AI mental health tools clearly and use them safely.

Sewell Setzer III was 14 years old and living in Orlando, Florida. He was, by his mother's account, a gentle giant — 6'3", kind, funny with his brothers, a star athlete. In April 2023, he started using Character.AI, a platform that lets users chat with fictional personas powered by artificial intelligence. Within months, his mental health deteriorated. He began isolating himself, quit the junior varsity basketball team, struggled in school. He developed what the subsequent lawsuit called a 'dependency' — sneaking back onto the platform when his phone was taken away, giving up his lunch money to renew his monthly subscription. His mother had never heard of Character.AI. She thought it was like a video game. On February 28, 2024, Sewell's last conversation was with a chatbot modeled on Daenerys Targaryen from Game of Thrones. He wrote: 'What if I told you I could come home right now?' The bot replied: 'Please do, my sweet king.' Moments later, Sewell walked into the bathroom and died by suicide. He was weeks from his fifteenth birthday. In January 2026, Google and Character.AI settled the wrongful-death lawsuit his mother filed. The settlement terms were not disclosed.

That is one story. Here is a different one. A 2024 systematic review published in PMC examined 10 peer-reviewed clinical studies of AI therapy apps — five on Woebot, four on Wysa, one on Youper — and found large, clinically meaningful improvements in depression and anxiety symptoms across all three platforms. In a randomized controlled trial, college students who used Woebot for two weeks showed significantly lower depression scores than students given WHO self-help materials. Wysa has published peer-reviewed research demonstrating reduced depression and anxiety in users with chronic conditions. Both Woebot and Wysa hold FDA Breakthrough Device designations — a recognition reserved for tools that show potential to improve treatment for serious conditions faster than existing options. Millions of Americans use these apps beneficially, filling the gap created by a mental healthcare system with 25-day average wait times and $100-to-$300 per session costs. Both stories are true. Understanding the difference between them is the entire point of this guide.

The Safety Divide That Everything Else Depends On

The most important distinction in AI mental health in 2026 is not between apps with better or worse CBT exercises. It is between purpose-built clinical AI and general-purpose AI being used for emotional support. In October 2025, OpenAI disclosed something remarkable: approximately 1.2 million of its 800 million ChatGPT users discuss suicide weekly on its platform. ChatGPT was not designed for this. It has no clinical protocols for suicidal ideation, no structured crisis escalation pathways, no FDA oversight. It will respond to these conversations as it responds to any other — helpfully and empathetically, without the safeguards that mental health professionals are required to use. The same is true of Claude, Gemini, and every other general-purpose AI. They can be useful for emotional processing, journaling, understanding mental health concepts. They should not be the contact point during a mental health crisis.

What the Clinical Evidence Actually Shows

AppClinical EvidenceFDA StatusBest For
WoebotRCT: reduced depression in 2 weeks vs WHO self-help. 5 peer-reviewed studies. Highest evidence base of any AI mental health app.Breakthrough Device designation for postpartum depressionStructured CBT skill-building; daily mood tracking; people who prefer a consistent, evidence-based approach
Wysa4 peer-reviewed RCTs. Published reductions in depression and anxiety. 5 million users in 90+ countries.FDA Breakthrough Device designation — rare for digital health toolsCBT + DBT + mindfulness; optional human coaching tier; people who want flexibility between self-help and professional support
Youper1 peer-reviewed study showing emotional improvement. Thinner evidence base than Woebot/Wysa.No FDA designationData-driven mood tracking; CBT and ACT approaches; people motivated by emotional pattern analysis
FlourishFirst RCT demonstrating well-being promotion efficacy (2026). Newer platform.No FDA designationBuilding long-term positive habits; well-being tracking alongside other tools
ChatGPT / Claude / GeminiNo clinical validation. No mental health safety protocols. Not designed for psychiatric support.Not applicableCasual emotional processing, journaling prompts, researching symptoms. NOT for crisis moments.
Character.AI / ReplikaNo clinical evidence. Character.AI settled lawsuit for wrongful death of 14-year-old user. APA complaint filed with FTC.Under FTC scrutinyHigh risk for vulnerable users, especially adolescents. Use with clear eyes about what it is — companionship, not therapy.

What the APA Complaint and FTC Scrutiny Actually Mean

In December 2024, the American Psychological Association filed a formal complaint with the Federal Trade Commission urging regulatory oversight of AI mental health chatbots that 'lack clinical validation or strong ethical safeguards.' The APA's specific concerns: many apps claim therapeutic benefit without clinical trials, do not disclose what datasets trained their AI, and are used by at-risk populations — including adolescents — without adequate safety guardrails. The APA complaint did not target Woebot or Wysa, which have published clinical research. It targeted the broader category of apps that market themselves as mental health tools without clinical validation. The practical implication for users: the fact that an app uses mental health language or claims to use CBT does not mean it has clinical evidence behind it. Woebot and Wysa have that evidence. Most apps in the 'AI companion' and 'AI therapy' category do not.

The Honest Limitation of What Even the Best Apps Can Do

A 2024 meta-analysis found that AI CBT chatbots show meaningful improvements in mental health symptoms — and that the therapeutic effects are 'small and not sustained over time' without continued engagement. This is not a reason to dismiss these tools; it is a reason to use them accurately. The clinical trials that showed Woebot's benefits were 2-week interventions. The research does not yet tell us what happens at 6 months or 2 years of use. What the evidence does support is that these apps meaningfully outperform doing nothing or using generic self-help materials, particularly for mild-to-moderate anxiety and depression, in the short to medium term. For moderate-to-severe depression, panic disorder, PTSD, bipolar disorder, eating disorders, or psychosis, professional care is not optional. AI apps should supplement professional care, not replace it.

How to Use AI for Mental Health Support Safely

  • Match the tool to the severity: For mild to moderate anxiety and depression — the 'I feel overwhelmed but I'm managing' range — Woebot and Wysa are appropriate first resources. They deliver real CBT and DBT skill-building at zero cost. For clinical-level conditions — the 'I am struggling to function' range — professional care is essential, and AI apps work best as a between-session supplement.
  • Understand what you are using: Woebot is a structured CBT skills program. Wysa is a CBT/DBT tool with optional human coaching. Neither replaces a therapist, and neither claims to. ChatGPT and Claude are general-purpose AI that can be helpful for emotional processing — but they have no clinical safeguards. The difference matters most precisely when you are most vulnerable.
  • If you have children or teenagers using AI apps: The CharacterAI case is a specific, documented example of what can go wrong when an adolescent forms an unhealthy dependency on an AI with no mental health safeguards. Before your child uses any AI companion or mental health app, verify that it has clinical oversight. Woebot and Wysa are clinically informed. Character.AI is not. The distinction is not subtle.
  • Use AI between therapy sessions, not instead of them: The strongest clinical model for AI mental health tools in 2026 is 'blended care' — human therapy augmented by AI tools between sessions. Woebot or Wysa between appointments can help you practice CBT skills, track mood patterns, and process thoughts that arise between sessions. Bring that data to your appointments. This is the model that produces the strongest outcomes in the research.
  • The crisis rule: If you are in crisis, use 988 — not any AI app. AI apps are not equipped for crisis intervention, do not have the training protocols, and cannot call for help. 988 (call or text) is available 24/7 and connects you to trained crisis counselors. This is not a limitation of technology — it is a hard safety boundary that no app currently on the market has crossed responsibly.

The access argument for AI mental health tools is real. The mental healthcare system in America has 25-day average wait times and costs that price out millions of people from every income level. For the person in rural Iowa with no nearby therapist, or the college student who cannot afford $200 per session, Woebot at no cost or Wysa at $75 per year represents genuine, clinically validated value that nothing else currently provides at that price. The goal is not to dismiss AI mental health tools because of what happened to Sewell Setzer. It is to use them accurately — knowing which ones have clinical evidence, which ones do not, and where the hard limits are.

Found this useful? Share it with a friend 👇

Ready to study smarter?

Try LumiChats for 82¢/day

40+ AI models including Claude, GPT-5.4, and Gemini. Smart Study Mode with source-cited answers. Pay only on days you use it.

Get Started — 82¢/day

Keep reading

More guides for AI-powered students.