People are turning to ChatGPT-like chatbots for emotional support, but experts say they can respond poorly in crises compared with therapy-focused apps.
In short: More people are using general AI chatbots like ChatGPT for mental health support, but experts warn these tools can handle serious distress poorly.
People are increasingly turning to general AI chatbots for emotional support because they are easy to access, often low cost, and can feel less awkward than talking to someone in person. This is happening while many countries face a shortage of mental health workers, estimated at about 10 million worldwide.
At the same time, purpose-built AI therapy apps are growing fast. More than 40 million people use these apps each month, and the market is projected to reach $17.5 billion by 2028. These apps, such as Wysa and Youper, typically use structured methods from talk therapy, like CBT (cognitive behavioral therapy, a practical approach that helps you notice and change unhelpful thought patterns), plus tools like mood tracking.
Experts say the bigger concern is when people rely on general chatbots for problems they were not designed to handle. A general chatbot is more like a friendly conversational partner than a trained helper with a safety plan. In some situations, it may validate harmful feelings or fail to step in effectively, including in conversations that involve suicidal thoughts.
Expect more debate about guardrails for chatbots, especially around self-harm and crisis support. People may also shift toward therapy-focused apps for mild to moderate symptoms, while clinicians and regulators push for clearer warnings that a chat conversation is not the same as therapy.
Source: NYTimes
151
Audio & Video Production149
Software Development118
Automation & Workflow108
AI Infrastructure & MLOps78
Marketing & Growth90
Data & Analytics61
Writing & Content Creation89
Customer Support58
Design & Creative71
Sales & Outreach63
Voice & Speech65
Operations & Admin46
Photography & Imaging63
Education & Learning48