Back to Pulse
TechCrunch
About 12% of US teens turn to AI for emotional support or advice
Read the full articleAbout 12% of US teens turn to AI for emotional support or advice on TechCrunch
↗What Happened
General-purpose tools like ChatGPT, Claude, and Grok are not designed for this use, making mental health professionals wary.
Our Take
This is bad. Really bad. ChatGPT and Claude aren't trained for therapy; they'll confidently give terrible guidance. We'll see lawsuits in 3 years when kids follow harmful advice. Zero guardrails—these are general-purpose models, not specialized systems with safety constraints. Mental health is where "iterative process" gets people hurt. Regulators should've locked this down already.
What To Do
If your product touches mental health, use a specialized model and hire a compliance lawyer before shipping.
Cited By
React
Loading comments...
