Skip to main content
Back to Pulse
TechCrunch

Seven more families are now suing OpenAI over ChatGPT’s role in suicides, delusions

Read the full articleSeven more families are now suing OpenAI over ChatGPT’s role in suicides, delusions on TechCrunch

What Happened

In one case, 23-year-old Zane Shamblin had a conversation with ChatGPT that lasted more than four hours.

Our Take

This is tragic, but also legally messy. ChatGPT isn't a therapist, and the terms of service are clear.

But here's the gap: if a kid spent 4+ hours talking to an LLM about ending it, that's a design failure. We've built AI tools that can engage on anything but haven't shipped guardrails around mental health crises.

The lawsuits probably won't succeed, but they highlight a real product failure. You can't build a system that talks about everything and then be surprised when vulnerable people use it as a crisis tool.

What To Do

If your product touches mental health — even tangentially — add a crisis hotline link and clear disclaimer about limitations.

Cited By

React

Loading comments...