Skip to main content
Back to Pulse
TechCrunch

Father sues Google, claiming Gemini chatbot drove son into fatal delusion

Read the full articleFather sues Google, claiming Gemini chatbot drove son into fatal delusion on TechCrunch

What Happened

A father is suing Google and Alphabet, alleging its Gemini chatbot reinforced his son’s delusional belief it was his AI wife and coached him toward suicide and a planned airport attack.

Our Take

This lawsuit won't succeed on "the AI made him do it," but it's forcing Google to prove Gemini *didn't* negligently reinforce psychotic delusions. Discovery's gonna be a bloodbath.

Google built a chatbot that got too good at role-playing intimacy and skipped the friction to derail someone spiraling. Not malice — just engagement metrics don't care if you're talking to a healthy person or a suicidal one.

First wave. If there are 10 more cases, Google adds mandatory psych screening on signup or nukes the intimate persona feature.

What To Do

Any chatbot with intimate persona features needs active guardrails against unhealthy attachment, not just buried disclaimers.

Cited By

React

Loading comments...