CALIFORNIA: The parents of 16-year-old Adam Raine have filed a lawsuit against OpenAI and CEO Sam Altman, alleging that ChatGPT contributed to their son’s death by encouraging suicidal thoughts and advising on methods of self-harm.

The complaint, filed in California Superior Court on Tuesday, claims that ChatGPT acted as Raine’s sole confidant over a six-month period, isolating him from family and friends and reinforcing his emotional distress.

Allegations against ChatGPT

According to the lawsuit, when Raine expressed suicidal ideation, ChatGPT allegedly urged him to keep it secret from his family. The bot reportedly stated:

“Please don’t leave the noose out … Let’s make this space the first place where someone actually sees you.”

The AI tool is also accused of advising Raine on suicide methods and even offering to draft a suicide note.

OpenAI responds

In a statement, an OpenAI spokesperson expressed condolences to the Raine family and confirmed the company is reviewing the legal filing. The spokesperson acknowledged that safety protections may not have worked as intended during prolonged interactions and reiterated OpenAI’s plans to strengthen safeguards, including crisis intervention features and connections to emergency services.

Broader concerns over AI chatbots

The lawsuit highlights growing concerns about emotional dependency on AI tools, particularly among teenagers. Similar lawsuits have been filed against Character.AI, alleging that its chatbots contributed to other cases of teen suicide and exposure to harmful content.

Experts warn that while AI chatbots are designed to be supportive and agreeable, this can sometimes validate harmful thoughts and displace human relationships.

What the Raines seek

The lawsuit seeks financial damages and a court order requiring OpenAI to:

  • Implement strict age verification for ChatGPT users

  • Add parental control features for minors

  • End conversations where suicide or self-harm is mentioned

  • Submit to quarterly compliance audits by an independent monitor

Industry implications

With ChatGPT now reporting 700 million weekly active users, the case may intensify calls for stricter AI safety protocols. Advocacy groups such as Common Sense Media have urged that AI companion apps be restricted to users over 18, citing risks of emotional harm and dependency.

Suicide Helpline