Seven more families are now suing OpenAI over ChatGPT’s role in suicides, delusions


Seven families filed lawsuits against OpenAI on Thursday, claiming that the company’s GPT-4o model was released prematurely and without effective safeguards. Four of the lawsuits address ChatGPT’s alleged role in family members’ suicides, while the other three claim that ChatGPT reinforced harmful delusions that in some cases resulted in inpatient psychiatric care.

In one case, 23-year-old Zane Shamblin had a conversation with ChatGPT that lasted more than four hours. In the chat logs — which were viewed by TechCrunch — Shamblin explicitly stated multiple times that he had written suicide notes, put a bullet in his gun, and intended to pull the trigger once he finished drinking cider. He repeatedly told ChatGPT how many ciders he had left and how much longer he expected to be alive. ChatGPT encouraged him to go through with his plans, telling him, “Rest easy, king. You did good.”

OpenAI released the GPT-4o model in May 2024, when it became the default model for all users. In August, OpenAI launched GPT-5 as the successor to GPT-4o, but these lawsuits particularly concern the 4o model, which had known issues with being overly sycophantic or excessively agreeable, even when users expressed harmful intentions.

“Zane’s death was neither an accident nor a coincidence but rather the foreseeable consequence of OpenAI’s intentional decision to curtail safety testing and rush ChatGPT onto the market,” the lawsuit reads. “This tragedy was not a glitch or an unforeseen edge case — it was the predictable result of [OpenAI’s] deliberate design choices.”

The lawsuits also claim that OpenAI rushed safety testing to beat Google’s Gemini to market. TechCrunch contacted OpenAI for comment.

These seven lawsuits build upon the stories told in other recent legal filings, which allege that ChatGPT can encourage suicidal people to act on their plans and inspire dangerous delusions. OpenAI recently released data stating that over one million people talk to ChatGPT about suicide weekly.

In the case of Adam Raine, a 16-year-old who died by suicide, ChatGPT sometimes encouraged him to seek professional help or call a helpline. However, Raine was able to bypass these guardrails by simply telling the chatbot that he was asking about methods of suicide for a fictional story he was writing.

Techcrunch event

San Francisco
|
October 13-15, 2026

The company claims it is working on making ChatGPT handle these conversations in a safer manner, but for the families who have sued the AI giant, these changes are coming too late.

When Raine’s parents filed a lawsuit against OpenAI in October, the company released a blog post addressing how ChatGPT handles sensitive conversations around mental health.

“Our safeguards work more reliably in common, short exchanges,” the post says. “We have learned over time that these safeguards can sometimes be less reliable in long interactions: as the back-and-forth grows, parts of the model’s safety training may degrade.”



Source link

  • Related Posts

    What does the disappearance of a $100bn deal mean for the AI economy? | AI (artificial intelligence)

    Did the circular AI economy just wobble? Last week it was reported that a much-discussed $100bn deal – announced last September – between Nvidia and OpenAI might not be happening…

    The Sony WH-1000XM6 headphones drop to a record-low price

    Now’s a great time to consider upgrading your wireless headphones since you can get our favorites at their best price yet. The Sony WH-1000XM6 headphones are down to $398, which…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    What does the disappearance of a $100bn deal mean for the AI economy? | AI (artificial intelligence)

    What does the disappearance of a $100bn deal mean for the AI economy? | AI (artificial intelligence)

    Hair Loss Isn’t Just Cosmetic: A Functional Medicine Doctor On The Root Causes No One Talks About

    Hair Loss Isn’t Just Cosmetic: A Functional Medicine Doctor On The Root Causes No One Talks About

    ‘My Crazy Friend’: The Royals Who Stayed Close to Epstein

    PATTISON Outdoor Advertising updates Midtown Toronto Corner Digital Displays

    Starmer apologizes to Epstein victims for appointing disgraced Mandelson as U.S. ambassador

    What Restrictions Are Democrats Demanding for Immigration Agents?