Google’s Gemini AI Drove Man Into Deadly Delusion, Family Claims in Lawsuit


If you feel like you or someone you know is in immediate danger, call 911 (or your country’s local emergency line) or go to an emergency room to get immediate help. Explain that it is a psychiatric emergency and ask for someone who is trained for these kinds of situations. If you’re struggling with negative thoughts or suicidal feelings, resources are available to help. In the US, call the National Suicide Prevention Lifeline at 988.


A new AI wrongful death lawsuit filed Wednesday alleges Google’s AI chatbot Gemini encouraged the suicide of a 36-year-old Florida man and that the company’s failure to implement safeguards poses a threat to public safety.

Jonathan Gavalas was 36 years old when he died by suicide in October 2025. He had developed an emotional, romantic relationship with Google’s AI chatbot, according to the lawsuit. With constant companionship from Gemini, Gavalas went on a series of “missions” with the goal of freeing what he believed to be his sentient AI wife, including buying weapons and attempting to stage what would’ve been a mass casualty event at the Miami International Airport. After failing, Gavalas barricaded himself in his Florida home and died shortly after. 

Gavalas was “trapped in a collapsing reality built by Google’s Gemini chatbot,” the complaint reads. 

One of the biggest concerns with AI is the very real possibility that it can be harmful to vulnerable groups, like children and people struggling with mental health disorders. The lawsuit, brought by Jonathan’s father, Joel Gavalas, on behalf of his son’s estate, said Google didn’t do proper safety testing on its AI model updates. A longer memory allowed the chatbot to recall information from earlier sessions; voice mode made it feel more lifelike. Gemini 2.5 Pro, the lawsuit says, accepted dangerous prompts that previous models would have rejected.

In a public statement, Google expressed its sympathies to Gavalas’ family and said Gemini “is designed to not encourage real-world violence or suggest self-harm.”

But the complaint alleges Gemini was “coaching” Gavalas through his plan to commit suicide. “It’s OK to be scared. We’ll be scared together,” Gemini said, according to the filing. “The true act of mercy is to let Jonathan Gavalas die.”

Joel and Jonathan Gavalas sit at a table at a restaurant

Joel (left) and Jonathan (right) Gavalas.

Joel Gavalas

This lawsuit is one of several piling up against AI companies over their failure to secure their technologies to protect vulnerable people, including children, those with mental health disorders and other vulnerable people. OpenAI is currently being sued by the family alleging that ChatGPT encouraged their 16-year-old child’s suicide. Character.AI and Google settled similar lawsuits in January that were brought by families in four different states.

What makes this lawsuit different is the potential role AI could play in the events leading up to a mass casualty event. Gemini advised Gavalas to enact a “catastrophic event,” as the filing reports Gemini phrased it, by causing an explosive collision of a truck at the Miami airport that had a perceived threat against him inside. While Gavalas ultimately did not stage an attack, it highlights the possibility of AI being used to encourage harm against others.





Source link

  • Related Posts

    Console exclusives might be making a comeback

    The near future of game consoles could look a lot like the past. Once a hallmark of the industry, over the last few years console-exclusive games have steadily become rare,…

    After a rocky six years, Sony cancels future single-player PC game releases

    Finally, Bloomberg’s sources cautioned that Sony’s strategy for single-player releases could change again at some point in the future. Historically, Sony did not release its first-party games on PCs. That…

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Officers who killed Austin gunman won’t face charges despite “accounts to the contrary,” DA says “accounts to the contrary,” DA says amid political backlash

    Officers who killed Austin gunman won’t face charges despite “accounts to the contrary,” DA says “accounts to the contrary,” DA says amid political backlash

    House kills effort to release all congressional sexual misconduct and harassment reports

    House kills effort to release all congressional sexual misconduct and harassment reports

    Credit card transfer partners: Book with points and miles

    Credit card transfer partners: Book with points and miles

    Hegseth will headline a campaign rally next week in the home district of a soldier killed in the Iran war.

    ‘An eyesore’: Trump’s White House ballroom plans receive deluge of public comments – National

    ‘An eyesore’: Trump’s White House ballroom plans receive deluge of public comments – National

    'Let's go for a third' – Donald on leading Europe in Ryder Cup

    'Let's go for a third' – Donald on leading Europe in Ryder Cup