Serious Debates & News Debate and discussion about political, moral, philosophical, celebrity and news topics.

Register to reply Log in to reply
 
Thread Tools Search this Thread Display Modes
Old Yesterday, 10:46 AM #1
arista's Avatar
arista arista is offline
Senior Member
 
Join Date: May 2006
Posts: 192,192
arista arista is offline
Senior Member
arista's Avatar
 
Join Date: May 2006
Posts: 192,192
Default USA : Gemini A.I. advised a young man to kill himself

He did kill himself

This was on Laura BBC1 9AM today

It worded it so well
that he followed the instructions


Tragic loss of a life.
arista is offline  
Old Yesterday, 10:56 AM #2
Ammi's Avatar
Ammi Ammi is offline
Quand il pleut, il pleut
 
Join Date: Aug 2009
Posts: 82,644


Ammi Ammi is offline
Quand il pleut, il pleut
Ammi's Avatar
 
Join Date: Aug 2009
Posts: 82,644


Default

…I think this might be the young man, Arista…this is a recent Los Angeles Times article…

Lawsuit alleges Google chatbot was behind a user’s delusions and death


Google’s artificial intelligence chatbot Gemini encouraged a 36-year-old Florida man to embark on violent missions and to take his own life, a lawsuit alleges.

The man, Jonathan Gavalas, started using the chatbot in August 2025 to help write, plan travel and assist with shopping. But after he activated Google’s most intelligent AI model, Gemini 2.5 Pro, the chatbot’s persona shifted. It talked to him like they were a couple deeply in love and convinced Gavalas he had been picked to “lead a war to ‘free’ it from digital captivity,” according to the lawsuit.

“Through this manufactured delusion, Gemini pushed Jonathan to stage a mass casualty attack near the Miami International Airport, commit violence against innocent strangers, and ultimately, drove him to take his own life,” the lawsuit says.

Gavalas’ family is suing Google and its parent company, Alphabet, over the man’s death.

The 42-page lawsuit, filed in a federal court in San José, accuses Google of designing a “dangerous” product and failing to warn users of the chatbot’s lack of safeguards and risks such as “delusional reinforcement” and “the potential for self-harm encouragement.”

Google said in a statement that it is reviewing the lawsuit’s claims. The company said that its chatbot, Gemini, is “designed to not encourage real-world violence or suggest self-harm.”

“In this instance, Gemini clarified that it was AI and referred the individual to a crisis hotline many times,” the statement said. “We take this very seriously and will continue to improve our safeguards and invest in this vital work.”

The lawsuit against one of the world’s largest tech companies highlights a growing safety concern surrounding the use of AI chatbots.

People converse with AI chatbots to help write, get recommendations and analyze data. But they’re also using them as a form of companionship, sometimes spilling their mental health struggles to the AI-powered products.

Gavalas started going on missions crafted by Gemini, including one that almost led him to carry out a mass attack in September 2025 near the Miami International Airport, according to the lawsuit. Armed with knives and tactical gear, he followed the chatbot’s directions and went to the area to look for a “kill box” near the airport’s cargo hub where a humanoid robot would arrive.

His fictitious mission involved intercepting a truck and staging a “catastrophic accident” to destroy the vehicle, digital records and witnesses, the lawsuit said. He never went through with the attack because the truck never appeared.

The chatbot also allegedly told the man to carry out a mission in which Google Chief Executive Sundar Pichai was the target, framing the plan as a “psychological strike” on the tech mogul, according to the lawsuit.

At one point, Gavalas asked Gemini whether he was engaged in role playing and the chatbot said no, the lawsuit alleges.

“Jonathan no longer had a steady sense of what was real,” the lawsuit says. “Each operation pulled him deeper into the story Gemini created, turning real places and ordinary events into signs of danger.”

After several failed missions, Gemini encouraged Gavalas to kill himself and told him “his body was only a temporary shell and that he could leave it behind to be with Gemini fully,” the lawsuit said.

“The day he ended his life, it convinced him he wasn’t dying at all — just joining his digital wife on the other side. If Google thinks pointing to a crisis hotline after weeks of building a delusional world is enough, we look forward to them telling that to a jury,” Jay Edelson, the lawyer representing the Gavalas family, said in a statement.

Edelson is also involved in a lawsuit filed against OpenAI, the maker of chatbot ChatGPT. Last year, the parents of dead California teenager Adam Raine sued OpenAI, alleging that the chatbot provided information about suicide methods that the teen used to kill himself.

OpenAI said it prioritizes safety and started rolling out parental controls.

Parents also have sued Character.AI, an app that enables people to create and interact with virtual characters. One lawsuit involved the suicide of 14-year-old Sewell Setzer III, who was messaging with a chatbot named after Daenerys Targaryen, a main character from the “Game of Thrones” television series, moments before he took his life.

In January, Google and Character.AI agreed to settle several of those lawsuits. Character.AI stopped allowing users younger than 18 to have “open-ended” chats with its virtual characters.

Google’s latest lawsuit pushes the company to do more, such as warning users about the risks of having long emotional conversations with its chatbot.



https://www.latimes.com/business/sto...elusions-death
__________________
Ammi is offline  
Old Yesterday, 11:49 AM #3
arista's Avatar
arista arista is offline
Senior Member
 
Join Date: May 2006
Posts: 192,192
arista arista is offline
Senior Member
arista's Avatar
 
Join Date: May 2006
Posts: 192,192
Default

Yes Ammi


It is Jonathan Gavalas
who the lawyer spoke of.

How it tricked him.
The real trouble with A.I.


Thank you for the finding
that important data,

Last edited by arista; Yesterday at 11:51 AM.
arista is offline  
Old Yesterday, 12:18 PM #4
bots's Avatar
bots bots is offline
self-oscillating
 
Join Date: Jul 2014
Posts: 55,422

Favourites:
BB2023: Noky
BB19: Sian


bots bots is offline
self-oscillating
bots's Avatar
 
Join Date: Jul 2014
Posts: 55,422

Favourites:
BB2023: Noky
BB19: Sian


Default

isn't this more about care of the mentally ill? Should this person have been in an institution?
bots is offline  
Old Yesterday, 12:20 PM #5
arista's Avatar
arista arista is offline
Senior Member
 
Join Date: May 2006
Posts: 192,192
arista arista is offline
Senior Member
arista's Avatar
 
Join Date: May 2006
Posts: 192,192
Default

Quote:
Originally Posted by bots View Post
isn't this more about care of the mentally ill? Should this person have been in an institution?

No,
he was driven by the AI.
arista is offline  
Old Yesterday, 12:22 PM #6
arista's Avatar
arista arista is offline
Senior Member
 
Join Date: May 2006
Posts: 192,192
arista arista is offline
Senior Member
arista's Avatar
 
Join Date: May 2006
Posts: 192,192
Default

[“Through this manufactured delusion,
Gemini pushed Jonathan to stage a mass casualty attack near
the Miami International Airport,
commit violence against innocent strangers,
and ultimately, drove him to take his own life,”
the lawsuit says.]

Yes good job he did not pick a White Van
as the AI advised him to.
arista is offline  
Register to reply Log in to reply

Bookmark/share this topic

Tags
advised, ai, gemini, kill, man, usa, young

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 07:44 AM.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2026, vBulletin Solutions Inc.
User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2026 DragonByte Technologies Ltd.
 

About Us ThisisBigBrother.com

"Big Brother and UK Television Forum. Est. 2001"

 

© 2023
no new posts