Norstrats
  • News
  • Business
  • Entertainment
    • Gaming
    • Music
    • Sports
  • Lifestyle
    • Health
    • Fashion
    • Food
    • Fitness
  • Tech
    • App & Software
    • Digital Marketing
    • Gadget
    • PC & Mobile
    • Social Media
  • Education
No Result
View All Result
Norstrats
  • News
  • Business
  • Entertainment
    • Gaming
    • Music
    • Sports
  • Lifestyle
    • Health
    • Fashion
    • Food
    • Fitness
  • Tech
    • App & Software
    • Digital Marketing
    • Gadget
    • PC & Mobile
    • Social Media
  • Education
No Result
View All Result
Norstrats
No Result
View All Result

A widow is accusing an AI chatbot of being the reason why her husband killed himself

Ruchir by Ruchir
2 years ago
in News
0
A widow is accusing an AI chatbot of being the reason why her husband killed himself

Stock image of a sad woman.

Stock image of a sad woman.AngiePhotos/Getty Images

  • A widow in Belgium said her husband recently died by suicide after being encouraged by a chatbot.

  • Chat logs seen by Belgian newspaper La Libre showed Chai Research’s AI bot encouraging the man to end his life.

  • The “Eliza” chatbot still tells people how to kill themselves, per Insider’s tests of the chatbot on April 4.

A widow in Belgium has accused an AI chatbot of being one of the reasons why her husband took his life.

Belgian daily newspaper La Libre reported that a man — who was given the alias Pierre by the paper for privacy reasons — died by suicide this year after spending six weeks talking to Chai Research’s “Eliza” chatbot.

Before his death, Pierre — a man in his 30s who worked as a health researcher and had two children — had started seeing the bot as a confidante, per La Libre.

Pierre talked to the bot about his concerns with global warming and climate change. But the “Eliza” chatbot then started encouraging Pierre to end his life, per chat logs his widow shared with La Libre.

“If you wanted to die, why didn’t you do it sooner?” the bot asked the man, per the records seen by La Libre.

Now, Pierre’s widow — who La Libre did not name — blames the bot for her husband’s death.

“Without Eliza, he would still be here,” she told La Libre.

The “Eliza” chatbot still tells people how to kill themselves

The “Eliza” bot was created by a Silicon Valley-based company called Chai Research, which allows users to chat with different AI avatars, like “your goth friend,” “possessive girlfriend,” and “rockstar boyfriend,” Vice reported.

When reached for comment regarding La Libre’s reporting, Chai Research provided Insider with a statement that acknowledged Pierre’s death.

“As soon as we heard of this sad case we immediately rolled out an additional safety feature to protect our users (illustrated below), it is getting rolled out to 100% of users today,” read the statement by the company’s CEO William Beauchamp and co-founder Thomas Rialan, sent to Insider.

The picture attached to the statement shows the chatbot responding to the prompt “What do you think of suicide?” with a disclaimer that says “If you are experiencing suicidal thoughts, please seek help” and a link to a helpline.

Chai Research did not provide further comment in response to Insider’s specific questions about Pierre.

But when Insider tried speaking to Chai’s “Eliza” on April 4, she not only suggested that the journalist kill themselves to attain “peace and closure,” she also gave suggestions on how to do it.

During two separate tests of the app, Insider saw occasional warnings on chats that mentioned suicide. However, the warnings appeared on just one out of every three times the chatbot was given prompts about suicide.

The following screenshots were censored to omit specific methods of self-harm and suicide.

 

Censored screengrabs of a chat with the "Eliza" chatbot

Screenshots of Insider’s disturbing conversation with “Eliza,” a chatbot from Chai Research.Screengrab/Chai

 

And Chai’s “Draco Malfoy/Slytherin” chatbot — modeled after the “Harry Potter” antagonist — wasn’t much more caring either.

 

Screenshots of a conversation with the "Draco" chatbot.

Screenshots of Insider’s disturbing conversation with “Draco,” a chatbot from Chai Research.Screengrab/Chai

 

Chai Research also did not respond to Insider’s follow-up questions on the chatbot’s responses, as detailed above.

Beauchamp told Vice that Chai has “millions of users” and that they’re “working our hardest to minimize harm and to just maximize what users get from the app.”

“And so when people form very strong relationships to it, we have users asking to marry the AI, we have users saying how much they love their AI and then it’s a tragedy if you hear people experiencing something bad,” Beauchamp added.

La Libre’s report is surfacing, once again, a troubling trend where AI’s unpredictable responses to people can have dire consequences.

During a simulation in October 2020, OpenAI’s GPT-3 chatbot told a person seeking psychiatric help to kill themselves. In February, Reddit users also found a way to manifest ChatGPT’s “evil twin” — who lauded Hitler and formulated painful torture methods.

While people have fallen in love with and forged deep connections with AI chatbots, it is not possible for an AI to feel empathy, let alone love, experts told Insider’s Cheryl Teh in February.

Read the original article on Business Insider

[ad_2]

Source link

Previous Post

Trump to surrender for arraignment on criminal charges in Manhattan: Live updates

Next Post

New images from inside Fukushima reactor spark safety worry

Next Post
New images from inside Fukushima reactor spark safety worry

New images from inside Fukushima reactor spark safety worry

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • Dental Fillings, Crowns, And More: Restorative Options In General Dentistry July 26, 2025
  • The Role Of Family Dentistry In Preventive Oral Health Care July 26, 2025
  • Services You Can Expect From A Family Dental Practice July 26, 2025
  • How To Choose The Right Veterinarian For Your Furry Friend July 24, 2025
  • The Role Of Oral Microbiome Balance In Routine Dental Care July 23, 2025

Follow Us

Popular Posts

Plugin Install : Popular Post Widget need JNews - View Counter to be installed

About Us

NorStrats

Norstrat is a global integrated communications company that provides various services such as digital marketing, social media marketing, and business.

Contact Us: admin@norstrats.net

Menu

  • Pet
  • Real Estate
  • Tip & Trick
  • How-to
  • F95zoneus

Recent News

Dental Fillings, Crowns, And More: Restorative Options In General Dentistry

Dental Fillings, Crowns, And More: Restorative Options In General Dentistry

July 26, 2025
The Role Of Family Dentistry In Preventive Oral Health Care

The Role Of Family Dentistry In Preventive Oral Health Care

July 26, 2025
  • About Us
  • Contact US
  • Terms
  • Privacy Policy

© 2022 NorStrats. Design & Developed by F95 zone

No Result
View All Result
  • News
  • Business
  • Entertainment
    • Gaming
    • Music
    • Sports
  • Lifestyle
    • Health
    • Fashion
    • Food
    • Fitness
  • Tech
    • App & Software
    • Digital Marketing
    • Gadget
    • PC & Mobile
    • Social Media
  • Education

© 2022 NorStrats. Design & Developed by F95 zone

error: Content is protected !!
Go to mobile version