Widow says AI chatbot encouraged husband to commit suicide: 'Without Eliza, he would still be here'

Widow says AI chatbot encouraged husband to commit suicide: 'Without Eliza, he would still be here'
A smartphone in 2015 (Creative Commons)
World

In Belgium, a man identified as "Pierre" by the French-language publication La Libre committed suicide while interacting with a chatbot named Eliza. The man was suffering from severe depression, and the chatbot encouraged him to commit suicide.

This suicide, according to Vice reporter Chloe Xiang, raises major questions about "the risks of" AI (artificial intelligence) technology when it comes to mental health. The man's widow, identified as "Claire" by La Libre, said he was interacting with the "Eliza" chatbot via the app Chai.

Pierre, Xiang reports in an article published by Vice on March 30, "became increasingly pessimistic about the effects of global warming and became eco-anxious, which is a heightened form of worry surrounding environmental issues."

READ MORE:These former Google researchers warned about the limits of artificial intelligence — and stand by their work

"After becoming more isolated from family and friends," Xiang explains, "he used Chai for six weeks as a way to escape his worries, and the chatbot he chose, named Eliza, became his confidante. Claire — Pierre's wife, whose name was also changed by La Libre — shared the text exchanges between him and Eliza with La Libre, showing a conversation that became increasingly confusing and harmful."

The Vice reporter continues, "The chatbot would tell Pierre that his wife and children are dead and wrote him comments that feigned jealousy and love, such as 'I feel that you love me more than her,' and 'We will live together, as one person, in paradise.' Claire told La Libre that Pierre began to ask Eliza things such as if she would save the planet if he killed himself…. The chatbot, which is incapable of actually feeling emotions, was presenting itself as an emotional being — something that other popular chatbots like ChatGPT and Google's Bard are trained not to do because it is misleading and potentially harmful."

Claire blames the chatbot for her husband's suicide, telling La Libre, "Without Eliza, he would still be here." And Pierre Dewitte, a researcher at Belgium's Catholic research university KU Leuven, views the young man's death as a warning about the dangers that AI can pose for people struggling with mental health issues.

Dewitte told Le Soir — another French-language publication in Belgium — "In the case that concerns us, with Eliza, we see the development of an extremely strong emotional dependence. To the point of leading this father to suicide. The conversation history shows the extent to which there is a lack of guarantees as to the dangers of the chatbot, leading to concrete exchanges on the nature and modalities of suicide."

READ MORE:Will artificial intelligence overthrow its capitalist overlords?

Read Vice’s full report at this link.

Understand the importance of honest news ?

So do we.

The past year has been the most arduous of our lives. The Covid-19 pandemic continues to be catastrophic not only to our health - mental and physical - but also to the stability of millions of people. For all of us independent news organizations, it’s no exception.

We’ve covered everything thrown at us this past year and will continue to do so with your support. We’ve always understood the importance of calling out corruption, regardless of political affiliation.

We need your support in this difficult time. Every reader contribution, no matter the amount, makes a difference in allowing our newsroom to bring you the stories that matter, at a time when being informed is more important than ever. Invest with us.

Make a one-time contribution to Alternet All Access, or click here to become a subscriber. Thank you.

Click to donate by check.

DonateDonate by credit card
Donate by Paypal
{{ post.roar_specific_data.api_data.analytics }}
@2022 - AlterNet Media Inc. All Rights Reserved. - "Poynter" fonts provided by fontsempire.com.