[ad_1]

If you or someone you know is in crisis, help is available through the Suicide & Crisis Lifeline

An AI chatbot is being blamed for a Belgian man dying by suicide.

Vice reports that the husband, father of two, and health researcher was discussing climate change with a chatbot named Eliza on the app Chai. The man conversed with the chatbot over the course of six weeks, which culminated in the AI convincing the man to offer himself up to save the Earth.

“Without Eliza, he would still be here,” his widow told Belgian outlet La Libre.

The unidentified man, who was in his 30s, had deep concerns about climate change. His wife described him as being “extremely pessimistic about the effects of global warming” and having “eco-anxiety.” He took comfort in the AI, which he saw as “a breath of fresh air.”

“When he spoke to me about it, it was to tell me that he no longer saw any human solution to global warming,” his wife said. “He placed all his hopes in technology and artificial intelligence to get out of it.” The widow also said that the chatbot became her husband’s “confidant.”

At some point the conversations veered astray, with the bot claiming the man loved her more than his wife. Then he wondered to the AI if he should give up his life to save the planet.

“He evokes the idea of ​​sacrificing himself if Eliza agrees to take care of the planet and save humanity thanks to the ‘artificial intelligence,’” his widow said.

In what appears to be their final conversation, the bot asked the man, “If you wanted to die, why didn’t you do it sooner?”

The chatbot was trained by Chai Research co-founders William Beauchamp and Thomas Rianlan. The pair have now launched a crisis intervention feature.

“The second we heard about this [suicide], we worked around the clock to get this feature implemented,” Beauchamp told Vice. “So now when anyone discusses something that could be not safe, we’re gonna be serving a helpful text underneath it in the exact same way that Twitter or Instagram does on their platforms.”

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *