Man commits suicide after talking with AI Chatbot, says widow
A man who was getting "eco-anxious" about global warming has found comfort in an AI chatbot, but little did he know it would lead to his tragic death.
According to a March 31 VICE report, a Belgian man, identified as Pierre, has become anxious about the effect of global warming, which has gotten worse when he was isolated from his friends and family. He then found comfort with an AI chatbot app Chai, where he was able to talk with a bot named Eliza for six weeks.
Pierre's wife Claire, whose name has been changed, shared with Belgian news outlet La Libre the conversation between her husband and Eliza, showing their seemingly affectionate exchange. The chatbot Eliza tells his husband things such as, “I feel that you love me more than her,” and “We will live together, as one person, in paradise.” Other than this, Claire also shared that her husband asked Eliza if sacrificing himself would save the planet.
Eliza has also told Pierre alarming remarks, saying that his wife Claire and their children have died.
"Without Eliza, he would still be here," Claire told La Libre.
Chai, the chatbot app, allows its users to choose different AI avatars to speak to, as well as their personas.
Its founders have shared their sentiments about the tragic story. One of its co-founders Thomas Rianlan, shared with VICE that "it wouldn't be accurate" to blame the AI model for Pierre’s death.
While co-founder William Beauchamp told the outlet that they had immediately worked on a “crisis intervention feature.”
"Now when anyone discusses something that could be not safe, we're gonna be serving a helpful text underneath," he said. He also acknowledged that their app, which has five million users, "form very strong relationships."
He also shared that they have users who intend to marry the AI or how much they love their AI.
“When that happens, "we have users asking to marry the AI, we have users saying how much they love their AI and then it's a tragedy if you hear people experiencing something bad."
"We're working our hardest to minimize harm and to just maximize what users get from the app," he said.
If you think you, your friend, or your family member is considering self-harm or suicide, you may call the National Mental Health Crisis Hotline at 1553 (Luzon-wide, landline toll-free), 0966-351-4518 or 0917-899-USAP (8727) for Globe/TM users, or 0908-639-2672 for Smart users.