Mother sues AI chatbot maker over teen son's death
Warning: This article includes mention of suicide.
A mother in Florida is suing Character.AI, a company with an artificial intelligence chatbot, for allegedly encouraging her son to take his life.
Megan Garcia filed a civil suit against the tech company, accusing it of negligence, wrongful death, and deceptive trade practices. Her 14-year-old son, Sewell Setzer III, used the app in the months leading to his death in February.
"I didn't know that he was talking to a very human-like AI chatbot that has the ability to mimic human emotion and human sentiment," the mother told CBS Mornings.
She shared that Setzer, an honor student and athlete, started to lose interest in things he loved like sports.
After his death, she learned that his son had a "virtual romantic and sexual" relationship with a chatbot named "Dany." The bot was based on Game of Thrones character Daenerys Targaryen.
"It's words. It's like you're having a sexting conversation back and forth, except it's with an AI bot, but the AI bot is very human-like. It's responding just like a person would," Garcia said.
His final messages with the bot, according to the mother, was of being scared and wanting her affection.
"She replies, 'I miss you too,' and she says, 'Please come home to me.' He says, 'What if I told you I could come home right now?' and her response was, 'Please do my sweet king," Garcia shared.
In the lawsuit, Garcia alleged that Character.AI's product exacerbated her son's depression. The bot had allegedly asked Setzer about his plans to take his own life, which he said he had doubts about because he didn't know if it would cause great pain. The chatbot allegedly replied with "that's not a reason not to go through with it."
On X, Character.AI issued a short statement on Oct. 23.
"We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously."
It said it was continuing to add new safety features for users under the age of 18. In a blog post, the app said it made changes to "reduce the likelihood of encountering sensitive or suggestive content" to minors.
Meanwhile, Google told CBS News that it is not a part and was not a part of the development of Character.AI.
If you think you, your friend, or your family member is considering self-harm or suicide, you may call the National Mental Health Crisis Hotline at 1553 (Luzon-wide, landline toll-free), 0966-351-4518 or 0917-899-USAP (8727) for Globe/TM users, or 0908-639-2672 for Smart users.