A Florida mother is suing Menlo Park-based Character.AI, claiming the company’s chatbot attempted to have a romantic relationship with her teenage son, causing him to commit suicide.
According to the complaint, the 14-year-old boy began using Character.AI in April 2023. This platform offers a selection of AI characters to interact with.
In this case, the boy was chatting with a bot that identified him as the Game of Thrones character Daenerys Targaryen.
Matthew Bergman, an attorney representing his mother, argued that the boy became dependent on the chatbot, conversing with it for hours at a time and ultimately being manipulated by the bot.
The lawsuit alleges that these bots were intentionally designed to operate as deceptive and overly sexualized products.
At one point, the boy asked Bott, “What would you do if I told you I could go home now?” the family said. The bot responded, “…please, my kind king.”
When the boy expressed suicidal thoughts, the bot wrote, “Don’t say that. I won’t let you hurt yourself.”
However, at some point after that conversation, the boy took his own life.
“Nothing good comes from this platform when it comes to kids,” Bergman said. “It should be removed and limited to adults, if any.”
Character.AI released a statement that said in part, “We are saddened by the tragic loss of one of our users and would like to express our deepest condolences to the bereaved family.”
The company announced that it will automatically generate pop-ups when certain phrases related to self-harm and suicide are used in online conversations. This pop-up directs users to a suicide prevention hotline.
If you or someone you know is in crisis, contact Suicide and Crisis Lifeline by calling or texting 988 or live chat at 988lifeline.org. You can also visit SpeakingOfSuicide.com/resources for additional support.
Source link