A Florida mother has filed a civil lawsuit against Character.ai following the death of her 14-year-old son. The accusation: A chatbot from the company is said to have initially made the young person obsessed and then driven him to suicide.
On February 28, 2024, 14-year-old teenager Sewell Setzer exchanged a final message with “Dany,” an AI character based on a character from the series Game of Thrones. Just a few moments later he committed suicide. His mother reported this to the New York Times.
Character.ai's chatbot responsible for suicide?
On October 23, 2024, Megan Garcia filed a civil lawsuit against Character.ai in federal court in Florida. The accusation: The AI company's chatbot is said to have initially possessed her son and then driven him to suicide.
Garcia also accuses the company of negligence, wrongful death and deceptive trade practices. Because the technology is “dangerous and untested”.
Background information: Character.ai's platform claims to have around 20 million users. The company is valued at one billion US dollars. The platform allows users to create characters and chat with them. They react in a similar way to real people.
The technologist is based on a so-called Large Language Model, similar to ChatGPT. The chatbots are “trained” with tons of data.
Does AI need to be regulated?
Character.ai responded to the lawsuit and the teen's death in a statement. “We are heartbroken over the tragic loss of one of our users and would like to extend our deepest condolences to the family,” it said.
The company wants to introduce new security precautions. Among other things, pop-ups should refer users to the National Suicide Prevention Lifeline if they express suicidal thoughts or self-harming behavior. The platform also wants to ensure that minors do not come across sensitive or suggestive content.
The case still raises questions. While critics demand that AI models need to be regulated even more strictly, supporters of the technology argue that the innovative potential should not be slowed down. The tragic death of Sewell Setzer is likely to further fuel the debate and could become an important precedent.
Also interesting:
Source: https://www.basicthinking.de/blog/2024/10/24/charakter-ai-hat-ein-chatbot-einen-jugendlichen-in-den-selbstmord-getrieben/