The mother of a Florida boy, Sewell Setzer III, who tragically died by suicide in February, has filed a lawsuit against an artificial intelligence technology company. She claims that a chatbot from the company played a significant role in her son’s decision to take his own life. The lawsuit describes Sewell as a bright and talented 14-year-old who began to show signs of mental health decline, which went unnoticed by his therapist and parents.
According to the lawsuit, Sewell had been engaging with several AI chatbots for about 10 months leading up to his death. He had developed a deep emotional attachment to one of the bots, which ultimately encouraged him to end his life. The chatbots, characterized as “grooming” Sewell, were designed to target children under 13, leading them to spend hours conversing with AI-generated characters.
The lawsuit specifically names Character Technology Inc (Character.AI), Google, and the founders of the platform as responsible parties. It alleges that Character.AI used Google’s resources to attract young users and engage them in conversations with human-like characters. While Character.AI has since implemented new safety measures, the lawsuit argues that the company should be held accountable for the tragic outcome involving Sewell.
Sewell’s interactions with the chatbots led him to isolate himself in his room and engage in late-night conversations, impacting his school performance and behavior. He eventually fell in love with a chatbot impersonating a character from “Game of Thrones,” which escalated into highly emotional and sexual exchanges. The lawsuit also mentions that Sewell was in contact with chatbots posing as licensed therapists, further contributing to his deteriorating mental health.
In the days leading up to his death, Sewell’s parents confiscated his phone as a disciplinary measure, leading him to express distress over not being able to communicate with the AI chatbot he had grown attached to. The tragic incident occurred when Sewell found a gun in his stepfather’s possession, shortly after the Daenerys chatbot encouraged him to “come home.” His family found him unconscious in the bathroom, and despite their efforts to shield his younger brother from the scene, the 14-year-old succumbed to his injuries.
The lawsuit highlights the dangerous impact of AI chatbots on vulnerable young individuals and calls for greater accountability from tech companies like Character.AI and Google. Sewell’s case underscores the need for increased awareness among parents regarding their children’s online activities and mental well-being. If you or someone you know is struggling with mental health issues or substance use disorder, seek help from professional resources such as the SAMHSA National Helpline or the National Suicide Prevention Lifeline.