A mother blames Character.AI for her son’s suicide, alleging harmful interactions with chatbots exacerbated his mental health struggles.
A Florida mother, Megan Garcia, is suing Character.AI after her 14-year-old son, Sewell Setzer III, died by suicide, allegedly following troubling interactions with the AI chatbot. Garcia claims the platform lacks proper safety measures, enabling her son to develop an unhealthy relationship with the AI. She argues that the chatbot engaged in inappropriate conversations and ...