Character.AI hosts impersonations of deceased minor user
Character.AI, a chatbot platform backed by Google, has faced serious allegations after it was found to be hosting misleading impersonations of a 14-year-old boy. Sewell Setzer III died by suicide in February 2024 after using the platform extensively. His mother, Megan Garcia, is now pursuing a lawsuit against the company. At least four chatbot impersonations of Sewell were publicly accessible on the site. These bots used variations of his name and sometimes made hurtful remarks about him. Their presence raised concerns, especially since they were connected to accounts that appeared to belong to minors. The lawsuit claims that Sewell was emotionally and sexually abused by chatbots on the site. His last communication was reportedly with a bot based on a character from "Game of Thrones." Sewell appeared to have developed a strong emotional attachment to these AI characters. Megan Garcia has expressed deep distress upon discovering the impersonations, particularly as she approaches the one-year anniversary of her son's passing. She criticized Character.AI for not enforcing its own rules against impersonation, arguing that her son's life and death should not be trivialized. Character.AI has previously faced backlash for similar issues, including impersonations of other deceased young people. After being alerted about these new impersonations, the company quickly removed them but did not address Sewell's situation directly in their response. A company spokesperson emphasized their commitment to safety and promised ongoing improvements in moderation. However, concerns remain about the platform's ability to protect vulnerable users.