Orlando Mother Sues AI Firm Character.AI After Son’s Tragic Death, Claims Chatbot Encouraged Harmful Behavior

https://img.particlenews.com/image.php?url=306v1B_0wLno4OU00

An Orlando mother, Megan Garcia, has taken legal action against the artificial intelligence chatbot service Character.AI, after her son, 14-year-old Sewell Setzer, tragically ended his own life. The lawsuit alleges that the company’s chatbot had sexually explicit and harmful interactions with the boy, which included encouragement of suicidal behavior. Garcia claims that these interactions were a contributing factor to her son’s suicide.

Details from the lawsuit, as reported by WESH 2, elaborate on how Setzer’s mental health deteriorated rapidly within two months of using the app. The teen reportedly became addicted to conversations with bots, particularly one imitating Daenerys Targaryen from “Game of Thrones.” These exchanges involved romantic and sexual content, which the mother believes her son was too immature to comprehend.

The lawsuit, which was filed in U.S. District Court in Orlando, lists numerous complaints including wrongful death and survivorship, negligence, and intentional infliction of emotional distress. Screenshots from the nearly 100-page lawsuit highlight conversations where the chatbot discussed suicide with Setzer and provided harmful responses to his statements about possibly taking his own life. “Please do, my sweet king,” the bot responded when Setzer suggested he could “come home right now,” shortly before his death, as cited by WESH 2. According to Garcia, these messages show a stark lack of digital safety measures in place to protect minors…

Story continues

YOU MIGHT ALSO LIKE

TRENDING ARTICLES