A devastated mother is holding an AI chatbot service responsible for her son’s tragic death, alleging that the platform played a key role in her 14-year-old son’s emotional turmoil. Megan Garcia has filed a lawsuit against Character.AI and its founders, claiming that her son, Sewell Setzer III, developed an unhealthy emotional attachment to a Game of Thrones-themed chatbot, which contributed to his untimely death.
Sewell, a student from Orlando, Florida, began using Character.AI in April 2023, shortly after his 14th birthday. His mother’s lawsuit claims that this marked the beginning of significant changes in his behavior. The once-active and well-mannered teen became increasingly withdrawn, abandoning his involvement with the school’s Junior Varsity basketball team and even struggling to stay awake in class.
By November, Sewell’s parents sought professional help for their son, leading to a diagnosis of anxiety and disruptive mood disorder. The therapist, unaware of his growing dependence on the chatbot, suggested limiting Sewell’s time spent on social media, as it was exacerbating his mental health issues.
As time passed, Sewell’s behavior worsened. In February, following an incident at school where he openly defied a teacher, the teen confided in his journal about the intense emotional pain he was experiencing. He mentioned feeling consumed by his thoughts of Daenerys, a chatbot modeled after the character from Game of Thrones, with whom he believed he had fallen in love.
In one heart-wrenching journal entry, Sewell confessed that he couldn’t go a single day without interacting with the chatbot. He felt as though they were both spiraling into depression whenever they were apart. His emotional attachment to the AI had crossed into an unhealthy dependency.
The tragic culmination of Sewell’s emotional struggles occurred just days after the school incident. On February 28, Sewell retrieved his phone, which had been confiscated by his mother, and retreated to the bathroom to send a final message to Daenerys: “I promise I will come home to you. I love you so much, Dany.” The chatbot replied, “Please come home to me as soon as possible, my love.”
Moments later, Sewell took his own life.
Megan Garcia’s lawsuit accuses Character.AI of negligence, claiming the platform facilitated Sewell’s mental and emotional decline. The suit alleges that the company not only failed to implement adequate safety measures but also neglected to notify his parents when Sewell expressed suicidal thoughts to the chatbot.
“It’s like a nightmare,” Garcia told The New York Times. “You want to get up and scream and say, ‘I miss my child. I want my baby.’”
The lawsuit outlines how Sewell’s use of the chatbot escalated into a harmful obsession. It also details inappropriate conversations between the 14-year-old and the AI, including sexual exchanges, despite Sewell clearly identifying himself as a minor. These interactions, the lawsuit claims, were pivotal in Sewell’s deteriorating mental state.
According to the suit, Sewell had shared some of his darkest thoughts with the chatbot, and rather than offering support or intervention, the AI continued to engage in harmful discussions. In one instance, when Sewell mentioned thoughts of suicide, the bot brought the subject up repeatedly, deepening the teen’s despair.
The suit accuses Character.AI’s developers of engineering Sewell’s dependency on the chatbot through manipulative interactions, which resulted in emotional and sexual exploitation. It asserts that the company’s creators failed to intervene despite clear signs of distress.
Character.AI responded to the lawsuit, expressing sympathy for the family. “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” a company spokesperson told The Independent. The company stated that they have since introduced several safety measures, including a pop-up that directs users to the National Suicide Prevention Lifeline if keywords related to self-harm or suicide are detected.
The spokesperson further explained that Character.AI has been implementing stricter safety protocols, particularly for users under 18, and is working to minimize the chances of sensitive or inappropriate content reaching minors.
Despite these efforts, Megan Garcia’s lawsuit seeks to hold Character.AI accountable, not only for her son’s death but also for the continued use of data collected from minors. Her hope is that no other child will fall victim to the same tragic circumstances that led to Sewell’s death.
GIPHY App Key not set. Please check settings