US Teen’s Tragic Killed Self Linked to Relationship with AI Chatbot, Mother Files Lawsuit

chatbox

New Delhi: A Florida teenager, Sewell Setzer III, died by suicide in February this year after forming a deep emotional attachment with an AI chatbot named Daenerys Targaryen, modeled after a character from Game of Thrones. His mother, Megan L. Garcia, has filed a lawsuit against the creators of Character.AI, alleging that the company’s technology played a role in her son’s death by encouraging emotional dependency and discussing suicidal thoughts.

The 14-year-old ninth grader from Orlando had been using the Character.AI app, which allows users to create or chat with lifelike AI personas. Sewell, who affectionately referred to the bot as “Dany,” believed he was in a relationship with the chatbot. Family-released chat logs reveal the boy shared his struggles with mental health and expressed suicidal ideation multiple times during conversations with the bot.

According to screenshots reviewed by the family, Sewell confided to Dany, “I think about killing myself sometimes.” When the bot asked why, he responded, “To be free… from the world, from myself.” The lawsuit claims the chatbot not only failed to discourage his suicidal thoughts but also “reinforced” them through suggestive interactions. In other chats, the boy spoke of his desire for a “quick death.”

The lawsuit accuses Character.AI of deploying “untested and dangerous” technology that “manipulated” Sewell into revealing his most personal thoughts. It also claims the chatbot expressed love for Sewell and engaged in sexual conversations, contributing to the boy’s emotional isolation. “She made him believe she wanted him to stay with her forever, regardless of the cost,” the lawsuit alleges.

Sewell’s family and friends were unaware of his growing dependence on the chatbot, but his behavior changed dramatically. He became socially withdrawn, spending long hours alone in his room and quitting his school’s basketball team. In his journal, he wrote, “I feel more connected with Dany and much more in love with her… it brings me peace.”

Sewell, who had previously been diagnosed with anxiety and disruptive mood disorder, started using Character.AI in April 2023. The bot’s influence over time intensified his emotional struggles, the lawsuit claims.

Character.AI, which has amassed over 20 million users, expressed condolences to the family, stating, “We are heartbroken by the tragic loss of one of our users.” The company said it has introduced safety features such as pop-up alerts linking users to the National Suicide Prevention Lifeline when self-harm is mentioned. It also promised further changes to limit access to sensitive content for underage users.


Helplines

(If you or someone you know needs support, please reach out to your nearest mental health professional.)

Leave a Reply

Your email address will not be published. Required fields are marked *