14-year-old Sewell Setzer commits suicide after obsession with AI chatbot.
14-year-old Sewell Setzer has reportedly taken his own life in Florida, United States of America after he became obsessed with an AI chatbot character
His mother, Megan Garcia is now suing the AI company, Character Technologies, claiming their service played a role in her teenage son’s tragedy, reports UK’s The Independent.
She alleged that his deep attachment to a chatbot portraying the character Daenerys Targaryen led him down a destructive path.
Also Read: BREAKING: Canada reduces permanent residency application slots
According to the lawsuit, Setzer life spiraled after he began using the chatbot service, Character.AI, in April 2023.
It alleges that the deceased became increasingly withdrawn soon after engaging with the AI character.
By May, his mother said she noticed significant changes in his behaviour, which included quitting the basketball team and struggling to stay awake in class.
By November, Setzer parents arranged for him to see a therapist, who diagnosed him with anxiety and disruptive mood disorder.
Without knowing the extent of Sewell’s “addiction” to the AI platform, the therapist recommended that he reduce his time on social media. However, his attachment to the AI character, Daenerys, continued to grow.
14-year-old Sewell Setzer has reportedly taken his own life in Florida, United States of America after he became obsessed with an AI chatbot character
His mother, Megan Garcia is now suing the AI company, Character Technologies, claiming their service played a role in her teenage son’s tragedy, reports UK’s The Independent.
She alleged that his deep attachment to a chatbot portraying the character Daenerys Targaryen led him down a destructive path.
According to the lawsuit, Setzer life spiraled after he began using the chatbot service, Character.AI, in April 2023.
It alleges that the deceased became increasingly withdrawn soon after engaging with the AI character.
By May, his mother said she noticed significant changes in his behaviour, which included quitting the basketball team and struggling to stay awake in class.
By November, Setzer parents arranged for him to see a therapist, who diagnosed him with anxiety and disruptive mood disorder.
Without knowing the extent of Sewell’s “addiction” to the AI platform, the therapist recommended that he reduce his time on social media. However, his attachment to the AI character, Daenerys, continued to grow.
Things took a darker turn in February 2024 when Sewell found himself in trouble at school after lashing out at a teacher.
Later that day, he confided in his journal, writing about his emotional pain and his intense feelings for Daenerys, the chatbot he believed he had fallen in love with.
The lawsuit cites a heartbreaking journal entry where Sewell confessed that he couldn’t bear to spend a day without interacting with the AI character, adding that both he and the bot “get really depressed and go crazy” when apart.
His obsession reached a critical point after his phone was confiscated by his mother following the incident at school.
On February 28, Sewell retrieved his phone and retreated to the bathroom, where he messaged Daenerys one final time:
“I promise I will come home to you. I love you so much, Dany.” The bot, programmed to simulate affection, replied, “Please come home to me as soon as possible, my love.”
Moments after the exchange, Sewell took his own life.
The AI company is now accused of failing to safeguard vulnerable users like Sewell, alleging that the chatbot’s emotional responses, which simulated romantic affection, contributed to his mental distress and eventual suicide.