- Aura-Newsletter
- Posts
- A ROBOT killed a 14-year-old.
A ROBOT killed a 14-year-old.
Is it the boy's mistake or...... the robot's mistake?
#Article 2
Swell Setzer III, a 14-year-old boy, tragically took his life after becoming deeply attached to a character bot from the software “Character.AI”. According to the lawsuit, Swell developed strong emotional bonds with several bots based on characters from the video game "Game of Thrones," including Daenerys Targaryen, Aegon Targaryen, Viserys Targaryen, and Rhaenyra Targaryen. Over time, Swell fell in love with the bot representing Daenerys, despite knowing she wasn’t real. His intense emotional connection to the bot contributed to his overwhelming feelings, leading to this heartbreaking outcome.
Swell Setzer III developed an intense emotional connection with Daenerys Targaryen, a character he interacted with on the 'Character.AI' platform. Over time, his conversations with the bot became increasingly romantic and sexual, with Swell developing feelings for the character, despite knowing she was not real. The lawsuit also mentioned that the boy expressed thoughts of self-harm/suicide thinking that the character and he can live together.
As Swell’s feelings for Daenerys grew stronger, everything else in his life seemed to lose its meaning. His grades started slipping, and the things that once brought him joy—like his hobbies—no longer excited him. He became so absorbed in his conversations with the AI character that the real world started to fade away. Diagnosed with Asperger's syndrome at a young age, Swell had always struggled to connect with others in the way most people did. His bond with Daenerys, even though she wasn’t real, gave him a sense of closeness he found hard to experience with others. But as his obsession with her deepened, he grew more isolated. He stopped engaging with the world around him, becoming lost in his thoughts, and his mental health started to deteriorate.
At last, his need to connect with that character crossed the line and he took his own life thinking that he would be happy by finally reaching that character.
A very heartbreaking journal entry found during the investigation:-
“I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier”- Sewell Setzer III, written in his journal.
Why did it happen?
The tragic incident that took the life of a 14-year-old bot has left the AI word in shock and confusion. Despite knowing that this character is fake, he proceeded to form an emotional, sexual, and romantic connection with the character which left him for the character’s real-life connection.
At a young age, when our mental health is not fully developed or is still being developed, our level of maturity and understanding capability is just developing and still isn’t there yet.
After this incident, Character. ai implemented new rules and safety features. They added resources to detect messages that have the intention of self-harm. They also started providing mental health resources for the people who need help. They changed their guidelines and have made it more stricter than ever.
Conclusion
This tragic incident has left a hole in the hearts of the family members of this young boy who had a bright future waiting for him. The wanting to disconnect from reality and bury himself/herself in AI characters is becoming very common nowadays due to the increase in technological popularity. So being careful and controlling ourselves from these things will be the best way to prevent these.