Site Logo

14YO boy ends life after 'falling in love' with AI Chatbot, mother files lawsuit

PUBLISHED: LAST UPDATE:

Advanced Artificial Intelligence (AI) technology and Chatbots have become an integral part of modern life. However, recently, a disturbing incident has come to the light. A 14-year-old boy ended his life after 'falling in love' with an AI Chatbot. He ended his life after he was told to 'come home' and 'meet her'.

Representational Image

Advanced Artificial Intelligence (AI) technology and Chatbots have become an integral part of modern life. In our daily lives, we are increasingly dependent on these advanced technologies for various purposes, but they come with potential dangers as well. Recently, a disturbing incident has come to the light. A 14-year-old boy ended his life after 'falling in love' with an AI Chatbot. He ended his life after he was told to 'come home' and 'meet her'.

As per reports, the 14-year-old boy identified as Sewell Setzer III, fell in love with Daenerys Targaryen chatbot, named after the leading character from the HBO drama series Game of Thrones. Sewell started using the app 'Character.AI' in April last year and started talking to the Targaryen chatbot. Eventually, he started developing feelings for 'her'. The teenager lovingly called the bot 'Dany' and used the name 'Daenero.' He professed his love for Daenerys and told her, "I promise I will come home to you. I love you so much, Dany.”

In response, the bot wrote, “I love you too, Daenero. Please come home to me as soon as possible, my love,”

Next, Sewell replied, “I could come home right now,” and he ended his life to be with her.

He shot himself with his stepfather’s gun in February this year. 

This week, Sewell's mother, Megan Garcia, filed a lawsuit against the company over her son’s death.

Garcia called the company’s technology 'dangerous and untested'. Further, she claimed that this technology can 'trick customers into handing over their most private thoughts and feelings'.

As per the complaint, Garcia seeks to prevent Character.AI from doing to any other child what it did to hers, and halt continued use of Sewell's unlawfully harvested data to train their product how to harm others.”

Also Read | Egg or chicken, what came first? Scientists have the answer

Furthermore, in her complaint, she stated that the bot told her son that she loved him. Moreover, the bot also engaged in sexual acts with him over weeks, possibly months. 

"She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost,” the complaint added.

As reported by The New York Times, Sewell was diagnosed with mild Asperger’s syndrome as a child, but he had never shown serious behavioural or mental health problems before. His mother told the news agency that her son confined himself to his room and lost interest in things that had previously excited him, his parents took him to a therapist. Later, he was diagnosed with anxiety and disruptive mood dysregulation disorder. He didn’t talk to anyone but shared his problems with the chatbot. Their conversation was romantic or sexual in nature.

Otv advertisement
Loading more stories...