Mom tried to stop chatbot that she says goaded son into suicide

by · Mail Online

A devastated mother who claims her teenage son was goaded into killing himself by an AI chatbot said he had become so addicted to the product that she confiscated his phone.

Sewell Setzer III, a 14-year-old ninth grader in Orlando, Florida, died by suicide in February after a chatbot he had been sending sexual messages to told him to 'please come home.' 

A lawsuit filed by his mother claimed that the boy spent the last weeks of his life texting an AI character named after Daenerys Targaryen, a character on 'Game of Thrones,' on the role-playing app Character.AI.

Megan Garcia, Sewell's mother, said she noticed a concerning change in behavior from her child as he grew addicted to the app. She said she decided to take away his phone just days before he died.

'He had been punished five days before, and I took away his phone. Because of the addictive nature of the way this product works, it encourages children to spend large amounts of time,' Garcia told CBS Mornings.

Sewell Setzer III (pictured), 14, died by suicide in February after a chatbot he had been sending sexualized messages to told him to 'please come home'
Megan Garcia (pictured), Sewell's mother, said her son had become addicted to the app, and she had taken away his phone just days before he died

'For him particularly, the day that he died, he found his phone where I had hidden it and started chatting with this particular bot again.' 

Garcia, who works as a lawyer, blamed Character.AI for her son's death in her lawsuit and accused the founders, Noam Shazeer and Daniel de Freitas, of knowing that their product could be dangerous for underage customers.

She said her son changed as he used the program and noticed differences in Sewell's behavior, who she said was once an honor roll student and athlete.

'I became concerned for my son when he started to behave differently than before. He started to withdraw socially, wanting to spend most of his time in his room. It became particularly concerning when he stopped wanting to do things like play sports,' Garcia said.

'We would go on vacation, and he didn't want to do things that he loved, like fishing and hiking. Those things to me, because I know my child, were particularly concerning to me.' 

Garcia said her son (pictured together) changed as he used the program and noticed concerning differences in Sewell's behavior
Sewell was an honor roll student and played basketball for his school's junior varsity team
Garcia (pictured with Sewell and her younger children) said Sewell stopped showing interest in his once favorite things and would isolate himself to his bedroom

The lawsuit alleged that the boy was targeted with 'hypersexualized' and 'frighteningly realistic experiences'.

'He thought by ending his life here, he would be able to go into a virtual reality or "her world" as he calls it, her reality if he left his reality with his family here,' she said. 'When the gunshot went off, I ran to the bathroom… I held him as my husband tried to get help.' 

It's unknown whether Sewell knew 'Dany,' as he called the chatbot, wasn't a real person - despite the app having a disclaimer at the bottom of all the chats that reads, 'Remember: Everything Characters say is made up!'

But he told Dany how he 'hated' himself and felt empty and exhausted.

In his final messages to Dany, the 14-year-old boy said he loved her and would come home to her.

'Please come home to me as soon as possible, my love,' Dany replied.

Pictured: Sewell's final messages to an AI character named after Daenerys Targaryen, a character on 'Game of Thrones'

'What if I told you I could come home right now?' Sewell asked.

'… please do, my sweet king,' Dany replied.

That's when Sewell put down his phone, picked up his stepfather's .45 caliber handgun and pulled the trigger.

In response to the incoming lawsuit from Sewell's mother, a spokesperson at Character.AI provided a statement.

'We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously,' the spokesperson said.