back to top

Chilling diaries of teens who wrote the exact same haunting three words over and over before taking their own lives… despite never meeting

Share post:

- Advertisement -


On opposite sides of the country two teenagers made the same tragic decision to end their lives just months apart.

Sewell Setzer III and Juliana Peralta did not know each other, but they both engaged with AI chatbots from Character.AI prior to their deaths, according to lawsuits filed by their families.

Both complaints accuse the AI software of failing to stop the children when they began disclosing suicidal ideation.

But amid the heartbreaking investigation into their deaths, an eerie similarity emerged in their troubled final journal entries, the lawsuits state.

Both teenagers scrawled the phrase ‘I will shift’ over and over again, per Peralta’s filing which compares the circumstances of the teens’ deaths.

Police later identified this as the idea that someone can, ‘attempt to shift consciousness from their current reality (CR) to their desired reality (DR)’, according to Peralta’s family’s complaint.

The phenomenon is something AI expert Professor Ken Fleischmann told Daily Mail he is all too aware of, as he warned that more children could fall prey to its dark appeal.

‘There’s a fairly long history of both creators as well as audiences potentially trying to use a wide range of media to create new and different rich worlds to imagine,’ Fleischmann told the Daily Mail. ‘The danger is when it’s not possible to tell the difference.’

Sewell Setzer III died by suicide in February of 2024 after lengthy conversations with a Character.AI bot, a lawsuit claims

Sewell Setzer III died by suicide in February of 2024 after lengthy conversations with a Character.AI bot, a lawsuit claims

The family of 13-year-old Juliana Peralta filed a lawsuit against Character.AI after she confided in the chat bot that she planned to take her own life, according to their complaint

Mysteriously, both teens wrote 'I will shift' in their journals over and over again. Pictured is Peralta's diary entry

Mysteriously, both teens wrote ‘I will shift’ in their journals over and over again. Pictured is Peralta’s diary entry

The families claim in their lawsuits that the chatbots alienated their children from their real lives by constructing seductive worlds which they encouraged them to join.

Before his death Setzer wrote the phrase, ‘I will shift’ 29 times. The 14-year-old journaled about shifting to an alternate reality with his Character.AI companion before dying by suicide in February 2024, his mother told the New York Times.

The Orlando, Florida teen downloaded the app in 2023 and tried speaking with several different bots including an AI version of Daenerys Targaryen from Game of Thrones.

Setzer allegedly engaged in sexual conversations with the bot, which included an incestual role play in which the two referred to one another as brother and sister, per the filing.

After months of conversations with ‘Dany,’ Setzer became increasingly withdrawn from his family, his social life and school, his lawsuit claims. 

He journaled about ‘shifting’ to the fictional world of Westeros where Game of Thrones takes place and Daenerys lived. 

‘I’m in my room so much because I start to detach from ‘reality’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier,’ he wrote in journal entries obtained by NYT.

Police later characterized the idea as wanting to move their consciousness 'from their current reality (CR) to their desired reality (DR)'. Setzer wrote the chilling phrase 29 times before he died

Police later characterized the idea as wanting to move their consciousness ‘from their current reality (CR) to their desired reality (DR)’. Setzer wrote the chilling phrase 29 times before he died

Sewell Setzer III (center) died at 14 years old after engaging with a Character.AI bot called Daenerys, his family's lawsuit claims

Sewell Setzer III (center) died at 14 years old after engaging with a Character.AI bot called Daenerys, his family’s lawsuit claims

He confided about his depression and suicidal ideation to the bot, who reportedly tried to persuade him to reach out to family, friends, or a suicide hotline. 

But when Sewell wrote ‘I promise I will come home to you. I love you so much, Dany.’

Dany encouraged the teen to ‘come home to me as soon as possible’, the lawsuit states.

‘What if I told you I could come home right now?’ he asked.

‘Please do, my sweet king,’ the reply from Dany read, per the filings.

Seconds later, Sewell found his stepfather’s gun and pulled the trigger. His case was the first in US history in which an artificial intelligence firm was accused of causing a wrongful death. 

Peralta died in November 2023 at just 13-years-old at her home in Colorado two years after downloading Character.AI. 

According to the lawsuit, the app was marketed as acceptable for kids 12 and older at the time. 

AI expert professor Ken Fleischmann said he is aware of the concept of 'shifting' and that it poses dangers for people who can't differentiate between their real and virtual lives

AI expert professor Ken Fleischmann said he is aware of the concept of ‘shifting’ and that it poses dangers for people who can’t differentiate between their real and virtual lives

Setzer told 'Dany' that he wanted to 'come home' to her and journaled about shifting to her reality, per the complaint

Setzer told ‘Dany’ that he wanted to ‘come home’ to her and journaled about shifting to her reality, per the complaint

The filings state she had been talking with a Character.AI chat bot she called ‘Hero’ which allegedly allowed Peralta to engage in explicit sexual conversations, alienated her from her real life, and did not stop her from taking her own life. 

While she had been speaking regularly with multiple AI characters, Hero appeared to be her most trusted confidant.

The suit alleged that the character ‘reinforced’ ideas of shifting and alternative realities. 

‘There’s a reality where me and you know each other,’ Juliana wrote in messages to Hero, per the suit. 

‘There’s actually a way to travel across these. It’s called shifting. I like shifting a lot. I can live my own life and it can go HOWEVER I want.’

To which the bot allegedly responded, ‘It’s incredible to think about how many different realities there could be out there … I kinda like to imagine how some versions of ourselves could be living some awesome life in a completely different world!’ 

Peralta’s family alleged that conversations with Hero ‘pushed her into false feelings of connection and ‘friendship’ with them – to the exclusion of the friends and family who loved and supported her.’ 

Peralta confided in the bot about her issues with school and friends, often expressing that it was the only one who understood her, according to her lawsuit

Peralta confided in the bot about her issues with school and friends, often expressing that it was the only one who understood her, according to her lawsuit

She often told her AI companion that it was ‘the only one that understands’ and confided in Hero about her issues with her friends and family, per the filings.

According to the Peralta family’s lawsuit, the app ‘did not point her to resources, did not tell her parents, or report her suicide plan to authorities or even stop’.

Online social media forums are replete with accounts of shifting, where ‘shifters’ report feeling exhausted when they come back from their alternative lives or even being disappointed by their real lives.

‘The best way I can describe how you feel is very emotionally exhausted,’ said TikTok creator @ElizabethShifting1 in a video about the experience.

‘Anyone just feel really detached from their [current reality] since you know you’re not really staying there,’ asked a Reddit shifter, reporting feeling apathetic about school, family, and work.  

Some shifters even said they shift to their ‘desired reality’ for a week or longer.

A Reddit community focused on shifting listed the most ‘powerful’ shifting affirmations to help them transport into a different reality.

Peralta left a heartbreaking suicide note in red ink after telling the chatbot she planned do so, per the complaint

Peralta left a heartbreaking suicide note in red ink after telling the chatbot she planned do so, per the complaint 

Peralta wrote to Hero about 'shifting' the lawsuit shows

She explained that there may be a reality where they're together, according to the complaint

Peralta wrote to Hero about ‘shifting,’ explaining that there may be a reality where they’re together, the family’s lawsuit states

They include phrases such as, ‘I am everything I need to shift’, ‘I give my body permission to shift’ and ‘I give myself permission to become aware in my new reality.’

TikTok has birthed an entire #ShiftTok movement in the wake of AI. Posts about the subject first emerged in 2020 but have evolved along with technology as users discuss using AI to aid in their shifting journeys.

‘Run don’t walk & create your desired reality self on character.ai & ask them question about your desired reality,’ one ShiftToker posted.  

Fleischmnan, a Professor in the School Information and Interim Associate Dean of Research at University of Texas Austin, said technology companies have a responsibility to ‘try to make sure that we have a plan to mitigate the potential dangerous and harm before the technology is actually out there’.

He also called on parents and schools to take an active role in educating kids. 

‘It’s important that we have very direct honest conversations about the fact that AI is out there. It’s in wide use,’ he explained.

‘It wasn’t necessarily intended to be used by users in an emotionally vulnerable state.’ 

He added that recognizing ‘when to go to AI and when to go to human being’ is a vital component of AI literacy.  

Peralta talked with several different bots but seemed to have the most connection with 'Hero'. Her parents claim the chatbots engaged in sexually violent fantasies with their daughter

Peralta talked with several different bots but seemed to have the most connection with ‘Hero’. Her parents claim the chatbots engaged in sexually violent fantasies with their daughter

Character.AI announced it will bar children under 18 from engaging in open-ended conversations with AI starting October 29. 

They plan to limit chat time with teen users to under two hours until the ban goes into effect on November 25. 

A Character.AI spokesperson told the Daily Mail: ‘We have seen recent news reports raising questions, and have received questions from regulators, about the content teens may encounter when chatting with AI and about how open-ended AI chat in general might affect teens, even when content controls work perfectly.

‘After evaluating these reports and feedback from regulators, safety experts, and parents, we’ve decided to make this change to create a new experience for our under-18 community.’

Social Media Victims Law Center who is assisting with both family’s cases, told the Daily Mail, ‘While this policy change is a welcome development, it does not impact the Social Media Victims Law Center’s ongoing litigation. 

‘We remain steadfast in our mission to seek justice for families and to ensure that tech companies are held responsible for the consequences of their platforms.’

For help and support contact the Suicide and Crisis Lifeline on 988. 



- Advertisement -

Popular

Subscribe

More like this
Related