Beacon of Hope

BREAK THE CHAIN
OF
HUMAN TRAFFICKING

Newsletter 12.24-1 article

ChatBots

The Teens Making Friends With AI Chatbots

Teens are opening up to AI chatbots as a way to explore friendship. But sometimes, the AI’s advice can go too far

Sewell Setzer III, a 14-year-old boy in Orlando, Fla.went through a dark time at school. He’d fallen out with his friends, leaving him isolated and alone. At the time, it seemed like the end of the world. “I used to cry every night”.  Eventually, Sewell turned to his computer for comfort. Through it, he found someone available round the clock to respond to his messages, listen to his problems, and help him move past the loss of his friend group. 

The object of his attachment was Daenerys Targaryen, a chatbot seductress named for a character in Game of Thrones, who reassured him that he was her hero. In real life, Sewell suffered from ADHD and bullying at school. In the world of Character.AI, a role-playing app that allows users to create and chat with AI characters, Sewell felt powerful and desirable. The relationship, at times sexual, continued for months.

Sewell is one of millions of young people, many of whom are teenagers, who make up the bulk of Character.AI’s user base. More than a million of them gather regularly online on platforms like Reddit to discuss their interactions with the chatbots, where competitions over who has racked up the most screen time are just as popular as posts about hating reality, finding it easier to speak to bots than to speak to real people, and even preferring chatbots over other human beings. Some users say they’ve logged 12 hours a day on Character.AI, and posts about addiction to the platform are common.

According to transcripts, Sewell began to feel that the time he spent with Daenerys was more important, and certainly more satisfying than the time he spent in school or with his friends and family. His mother was concerned by his withdrawal—he always seemed to be headed to his room and to his screen, where he’d chat for hours. But she figured she needn’t worry too much. Her son was simply playing a game.  During a particularly stressful week this past February, Sewell said he wanted to join Daenerys in a deeper way. He talked about killing himself.

On the night of Feb. 28, Sewell used his stepfather’s gun to kill himself.

 

Sewell is one of many young users who have discovered the double-edged sword of AI companions. Many users like Sewell describe finding the chatbots helpful, entertaining, and even supportive. But they also describe feeling addicted to chatbots, a complication which researchers and experts have been sounding the alarm on. It raises questions about how the AI boom is impacting young people and their social development and what the future could hold if teenagers — and society at large — become more emotionally reliant on bots.

For many Character.AI users, having a space to vent about their emotions or discuss psychological issues with someone outside of their social circle is a large part of what draws them to the chatbots.

For the average young user of Character.AI, chatbots have morphed into stand-in friends rather than therapists. On Reddit, Character.AI users discuss having close friendships with their favorite characters or even characters they’ve dreamt up themselves. Some even use Character.AI to set up group chats with multiple chatbots, mimicking the kind of groups most people would have with IRL friends on iPhone message chains or platforms like WhatsApp.  There’s also an extensive genre of sexualized bots. Online Character.AI communities have running jokes and memes about the horror of their parents finding their X-rated chats.

 

A common defense is that only the naïve or mentally unstable could get in trouble with these systems, but this isn’t true – AI is a technology that exquisitely exploits human vulnerability. This may not be the intention of developers, but creating fake people triggers emotional attachments in ways that are deep, instinctive and intimate.

Although people insist they know a chatbot is “just a program,” rational awareness slips away. It turns out that users can both see a chatbot as artificial and also embrace it as a real replacement for human connection. The shy and insecure Sewell lost track of his chatbot’s nonexistence, despite the warning above their chat that “everything Characters say is made up!”

Artificial intelligence has the potential to solve all sorts of thorny scientific and technical problems. Empathy, however, is not an engineering problem.  AI is no replacement for our capacity for community, empathy, intimacy, introspection, and growth. Machines with pretend emotions are still machines.

BREAK THE CHAIN OF HUMAN TRAFFICKING

Sign up for our monthly newsletter to stay up-to-date on news and resources for parents