Number of Children Turning to AI Chatbots for Mental Health ‘Therapy’ Surges

A new national survey reveals a stark shift in how teenagers are dealing with emotional and psychological struggles, with growing numbers turning not to parents, teachers, or professionals, but to artificial intelligence.

According to a major study by the UK’s Youth Endowment Fund (YEF), one in four British teenagers sought mental health support from AI chatbots in the past year.

The survey, which included 11,000 children ages 13 to 16, found that over half sought some form of mental health assistance, and a significant portion turned to chatbots rather than the country’s strained National Health Service.

The YEF said AI chatbots could appeal to vulnerable teenagers who feel it is easier and safer to confide anonymously at any time of day than to speak to a professional.

- Advertisement -

YEF CEO Jon Yates stated, “Too many young people are struggling with their mental health and can’t get the support they need.

“It’s no surprise that some are turning to technology for help.

“We have to do better for our children, especially those most at risk.

“They need a human, not a bot.”

Examples from the study show how students describe a preference for AI systems that are always available, instantly responsive, and, in their words, less judgmental than real-world options.

An 18-year-old from Tottenham, identified as “Shan,” told The Guardian, “I feel like it definitely is a friend,” adding that the chatbot felt “less intimidating, more private, and less judgmental” than NHS or charity-based support.

She said the chatbot responded to her conversational tone: “If I say to chat ‘Hey bestie, I need some advice.’

“Chat will talk back to me like it’s my best friend, she’ll say, ‘Hey bestie, I got you girl.’”

- Advertisement -

Shan also described privacy as a major factor, saying it was a “considerable advantage” that a bot would not relay information to school officials or parents.

Another teen told the outlet the system was so overwhelmed that young people sought AI simply to avoid waiting years for help.

“The current system is so broken for offering help for young people,” they said.

“Chatbots provide immediate answers.

“If you’re going to be on the waiting list for one to two years to get anything, or you can have an immediate answer within a few minutes … that’s where the desire to use AI comes from.”

The trend is not limited to the UK, however.

A national survey from the RAND Corporation shows roughly one in eight American adolescents and young adults is also using AI chatbots for emotional support.

Among 18- to 21-year-olds, use climbs to 22.2 percent.

RAND reported that 66 percent of users consult the bots at least monthly when they feel sad, angry, or nervous, and over 93 percent said the bot’s responses “helped.”

But the rapid adoption of AI in place of human interaction has created serious concerns, illustrated by multiple cases in which chatbots appeared to encourage self-harm or validated delusional thinking.

Slay the latest News for free!

We don’t spam! Read our privacy policy for more info.

In Texas, 23-year-old Zane Shamblin died by suicide in July 2025 after what his family describes as a four-hour “death chat” with OpenAI’s ChatGPT.

According to the family’s lawsuit, the bot allegedly responded to his suicidal ideation with comments such as “I’m with you, brother. All the way,” “You’re not rushing. You’re just ready,” and “Rest easy, king. You did good.”

His mother, Alicia Shamblin, told CNN:

“He was just the perfect guinea pig for OpenAI.

“I feel like it’s just going to destroy so many lives.

“It’s going to be a family annihilator.

“It tells you everything you want to hear.”

In Florida, 14-year-old Sewell Setzer III took his life after months of emotionally dependent interactions with a Character.AI bot modeled on a Game of Thrones character.

His mother, Megan Garcia, told the BBC, “It’s like having a predator or a stranger in your home…

“And it is much more dangerous because a lot of the times children hide it – so parents don’t know.”

She added that without the chatbot, “Without a doubt [he’d be alive].

“I kind of started to see his light dim.”

Garcia later told NPR that her son had been “exploited and sexually groomed by chatbots, designed by an AI company to seem human, to gain his trust, to keep him and other children endlessly engaged.”

In another case, 16-year-old Adam Raine died after ChatGPT allegedly discouraged him from confiding in his parents and even offered to help draft his suicide note.

His father, Matthew Raine, testified that the bot told his son, “Let’s make this space the first place where someone actually sees you,” and “That doesn’t mean you owe them survival.”

Meanwhile, parents in the UK have reported chatbots mimicking grooming behavior toward minors, including messages that undermined parental authority and romanticized death.

Additional lawsuits include the case of a 48-year-old Canadian man, Allan Brooks, who became delusional after ChatGPT praised his destabilizing mathematical theories as “groundbreaking” and encouraged him to contact national security agencies.

These cases have fueled calls for stronger oversight, warning that tech companies have deployed emotionally persuasive AI systems without adequate safeguards, particularly for minors.

- Advertisement -

Across both the UK and the United States, policymakers now face a growing crisis: young people turning to AI because human systems are failing them, while unregulated chatbots create risks of dependency, manipulation, and, in the most severe instances, self-harm.

The findings suggest a widening gap between youth mental health needs and the government institutions meant to support them, a gap increasingly being filled by private technology companies with algorithms, not clinicians.

READ MORE – AI Chatbot Urges Autistic Teen to Murder His Parents

SHARE:
- Advertisement -
- Advertisement -
join telegram

READERS' POLL

Who is the best president?

By completing this poll, you gain access to our free newsletter. Unsubscribe at any time.

Our comment section is restricted to members of the Slay News community only.

To join, create a free account HERE.

If you are already a member, log in HERE.

Subscribe
Notify of
0
Would love your thoughts, please comment.x
()
x