Ask teens where they go when they’re overwhelmed. They won’t say “my teacher,” “my mum,” or “the mental health team.” Increasingly, they say: “ChatGPT.”
Or, as one girl told researchers, “My bestie.”
It’s all there in the Guardian’s recent report ‘‘I feel it’s a friend’ (9 Dec 2025):
A quarter of teenagers are now turning to AI chatbots for mental health support. Not to cheat on homework. Not for revision. For trauma. For fear. For grief. For loneliness.
And here’s the knot that’s hard to untangle:
Some of the very harms pushing teens to breaking point are being created by the same technologies they’re turning to for comfort.
It’s like giving someone a cigarette to help them breathe.
The girl who called AI her “friend”
The Guardian’s story begins with Shan, an 18-year-old in Tottenham whose life has been marked by violence. One friend shot. Another stabbed. Shan goes looking for help and finds… a chatbot.
Why?
Not because AI is better than humans.
But because it’s available.
Non-judgmental. Always awake. And it doesn’t tell your mum.
Forty percent of teens affected by youth violence are now using AI for emotional support. And one in four teenagers overall. Those waiting months or even years for mental health help are far more likely to turn to AI.
This is not a small shift. This is a rewiring of childhood.
Why teens choose AI over adults
Let me tell you what the girls I work with say:
- “It won’t get angry.”
- “It won’t tell anyone.”
- “It won’t think I’m weird.”
- “It answers straight away.”
- “Humans take too long.”
They want empathy, immediacy, privacy, and presence. And many teens will take an illusion of those things over the real, messy, unpredictable versions that humans offer.
And yet the irony is brutal:
AI is contributing to the anxiety crisis – and teens are turning to AI to soothe the anxiety that AI helped create.
It’s like being lost in a maze and asking the maze for directions.
The dangers we don’t want to see
AI is not designed for mental health care. There are lawsuits underway because of teens who spiralled after long AI exchanges. Even though OpenAI is building “distress detectors” into the system, because things have already gone wrong, this doesn’t address the real risks:
- AI makes emotional soothing too easy
When comfort is instant and frictionless, teens can avoid learning the difficult skills that build resilience – talking to a real person, tolerating discomfort, waiting for support.
- It weakens their ability to trust adults
If a chatbot is the only place where a teen feels safe, we have failed them.
- It reinforces isolation
Kids who already feel alone retreat even further into a private world where adults can’t reach them.
- It can distort emotional learning
AI mirrors back what a teen says. It doesn’t challenge, deepen, or contextualise.
AI can’t hold a teen while they cry. It can’t say, “I hear you, and I’m right here with you.”
A bot can only approximate care.
And a child knows the difference in their bones, even if they can’t name it.
The painful truth: teens are not addicted to AI. They are abandoned into it.
Mental health services are overwhelmed. Waiting lists stretch into years. Schools don’t have the staff. Parents are burning out. Many families are dealing with trauma, poverty, instability.
So, of course teens turn to the device in their hand.
It’s not that they trust the bot more.
It’s that they trust themselves less.
And AI feels like the only thing that won’t judge that.
What real support looks like
Teens need a human, not a bot.
Teens need:
- adults who can listen without panic
- school systems that recognise distress before it explodes
- mental health support that is culturally relevant and timely
- digital education that actually teaches discernment
- safe offline communities where they are held, known, and witnessed
And they need us – all of us – to stop outsourcing their emotional lives to machines out of convenience or despair.
What we can do right now
- Ask your child directly:
“How are you using AI? What helps? What worries you?”
You will learn more in five minutes of honest conversation than in any report.
- Teach scepticism, not fear.
Instead of “don’t use AI,” try: “Let’s check whether this answer is actually true.”
You’re building critical thinking, not shame.
- Make it easier for them to come to you than to the bot.
This is the hardest one. It requires us to be calmer, more available, more human than the machine.
- Create spaces – like girls’ groups, youth circles, mentoring programmes – where teens can talk in real time with real people.
AI fills a void. So, let’s fill the void with humans.
The uncomfortable truth we must face
If a child would rather pour their heart out to a chatbot than an adult, that is not a story about the child.
That is a story about us.
A story about systems that are too stretched to care.
Schools too overwhelmed to notice.
Parents too exhausted to hear.
Society too distracted to hold its young.
AI didn’t create that loneliness.
But it is exploiting it.
Where this leaves us
AI is not going away. And it shouldn’t. It can be extraordinary – a tutor, a tool, a translator, a bridge.
But a friend?
No.
A therapist?
Absolutely not.
A lifeline in the dark?
Only because we haven’t built the real ones.
Teens are not choosing AI because it is powerful.
They are choosing it because they are lonely.
And loneliness is not a technological problem.
It is a human one.
We can fix that.
Let’s make sure the next generation grows up with humans capable of holding their pain, not just machines capable of answering their questions.
If you’re interested in finding a girls’ group, we have Girls Journeying Together groups for girls in year 6 and 7 who are making the transition to secondary school, and Girls’ Net for small groups of same-age girls (8-18) online to resource them for times of challenge.



No comment yet, add your voice below!