When I was a child, we had BBC1, BBC2 and ITV. Three channels. That was it.
I wasn’t allowed to watch ITV.
All the programmes everyone talked about at school were on ITV, so of course I felt left out. My parents said they were protecting me from the persuasive effects advertising. I thought they were ruining my life.
Parents and teachers today are wrestling with the same conundrum: how to protect our children when there’s so much pressure to let them join in. Gaming on computers. Social media on mobile phones. And now AI. Each time, children are handed something powerful before we fully understand what it is doing to them. Each time, adults reassure themselves that this is simply progress. Each time, anyone who sounds a note of caution is at risk of being labelled old-fashioned, anxious, anti-technology.
I am not anti-technology. I used AI to help me prepare a recent talk on AI in Education – I asked it for a literature review and current research. But then I read the research myself. I watched documentaries, read articles, listened to podcasts, talked to people, and walked in the woods working out what I thought. AI helped me gather information, but it did not do my thinking for me.
However, by welcoming AI into our classrooms and our children’s bedrooms, we risk our children losing the ability to think, to struggle with a problem, to make mistakes and learn from them.
The evidence on AI in education is still emerging. Even the Department for Education says the evidence is still developing and that it is continuing to understand “effective and safe use cases” for pupils and students.
AI may well be helpful for teachers. A 2024 Education Endowment Foundation trial found that teachers using ChatGPT, alongside guidance, reduced lesson planning time by 31%. Given the pressure teachers are under, that is beneficial.
But the answer for children is different.
UNESCO’s guidance on generative AI in education calls for a human-centred approach, designed to protect human agency and genuinely benefit learners.
Learning is not just about producing an answer. Learning is what happens inside the child while they are working towards one.
I think of a small child asking, “What’s this?” and a device answering before her mother can. Accurate. Efficient. Impressive. But something is lost. The shared look. The pause. The mother leaning in and saying, “What do you think it is?” That tiny dance of curiosity and connection.
Learning becomes transactional instead of relational.
Then I think of the nine-year-old whose homework is perfect because AI helped him produce it. No spelling mistakes. No frustration. No getting stuck. No procrastination or melt downs. Everyone is relieved. Until he is asked a question in class and freezes. Not because he is incapable, but because he has not practised thinking.
The homework has been done. Just not by him.
And I think of a teenage girl from one of my groups, bright and thoughtful, who said, “We have to use AI. It’s what we’ll need for the future.” She is right, of course. But then she added, “It helps me know what to think.”
That reliance on AI troubles me because adolescence is not just a time when young people learn more about the world. It is the time when they begin to work out what they think about it. Their brains are developing the capacity to question, reflect, disagree, and form their own views. That is not a side effect of education, it is the developmental work of this age.
Teenagers need space to think. To not know. To follow a messy line of thought to its end. To discover that they can form a mind of their own. If AI steps in too early, we risk producing young people who are very good at generating answers, but less confident in knowing what they actually think. So many young people are using tools they do not yet know how to question.
A much-discussed MIT Media Lab preprint found lower neural engagement among people using AI tools for essay writing tasks, though it is important to say that this study is small and not yet peer-reviewed. AI cheats our learners of the effort and rigour of using their own brains, and the abilities for original ideation and critical analysis are shrinking.
The OECD’s Students, Computers and Learning report found that students who use computers very frequently at school “do a lot worse in most learning outcomes”, even after accounting for social background and demographics. It also found no appreciable improvement in reading, maths or science in countries that had invested heavily in ICT for education.
Sweden has now begun scaling back digital-first schooling and reintroducing printed books, handwriting and teacher-led learning, in response to concerns about literacy and attention. Reporting on this shift cited concerns from the Karolinska Institute that digital tools may impair rather than enhance student learning when overused.
Parents are told to limit screens at home, while schools may put children in front of screens for hours under the respectable banner of educational technology. iPads in nurseries, digital platforms in primary schools, Google Classroom in secondary. And now we have AI waiting at the door, promising speed, efficiency, productivity.
We need to ask: who has proved that this is what children need?
Behind it all sits a global ed-tech industry worth hundreds of billions.
This prompts the question: Is this being driven by what children need, or by what can be sold to make a profit for ed-tech giants?
The brain develops through effort. Through sustained attention. Through struggle. Through making mistakes and learning from them. Through sitting with something difficult long enough for understanding to form.
When learning becomes too easy, too immediate, too frictionless, something vital is lost. It is like feeding purée to a child who is ready to chew. They may swallow it, but they do not develop strength.
Reading a book, writing by hand, wrestling with a problem, staying with uncertainty – these are not outdated skills. They are the very processes that build the mind’s capacity to think.
Screens can be both stimulating and numbing. AI may amplify that. Social media hijacks attention. AI threatens something even deeper: attachment, judgement, confidence, relationship.
We have known for a long time that children need to be fed more than information. Harry Harlow’s famous, and now ethically troubling, studies with baby rhesus monkeys showed that infant monkeys preferred a soft cloth surrogate “mother” over a wire one that provided food, returning to the wire mother only to feed. The infants would rather starve themselves of food than of comfort and connection.
Our children’s brains grow best in relationships where they feel safe, seen and held in mind. Most of us can remember a teacher who changed something for us. Sometimes even saved something in us. A teacher who noticed, challenged, encouraged, held a standard, or saw something in us before we could.
What happens if we slowly remove that human centre from learning?
I am not saying AI has no place. I am saying it has to know its place. Technology should be our servant, not our master. It should be intentional, specific and moderate. It should support learning, not replace the human work of it.
But rather than asking: how do we integrate AI into our schools, or should we keep it out?
I believe the deeper question is: what kind of education do we need to build?
Our current education system was designed for a different world. A world of offices, factories, predictable roles, fixed pathways. A world where success often meant following instructions, getting the right answer, fitting into a system.
That world no longer exists. Our children are growing up into a world we do not fully understand ourselves. A world where information is instant, answers are everywhere, and machines can do much of what we once trained humans to do.
So the question cannot just be, what should children know? Or how should they learn it? It has to be: who are they becoming?
The word education is often linked to two Latin roots: educare, to train or mould, and educere, to draw out or lead forth. Both are necessary. Children do need knowledge and skills and AI may be excellent at putting information in. But drawing a person out, that is the work of parents, teachers, mentors, peers, conversation, challenge, relationship, and time.
In our work at Rites for Girls, we run girls’ groups and anxiety sessions for boys and girls, especially around the transition to secondary school. Again and again, we see the same needs. Children do not just need facts, they need adults who help them discover themselves. They need experiences where they learn by doing. They need to speak, listen, disagree, make mistakes, feel awkward, try again.
Over thirty-five years of working with young people, I have witnessed a shocking rise in anxiety that is crippling our young. More fear of mistakes. More social reticence. More panic. More children frightened to try something new. This is not caused by one thing, but I cannot ignore that it has grown alongside a culture increasingly focused on performance, speed, screens, and the “right answer”.
We need an education that teaches children how to think, not just what to think.
How to question, not just answer.
How to sit with uncertainty, not rush to solve it.
How to relate, collaborate, listen and challenge each other well.
How to be human in a world that is becoming more technological.
In that kind of education, AI might have a place. Not as a shortcut. Not as a replacement for thinking. But as a tool. A tool to explore ideas, test thinking, generate possibilities that young people then challenge, refine and make their own.
So the human work comes first.
Because what children need is not just protection from technology. They need preparation for a world where it exists. That preparation is not just about skills. It is about confidence in their own thinking. Trust in their own voice. The ability to stay grounded in themselves when the world around them is changing.
So perhaps the conversation about AI in education needs to become a conversation about education itself.
And in the meantime, rather than constantly having to prove that AI and screen use are damaging childhood, perhaps we should reverse the question: Where is the evidence that this is safe for developing minds? Until we have that, we should slow down.
I propose that we should keep AI out of children’s education until we can prove it is safe and genuinely useful. We need to radically rethink what education is for. We return childhood to our young. And we focus on equipping them not only to use the tools of the future, but to remain fully human while doing so.



No comment yet, add your voice below!