Safer Internet Day – and the truth about who is really responsible

Every year for over two decades, Safer Internet Day arrives with a new slogan and a new set of well-meaning resources. This year it’s “Smart tech, safe choices – exploring the safe and responsible use of AI.”

And it’s a worthy topic, we all need to be making safer choices; adults, teenagers, and children. We all need to understand more about how AI works, how algorithms shape what we see, how data is used, how easily we can be nudged, compared, outraged, polarised, or stealthily pulled into content that doesn’t actually serve us.

Many adults I know struggle with their relationship to technology. We check our phones compulsively. We scroll when we’re tired. We feel anxious, distracted, overstimulated, or oddly empty after being online. We know it’s affecting our sleep, our attention, our bodies, our parenting, our relationships.

So, when we talk about “safe and responsible use of AI”, we’re really talking about how all of us are learning to live inside digital systems that are developing far faster than our nervous systems ever could.  The language of “safe choices” insidiously places the responsibility on individuals, and I would like to shift the focus onto how the environments themselves are anything but safe or healthy.

We are asking human beings – with ancient brains, emotional bodies and social wiring – to behave responsibly inside systems that are explicitly designed to capture attention, provoke emotion, maximise engagement and keep us coming back for more.

By making the focus of Safer Internet Day our “safe and responsible use”, we subtly shift attention away from where it most needs to be. Away from what is hiding in plain sight. Because this is not just a question of personal responsibility, it is about power and accountability. It is about the companies who profit from selling our attention, and the governments who are responsible for citizens’ wellbeing and who actually hold the legal authority to intervene. If we are serious about safety, then those with the most power must also carry the most responsibility, and be held to account when they don’t act in the public interest.

What happens when governments actually step in

There are now places in the world where this imbalance of power is finally being taken seriously.

In the UK, the Online Safety Act (2023) introduced something radical: a legal duty of care. Platforms are now required to protect users, and especially children, from harmful content, including material linked to self-harm, eating disorders, bullying and exploitation. Ofcom can fine companies up to £18 million or 10% of their global annual turnover if they fail. That’s not symbolic. That’s structural. That’s saying: this is no longer optional.

In the European Union, the Digital Services Act (DSA) goes even further. It gives regulators the power to demand transparency about how algorithms affect users’ physical and mental health and to fine companies up to 6% of their annual turnover if they refuse. Meta, TikTok and others are now legally required to open their systems to public scrutiny. Not just what content is shown, but how and why it is being pushed.

In the United States, something different is happening, not through regulation, but through the courts. Hundreds of lawsuits, now consolidated in California, accuse Meta, TikTok, Snap and YouTube of knowingly designing addictive systems that damage children’s mental health. Claims include rising anxiety, depression, body dysmorphia and compulsive use. In January 2026, both Snapchat and TikTok quietly settled major cases just before landmark trials were due to begin. No public verdict. No legal precedent. Just very large sums of money changing hands. And that’s an important pattern to notice. When companies settle, they often do so under Non-Disclosure Agreements. Which means the public never gets to see the internal documents. No full story. No official record. No systemic learning. It’s a way of buying silence. Of containing reputational damage. Of paying to keep the deeper truth hidden.

In Australia, the response has been more direct. The government has introduced a world-first ban on social media for children under 16, explicitly citing concerns about mental health, doom scrolling, cyberbullying and addictive design. It’s a bold move. Not perfect. But it sends a clear signal: this environment is not neutral, and children are not just “failing to cope” with it.

And here’s the interesting thing: The internet didn’t collapse in any of these places. Innovation didn’t stop. Platforms didn’t disappear. What happened was redesign. Suddenly safety features became possible. Defaults changed. Reporting systems improved. Content moderation was taken more seriously. Child protection became a design issue, not just a PR one.

Which tells us something important:Tech companies can design for wellbeing. They just usually won’t unless there is a legal or financial reason to do so.

So, when we talk about “responsible AI”, we need to focus on where responsibility really lies. It doesn’t live primarily with individual users, whether they are adults or children. It lives with the systems that shape attention, behaviour and emotional life at scale – with the companies who profit from capturing human vulnerability and with the governments who have the authority to regulate them, but often hesitate to.

That’s how every other industry that affects public health works: food, drugs, cars, buildings. We don’t rely on personal responsibility alone. Tech should not be the exception.

What this looks like at home

And then there’s the smaller version of all of this. The one that plays out in ordinary houses, with ordinary families, every day.

I once worked with a mother whose 12-year-old daughter had become increasingly anxious, irritable and withdrawn. She was constantly on her phone. Scrolling late into the night. Any attempt to limit it ended in major meltdown of tears or fury. The mum assumed the problem was the phone, but when she took a closer look, something else emerged. The girl wasn’t just scrolling. She was watching endless videos about friendship fallouts, body “glow ups”, emotional advice, school drama. Her nervous system was living in a constant state of comparison and threat.

So instead of adding to this stress by bringing in a rule to safeguard her child, the mum did something different. She asked her daughter, “Can you show me what you like watching?” She asked not in a policing way but in a genuinely curious way. They lay on the bed together and watched a few clips. And the mum noticed what her daughter’s body was doing: the laughter, the excitement, and also the tension, the shallow breathing and the way she went quiet afterwards. And she named it,“I wonder if some of this makes your body feel a bit on edge.”

That one sentence changed everything. Suddenly it wasn’t about control, it was about awareness. Mother and daughter started noticing patterns together. Which content made her feel worse? Which made her laugh? Which made her compare? Which made her spiral?

From here, boundaries could come more naturally. Not dramatic ones, or perfect ones, but choices rooted in how things actually felt. And that’s emotional literacy. For children, and for adults too. Not “get off your phone” or “be careful online”, but “how does this feel in my body, and what do I want to do with that information?” That’s a skill that matters whether you’re 12 or 52.

So what can we actually do – practically?

We talk about risk. We talk about bans. We talk about harm. But people are left thinking, OK, but what do I actually do tomorrow?

Here are some things that genuinely make a difference.

1. Stop outsourcing digital safety to rules alone

Rules matter. Boundaries matter. But they only work when paired with relationship.

Instead of just:
“How long were you on your phone?”
Try:
“What did you watch today that stuck with you?”

Instead of:
“That app is banned.”
Try:
“How does that app make you feel in your body afterwards?”

Children learn regulation through connection, not control.

2. Teach children to read their own nervous systems

This is huge, and rarely taught. Help children notice:

  • When their body feels tense or calm
  • When they feel more themselves or less
  • When they feel energised or flattened after being online

This builds internal guidance, not just obedience.

3. Watch together sometimes

Not all the time. Not intrusively. But enough to understand the world they’re living in.

You don’t need to like it. You just need to be interested. You don’t need to feel like you can spare the time – actually you can’t afford not to. Curiosity keeps communication open. Surveillance closes it.

4. Model your own relationship with tech

This one is uncomfortable, and the greatest parenting challenge for many of us.

Children learn more from what we do than what we say. If we’re always distracted, scrolling, checking, half-present – that is the curriculum. And saying that it’s work is no excuse, they just see you online.

And we can be honest too, “I’m noticing I feel worse after being on my phone too long.” That teaches self-awareness, rather than shame.

5. Advocate beyond your own home

This is where adult power really lies.

  • Support regulation of tech companies
  • Back schools that teach emotional literacy
  • Question platforms, policies, data use
  • Speak up when children’s wellbeing is being compromised

Private coping is not enough. This is a public health issue.

Why both levels matter

Safer Internet Day shouldn’t just be about educating individuals to survive digital environments. It should also be about asking whether those environments are healthy in the first place.

We need both:

  • people developing emotional and nervous system awareness,
    and
  • governments holding tech companies structurally accountable.

One without the other doesn’t work. No amount of “smart choices” can compete with systems designed to exploit human psychology. And no amount of regulation will help if we don’t also build cultures where people can notice how technology is actually affecting them: their bodies, their attention, and their relationships.

For me, and for Rites for Girls, the focus will always land most strongly on children because children don’t get to opt out. They don’t design these systems. They grow up inside them. And this is not a children’s problem, it’s an adult responsibility.

So, if Safer Internet Day is only about teaching individuals to cope inside unsafe systems, then we’ve missed the deeper question. The real question isn’t: Are people making good choices online? It’s: Why are so many of us, adults and children alike, being asked to manage environments that were never designed with human wellbeing at their core?

Essential to talk about systemic responsibility

This is what we need to remember on Safer Internet Day. We can teach children all the skills in the world but if the platforms themselves are fundamentally harmful, we’re asking children to swim in polluted water and congratulating ourselves for giving them armbands.

Real safety requires systemic change.

It requires:

  • regulation of AI-driven platforms,
  • transparency about how data is used,
  • financial incentives for companies to design for wellbeing, not addiction,
  • consequences when tech harms mental health,
  • and an honest reckoning with the business models that profit from children’s attention.

Because right now, we’re telling children to be responsible in systems that reward irresponsibility. And that’s not fair.

What I see in the girls I work with

I’ve worked with hundreds of preteen and teen girls. Girls who are bright, funny, thoughtful, sensitive, imaginative.

And increasingly:

  • overwhelmed,
  • hyper self-aware,
  • anxious about being seen,
  • terrified of getting it wrong,
  • exhausted by comparison,
  • and unsure who they are when they’re not performing.

They don’t need more “smart choices”. They need more protection. More guidance. More adult responsibility. They need adults who are willing to say:
“This world is intense, online and off, and it’s our job to make it safer for you.”

Recommended Posts

No comment yet, add your voice below!


Add a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.