That milestone also brought a bigger question: what rights should children have in the digital age, and what responsibilities do we, as adults, have to protect them? He’s still five years away from being considered an adult in most senses – voting, buying alcohol, signing contracts – yet so much of life now plays out online.
Today’s digital natives are navigating adult-level risks, pressures and expectations years before they’re developmentally ready.
Under the UN Convention on the Rights of the Child, articles 12 and 17 affirm the right to be heard and to access information. These rights are often at the centre of opposition to proposals like social media age restrictions. But the convention also guarantees children the right to safety, privacy and to grow up free from exploitation. These rights don’t enforce themselves, especially online and they’re not in competition. The challenge is how to balance them and right now, the balance is badly off.
How digitally savvy, resourced or educated your parents are shouldn’t determine the level of protection you get in the online world. Yet in practice, it does. In the real world, we’ve never been more hands-on as parents. We hover, supervise and micromanage. At our youngest’s football training, I’m struck by how much has changed in a decade. Ten years ago, parents watched from the sidelines; now, they’re on the pitch, often leading the way and doing all the moves alongside their kids.
It got so ridiculous we moved our son up a class just so he could focus on his coach instead of dodging adults performing star jumps next to him. Yet online, the same parents hand over a smartphone at 10, effectively waving their children into an unmoderated, unfiltered space filled with strangers, predators and content they can’t process.
The online world our kids inhabit is fundamentally different from the one we knew and know. Gaming isn’t “just a game” anymore. The most popular titles are evolving platforms designed to keep players hooked indefinitely, what the industry calls Games-as-a-Service. Imagine a theme park that never closes, where rides change daily and your digital character is part of your identity. You don’t just play; you live there. And the longer you stay, the more money the park makes.
Some of these platforms, like Minecraft, Fortnite and Roblox, are household names. People think they’re harmless fun. But they can be dangerous places where kids encounter strangers, sexual content, gambling-style microtransactions, extremist propaganda and violence. On Roblox, you might stumble into simulated school shootings, sexual roleplay between avatars, or Nazi concentration camp reenactments, all scenarios that have actually occurred. Thousands of new user-generated games are added daily, far too many for moderators to review.
When these games meet smartphones, the effect is supercharged. Kids can start on a console at home, continue on a tablet in the car, and sneak in another session on their phone at midnight. The lines between gaming, socialising and scrolling blur completely. Platforms like Discord and Twitch have become the new locker rooms, except these ones are open to millions of strangers, with no walls, no teachers, no rules.
This week, I heard about an incident at a New Zealand school that hit me like a punch to the gut. A young student was pressured into a sexual situation with peers, while others filmed and shared it online. Within minutes, the footage was circulating on social media, making it impossible for staff to fully contain. Because those involved are children, this isn’t just a behavioural issue, it’s child sexual exploitation. The emotional and reputational fallout will be long-lasting. Unfortunately, this isn’t an isolated case, incidents like this are happening in schools every day.
Stories like this are why the conversation can’t just be about children’s rights to participate online. As the chief executive of a youth charity said to me recently, “I’m so sick of hearing about child rights. We need to talk about child protection.” That shift matters. Rights mean nothing if we fail to protect children from harm in the first place. Yes, we need to hear children’s voices, but this is also a parenting issue.
As Australian clinical psychologist Dr Danielle Einstein recently told a University of Auckland audience, the counter to the “children’s rights” argument against social media age restrictions is simple: children have a right to be protected by their parents. End of story.
The “child rights” banner is also co-opted by industry, often the same tech giants profiting from young users and echoed by youth organisations that, in some cases, receive funding from those platforms. It’s not unlike tobacco’s playbook: cast doubt, muddy the waters and frame regulation as an overreach.
That’s why I’m part of the B416 campaign, calling for a minimum age of 16 for social media access and for a dedicated regulator to enforce it. The current age limit of 13 is a joke. No one checks. Kids lie. Parents turn a blind eye. Platforms profit from the attention and data of users who are far too young to manage the risks. We regulate driving, drinking and gambling because the stakes are high. Social media, with its links to mental health harm, exploitation and addictive behaviours, should be no different.
A regulator would mean proper oversight, not just empty promises from Silicon Valley, global platforms like TikTok or industry bodies funded by the tech companies themselves. It would require platforms to prove they meet safety standards, moderate content effectively, and protect underage users from adult strangers, sexual material and harmful content. Right now, we’re relying on multinational corporations to put child safety ahead of shareholder returns. Spoiler alert: they’re not doing it.
If you think I’m exaggerating, look at Instagram’s recent feature allowing users to share their exact real-time location with friends. It’s pitched as a fun meet-up tool, but it opens the door to stalking, harassment and predatory contact. If these companies can’t get something as basic as location safety right, why should we trust them with children’s mental health and wellbeing?
I’m not anti-gaming, and as the co-chief executive of a health tech company, I’m not anti-technology. Used well, tech can be creative, social and educational. But the current business models exploit attention rather than nurturing wellbeing. Free-to-play games make money through in-game purchases, advertising and virtual currencies like Robux or V-Bucks. Developers are rewarded for “stickiness”, not healthiness. Cosmetic upgrades, loot boxes with slot-machine odds and limited-time offers create artificial urgency. Once kids invest time and money into an avatar that feels like part of their identity, walking away feels like losing part of themselves.
And it’s not just about the content, it’s the culture. The unfiltered chats. Influencers streaming borderline pornographic content alongside gameplay. Discord servers where violent or extremist material is a click away. These aren’t rare exceptions. They’re part of the everyday digital ecosystem for millions of children.
That’s why we’ve chosen differently for our son: less gaming, no social media, more real-world skills. This isn’t about denying fun; it’s about giving him time and space to build offline confidence and resilience before immersing him in an online world designed to hold him hostage.
Does he sometimes feel left out? Yes. But parenting in the digital age isn’t about keeping kids happy in the short term; it’s about keeping them safe and prepared in the long term.
So we’ll keep being the “uncool” parents. We’ll keep saying no when “everyone else” is saying yes. We’ll keep choosing a Herald subscription over a headset, and Sharesies over Roblox.
Because here’s the truth: in this new digital Wild West, our kids don’t just need rules. They need a fighting chance. And that starts with the choices we make now, the boundaries we set, and the legislation we fight for. Upholding children’s rights means protecting them first, because without safety, every other right risks being meaningless.