Documentary maker Nadia Maxwell set up an account as a 13 year old girl and was shocked by what she was served on instagram.
A Christchurch documentary maker is calling on the Government to take urgent action to protect young Kiwis after a “deeply unsettling” social media experiment in which a page set up for a 13-year-old girl was flooded with eating disorder and other distressing content.
Nadia Maxwell is throwing her supportbehind a bill aiming to ban Kiwis under 16 from accessing social media platforms in New Zealand - claiming that tech giants are doing little to keep vulnerable users safe online.
Maxwell set up pages on TikTok, Instagram and Snapchat posing as a 13-year-old girl as an experiment to better understand how the apps worked.
Christchurch documentary-maker Nadia Maxwell carried out experiments on social media apps to see what content is being recommended to young users. Photo / George Heard
Last year, the Herald revealed a similar experiment by Maxwell, which showed it took just 22 minutes for harrowing suicide-related content to appear in the “for you” feed of a TikTok account.
Maxwell said the Instagram content she found this time was “disturbing, discombobulating and gross”.
She has now shared the results of her “deeply unsettling” experiment.
As she did with TikTok, when she set up the profile, she liked and followed accounts that offered content appropriate to a 13-year-old girl - cute animals, Taylor Swift and funny kitten videos.
But her feed quickly filled with diet and beauty content - creators showing how to create a “thigh gap” or perfect nose or jawline; videos of severely thin or underweight women sharing “what I eat in a day” and posts about extreme mental health issues.
“It was deeply unsettling spending time on Instagram through the lens of a 13-year-old. I don’t think many of us would dream that there would be this level of harmful content being targeted at teenage accounts,” said Maxwell.
Maxwell set up an Instgram account posing as a 13-year-old girl. Photo / Supplied
“Teens don’t even need to go looking for it. If you linger on a piece of content, the algorithm serves you more of the same. So you don’t have to be searching for, liking or following harmful content for it to appear in your feed.”
Maxwell said she believed the experiment showed Instagram’s new settings - aimed at offering teens a safer online experience by automatically applying stricter settings - had failed.
The settings were announced in September last year and are supposed to restrict who can contact teens, limit the content they see, and encourage healthy online habits.
Any user under 16 needs parental permission to adjust these settings.
Maxwell filmed her experiment both before and after the settings were introduced, and said they did nothing to protect her feed from awful content.
“I saw firsthand that not only is harmful content still prevalent, but even when I reported it, it wasn’t taken down,” she said.
“Yet Instagram are publicly trying to reassure parents that their new built-in protections for teens offer peace of mind for parents. It’s completely disingenuous… they don’t care.”
“These companies have proven time and time again they will prioritise profit over safety. Their algorithms are designed to amplify emotionally charged content. The longer kids spend on their platforms, the more these companies make, which means there is a huge incentive to serve up this distressing content.
“I don’t know many adults who have a healthy relationship with social media, so why do we expect a 13-year-old to?”
Maxwell said it was “scary” imagining real 13-year-olds trying to navigate the “awful” content she saw.
“The algorithm is there waiting to exploit any vulnerable moment that teens may have… they don’t even need to be searching for this content,” she said.
“And to anybody that says ‘teenagers should just be able to scroll past it’ - I say, well, what did you do the last time you drove past a car accident - did you look? We don’t want to, but we all do. On some human level, we’re all wired to look, and I think the tech companies know that and they exploit it.
Maxwell supports a move to ban social media for anyone in NZ under 16. Photo / Alex Cairns
“Kids don’t stand a chance against thousands of engineers skilled in persuasive design… and you can’t out-parent an algorithm that’s been weaponised this way.
“This is why the government needs to step up and follow Australia’s lead and raise the minimum age.”
The law introduces a mandatory minimum age of 16 for accounts on certain social media platforms. Parents cannot give their consent to let under-16s use these platforms.
Australia has moved to ban under 16s accessing certain social media platforms. Photo / NZME
Maxwell acknowledged social media was a huge part of everyday life, but wanted to remind people it was “an entertainment platform” - not a necessity.
“If this were a television channel, would you want your kids sitting down and watching this? I don’t think many of us would,” she said.
“We urgently need change - for the sake of the next generation, we need it now… these kids are just out there stumbling across this stuff at any time. A 13-year-old’s brain is no match for the most powerful companies in human history, using these incredibly manipulative algorithms.
A spokesperson for Meta, the company that owns Facebook, said Maxwell’s experiment was “the experience of a test account” and was “not representative of the experience of Instagram’s broader community”.
“A manufactured account does not change the fact that tens of millions of teens now have a safer experience thanks to Instagram Teen Accounts, which offer built-in protections limiting the content they see, who can contact them, and the time they spend on Instagram,” she said.
“We began rolling out Instagram Teen Accounts in New Zealand in February 2025 to help protect teens online and will continue to work tirelessly to do just that. We also use automated technology to remove harmful content, with 99% proactively actioned before being reported to us.
“We will continue to invest in new technology and features and engage with the New Zealand government and our safety partners to build safe and inclusive online environments.”
The spokesperson said since September 2024 at least 54 million teens around the world had been placed into Teen Accounts.
“Since making these changes, 97% of teens aged 13-15 have stayed in these built-in restrictions, which we believe offer the most age-appropriate experience for younger teens.
“Teen Accounts builds on the numerous other tools, features and resources that help teens have safe, positive experiences, and give parents simple ways to set boundaries for their teens.
“We’re taking several steps, including using age verification technology, to help ensure teens on Instagram are placed in Teen Accounts.
“Meta publishes Community Standards Enforcement Reports to track our progress and demonstrate our continued commitment to making our platforms safe and inclusive.
“We remove content that encourages suicide, self-injury, and eating disorders. Between January and March 2025, we removed 6.8 million pieces of content related to suicide, self-injury and eating disorders. Of the violating content we actioned, 98.9% was found and actioned by us, and 1.10% was reported by users.”
Anna Leask is a senior journalist who covers national crime and justice. She joined the Herald in 2008 and has worked as a journalist for 19 years with a particular focus on family and gender-based violence, child abuse, sexual violence, homicides, mental health and youth crime. She writes, hosts and produces the award-winning podcast A Moment In Crime, released monthly on nzherald.co.nz