The Department of Internal Affairs says Kiwis have been caught undertaking the offending.
The Department of Internal Affairs says Kiwis have been caught undertaking the offending.
WARNING: This article details child abuse
Kiwis are among those procuring a disturbing form of child sexual abuse – bespoke livestreamed sex assaults – which are purchased through an offshore “receiver”, who acts out the attacks on a child thousands of kilometres away.
The tailor-made abuse is directed througheveryday video chat platforms by abusers who are usually based in Western countries.
In a briefing document released last month, the International Justice Mission (IJM) said it was one of the fastest-growing and least-detected forms of child abuse globally.
New Zealand is one of the key “demand-side” countries identified by the IJM, which has been responding to cases of the abuse since it was first reported in the Philippines in 2011.
The IJM says the Philippines is a major “source country” but the livestreamed abuse has also been found in Vietnam, Colombia, Latin America, Thailand, Africa and Eastern Europe.
Its research found that nearly 500,000 children in the Philippines had been trafficked to produce child abuse materials.
Chief UK marketing and public engagement officer Molly Hodson said New Zealand was in the top 20 countries flagged for suspicious transactions with the Philippines.
“Offenders in the Philippines tend to be people known to the children. It’s people who are relatives or close to the children, which, sadly, includes parents.”
The agency found demand-side offenders like those in New Zealand pay as little as $34 to “direct” the personalised videos.
Department of Internal Affairs (DIA) digital child exploitation manager Tim Houston confirmed multiple purchasers of long-distance child abuse have been discovered in New Zealand.
He said the offending was often found through investigations into separate abuse.
“So it’s been detected as almost like a byproduct.
“For example, an investigation commenced into the possession or distribution of child exploitation material and then a search warrant [is] executed, digital forensic processes undertaken and as a result of that analysis, there’s been a detection of or evidence located of livestreamed abuse.”
Tim Houston, manager of the Department of Internal Affairs' digital child exploitation team, pictured in the Wellington office. Photo / Mark Mitchell
Houston said, at a high level, this offending often involves a “facilitator” who has access to a child advertising the opportunity for abuse online.
“Offenders who are wanting to consume or pay for that abuse can see that advertisement, and it might be on an adult dating website or it might be on a pornography channel or something like that.”
This, he said, leads to communication and the group will then move to a separate platform.
“Then there’ll be some form of negotiation in terms of price and the method of payment and then the facilitator will create the livestream.
“The offender who’s paying for the services will direct the abuse via a chat function while the livestream’s occurring.”
Houston said people could conduct live video offending on any video platform that had a live webcam function.
“As with other forms of offending in the space, if the platform offers encryption, it’ll probably give some more confidence to the offenders that they’re able to operate a bit more securely.”
With typical child sexual abuse material investigations, his agency would receive a referral saying on this date and time, this illegal image was uploaded to a platform.
“Then we will go and make some inquiries and ultimately execute a search warrant, and then we can conduct digital forensic processes on the seized items.”
Livestream abuse, however, presented a greater challenge, as there was a limited digital footprint created by it.
Hodson said one of the most significant challenges in combating this type of exploitation is that it is not detected by most platforms in private-messaging and video-chat applications.
“Furthermore, child sexual abuse livestreamed in a video call is more difficult for law enforcement to investigate and prosecute than the distribution of images and recorded videos, which leave a digital trace.”
She said Google and Meta already used tools to detect live online child sexual abuse in publicly broadcast services such as YouTube, Facebook Live and Instagram Live.
But Hodson said the companies continued to not use such detection technology in all parts of their services, such as private messages and encrypted environments.
“IJM believes that governments should require device manufacturers and operating system developers to detect and disrupt the creation, distribution and rendering of child sexual abuse images and videos, including livestreaming, by using AI [artificial intelligence] tools that can detect such egregious abuse.”
As well as this, Hodson said the proliferation of AI-generated abuse material which is photo-realistic meant police may not be able to tell the difference between real and synthetic child abuse.
“The police are trying to find the child, but are potentially wasting their time, looking for someone that is AI-generated.”
The police also investigate this type of abuse. National Criminal Investigations Detective Inspector Stuart Mills said their investigations had resulted in offenders being identified, located and brought before the courts, whether the offending was occurring in New Zealand or internationally.
“A key focus of any livestreaming of child exploitation investigation is identifying the victim and ensuring, where possible, their safety and wellbeing wherever the offending is occurring,” Mills said.
Detective Inspector Stuart Mills of the police's National Organised Crime Group.
Houston told the Herald that toward the end of last year, some DIA staffers attended a training facilitated by international partners in relation to tools that can assist with these types of investigations.
“We’ve done quite a lot of upskilling in terms of our capability to detect the livestreaming offences and how we actually investigate them.”
There was “always” something left on a person’s device from accessing this abuse.
“We know from our experience that particularly sophisticated and more organised offenders will upskill each other. They talk to each other and they educate each other on ways to avoid detection.
“But at the same time, law enforcement is also using new technologies to hopefully make the detection of this type of offending better.”
In his 22 years on the job, Houston said the agency had seen an escalation as technology advanced.
“When I first started, people would import DVDs into the country or even sometimes VHS or a magazine with child exploitation material.
“That has evolved to digital images and then digital movies. I think ... offenders probably want the next best thing.”
Houston said abusers wanted the most realistic form of abuse available and he believed elements of generative AI could be fuelling interest.
“Then you have an offender who now has the ability to watch a child be abused live in front of them from the safety, quote unquote, of their own home and direct that abuse and probably make a recording of their screen, that’s probably a natural progression in terms of escalation and more sexual gratification.”
His message to those considering engaging with the abuse: “It’s criminal. It’s illegal in New Zealand and between my team at DIA and our partner agencies, there’s a high likelihood that we will detect you.
“Get help.”
Katie Harris is an Auckland-based journalist who covers issues including sexual assault, workplace misconduct, media, crime and justice. She joined the Herald in 2020.
Sign up to The Daily H, a free newsletter curated by our editors and delivered straight to your inbox every weekday.