In a move rarely seen inside a courtroom - and a possible first in New Zealand - Northland police called for the Google account of a recently convicted sex offender to be destroyed.
The request has raised the battle between privacy of personal Google accounts versus harm - and also Google's role in policing the web.
Last month a Far North man, who has name suppression, was sentenced in the Whangārei High Court to seven years and 10 months' jail for 21 charges that included indecent assault, sexual exploitation and unlawful sexual connection involving a 15-year-old girl.
The man, 36, offered the Northland teenager to paying customers for sex via an online advertisement on a website called Locanto in January last year.
Marcus Barker, 55, Owen Sigley, 66, Auckland church leader Michael Weitenberg, 54, and Calvin Fairburn, 37, were sentenced to home detention for their part in the commercial exploitation of the teenager whom they paid to have sex with.
Sexual encounters with the girl were filmed by the Far North man on his mobile phone and sent via WhatsApp to people elsewhere in New Zealand. He then deleted the videos.
A police spokesman said the man's Google account meant he could still retrieve the deleted videos. They asked for the deletion of his account to destroy his access to the videos that would also prevent him from distributing them.
During the man's sentencing Justice Geoffrey Venning ruled the forfeiture of his cellphone but told prosecutors a formal application to have the Google account destroyed was required.
Eleanor Parkes, director of ECPAT Child Alert, the only New Zealand-based organisation focused solely on addressing the sexual exploitation of children, said misusing Google's services was not a right.
"Being safe from sexual abuse is."
She said there was no question that all online accounts used to sexually abuse children - including streaming or storing video evidence of abuse - should be deactivated.
"Web giants like Google, Mega and Facebook need to step up their child protection efforts as it is these sites that are being used to host videos and images of children being sexually abused."
Globally 480,769 child abuse images a week are reviewed by the National Centre for Missing and Exploited Children in the United States - an annual total of 25 million images.
Every month more than 10,000 attempts to access known child abuse sites from New Zealand were stopped by the Department of Internal Affairs' digital child exploitation filtering system.
"What we cannot say is how many attempts to access this material is inadvertent and how many are deliberate."
For Parkes, the most powerful stop to the online circulation of child sexual abuse was to prevent it occurring in the real world.
"In New Zealand, this type of commercial sexual exploitation of children is very often linked to other risk factors and types of vulnerability."
The country's rates of sexual abuse, family violence, poverty, and institutional racism needed to be addressed, she said.
Digital business and internet lawyer Rick Shera told the Advocate he had not heard a request for the deletion of a Google account made in the courts before.
He said often law enforcement agencies would directly approach a digital platform about child sexual abuse material.
"Often the platform will quarantine it so it's useful for evidential purposes later, close down the account and assist police with those sorts of activities."
Shera said there was no real argument against deleting the Far North man's Google account.
Any benefits under the Privacy Act afforded by the man became irrelevant because they were "subject to exceptions in terms of illegality".
A framework to help the digital industry tackle online child sexual abuse and exploitation was developed by ministers of the Five Countries security partnership in July 2019.
New Zealand, Australia, Canada, the United Kingdom, and the United States co-designed the Voluntary Principles to Counter Online Child Sexual Exploitation - which included guidance to prevent the sharing of abuse material and for reviews of existing safety processes.
Shera said the more challenging issue was where online providers drew the line between a pro-active approach to child sexual abuse images (CSAM) and waiting for a call to action.
Currently digital tech giants have algorithms that search for questionable material. Upon its discovery they hash-mark it and monitor it. Shera said it was a challenging process for them because in "a sense that's a game of whack-a-mole".
"Any reputable platform provider as soon as they get an inkling of it - whether it's through their own scanning or whether they're notified of it – will take it down as quickly as they can."
The country's legislation that deals with objectionable materials is undergoing a major change that will give the chief censor the powers to make quicker rulings on whether something is objectionable or not and order its removal.
Shera said changes to the Films, Videos, and Publications Classification Act would be useful as online providers – at home and abroad – felt more comfortable with a ruling from a regulator.
In New Zealand judges can lack jurisdiction to make rulings about objectionable material because things like Google accounts are run out of an international online provider.