Former Prime Minister Helen Clark's think tank is on a global mission to make the internet a safer place, but its head says New Zealand still hasn't done enough to fix its own patchy social media laws after the Christchurch shootings.

The Helen Clark Foundation has released its latest report, laying out what it's calling the "Christchurch Principles" – a set of 10 broad recommendations for what Governments and social media companies should take into account when coming up with regulations and policies.

The organisation has pitched the idea at the second annual Paris Peace Forum, as the only Australasian project selected to compete for funding at the event.

Vodafone, Spark, 2degrees block access to Christchurch shooter game - but say they're reluctant cops as govt dithers
Helen Clark Foundation set up to tackle big issues of the day;
Big Read: Helen Clark and the Apocalypse;
Trailblazers: Helen Clark


The principles include fairly wide-ranging concepts such "a duty to protect" by states and a "responsibility to respect" by businesses. But the foundation's executive director, Kathy Errington, says it's simply trying to find ways to make the internet safer and more democratic, while trying to fix its problems.

"Most people have come to the view that the platforms got too big, too fast and accepted too little responsibility, but the next question is: how do you do that well?" she said.

The foundation says the principles are meant to sit alongside Prime Minister Jacinda Ardern's Christchurch Call, a voluntary, multinational pact between countries and social media companies aimed at countering online extremism – and are wider-reaching, looking at all harmful content.

The report builds on earlier work by the foundation, which made a number of recommendations specifically for New Zealand, including establishing a social media watchdog.

Errington said New Zealand's current legal framework for how social media companies were governed was a "patchwork with holes cut in it", divided up between five different agencies, with exemptions and caveats throughout.

"In New Zealand we have a problem that there is a huge lack of clarity about who has jurisdiction over social media platforms," she said.

"That's where we will still are. We haven't yet introduced any more robust provisions around regulating platforms."

It took three days for footage of the Christchurch mosque attack and the alleged gunman's manifesto to be deemed objectionable by the Chief Censor.


Hate-speech laws are currently being reviewed by the Government and ministers last month announced they were putting $17 million over four years into a dedicated team aimed at tackling violent extremist content within the Department of Internal Affairs, after a significant lag in making footage of the Christchurch shooting illegal.

But to date, it has mostly been up to social media companies to take down violent content based on their own guidelines, rather than being covered by New Zealand laws.

Nor is there legislation setting dedicated timeframes for the take-down of objectionable material, or imposing a statutory duty of care on social media companies, which would make them liable for the content they host.

"At the moment, when problems occur it's very unclear what kind of remedy to apply," Errington said.

Officials say the gaps in the law were expected to be filled by next year's election.

The Australian Government passed laws cracking down on online extremist content in April.