Prime Minister Jacinda Ardern says the global community should "speak with one voice" when it comes to blocking harmful content on social media platforms.

Ardern has criticised the role of social media in the Christchurch terror attack on March 15, and she met with a group of digital media experts in Auckland on Friday to learn more about the issue.

"I wanted to make sure I had the views of those that work in the [social media] space, particularly given that questions are being raised around what role New Zealand could and should play in this debate at an international level."

She said she would be happy to say who she met with, but would seek their permission to do so first.

Advertisement

Australia, the UK, Ireland and Germany are all looking at measures to address harmful content, but Ardern said a global, co-ordinated approach would strengthen domestic measures.

"Different countries have asked for different things. Perhaps the power would be when we speak with one voice.

"My question here would be: Do those legislative tools answer the questions and challenges we faced through March 15? If not, what more should we ask, and should we be asking that together?

"These are global platforms. My strong view is that if we wish to establish a step-change in behaviour, we need to take a global approach."

Facebook was heavily criticised for how long it took to block the video of the gunman's attack on Christchurch mosques, which was livestreamed and shared on its platform.

Facebook said it removed about 1.5 million videos of the attack globally in the first 24 hours, with more than 1.2 million of those blocked at upload.

Twitter's head of legal, policy and trust Vijaya Gadde has said the social media firm had removed 20,000 tweets since the attacks, but admitted it "feels like a leaky bucket".

In an op-ed piece at the end of March, Facebook founder Mark Zuckerberg called for Governments to be more active and regulate four areas: harmful content, election integrity, privacy and data portability.

Advertisement

"Lawmakers often tell me we have too much power over speech, and frankly I agree."

He said regulation could set out what is allowed and require companies to build systems to minimise harmful content. A third-party could then set standards based on the regulatory rules, and measure companies against those standards.

Facebook has also responded to the Christchurch shootings by saying it will ban posts about white nationalism and white separatism.

Last week Australia's Parliament passed legislation that could imprison social media executives if their platforms streamed violent content.

In the UK, the Government is looking at making tech company bosses liable for failing to limit the distribution of harmful content, and setting up a regulator to police the rules.

Justice Minister Andrew Little has fast-tracked a review of the Human Rights Act, which would look at hate speech and possible hate crime laws.