Anti-Islam pages and clear threats to politicians draw company’s standard reply.

Kiwis trying to get "hate speech" taken down from Facebook following the Christchurch mosque attacks say the social media giant isn't listening.

And the live stream of the gunman's attack is still circulating.

Facebook chief operating officer Sheryl Sandberg had last week promised her team would take "strong steps" to tackle hate speech after a gunman livestreamed 17 minutes of an attack on two Christchurch mosques that killed 50 people.

However, Kiwi Facebook users are now taking to social media to label her words "lip service".

Advertisement

They say they have reported anti-Islam pages to Facebook, a photo showing New Zealand's Parliament in crosshairs and comments - such as one saying Prime Minister Jacinda Ardern should be gassed - with little success.

Many complaints have been met with what look like automated Facebook responses, stating the posts did not breach the company's Community Standards.

Wellington resident Denise Stephens received one such reply in which Facebook saw no reason to take down a photo posted to anti-immigration page Yellow Vests New Zealand showing gunsights on the Beehive.

"That is clearly threatening violence towards politicians, and given what's happened around the world, there has been politicians killed or injured by extremists," Stephens said.

"Something like that does count as an incitement to violence - it leaves me wondering what Facebook's standards are."

The renewed pressure on social media companies to take responsibility for what is shown on their sites comes after world leaders and citizens were left horrified by the widespread sharing of the Christchurch attack video.

Facebook had attempted to take the video down, but it was quickly shared and reshared thousands of times across the internet.

Facebook's Sandberg said in her Herald opinion piece last week that the company agreed with calls it could do more.

She said the Christchurch shooting video had spread mainly through people re-editing it in ways that made the company's systems harder to block it.

Sheryl Sandberg, chief operating officer of Facebook, has promised to take strong steps to remove offensive posts from Facebook. Photo / Supplied
Sheryl Sandberg, chief operating officer of Facebook, has promised to take strong steps to remove offensive posts from Facebook. Photo / Supplied

But following the attack, Facebook was now "taking three steps".

This included "strengthening the rules for using Facebook Live, taking further steps to address hate on our platforms, and supporting the New Zealand community".

As part of this, Facebook had focused its artificial intelligence tools towards hunting down and removing hate groups and ensuring videos like the Christchurch gunmen's were taken down faster.

This had led to the removal of groups, such as the Lads Society, the United Patriots Front, the Antipodean Resistance, and National Front New Zealand, Sandberg said.

Across the Tasman, the Australian government has just passed tough new legislation threatening social media companies with fines up to 10 per cent of their revenue and their executives up to three years' jail if they fail to remove "abhorrent violent material expeditiously."

"There are platforms such as YouTube, Twitter and Facebook who do not seem to take their responsibility to not show the most abhorrently violent material seriously," Attorney-General Christian Porter told media.

Yet despite the promises and new laws, offensive posts and pages continue to proliferate.

A Herald investigation by Eric Feinberg, founder of the New York-based Global Intellectual Property Enforcement Center - an organisation that tracks extremist content on social media - yesterday found seven copies of the gunman's video still posted on Facebook.

Feinberg also found five copies on the Facebook-owned Instagram.

Earlier this week, British paper, The Times, reported that it had also been alerted to more than a dozen versions of the attack video on Facebook-owned Instagram.

Versions have additionally been available this week on Twitter and YouTube.

Herald reporter Frances Cook this morning also reported a brenton_tarrant_fanpage set up on Facebook-owned Instagram, which showed a screenshot of the gunman.

Instagram later replied to say it had taken down the photo of the gunman but not shutdown the page, despite it stating the attacker had done a big favour to the world.

Another Herald reader, who wished to remain anonymous, said he had unsuccessfully reported a post by controverial commentator Milo Yiannopoulos, claiming the Al Noor Mosque where 41 people were gunned down had harboured terrorists.

But more alarming to the man was the fact a family member had shared the post.

Media commentator Russell Brown has reposted messages by Facebook users, who had unsuccessfully reported a page called New Zealanders Against Islam.

Wellington resident Denise Stephens reported this photo to Facebook in the hope it would be taken down, saying it incites people to violence. Photo / Facebook
Wellington resident Denise Stephens reported this photo to Facebook in the hope it would be taken down, saying it incites people to violence. Photo / Facebook

Wellington's Denise Stephens, meanwhile, suspected Facebook's artificial intelligence tools were woefully inadequate to police the social media platform.

As well as reporting the "offensive" Yellow Vests New Zealand page, she also reported a fake profile that was made to look like it was the official Facebook page of retail chain Farmers.

Stephens ended up with the same message back from Facebook stating that no "Community Standards" had been breached.

"I don't know how many humans are looking at these pages because it seems a lot can get through without much scrutiny," she said.

Facebook failed to get back to the Herald's request for comment, despite last week being in regular touch with the paper in the lead up to the publication of Sandberg's opinion piece.