Facebook has broken its silence over the Christchurch terrorist attacks, outlining its plans to rein in the sort of videos posted by the alleged shooter and to clamp down on hate speech.
In a letter provided exclusively to the Weekend Herald, the social media giant's No 2, chief operating officer Sheryl Sandberg, has revealed it will restrict those who can use Facebook Live and build better technology to quickly identify versions of violent videos and images and prevent them being shared.
It will also ban a range of extremist groups from its platforms.
The shooter livestreamed video of the attacks in which 50 people died and dozens more were injured as they prayed in two Christchurch mosques on March 15.
• Sandberg's full letter: By Working Together, We Can Win Against Hate
• Editorial: Facebook needs to do much more
• Facebook reveals viewer numbers, responds to critics
• Facebook finally removes one NZ-based hate site, but leaves another
Sandberg, who is the second-ranking executive behind Facebook creator Mark Zuckerberg, said it had found and removed 900 edited versions of the 17-minute video.
"We are exploring restrictions on who can go Live depending on factors such as prior Community Standard violations," Sandberg said.
"We are also investing in research to build better technology to quickly identify edited versions of violent videos and images and prevent people from re-sharing these versions.
"We are also using our existing artificial intelligence tools to identify and remove a range of hate groups in Australia and New Zealand, including the Lads Society, the United Patriots Front, the Antipodean Resistance, and National Front New Zealand.
"And this week we announced that we have strengthened our policies by banning praise, support and representation of white nationalism and separatism on Facebook and Instagram."
She said people were right to question how online platforms such as Facebook were used to circulate horrific videos of the attack.
"We are committed to reviewing what happened and have been working closely with the New Zealand Police to support their response."
Specifically, Facebook was:
• Exploring restrictions on who can go Live depending on factors such as prior Community Standard violations.
• Investing in research to build better technology to quickly identify edited versions of violent videos and images and prevent them being spread.
• Strengthening steps to remove hate speech from Facebook's platforms.
Facebook and Zuckerberg had been strongly criticised by politicians and commentators for the company's previous public silence.
Prime Minister Jacinda Ardern has called for an international response while also calling into question the motivation of social media giants to act.
She said this week the moral rationale for self regulation existed before the Christchurch terror attacks but it hadn't happened.
• Brian Gaynor: Facebook needs to face up to hard questions
• World Federation of Advertisers calls on powerful members to put pressure on social media companies
•How much Google and Facebook made in NZ in 2018
"So if what has been seen in [Christchurch] isn't enough . . . I'm not sure what will be."
Facebook had been in touch, she said, and explained its efforts to combat the spread of video taken by the terrorist.
While the actions "in the aftermath", appeared to be genuine, Ardern said, "that doesn't mean that there aren't other questions that still need to be answered."
She has called for an international approach.
"We cannot, for instance, just simply allow some of the challenges that we face with social media to be dealt with on a case-by-case basis. There is an argument here to be made for us to take a united front on what is a global issue."
Former Prime Minister Helen Clark has observed that if the companies put as much effort into developing algorithms for preventing the spread of hate material as they put into targeted advertising, they could solve the problem.
Facebook was among several social media companies called into a meeting with the Australian Government this week where the agencies were told their executives could face jail time for not removing terrorist content.
"We need to prevent social media platforms being weaponised with terror content," Australian Prime Minister Scott Morrison said.
In the letter provided to the Weekend Herald, Sandberg said it took down the alleged shooter's Facebook and Instagram accounts and removed the video of the attack, in the immediate aftermath of the shootings.
It used artificial intelligence to proactively find and prevent related videos from being posted but people re-sharing it and re-editing it made it harder for its systems to block.
"We have identified more than 900 different videos showing portions of those horrifying 17 minutes," Sandberg said.
"People with bad intentions will always try to get around our security measures. That's why we must work to continually stay ahead."
• Aussie PM slams Facebook, but his Government gives it another $68m
• 'Your silence is an insult to our grief,' Privacy Commissioner tells Facebook
• Chris Keall: Front up, Facebook CEO Mark Zuckerberg
•Facebook charged with housing discrimination
Changes had been made to its review process to help improve response time to videos like the mosque attack.
Sandberg said Facebook was ready to work with the Royal Commission of Inquiry into the country's security agencies to further review the role that online services played in such attacks.
It was also ready to work with the New Zealand Government on future regulatory models for the online industry in areas such as content moderation, elections, privacy, and data portability.
Sandberg also noted the response by New Zealanders.
"The whole world has seen the compassion, unity, and resilience you have shown as a country through your grief."