Facebook is regretful but not apologetic for the way it failed to block the online footage of the March 15 terrorist attack, which is still available on the online platform today.

READ MORE:
Six months on from the Christchurch shootings, a look at what's changed on social media - and what hasn't

The social media giant's chief operating officer, Sheryl Sandberg, standing alongside Prime Minister Jacinda Ardern in New York today, paid tribute to Ardern and pledged that Facebook would do all it could to fight terrorist and violent extremist content online.

Asked if Facebook should apologise because the March 15 video footage can still be found today on its social media platform, potentially retraumatising victims' families, Sandberg did not answer directly.

Advertisement

"Anytime anyone sees anything [of that footage] on Facebook, that is something we deeply regret. We have tried as hard as possible to get things down. We will continue to look for every instance to get it down.

"We can't wait until a moment like this happens again. We have to do the hard work now to establish the systems, and the protocols, the cooperation, and that's what today is about."

Earlier today Ardern announced the next stage of the Christchurch Call, including making the Global Internet Forum to Counter Terrorism an independent body to oversee a number of work programmes.

These include prevention work such as researching how algorithms affect social media users and intervening if they are being pushed towards radicalised or hateful content, and overseeing a crises-response framework to stop terrorist content from going viral.

Facebook was heavily criticised for failing to immediately stop the livestreamed footage on March 15, and for failing to prevent its viral spread; it was uploaded 1.5 million times within the first 24 hours, and Facebook's AI automatically blocked 1.2 million of those.

"That gap between the 1.2 million and the 1.5 million is where we acknowledge we need to do better," Sandberg said.

"It's not just about preventing the last thing that happened like this, but the future things, even if we don't yet know what they are."

Sandberg said that there were many good uses for social media platforms.

Advertisement

"I've been working in tech for almost 20 years and I've seen incredible examples of the good technology can do, in a crisis like a tsunami, in moments of need when someone faces illness, or even in the small everyday moments like a birthday wish.

"The people who seek to do good on our platforms are many, but the people who seek to do harm are there as well."

Ardern noted the work that Facebook had done since March 15, including tighter rules on livestreaming, and redirecting users that search for white supremacy-related items towards anti-hate groups.

"No one, no government, no social media or tech company wishes to see these platforms used to cause harm, and yet the alleged terrorist in New Zealand obviously sought out those platforms for that very purpose," Ardern said.

"What happened on March 15 was horrific and should never happen again."

Prime Minister Jacinda Ardern with Facebook chief operating officer Sheryl Sandberg in New York. Photo / Supplied
Prime Minister Jacinda Ardern with Facebook chief operating officer Sheryl Sandberg in New York. Photo / Supplied

The Christchurch Call is voluntary, and Ardern said regulating tech giants was difficult because the most effective action to reduce harm was to prevent such content from being uploaded in the first place.

"It's nigh on impossible to regulate in those areas. How do you regulate forced research on tech to remove things immediately or even to direct individual into different content to avoid them coming across or engaging in violent extremist content or even hateful content?

"To make the greatest gains, we actually have to collaborate ... so far engagement has been excellent."

New independent body to prevent terrorist and violent extremist content

Earlier today, social media giants agreed to join forces to research how their business models can lead to radicalisation, and how best to counter disrupt online extremism.

The work will be driven by the Global Internet Forum to Counter Terrorism (GIFCT) - set up by Facebook, Twitter, Google (YouTube) and Microsoft in 2017 - which will become an independent body tasked with preventing and responding to terrorist and violent extremism online.

It will have dedicated teams focused on a set of goals, including looking at the companies' algorithms that potentially lead to radicalisation.

This is a step that has previously been resisted by tech companies due to the commercially-sensitivity around its algorithms, but it has been a particular focus for Ardern, who has said that "every time you're on social media you could end up in a rabbit hole".

The work will form part of the GIFCT's prevention strategy, as will the use of counter narratives to steer users away from online rabbit holes that lead to extreme content, a step Facebook has already taken for users who search for anything related to white supremacy.

The GIFCT will also a publish toolkit, developed with the Institute for Strategic Dialogue, to for building online campaigns that challenge extremist ideologies.

Counter online extremism was a particular request from New Zealand officials, but is likely to exacerbate concerns over freedom of expression and internet censorship.

The GIFTCT will also be the focal point for a crisis-response framework to stop the viral spread of terrorist content through a coordinated response, as previewed in the Weekend Herald.

"In a sense, we are trying to create a civil defence style response mechanism – and in the same way we respond to natural emergencies like fires and floods, we need to be prepared," Ardern told a Christchurch Call leaders' dialogue in New York today.

"I don't want any other country to be placed in the situation New Zealand was in the minutes, hours and days after the attack in Christchurch, when we were left scrambling to respond to and remove livestreamed hate.

"I am pleased to say today that this crisis response protocol is ready to deploy."

The protocol includes a shared list of country and company contacts to ensure a swift response.

It has strict actions including sharing of hashes (digital fingerprints to identify content and how to remove it), URLs, and keywords, as well as takedown measures.

One of the issues following the March 15 attack is that tech companies did not have joint protocols to properly tag content, or established people to contact to coordinate a response.

It is believed that the new framework would have made a significant difference in stopping to viral spread of the March 15 footage, which was shared on Facebook 1.5 million times in the first 24 hours.

Google will host a demonstration and testing exercise in New Zealand in December this year.

The Christchurch Call was signed at a Paris summit in May by 17 countries, the European Commission and eight online platforms, but now 31 more countries and two more companies – including Mexico, Sri Lanka and Chile, and the Council of Europe and UNESCO - have added their voices.

The US, however, is still holding out from signing, though it has been involved in the ongoing work.