COMMENT:

Australia is once again ahead of the play when it comes to clamping down on terrorism.

While the Ardern Government is grappling with explaining to Facebook, and the like, of New Zealand's expectations of social media behaviour, Australia has foreshadowed tough new criminal laws to crack down on platforms that fail to quickly remove footage of live terrorist attacks.

Australian Prime Minister Scott Morrison threatened severe penalties on social media companies if they do not voluntarily make changes to curb the spread of extremist material on their platforms.

Advertisement

"We need to prevent social media platforms being weaponised with terror content," he said. "If social media companies fail to demonstrate a willingness to immediately institute changes to prevent the use of their platforms, like what was filmed and shared by the perpetrators of the terrible offences in Christchurch, we will take action."

It's 10 days on from the terrorist attack where a gunman slaughtered 50 Muslims at prayer, broadcasting his murderous rampage live on Facebook.

But there has yet to be a public apology from Facebook co-founder, chairman and chief executive Mark Zuckerberg.

He has failed to even show a sense of common decency since the massacre, let alone admit some culpability for the fact it was his company's lax procedures that allowed the gunman to broadcast live 17 minutes of terror, and that his company failed to promptly remove the footage.

A company, which in my view, has become monstrous in its pursuit of profit and should be brought to heel by governments and regulators.

Zuckerberg is the man who "sets the strategy" for Facebook.

Founded in 2004, Facebook's mission is to give people the power to build community and bring the world closer together.

"People use Facebook to stay connected with friends and family, to discover what's going on in the world, and to share and express what matters to them", it trumpets on its website.

Advertisement

So too, do terrorists.

It's extraordinary the Facebook board — which includes New Zealander Peter Thiel — has not required Facebook to make a public apology.

For the record other board members include Chief Operating Officer Sheryl Sandberg, who earlier put a call into Ardern to offer "condolences", the others are independent directors Marc Andreessen, Erskine Bowles, Kenneth Chenault, Susan Desmond-Hellman, Reed Hastings and Jeffrey Zients. Thiel has also yet to comment.

Zuckerberg said his priority in 2019 was to tackle social issues. Like other social media companies, Facebook has been an enabler.

But it has also been a platform for fake news and crossed the line when it shared users' data with political consulting firm Cambridge Analytica.

But while Zuckerberg is silent, Microsoft President Brad Smith has called on his tech industry peers to take action following the Christchurch massacre.

"Words alone are not enough," Smith wrote in a blog post over the weekend. "Four months ago, when our team at Microsoft first made plans for a visit to New Zealand that began yesterday, we did not expect to arrive on the heels of a violent terrorist attack that would kill innocent people, horrify a nation and shock the world.

"Like so many other people around the globe, across Microsoft we mourn the victims and our hearts go out to their families and loved ones. This includes two of the individuals killed who were part of the broader Microsoft partner community."

Smith met with Ardern and other Cabinet ministers on Monday. Ardern said their conversations were broad.

"I had a very general conversation around our perspective on what expectations we can have around social media companies essentially holding up and upholding what are their own community standards."

Microsoft, YouTube, Facebook, and Twitter created the Global Internet Forum to Counter Terrorism, a group with a shared database of terrorist content and machine learning tools to identify violent images.

It's clearly not working.

Morrison's muscular approach, where social media executives could face jail time if their platforms fail to remove terrorist content, just might.