Pressure is mounting on the world's social media giants following the Christchurch mosque shootings.
Countries already grappling with how to rein in the behaviour of platforms such as Facebook and Twitter and force them to act responsibly have redoubled their efforts in recent days, New Zealand among them.
There is also widespread agreement that nations must act together to force a change in how social media deals with harmful digital content.
Prime Minister Jacinda Ardern, senior ministers and the Privacy Commissioner have all been highly critical of social media companies, in particular Facebook, following the March 15 shootings, which were livestreamed.
Sharing the video has effectively been banned by Chief Censor David Shanks and doing so now carries a fine of up to $10,000 or up to 14 years' jail.
Privacy Commissioner John Edwards has also called on Facebook to give the names of people who distributed the video to police.
Ardern has criticised the response by Facebook and other social media companies, saying there were still questions to be answered over their actions in the aftermath of the massacre.
Cabinet is expected to discuss if social media platforms can be forced to comply, but there is widespread acknowledgement that a cross-border approach is needed.
Minister Responsible for the SIS and GCSB Andrew Little has said the use of social media to spread hate speech and incite violence would likely be discussed by the Five Eyes intelligence-sharing network of countries when they meet in the UK later this year.
In Australia, Prime Minister Scott Morrison and Opposition Leader Bill Shorten are due to meet social media executives on Tuesday to discuss violent offences being broadcast by perpetrators on sites such as Facebook and YouTube following the mosque shootings, SBS reported.
"It's no good for the people who run the swamp...to then not take responsibility for what crawls out," Shorten told reporters yesterday.
Under sweeping changes proposed for Australia's Privacy Act, online platforms that seriously or repeatedly breached privacy laws would be fined A$10 million compared to the current penalty of A$2.1 million, according to SBS.
Morrison has also called for a crackdown by G20 nations on social media giants following the attacks and has reportedly written to Japanese Prime Minister Shinzo Abe asking for it to be a top agenda item when the G20 meets next in Japan.
Attorney-General David Parker has a keen interest in social media liability. He raised it at a meeting with his counterparts from the Five Eyes countries of the US, UK, Canada and Australia in August last year.
He declined to be interviewed on the issue today.
In a speech at the recent swearing in of Justice Helen Winkelmann as Chief Justice prior to the mosque shootings, Parker predicted challenges for her in the area of social media.
"Where does the limit lie between freedom of speech and harmful fake news or hateful propaganda," he asked.
"What duties are owed by those who profit from social media platforms to society, private citizens, or to the public institutions which democracy relies upon?
"These are very important and complex issues for our time and the wisdom of the courts you lead will help achieve wise outcomes."
And speaking to reporters following an assault on Green Party co-leader James Shaw just a day before the mosque shootings, Parker criticised the role of social media in spreading extreme opinions that fed political biases.
He said the British Digital Culture, Media and Sport select committee, which looked into the issues of disinformation and fakes news, was right to call out the kind of extreme opinion shared via social media which fuelled biases and caused instability among some people.
"We need to reflect upon what's going on in society that causes people to be so extreme in their reactions to things they disagree with," he said.
In its final report the committee, chaired by Tory MP Damian Collins, called for:
• A compulsory code of ethics for tech companies overseen by an independent regulator
• The regulator to be funded by a levy on tech companies operating in the UK
• Powers for the regulator to launch legal action against companies breaching the code of ethics
• Social media companies obliged to take down known sources of harmful content, including proven sources of disinformation
• Hefty fines for companies that fail in their obligations on harmful or illegal content
In releasing the report last month, Collins said: "Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised 'dark adverts' from unidentifiable sources, delivered through the major social media platforms we use every day."
He said the big tech companies were failing in the duty of care they owed to their users to act against harmful content.
"These are issues that the major tech companies are well aware of, yet continually fail to address. The guiding principle of the 'move fast and break things' culture often seems to be that it is better to apologise than ask permission," Collins said.
Little, who is also in charge of the Harmful Digital Communications Act, says there is a willingness in Government to look more closely at what is wrong with social media and what powers could be used to rein in the worst aspects of it.
He agreed social media was becoming more toxic.
"It's kind of got to the point where it's almost a parody of itself. You go on Twitter to bathe in the toxicity and putrescence of it. You don't take it seriously,
"On the other hand, I've seen stuff on Facebook which causes my toes to curl and wonder about the mentality of some people."
The Harmful Digital Communications Act was about "clear expressions of hatred and threats expressed towards individuals through the internet, social media platforms and the like".
"One of the challenges the law always has been to draw that line between upholding and protecting free speech but on the other hand clearly dealing with expressions that are calculated to cause harm or incite others to cause harm.
"The law is always trying to find that balance. I believe that legislation that we've got in place at the moment does that. If there's any call for review or change, let's have a look at that."