What's the best way to persuade Facebook to behave more responsibly?
Stay at the table and try to talk sense - like the NZ Super Fund with its large and growing shareholding in the social network - or to walk out in a huff, as various big-money advertisers have over the past few days?
In March last year, following the Christchurch mosque massacres, the Super Fund joined other institutional investors around the world in a coalition calling on Big Tech to tighten up its act.
"We will be calling on Facebook, Google and Twitter to take more responsibility for what is published on their platforms. They must take action to prevent this sort of material being uploaded and shared on social media. An urgent remedy to this problem is required."
Since then, Facebook and Twitter have hung tough on livestreaming (Google disabled it for most mobile YouTube users).
And in the build-up to the Christchurch anniversary, victims' families said they were still routinely harassed on Facebook.
Throughout, our national fund has maintained substantial stakes in the Big Tech companies it's been campaigning against.
The NZ Super Fund's June 2018 equity disclosure statement said it owned $211.5 million worth of Facebook shares, shares in Google's corporate parent Alphabet worth $317.3m, and $13m worth of scrip in Twitter.
A December 2019 update had the Super Fund's Facebook holding worth $241.6m, its Alphabet holding worth $400.1m and Twitter $12.1m.
Covid-19 Coronavirus: More cheap loans from the Crown
I wonder what sort of message that sends Facebook et al?
In March this year, NZ Super Fund CEO Matt Whineray conceded not enough progress had been made at this point.
"We have made our voice known to these companies since the attack in Christchurch," he said.
"While some positive changes have been made, more needs to be done at the executive and board level to build accountability and ensure these platforms cannot be used to spread objectionable material."
The coalition of funds wanted:
• clear lines of governance and accountability for senior executives and Board members to ensure your platforms cannot be used to promote objectionable content like the livestreaming and dissemination of the Christchurch shootings; and
• sufficient resources being dedicated to combating the livestreaming and spread of objectionable material across your platforms.
The group said in a second open letter, sent on the anniversary of the attack, that it was "dissatisfied with the response from your senior executives and boards".
"The failure to respond to these actions creates a significant business risk, beyond the harm caused to the global community. You have a duty to address that."
Since then, there has been renewed controversy about inflammatory content on Facebook in the wake of mass protests calling for police reform in the US - non-the-least posts by US President Donald Trump.
Today, a spokesman for the Super Fund noted it had voted in favour of resolutions for an independent chair and for a human/civil rights expert to be added to the board at Facebook's annual meeting on May 27.
The fund also backed a resolution calling for board-level oversight regarding human rights issues impacting Facebook's community of global users.
"We also withheld votes for directors who are on the Audit and Risk Oversight Committee and who have been on the Facebook board for more than 12 months as it is this committee that has responsibility for how the platform may be used to facilitate harm or undermine the public interest," the Super Fund spokesman said.
All of the resolutions were knocked back. Trump's some-time dining companion , Facebook CEO Mark Zuckerberg, remains chairman of the company he founded. The key issue that blocks any major governance change, year after year, is that Zuckerberg owns a minority of the company but controls 57.7 per cent of voting rights thanks to his large block of "Class B" shares that have 10 votes per share.
The Super Fund is well aware of what a spokesman called Zuckerberg's "outsized voting rights", but those rights - and their ability to doom any attempt at reform through AGM resolutions to Quixotic failure - are not highlighted in its PR about its social media campaigning.
A spokesman for Facebook said earlier that the social network had made a substantial effort to increase its filtering, both by AI and people. And he reiterated that as one of the Christchurch Call summit outcomes, it would collaborate with a number of top US universities on a US$7.5m (NZ$11.8m) project to research systems that could better detect harmful content.
We're still waiting to see progress on that front.
Getting Zuck's attention
Elsewhere, we've seen Facebook can be fast on its feet.
Late last week, Unilever and other big multinationals pulled advertising from the social network (and its peers) over concerns it was not doing enough to combat hate speech.
The moves saw Facebook shares crash 8.3 per cent on Friday, wiping US$56 billion from the company's market value and knocking US$7.2b off Zuckerberg's personal net worth.
The same day, the social network said it would start labelling political speech that violates its rules and take other measures to prevent voter suppression and protect minorities from abuse.
Facebook will have to go further. Over the weekend, Coca-Cola, Starbucks, Levi's and Diageo followed Unilever out the door. But the social network is now making concrete moves in the right direction.
The multinational advertiser boycotts are only for 30 days. But they all involve big money - a language Facebook understands. Starbucks spent US$94.86m on the social network. And the boycott is spreading.
So expect more Facebook action on hate content - and in days, not weeks.
And the change will come from Facebook itself, not the White House.
Despite the bluster of his recent, rambling executive order threatening to rein in Facebook and Twitter, Trump needs the social networks much more then they need him.