In the 1962 dystopian classic A Clockwork Orange, the protagonist is injected with nauseating drugs and forced to observe an onslaught of violence as part of an experimental conditioning therapy designed to change his behaviour.

In the real world, behavioural conditioning isn't quite as violent. It doesn't involve being tied to a chair or having your eyes pried open while a group of political operatives observe your progress. Instead, it encroaches insidiously, often with our consent, subtly nudging us one way or the other, becoming an everyday part of our lives

Today, there is perhaps no better example of this than social media – and particularly Facebook.

Viewed in the moment, the individual nudges – the friend recommendation here, the push notification there – seems largely innocuous. But nudge after nudge over the past 14 years has congealed into a festering sore that has individuals, countries and even Facebook founder Mark Zuckerberg deeply concerned.


Academic and author of the book Anti-Social Media: How Facebook Disconnects Us and Undermines Democracy, Siva Vaidhyanathan doesn't laugh when asked whether we're guinea pigs in an elaborate behavioural science experiment on the impact of social media on societies.

"That's exactly right," he tells the Weekend Herald from Virginia in the United States.

"We've never had anything like Facebook. It covers 2.2 billion people, which means it has the potential to influence 2.2 billion people. And short of the ocean or the atmosphere, I can't think of anything that influences that many people."

This scale stretches into New Zealand, with the social media giant boasting over 3.2 million monthly active users on Facebook and 1.5 million on Instagram. Facebook only trails Google when it comes to the most popular websites in New Zealand.

Early in his book, Vaidhyanathan notes that Facebook's entire world is addictive by design, premised on pulling users through the door and keeping them there for as long as possible.

"Facebook, like snack foods, cigarettes, and gambling machines, is designed for stickiness," he writes.

"Every acquisition Facebook has made has been in the interest of keeping more people interacting with Facebook services in different ways, to generate more data."

Expressed as a bottomless social feed, Facebook's social environment couples a conveyor belt of algorithmically curated interests with the red-headed ping of push notifications to create a continuous reward loop. And with figures from research agency Nielsen showing Kiwis have increased their time spent per month on the platform from six and half hours in 2013 to almost 11 hours in 2017, there's little doubt Facebook achieves what it sets out to do.

Facebook this week announced the global rollout of a new dashboard tool that will enable users to see how much time they've spent on the app and set up alerts to notify them when they've exceeded a customisable limit of what the user deems an appropriate amount of time on the platform.

Steps such as these are being taken in response to growing concerns and research regarding the impact of social media on mental wellbeing. However, though a positive step in terms of being an effort to mitigate the harm caused, the initiative requires buy-in from the user – and addicts aren't often great at saying no.

But Facebook is dying, isn't it?

Last month, Facebook's stock suffered the biggest plunge in Wall Street history, shedding US$120 billion (NZ$176b) from the company's overall value and dropping Zuckerberg's personal wealth by a staggering US$12b.

Looking beyond the ominous headlines such as "The fall of Facebook" and "The ghost of Myspace haunting Facebook", Vaidhyanathan argues that reports of the demise of Facebook have been somewhat exaggerated.

"It's not going to go away soon because of a few scandals," he says, pointing to the company's quarterly revenue figure of US$13.2b and profit of US$5.1b.

The fact that a company with those revenue figures can be punished so harshly by investors tells us more about the nature of financial markets than it does about Facebook, he argues.

Predictions of an apocalypse for Facebook have also been attributed to the exodus of younger users from the platform. There is some truth to this, but it's also worth noting that many of the users have simply shifted their social media activities to Instagram, which was snapped up by Zuckerberg in 2012 for a billion-dollar figure that looks quite reasonable in hindsight.

Vaidhyanathan says when you look at the global strength of Facebook and Instagram alongside the fact that its other big platform, the messaging and chat app WhatsApp, hasn't even been monetised yet, there's little chance of this juggernaut sinking any time soon.

A sour taste

New Zealander Elly Strang, 24, is social media's dream. She's the youngest-ever editor of the well-respected Idealog magazine, an influencer and avid Instagrammer.

But she rethought her relationship with Facebook in the aftermath of the Cambridge Analytica scandal, which saw a data firm collect personal data of 87 million people, including 64,000 Kiwis.

"The Cambridge Analytica scandal left a sour taste in my mouth, and I've definitely toyed with the idea of leaving Facebook as I don't see much value in it as a social media channel any more," she says.

"I feel like Facebook definitely still holds a lot of power when it comes to the attention of the older generation, like baby boomers, who are now the most prolific users on the platform.

"My parents, their friends and other older relatives are constantly on it and share their personal thoughts and beliefs on their profiles and on public forums without much contemplation on what a company like Facebook can do with this information.

"They're less savvy, whereas I think Facebook has lost touch with people aged 30 and under, who share far less on there these days.

"However, I feel as though my generation has transferred this sharing of personal information over to Instagram, so maybe it's a tactical move and Facebook is one step ahead of us, seeing as though it owns Instagram and can monitor the younger ones over on there anyway."

Young professional Elly Strang says the Cambridge Analytica scandal left her disappointed. Photo/Supplied
Young professional Elly Strang says the Cambridge Analytica scandal left her disappointed. Photo/Supplied

And, Strang adds, she's kept Facebook for its Messenger application "to stay in touch with friends and family".

Strang's last point here touches the core of why, despite the constant threats of leaving the platform, most people decide to stay. As long as there's sufficient value offered to the individual, most users are willing to accept the collective risks that come with the channel.

Asked about this tendency and what it would take to actually take to make people leave Facebook, Vaidhyanathan uses the analogy of the automobile industry as a parallel, saying though we know that cars on the whole are bad for the environment, the benefits to the individual of being able to get where you want quickly and effectively makes people reluctant to opt for an alternative mode of transport.

Which is to say that getting people off Facebook is about as easy as getting Aucklanders out of their cars.

Come into my echo chamber

The influence of social media on society was laid bare during 2016's US Election and Brexit Referendum, with new media platforms playing an integral role in giving form to the debates and ultimately playing an integral role in determining the winner.

What was notable about these results was that they stunned those on the losing side, but seemed completely expected to the winners. This disconnect was engineered by the users themselves. Every like, share and post was fed into the algorithm and the Facebook newsfeed served up only what the user wanted to see. Users have essentially built themselves echo chambers, where their biases are confirmed to them over and over again.

US President Donald Trump communicates directly with his base via social media. Photo/Getty Images
US President Donald Trump communicates directly with his base via social media. Photo/Getty Images

Massey University's James Liu, a psychology professor who studies the impact of digital media on society, says social media makes it very easy for people to find others who share their exact views, making critical thought quite challenging.

"The internet age is like all these wormholes, where you can dive down this rabbit hole and suddenly you're in this world where everyone seems to agree with you," he says.

"This is a world where you just connect with whatever it is that turns you on, be it gaming, hating on immigrants or whatever you like. And those worlds can be more brittle than the real world because people can be harsh, they can bully, they can tell you false news and they can do all sorts of things that aren't constrained by reality."

Politicians such as Donald Trump have been masterful at identifying these enclaves of their supporters, then feeding information into these spaces, reinforcing the views.

Liu notes that the extremely partisan echo chambers seen in the US aren't quite as pronounced in New Zealand, largely due to the MMP political system, which forces parties to talk to each other and form alliances despite their differences.

"You've got someone like Winston Peters who could swing either way. And although this used to be a negative, in this environment it's actually a positive when you look at the US or the UK," he says.

But local politicians have also been known to push the boundaries when it comes to the use of social media to push their worldview.

Earlier this year, a National Party Facebook advertisement appeared among the list of decisions arising from complaints to the Advertising Standards Authority.

The ad in question was a standard political ad, referencing a quote from an article published in the Forbes business magazine.

The post read: '"It seems likely that New Zealand will experience a recession during (PM Jacinda) Ardern's term … [and] will probably lose its status as one of the most open, free economies in the world." - Forbes business magazine.'

The complainant in the case took issue with the fact that there was no disclosure of the fact that the quote was, in fact, taken from an opinion piece, which may well have come from someone with a clear political leaning. The complainant argued that it created the impression that a reputable publisher endorsed the opinion, when it was clearly stipulated on the Forbes website that the opinions of the author did not represent the views of the magazine.

The National Party responded to the complaint, saying it was a direct quote from an article and it was clearly presented as an advocacy advertisement from the National Party Facebook page.

In siding with the National Party, the ASA set a complicated precedent in that highly partisan opinion could potentially be dressed in advertising as an endorsement for a politician from a reputable publication.

The beast roams free

There are no easy answers to moderating or regulating the content published on the sprawling online environments created by the tech giants in Silicon Valley. And they quite often play by different rules, allowing the publication of content that would not be allowed to stand in mainstream media.

It's something that became apparent when rumours began swirling about Ardern's partner Clarke Gayford. Mainstream publishers did not report the rumours but the information could still be found posted anonymously across social media.

Attempting to have these posts removed can quickly turn into an elaborate digital game of whack-a-mole, with new posts appearing as soon as others have been flagged and deleted. It's a scenario that can escalate when troll armies mobilise behind a cause.

Digital media minister Clare Curran says New Zealand is now overhauling the content classification regime in order to ensure consistency across broadcasting, online and print environments.

"I've had discussions with entities like Facebook, Amazon and Netflix around their compliance with New Zealand classification standards and content regulation," Curran tells the Weekend Herald.

Digital media minister Clare Curran faces major challenges when it comes to new media platforms. Photo/File
Digital media minister Clare Curran faces major challenges when it comes to new media platforms. Photo/File

"If we were to push those discussions further you could end up with a consistent set of classification codes across content on those platforms; the ability for people to make complaints; and for there to be some way of having those complaints heard."

In its early years, Facebook took a somewhat laissez-faire approach to content moderation, claiming it wasn't a media company and that individual posters were responsible for what they were posting. But, as the platform is increasingly used to spread misinformation and incite hate, Facebook is taking a more active role in protecting users.

Mia Garlick, the director of policy for Australia and New Zealand, tells the Weekend Herald the company is ramping up its efforts to combat harmful content.

"The safety of our community is a top priority," she says, pointing out that the company invests heavily in AI technology, processes and people to ensure quick action on violating content and accounts.

"We're also doubling our teams working on safety and security to 20,000, and will continue working with local partners, such as Netsafe and Sticks n Stones, to develop resources and education campaigns that are local, relevant and useful."

Define hate

As part of its increased push to improve the user environment, Facebook has also made a pledge to improve its removal of hate speech from the platform through its growing global team of 7500 content reviewers, spanning 50 languages.

According to Vaidhyanathan, it doesn't even come close to being enough to solve this massive global problem.

"To effectively limit the damage, Facebook would have to hire thousands of people in more than 120 languages and train them to make very difficult decisions about constantly changing subjects and terms," he says.

These decisions are difficult enough in a developed democracy such as New Zealand; Vaidhyanathan says it's nothing compared to a nation such as Myanmar where extremists have been using Facebook to incite hate against the Rohingya people.

This is an extreme example, but there is a more subtle debate to be had about the impact that content removal might have on free speech in the digital environment.

Vaidhyanathan warns that identifying hate speech is no longer as easy as screening for a list of racial epithets and ensuring these are removed.

"Racists have become much more sophisticated in the age of the internet," he says.

"Many of the world's leading racists have developed a veneer of sophistication and adopted a pseudoscientific tone that has allowed them to either avoid the accusation of vulgar racism by saying, 'I'm merely starting a debate, why are you afraid of ideas or research?'"

A good example in the local context would be controversial Canadian YouTube star Stefan Molyneux, who was due to speak last night in Auckland alongside the equally provocative Lauren Southern before the event was cancelled.

One of the issues for which Molyneux has attracted controversy is for his argument that black people are genetically predisposed to having lower IQs than other races. If Molyneux came out and simply said he believed his race was smarter than other races, he would quickly be denounced as a white supremacist. But he layers his argument with references to widely disputed psychological studies, saying it's simply what the facts say and that he's just trying to tell truths everyone else is afraid to share.

It's an approach used in Holocaust denial and tirades on the impact of immigrants moving into nations.

"There's a rhetorical flourish to many racists that makes it extremely difficult to use blunt instruments against their content," Vaidhyanathan argues. "It also undermines the idea that Facebook can develop an artificial intelligence instrument that can screen for it."

The German lead

Looking at international examples, Vaidhyanathan says there's growing evidence that regulation will play an essential role in protecting democracies from the potential harm of social media.

Mark Zuckerberg, chief executive officer and founder of Facebook. Photo / AP
Mark Zuckerberg, chief executive officer and founder of Facebook. Photo / AP

"Only when they are restricted by law in other places in the world do they institute controls," Vaidhyanathan says.

"Facebook has been very effective at filtering out anti-Semitic and Holocaust denial propaganda from its German-language services."

This is partly because Germany passed strict hate speech laws, giving the government power to impose a fine of up to 50 million euros against an individual or a company if the offensive content is not removed.

The French Government is working toward the introduction of a fake news law by the end of 2018 and the Czech Republic is establishing an anti-fake-news task force.

Locally, there are moves toward legislative changes, with Justice Minister Andrew Little working on a revamp of our privacy laws. But there are already concerns that the proposed penalties won't be enough to dissuade big tech companies from doing as they please.

Asked for his opinion on what New Zealand could do to rein in the social media beast, Vaidhyanathan points to our history in fighting nuclear proliferation, saying that you don't have to be big to make a difference.

"Perhaps a country like New Zealand could lead the way in experimenting with responses to Facebook in a way a large, diffuse country, like the United States, cannot."

The only question is whether our lawmakers have the stomach to take on the challenge and introduce meaningful change the rest of the world is likely to follow.

Facebook's biggest scandals of the year… so far

Cambridge Analytica

It's the scandal that made everyone care about their data and led to a range of legal changes on the use and storage of data in the digital world. What's worrying is that Cambridge Analytica is only one of many data firms toying with our data.


Earlier this year, it was revealed Facebook was being used by extremists to incite violence against the Rohingya people in Myanmar. This was particularly troubling because it provided a clear link between Facebook and physical harm.

Russia meddling and fake news

Just last week, Facebook's team discovered and deleted 32 political pages and accounts linked to Russia, aiming to interfere in the US mid-term election. This adds yet another layer of evidence to the case of Russia using social media to influence US politics.

WhatsApp linked to violence

In early July, five people were killed by a mob in India after rumours were spread on WhatsApp that they were child traffickers. Facebook responded to this by limiting the number of times messages could be forwarded to other phones via the service.