They pass around your private information and manipulate your attention so they can sell it. Yet still we spend hours a day on the world's biggest social media platforms - but should we? This Herald series explains exactly what's at stake when you like your friend's Facebook post, Google symptoms when you're sick and share fake news.
Is Prime Minister Jacinda Ardern planning to move the capital of New Zealand from Wellington to Auckland?
Listen to an audio clip unearthed today for the first time and that's an impression you could get.
In a conversation with an unnamed journalist, Ardern talks about the disparity between New Zealand's two largest cities, leaning strongly toward a move north.
"You know, I love Wellington and it's a phenomenal place, it's a beautiful place," she says.
"But there's such an obvious disparity with infrastructure and access to services."
Pressed by the journalist, Ardern raises her voice and drives home her rationale.
"Business has asked this Government that if we want to boost the environment that they want to operate in, then we need to keep moving. We need to make changes."
Surprising, right? So surprising you might be left wondering how this hasn't made front-page news and prime-time TV across every media organisation in the country already.
That's because the entire clip is fake.
Let me repeat that for those who dabble in the art of skim reading: The audio clip is fake.
It's been pilfered from existing online audio content and pieced together into a narrative that didn't actually happen in the real world.
"We didn't use any special software, we didn't buy anything specific. We literally just cut up speeches she'd done before to put that together," says Justin Mowday, the chief executive of DDB, the advertising agency behind the audio.
"It took about two hours and it gave us a fright. Someone could just have an audio programme on their laptop and do that pretty quickly. It was pretty scary."
As someone who sees the power of persuasion playing out in the media every day, Mowday has become increasingly concerned about the systematic dismantling of truth in the public sphere.
His team decided to develop this clip to give New Zealanders a relatable reminder that they need to be vigilant reading, sharing and interacting with content online in the lead-up to this year's election.
Internationally, we've seen sophisticated deep-fake videos of actors occupying the likeness of Barack Obama and Donald Trump and having them say whatever they please.
While a terrifying hint at what the future holds, these examples do not quite give a glimpse at what an average layman, sitting in front of laptop, chugging a few beers can do right now. Much like the threat of 3D-printed guns, the most sophisticated deep-fake videos show what's possible with decent tech and lots of technical nous.
The difference with the ad agency's audio clip is that the barrier to entry is incredibly low. On a technological spectrum it falls far closer to composing a series of romance scam emails than developing a deep-fake video.
"The implications are huge," says Mowday.
"But that said, there's always been an element of fake news around, and what we do as people is learn what to trust and what not to trust pretty quickly.
"I think we're at the beginning now of learning that video or audio that we're served may not be trustworthy at face value."
You don't need to look far to see a recent example of the dangers presented in this. In May last year, a video circulated around Facebook claimed to show Democratic House Speaker Nancy Pelosi intoxicated while speaking during an event. In reality, an existing video of the event had simply been slowed down through a simple editing process to make it seem that Pelosi's speech was inhibited.
By the time anyone bothered to check, this video and the misleading story attached to it had already become a smoking gun passed between friends, family and co-workers as evidence that the outspoken Trump critic simply couldn't be trusted.
The edge of truth
The thing that makes both the Ardern and the Pelosi clips so compelling is that they leave just enough digital breadcrumbs to lead you to the desired destination – however inaccurate that might be.
Dr Chris Galloway, the head of media relations at Massey University, says it's often not the technology that determines how believable something is but rather the story woven around it.
"The most believable fake news is that which has a core of truth surrounded by lies or distortion," he says.
"Something that has a kernel of truth is going to be more believable than something that's purely a tissue of lies."
One of the best examples of this form of misinformation has also proved to be one of the most harmful we've seen.
A common rhetorical trick used in the anti-vaxxer movement is to point out the harmfulness of the individual chemicals that go into a standard vaccine. While listing the side effects of each of these chemicals makes for a compelling argument, it's also disingenuous because the presence of a chemical doesn't necessarily equate to harmfulness.
"It's easy to string together a sequence of facts, each of which is true in itself, but to do it in such a way that leaves a misleading impression," says Galloway.
"I've looked at the approach to lies of the former Nazi Propaganda Minister Joseph Goebbels and one of the things he believed was that good propaganda didn't have to lie, it just had to package the truth for the man on the street."
The problem today is that the men and women on the street don't only listen to what they're being told. They're also actively spreading their ideas, post by post, tweet by tweet, across the most powerful media channels we've ever seen.
A single post on Instagram claiming that a 9-year-old bullying victim is actually an 18-year-old man can quickly escalate and cast doubt over an entire narrative that was true only a moment ago.
And politicians now go into every election knowing they won't only be fighting against political opponents, but also against narratives bouncing around social media.
This challenge recently came to the fore when Prime Minister Jacinda Ardern recently shot down an Instagram critic for claiming that she's ruined the economy.
"Is that a reference to the low unemployment levels we've achieved, high wage growth, the decrease in debt or the solid GDP growth at 2.7 per cent?" she snapped back at the user.
The story was picked up by the media and led to National Party leader Simon Bridges releasing a statement questioning the accuracy of Ardern's statistics.
Think about that for a second. An off-the-cuff remark posted by a random individual on Instagram was powerful enough to extract a comeback from the Prime Minister, become a talking point on the homepages of nation's leading news publishers and spark a response from the Opposition leader.
Election at risk
Before publication of this story, the Herald made the decision to pass the fabricated audio clip on to the Labour Party campaign team for their response.
The party's campaign chair, Megan Woods, wasn't surprised at all by how simply a clip of this nature could be produced.
"In a world where we've seen recent high-profile examples of misinformation, that is something we as a campaign are taking seriously," she told the Herald.
"New technology is certainly opening new avenues for misinformation such as deep-fakes or misleading audio and video, and that puts more onus on us to get our positive, factual messages out to people."
Woods has enough self-awareness to know the public isn't likely to trust everything that comes out of a politician's mouth – and this is where she sees established local media stepping in.
"As the fourth estate, they play a really important role in calling out misinformation and ensuring people are well informed," Woods says.
"That's a really vital democratic function and the sort of issues we've seen play out overseas really underscore the importance of that role."
The most prominent examples we've seen were the 2016 United States election and the Brexit referendum in Britain, both of which saw the unprecedented distribution of information – both true and false – across social media channels.
Former US President Barack Obama may have pioneered the use of data in electioneering, but these follow-on campaigns showed the dark depths this strategy could plumb.
Studies have shown that during the Trump campaign, for instance, fake news articles distributed through social media regularly outperformed those from reputable publishers.
With the assistance of powerful data firms like Cambridge Analytica, politicians could slash up audiences and find the exact target group contained within an otherwise amorphous blob of humans.
"Cambridge Analytica was involved in both the 2016 US election and Brexit, and don't think that's a coincidence," says Mowday.
"They were able to access large troves of data and serve people messages to influence them around that election. Some of those messages were facts, some of them were twisting the facts and some of them have been pure fakes. You've got to say that's had an influence."
Mowday has little doubt that the same effect could be achieved in the local market and makes the point that you don't have to influence everyone in New Zealand to shift the result of an election.
He says large groups on both the right and left have already made up their minds about which way they will vote – and the more strategic politicians won't be focusing on these.
The important segment capable of pulling a vote one way or the other, he says, could be as small as a couple of hundred thousand swing voters around the centre. Influence enough of those and you could win an election.
Rules of engagement
Social media is currently the wild west of the publishing world. The rules change from platform to platform and there's little or no oversight of what is distributed.
In the face of growing criticism, Twitter chief executive Jack Dorsey announced a stop to all political advertising on his platform last year.
"We believe political message reach should be earned, not bought," said Dorsey in a Tweet.
Meanwhile, Facebook has stuck to its guns, allowing political parties to continue advertising on its platform. Company founder Mark Zuckerberg has, however, banned videos altered with the use of artificial intelligence (deep fakes). But under this policy, videos like the one featuring Pelosi would still be allowed. And Facebook has drawn additional criticism for refusing to tighten up its rules governing false claims made in political advertising.
We've also seen inconsistencies emerge in the local market, with political parties adopting different approaches to their use of social media.
Labour, the Greens and Act have for instance signed up for Facebook's transparency tool, which allows users to see how much the advertiser is spending and who it's targeting. National has intimated that it's looking into this but is yet to make a public commitment.
We've also already seen a high-profile stoush between Labour and National over attack ads featuring Parliament TV footage spliced together to cast a politician in a negative light. Labour argued that National's use of this strategy led was tantamount to content being used out of context and thus did not give an accurate reflection of the matter at hand.
In response, Speaker Trevor Mallard ordered that all political advertising featuring footage used without the consent of the politicians depicted should be removed on account of it breaking Parliament's Standing Orders rule.
National was reluctant at first but later did eventually agree to take the offending material down.
The Opposition came into the firing line of the Advertising Standards Authority when it received four complaints about an ad comparing fuel prices under this Government and its predecessor. The advertising presented a series of graphs out of scale, purposely to give the impression of a far bigger divide between the compared prices.
The advertising watchdog ultimately decided that while the graphic was hyperbolic, the information provided alongside was accurate, which in turn meant the ad was not censured.
Who will be the gatekeepers?
The biggest concern with the ASA is not where it draws the line on what is and isn't misleading, but rather how long it takes for a decision to be made. By the time the ASA has decided to censure an ad and order its takedown, it may well have already run its course – reaching hundreds of thousands of people in the process.
The existing watchdogs are no longer fit for purpose. They were shaped to accommodate traditional media, which moved more slowly than social media and took responsibility for what was published on its channels.
Every ad to appear on radio, in a newspaper or on TV would run through an approval process designed to ensure all content aligned with New Zealand's rules, guidelines and regulations.
The same cannot be said for the likes of Twitter and Facebook, which don't even see themselves as media channels at all. Given the sheer volume of content published on these channels on a daily basis, it's nearly impossible to keep track of everything moving around. And by the time that content has misinformed, influenced and done its damage, it's often too late to pull it down.
There are global steps being taken – such as massive fines for hate speech in Germany – but regulators are still looking for ways to keep the social media beast under control.
There's even talk in US media about developing a separate arm called a Ministry of Truth or Department of Facts to combat the onslaught of misinformation and propaganda flooding across the internet. But where do you draw the line? How do you ensure the arbiters are objective and fast enough? And when does a Ministry of Truth cross the line into censorship?
Failing to answer these questions will leave us in the awkward limbo of not being able to separate fact from fiction, mistrusting all those who disagree with us and being sceptical of everything we see and hear.
Or, perhaps, we've always been here and we're just starting to realise it now.