Lana Hart on living in her own echo chamber, thanks to Google.

Early every morning, before the household wakes, I take three items to a chair in the living room: my reading glasses, my cellphone and a cuppa.

I click on the Google icon and my favourite three searching words immediately appear. I think: maybe today could be the day. Maybe something big happened overnight and it's all starting to crumble now, at long last. I skim through the headlines.

Yes, horrible … I can't believe he said THAT … Oh, and what did so-and-so say about him? Another posturing threat to a foreign power? What a dick.

Advertisement

There aren't many mornings when I'm not at least partially satisfied with my news. After all, Google knows what I like, and it gives me what I asked for, plus more. I didn't need to type in "impeachment" or "haters", because "Trump news today" provides all that I want.
I supposed I already know that there are algorithms and location-finding calculations working in the background to bring me my morning news.

After all, when I search on "Driver's Licence Test" I get results for New Zealand, not Mozambique, which is handy, right? And when I ask Google to tell me what's on TV tonight, it doesn't take me directly to the religious channels, in which I've never shown any online interest. No, Google has done its research on me, so to speak, and tailors my searches to what I prefer.

Despite my placid awareness that my search engine is starting to think like me, if I consider this too deeply I start to squirm a little in my seat, thanks to Eli Pariser.

The filter bubble

Pariser first coined the term "filter bubble" in a 2011 book claiming that instead of the internet being an impartial tool delivering information to us objectively, the order of the suggestions (the key determinant of what we click on) is shaped by other "signals", such as our search history, how long we visited sites, when and where we are searching, and even what type of computer we are using. In fact, with 57 signals determining which Google links appear first, the same search can render different results as these and other factors change.

In The Filter Bubble: What the Internet is Hiding From You, Pariser argues that the personalisation of our information moves us quickly "to a world where the internet is showing us what it thinks we want to see, but not necessarily what we need to see".

The effects of this categorisation and prioritisation of information lead us to believe that most people think as we do (the "majority illusion") and to the absence of more critical thinking. With less contact with contradictory perspectives, we tend to become intellectually and politically lazy, adopt group-think and ruminate on the same views.
Pariser argues that this personalised information cultivates "a kind of invisible propaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar ... [with] less room for the chance encounters that bring insight and learning".

Being Wrong

The tendency to favour and use information that confirms our own beliefs - confirmation bias - is well understood in research of all kinds; avoiding it is a continual challenge so that researchers don't arrive at conclusions based solely on what they already believe while ignoring data that is inconsistent with what they believe.

Carl Davidson, chief social scientist at Canterbury research agency Research First, says "a trap for learning is not to fool ourselves – if you want to be able to learn and improve, you have to entertain the possibility of being wrong. But we are wired to latch on to ideas and confirm what we already think we know. Most of us don't entertain the possibility that these ideas could be wrong."

Confirmation bias, Davidson explains, is associated with another kind of thinking faux pas called Fundamental Attribution Error. "If I make a mistake or someone I like makes a mistake or a tells a lie, I attribute it to their state of mind or their condition at the time. Maybe they were tired, or busy, or not well briefed about the matter. But if someone who I disagree with makes the same mistake, I blame it on their attributes as a person: they are liars or stupid, for example.

"So we explain our own failings in terms of states, but others who we disagree with in terms of traits. This is really important because it's how we interpret other people's behaviours. If empathy is missing, we perceive things very differently."

Tricking algorithms

I think about my online searches for yet another Trumpian lie, or discombobulation of words, or muscle-flexing false promise. Quite right … I do think the guy's very traits mean that he's a knob, and not that he's just having a bad day.

Markus Luczak-Roesch, senior lecturer at the School of Information Management at Victoria University, challenges me to widen my collection of Trump-related news by providing me with different tools for accessing information about him. "Duck Duck Go," he suggests, "is a search engine that does not personalise what it feeds you. It doesn't reduce the amount of stuff to what you're interested in, but draws from a wider range of sources."

I ask: so what if I want to get outside my personalised feeds and pretend that I am, say, a Trump supporter? How can I get different information which can help me see things from a new point of view? How long would it take Google to "unlearn" what I like?

Luczak-Roesch admitted he could refer only to anecdotes rather than any research, but had heard that some had been successful at tricking the algorithms after only a couple of days. "If you show a pattern of regular browsing through particular sites, deliberating hanging out on them, you may be able to confuse the coding."

"But remember," he warns, "these sites are getting very smart at detecting who are genuine visitors to a site and who isn't. Their ultimate goal is to get ads on those sites to make you consume things. If it harms the advertising business model – if you are less likely to buy the products that are advertised on those sites – then they don't want you there."

Advertisers and developers, Luczak-Roesch says, have an interest in reducing "click fraud" – reducing the number of non-genuine visitors to a site.

The dark side of my moon

So I try to disrobe my cloak of bias and have a look on the darker side of my little moon. I Duck Duck Go'd (is that a word?) my favourite three searching words and found a lot of news that I had already seen that day. But there was more. For the first time, Fox News appeared, that presidential mouthpiece and noisy purveyor of conservative American news.

I read two articles. I watched three news videos. When that voice in my head disagreed with what they had said, I told it to shut up and keep listening. I learned that US conservatives are surprised, after nearly two years with Trump in office, that the left-leaning media still haven't learned how Trump communicates, and that although his style may be unusual, it is fresh and real, and it won't change. Good point.

I heard how the most followed, quoted, and "popular" US President in history is merely venting in his tweets his legitimate frustrations at the Mueller inquiry into possible links between Russia and the Trump campaign, and that it's fair enough, given he is making profound changes across many American fronts even as the inquiry is sidetracking him from this agenda. Idiosyncratic nuances though he has, Trump is forcing much-needed transformations that have long been neglected.

I was surprised to learn that not everyone at Fox News is necessarily on Team Trump and that some journalists there are as concerned about Trump's recent attacks on the media as those in the echo chamber on the left.

Popping our filter bubbles

Luczak-Roesch suggests following different news-sharing communities that "have polarity". Wikipedia, he says, although not a perfect platform, "is a way to feel the pulse of topics and what makes its way into the discussion".

Other forums, such as Wikitribune and theconversation.com offer great sources of investigative journalism from sometimes surprising perspectives and place more of an emphasis on fact-checking.

Ultimately, he argues, it is the responsibility of the individual to overcome the biases presented in our news and to develop the skills for critically assessing information.
"We have to have people in our workforce," Luczak-Roesch says, "who can think critically. There is so much debate these days around climate change or cancer treatment or whatever, and we need scientifically literate people to help determine what works" in order to solve the problems of modern life.

Social scientist Davidson agrees; when choosing our news and how we access it, we need to be more careful. "Not so long ago, humans had sweet and fatty foods only rarely, so our brains rewarded us with dopamine and this made us feel good. Now, there are sweet and fatty foods everywhere and we could eat these all day, releasing more dopamine, if you wanted. But in a world of sweets, we need to make better choices about lifestyle. It's the same with how you get news and how you shape your own views. Make good choices."

Tomorrow morning: cuppa, and a nice browse through Breitbart News to start the day.