In today's "attention economy", it often seems that people have decided on what they believe not after any kind of rational assessment of the facts, but rather according to the belief system of their particular political or ideological camp.
For example, long before the jury finally arrived last week at a not-guilty verdict in the trial of Kyle Rittenhouse, the American teenager charged with homicide over two killings during civil unrest last year, many seemed to have already decided what the outcome should be. To the left, Rittenhouse was a murderous white supremacist. To the right, he was a hero who was innocently and righteously defending America against mob violence and destruction. Both sides seemed to view the beliefs of the other as unconscionable. Such polarisation and vilification of those with whom we disagree can be seen in the discussion of other contentious topics such as Brexit or abortion.
It is by now pretty well established that the internet — and in particular social media algorithms that feed us more of the kind of content we have already looked at or engaged with — has made our societies more divided and resulted in the belief systems of various tribes becoming increasingly entrenched. But why is it that we are so quick to accept the material we come across in our particular filter bubbles? What is making us all so unwilling, perhaps even unable, to understand the "other side"?
An essay by Daniel Gilbert, a Harvard psychologist, may offer some insights. In a 1991 paper entitled "How mental systems believe", Gilbert argues that in order to comprehend an idea, our brains initially accept it as true, even if only momentarily. It therefore takes more effort to reject an idea than to accept it, because acceptance is something that we do automatically, whereas in order to disavow it we must go through a process of what Gilbert calls "unbelieving" or "unaccepting".