Read the previous Problem Solving essay:
“If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. Many have written about this bias, and it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations.”
-Raymond S. Nickerson, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guides”
And why beholdest thou the mote that is in thy brother's eye, but considerest not the beam that is in thine own eye?
Or how wilt thou say to thy brother, Let me pull out the mote out of thine eye; and, behold, a beam is in thine own eye?
Thou hypocrite, first cast out the beam out of thine own eye; and then shalt thou see clearly to cast out the mote out of thy brother's eye.
-Matthew 7:3-5 (King James Version)
As far as people on this planet concerned
They... they know far more less than they do what they know
I mean that they... there's so much that they don't know
Until what they do know is not even important
So it comes down to that point, well, they don't really know anything
Uh, it's the same thing as it ever was
Nothing has changed yet
-Kunzite, “Novas” (Bandcamp link)
Here is a pretty solid model for solving problems, achieving your goals, or generally being successful in life:
1) Develop a model of the world - a set of linked beliefs about the world or just your problem domain's categories, structure, and relationships
2) Use the model to develop a policy for how to achieve your goal, and act on it in the world
3) Use feedback from your interaction with the world to update the world model
4) Use the updated world model to update your policy and actions towards it
5) Repeat until you have achieved your goal
Stated so baldly, it’s hard to believe that anyone in pursuit of a goal ever does anything but these steps. Yet, in human beings, things often go critically wrong at step 3 – the world model updating step.
For any beliefs of consequence there is confirming evidence and disconfirming evidence.1 That is, there is evidence that counts for the truth of the belief, and evidence that counts against it. Carpet fibres under the fingernails of a corpse that match the color and type of the suspect's car trunk is evidence for the belief that the body was, at some point, in that trunk. Radiocarbon dating of geological sediments and fossils is strong evidence against the belief that the Earth is less than 10,000 years old. Nickelback CDs in an acquaintance's apartment are evidence that they have terrible musical taste (and that you have been transported back to the early 2000s, before ubiquitous MP3 playing electronic devices).
Presuming, for the moment, that one really does want to accomplish one’s explicit goals (we’ll come back to this point at the end of the essay…), both types of evidence should be weighted equally, without a bias for or against. You shouldn’t favor evidence for a strongly held belief more than you favor evidence against something you disbelieve strongly - the gun control study is not faked, or manipulated, or misinterpreted, or irrelevant just because it doesn’t serve your preconceptions. It may indeed be any or all of those things, but we can’t dismiss alleged evidence based on our personal tastes in ideologies and belief systems.
When you start biasing in favor of or against certain types of evidnece, your world model becomes divorced from the world – your map will diverge from the territory – and your actions will not accomplish your goals except for chance or overdetermination. But people do it all the time - you and everyone you know did it multiple times this month, if not multiple times today. For an allegedly rational, goal-seeking species, this is a seriously strange phenomenon.
This ubiquitous phenomenon is confirmation bias. Indeed, it if you want a thorough rundown on confirmation bias, rather than just this Substack writer’s summary, you should read Nickerson’s 1998 paper on the same, quoted in the header of this essay (link here, or you can find it pretty easily with Google Scholar and Sci-Hub). Nickerson reviews a multitude of related psychological facts, all of them plentifully confirmed by independent, replicated studies: the overweighting of evidence that confirms a preexisting or strongly held hypothesis, and the underweighting of evidence that opposes it.
Naturally this sounds bad – and it often is. But I come not to condemn confirmation bias but to praise it. Not wholeheartedly, but with two cheers out of three.
Confirmation Bias Keeps You Happy
In his wonderful 1871 essay, “On the Fixation of Belief,” my man Charles Sanders Peirce anticipates Plantinga’s Problem: why should we trust that the beliefs of minds shaped by evolutionary pressures can be trusted to be accurate?
“Logicality in regard to practical matters is the most useful quality an animal can possess, and might, therefore, result from the action of natural selection; but outside of these it is probably of more advantage to the animal to have his mind filled with pleasing and encouraging visions, independently of their truth; and thus, upon unpractical subjects, natural selection might occasion a fallacious tendency of thought.” [emphasis mine]
Which raises an important question: of the countless beliefs you have, both explicit and implicit, how many of those are actually keeping you alive day to day and year to year? Which beliefs are critical, and which are ornamental? Or, more precisely, which of your beliefs work to keep you happy and safe directly, like you would expect them to (cyanide = not edible, extended warranty = ripoff, stuff you drop falls down = true …) and which ones are doing so indirectly.
The essay where Peirce raises this interesting thought is concerned with how people ‘fix’ their beliefs. The point of a belief, Peirce reasons, is action: what it leads people to do or not to do in the real world. Because of this, genuine psychological doubt (not the kind Descartes feigns in the Meditations or, more damnably, that doubt feigned by vaccine ‘skeptics’) is deeply uncomfortable. When we don’t know what to believe, we don’t know how to act, and our minds become consumed with reducing the doubt to, if not full hearted belief, at least tepid, “good enough for government work” acceptance. Belief formation is, for Peirce, akin to itch scratching or stretching a cramped muscle – something to be gotten out of the way.
This provides a good first pass at explaining the ubiquity of confirmation bias – genuine doubt is uncomfortable, so in the same way our nervous system motivates us to action with present discomfort and the thought of relief, our minds motivate us to overweight confirming beliefs. This assuage doubt and makes action possible. And enabling action is very important. Everyone reading this has, at times, been in aching doubt between alternatives. Confirmation bias, or even trust in a source of entropy like thrown dice or a random number generator, would get them out of their stasis and doing something. While there is wisdom in Herbert Hoover’s apocryphal quote “don’t just do something, man – stand there!” often action, any action, is preferable to immobility and stasis, if only in one’s own mind and heart.
Though we can model beliefs as probability estimates, from P(x) = 0 for absolute, immovable disbelief to P(x) = 1 for unshakable faith, beliefs are felt emotionally, not measured objectively. As Pascal observes, disturbingly, even the unshakable conviction that 1+1 = 2 is at root a feeling, an emotional movement of the soul. Evidence is not perceived as numerically weighted, but felt. Given how bad it feels to be in true doubt and uncertainty about how to act in the world, it is no wonder that the nervous system/soul default to weighting confirming evidence.
This seems surprising when the beliefs are those that make us unhappy – the depressed often have markedly unrealistic estimations of themselves and the world – but it makes sense. If doubt inhibits action, then even painful beliefs will be reinforced because they lead to decisive action (or inaction, as the case may be).
Going back to the issue of beliefs, I would argue that a vast number of our explicit beliefs have no practical consequence or effect whatsoever. At least no direct effect – but more on that in the next section. You can be perfectly successful in life – finish school, work a career, make lifelong friends, find love and raise a family – while believing many strange of downright wrong things. Beliefs in the efficacy of crystals for ‘psychic’ healing, of Tarot cards as divinatory instruments, of the coexistence of dinosaurs and early humans, none of these things will impact your success in life. At least nowhere near so much as a belief in the excellence of cyanide as a pizza topping or the friendliness and playfulness of wild cougars.
So per Peirce’s observation, these beliefs are free – they can have any value without affecting survival chances. What then determines how we weight them? And why do so many people weight them the same way?
Confirmation Bias Keeps You Safe
There is, in human beings, despite all our foibles and biases, a sense of reality. We can’t deny that the cat is on the mat when the orange tabby cat is comfortably curled up on the mat in our field of vision, perhaps beckoning us to play with it. We know for a fact which way is up, relative to the gravity well we live near the bottom of. This sense of reality keeps our practical beliefs within bounds, especially those that have regular enough feedback for constant belief updating – Nickerson notes that weather forecasters, among all professions studied, have the closest correlation between their stated certainty and the occurrence of the weather they are forecasting. The weather forecasters are accurate in forming beliefs because they find out every single day whether they were correct or incorrect. For beliefs that have longer term consequences – the harmfulness of smoking, say, or the viability of an investment – our confidence can wander away from the objectively considered evidence.
If your object of belief is free – of no real consequence in any immediate or near-term way – then your beliefs can have any value, but in practice we find they don’t vary much within societies. This is because of the social proof instinct, the tendency to do and believe what everyone around you is doing, with ever greater confidence as more and more people do it. The greatest enemy of a conviction, after all, is not any physical evidence, but the existence of people who think and feel differently about the issue. Peirce describes this as the main block to the first way of eliminating doubt - the Method of Tenacity. Something in human nature prevents us from following a belief that no one else holds.
This is, in part, an explanation of all forms of evangelicism - whether religious or not. The neighbour of yours who, a few years ago, was going on and on about how ‘good’ they felt when they dropped gluten from their diet, despite no prior evidence of celiac disease, and posting to social media about how gluten is terrible for ‘the gut,’ may in fact be feeling better - losing weight by reducing calories, a common upshot of gluten-free diets, has that effect on people - but their better health is not enough. They need others to feel and believe as they do. It is as if the belief, on its own, is not enough - they want companionship.
This, as an aside is the conclusion I have come to after meeting Flat Earthers and people who subscribe to conspiracy theories. While it would seem to disconfirm social proof when almost no one holds a belief, a tenacious belief held adamantly by a small group is a more intense kind of belonging than, say, believing that liberal democracy is overall a pretty good idea. Almost everyone in our society believes that, but you’re going to get more fellowship and ‘hail brother, well met’ believing in fascism or communism.
We tend only to think of social proof and the ‘herd instinct’ when bad or foolish things happen – diet or health fads, investment bubbles, mass suicides – but it is vital to social cohesion. Outside of the practical, groups make their identities and shape their members by their opinions on matters of free belief. Take any number of the dozens of topics and controversies in the day to day news, magazines, and popular websites and forums: the practical outcome, for almost everyone debating the issue, is nil. Belonging is what is at stake, and demonstrating you belong - to yourself if no one else - by proclaiming your beliefs. This is why people join in marches, sign petitions, share #hashtags on their social media. Belonging matters.
Aristotle observed, millennia ago, that not only was man a “political animal” - meaning that unlike other social animals we deliberate with one another about what to do and why - but that a man who lived without a society was either “a god or a beast.” To be safe, to be taken care of, to have people look out for you, they need to believe that you are, in some ways, on the same wavelength as them. Associating around beliefs that are, in themselves, of no practical import is a way to signal this at relatively low-cost to all involve, save foregoing the companionship of those who think differently. Confirmation bias reinforces this social proof instinct - we are inclined to believe a certain way because of our group, and we do in fact believe that way: we see the ‘proof’ all around us.2
If you want to see the effects of disconnecting from your society, take a look at your local drug-addicted, homeless population. But even there, you will find commonalities of belief and conduct. Prevailing superstitions, habits of speech and commonalities in dress, even the most outsider communities share those with each other.
We can and do think independently, but not as much as we in the West like to think. “The herd of independent minds” is a real phenomenon – a belief becomes associated with being a ‘free thinker’ and miraculously all the people who want to identify that way wind up believing that, and finding the proof all over. And, most of the time, this is perfectly alright. Suboptimal, definitely, but optimality in one domain can be suboptimal on the whole when it disconnects you radically from your time and place. It’s praiseworthy, I guess, to believe while living in 5th Century BC Athens that slavery was an abomination, but you’re definitely going to limit your practical effectiveness by going on and on about it. It is vitally important, for your survival, that you broadly agree with those around you – that you know how to dress, how to follow social conventions, what to say (and not say) and when to say it. When in Rome, you had better do as the Romans do. Stay long enough, and you’ll likely wind up believing what the Romans do too.
Of course, as I observed almost a year ago in my “Let’s Stop Pretending We Are Original Thinkers,” merely negating the common beliefs of your society or social set is not wisdom or knowledge. For your happiness and safety, my advice is to almost all of the time do as those around you do, especially when you can judge that their beliefs are making them successful, happy, and secure. Save your original thoughts for a few specialized domains where you can be contrarian because no one else bothers to figure out what the truth might be.
But Not Three Cheers
Of course, all of the above can go horrifically wrong and lead to mass suffering, destruction, and death. Thomas Sowell, in his Knowledge & Decisions, discusses the costs of ‘building consensus’ in a society. He asks the reader to reconsider the serious attention given by scholars to argue for the Divine Right of Kings. This is something moderns consider ludicrous, but anyone looking back on the enormous bloodshed caused by contests - then fairly recent - among claimants to the Imperium of Rome would see great wisdom in the system of picking the eldest son of whoever is currently monarch to rule, so great it could be argued to be the very will of God.
More darkly, and with anhydrously dry wit, Sowell asks the reader to judge more recent methods of enforcing consent: the death camps and gulags of the National Socialists and the Communists were, at base, extremely costly methods of achieving consensus. You can solve many utilitarian calculation problems by making those unhappy with your preferred arrangement dead.
Peirce, in “The Fixation of Belief” notes a similar phenomenon: when the people in a society come to think that a belief or set of beliefs about which there is at best mixed evidence must be held one way or another for their social cohesion, there will be blood. This is even more critical when what is sought is some radical reform of society, especially against prevailing human nature. Without widespread shared belief, or at best assent, there will be cogs that grind against the other gears, which may stop the Utopia from coming to be. Consider the efforts your organization puts into building a company ‘culture.’ Thankfully, most of the time, this effort stops at getting people to mouth slogans or not stand up and scream “Heil Hitler!” during the mandatory diversity and inclusion seminar.
Conclusions
Here we come to the uncomfortable fact: confirmation bias can be highly useful on the individual level, and achieve good results at the societal level, but can also lead to disaster, both personal and international. Is there a way to sail between the Scylla and Charybdis of this ubiquitous human tendency?
We’re not going to stop having confirmation bias. It’s a built in part of our human nature, and one we’re not going to change, ever. It can be, in limited areas, with much effort, resisted, but there’s no way to do it globally all the time. That would be exhausting and maddening.
But don’t despair that you’re not as good at solving your problems as you would like tobe.
Here’s the thing:
Confirmation bias is solving problems for you,
just not the ones you think you want solved. You may genuinely want world peace, but you want (and need) fellowship and belonging that much more. You may genuinely want a solution to climate change, but you want (and need) friendship more. You may genuinely want X, but you want (and need) to be safe, to belong, to have an in-group distinguished from an out group. So you’ll believe the things that only sort of, kind of, advance those X’s, but no so much that they’ll set you at odds with your group. If they do, sure enough you’ll find some other group to belong to, who believes Not-X.
You need to know who you are, where you belong, what you believe, so you can do what you need to do (whatever you happen to believe that is). Your brain is solving this problem for you, continuously, without you being aware of it.
And that’s a good thing. And something to be grateful for as we come to end of this year.
What counts as evidence for any belief and why is a vexed philosophical question, but just presume whatever you understand ‘evidence’ to be for the rest of this essay.
As an aside, I believe that this is how the early Internet evangelists - John Perry Barlow, Douglas Rushkoff, and others - came to be so wrong about the reality of connecting the whole world. These thinkers were certain that, given the means to communicate freely across borders and cultures without restrictions, people were likely to find they had more in common than they thought, and peace and freedom would spread.
I call this the College Illusion. If you went to a major university, one that draws from an international population for its students, you met and made friends with people from all over the world. You probably felt amazed at how similar people could be across cultures and languages. This is because you, the college-bound, were more like those international students you befriended than they are like the people in those country’s rural hinterlands. Or you are like the people in your country’s rural hinterland, for that matter.
The people who were on the early Internet, aside from the pedophiles and terrorists, were almost all from privileged, highly-educated backgrounds, living in intellectual milieus influenced by the fumes of the 1960s Flower Power era, whether in Berkeley, Vancouver, Oxford, Mexico City, or Kyoto. These were people with technical skills who could navigate the complexities of early computer networking. Confirmation bias was at play here too: each pedophile, each terrorist, each teenage hacker and shit disturber, was “not representative,” evidence to be set aside.
Two Cheers for Confirmation Bias
Thank you for sharing this thoughtful essay. I enjoyed reading your perspective on confirmation bias. You raise some interesting points about how it can be useful for happiness, safety and belonging, while also acknowledging its potential downsides. I don't have much to add, but appreciate you taking the time to write and share your ideas. Keep up the thought-provoking writing!