In the last article we talked about cognitive dissonance–the state of mental unease caused by contradictory information or perceptions. The brain tries to reduce this unease whenever possible, and uses a variety of tricks to do so. In this article we’ll talk about one of the most common tricks–the confirmation bias and the related phenomenon of belief polarization.
The confirmation bias is the tendency to see the world in the way that we expect to see it. It is the impulse the brain has to filter our perceptions (usually without our conscious awareness) so we take in information that confirms our existing biases, prejudices, and assumptions and we fail to see information that would run counter to them.
The confirmation bias has two major elements:
1. Biased assimilation is the tendency to seek and incorporate information that fits our current view.
2. Cognitive discounting is the tendency to simply not see or to lessen the importance of information that doesn’t fit.
It is easy for us to see this behavior in others. We all know people who who have huge blind spots in their ability to see other points of view or be receptive to information they don’t want to hear. Unfortunately, each one of us is also hampered in our ability to see that tendency in ourselves. To be a good, rigorous thinker it is critical that we get better at mitigating the role that confirmation bias plays in shaping our understanding of the world around us.
We tend to live in information bubbles of our own making. The flood of available information available in our lives today makes the membrane of those bubbles more and more dense. We watch cable news channels and read editorial pages that share our political views and point out the obvious bias of the sources that have views different from ours. (Conservatives discount The New York Times for having a liberal editorial page; liberals discount The Wall Street Journal for having a conservative editorial page.) We gravitate toward friends who share our worldview and “unfriend” those who challenge our ideas. We may make attempts to expose ourselves to other points of view, but it is usually just to point out why they are wrong and further convince ourselves of why our point of view is right. Friends and rivals argue over politics and religion and any number of other issues, and no one seems to ever change their mind.
We walk away from such discussions amazed that others simply don’t get it. Confirmation bias is one of the reasons they don’t get it; and one of the reasons we don’t get it either.
Belief polarization is a related cognitive bias. It works like this: When we argue over a topic and are presented with evidence that runs counter to our assumptions, cognitive dissonance is increased. The more cognitive dissonance we feel, the more we want it to go away. This makes us hold tighter to our existing view and fight harder against the contradicting information. Thus, the more someone tries to make us see something we don’t want to see, the more convinced we are of the correctness of our original belief. The harder we work to get others to see our point of view, especially if it involves an attack on their existing beliefs, the more we cause them to resist what we have to say. Our beliefs tend to become more polarized over time, not less so.
So, as rigorous, critical thinkers, we face a dilemma: we can be convinced that we are right based on the information that we see, but we tend to only see things that confirm our initial belief. What is one to do?
There are a few simple steps we can take to combat confirmation bias and belief polarization.
- Become aware of the tendency to confirm our biases by embracing agreeable information and discounting disagreeable information. Learn to see it in others (but resist the urge to point it out to them!) and endeavor to see it in yourself.
- Resist the instinctual lunge toward certainty. Every good critical thinker focuses on falsification–the attempt to find out why their hypothesis is wrong–rather than confirmation–the attempt to find out why their hypothesis is right. Falsification is at the heart of the scientific method for a very simple reason: it is easy to find evidence that supports our conclusion, no matter what that conclusion is, but that does not mean that the conclusion is correct. Take one absurd example: using a confirmation strategy, one could confirm the hypothesis that Hitler was a kind man if one only selected the confirmatory evidence that he was kind to animals. Seeking falsifying evidence brings one to a vastly different (and much more accurate) conclusion.
- Get into the habit of seeking other points of view. Search for the merits of those points of view. Attempt to synthesize competing points of view into a blend rather than sticking to a polarizing, right/wrong point of view. I must point out that I am not advocating a post-modern, many-truths philosophy to life; some views are factually, objectively wrong and other views are factually, objectively correct. We should exercise humility before making such strong assertions, however, and we should hold our minds open for new information that challenges our assumptions.