Other Obstacles to Our Thinking (Part 2)

Awareness to Action
Aug 17, 2021

By Mario Sikora

Part of an ongoing series on Clear-Thinking Skills. These articles are excerpted from “How to Think Well, and Why: The Awareness to Action Guide to Clear Thinking,” which is available in paperback and e-book via amazon.com.

Here we continue with some of the cognitive obstacles to clear thinking we began discussing in the last article.

Leaps of Inference

An unjustified leap of inference is drawing a conclusion that may not necessarily flow from the premises used to arrive at that conclusion. For example, just because I can safely jump off a curb and jump off a chair does not mean I can safely jump off a cliff. Assuming I could would be a leap too far.

Leaps of inference are much more common than we realize. They are often the root of conspiracy thinking or questionable assertions about history or science. They are a mechanism that allows us to embrace beliefs we want to believe but for which we don’t have solid evidence.

Naïve Realism

We have the views we do about the world around us because those views make sense to us. Whether it is something we have thought a lot about or whether we are following our initial gut reaction, everyone believes they believe what any reasonable person would believe given the same set of facts.

Psychologists call this phenomenon “naïve realism,” which Thomas Gilovich and Lee Ross describe as “the seductive and compelling sense that one sees the world the way it is, not as a subjective take on the world” in their excellent book, “The Wisest One in the Room.”

Our brains manufacture certainty in a way that convinces us we interpret the world in a way that any “reasonable” person would, and we don’t realize that much of what we believe is highly subjective (“the thing as we know it”) and not necessarily a match to reality (“the thing as it is”).

The importance of this simple idea cannot be overstated—we all think we are right about how we see the world and that anyone who thinks differently is either ill-informed or ill-intentioned. The greater the disparity between different points of view, and the more important a particular belief is to us, the more likely we are to attribute negative qualities to people who believe differently. The more you and I disagree on an important topic, the more likely you are to assume that I am not just stupid, but that I am a bad person as well. (And I will probably fall into the same trap…)

The Fundamental Attribution Error

The fundamental attribution error is the common tendency to view our less-admirable actions as a reasonable response to our circumstances but see other people’s actions as a mark of their character structure. When we misbehave it is because we had a bad day; when others misbehave it is because they are a bad person.

Combine this with the closely related correspondence bias—our tendency to assume a broad quality of another’s character based on one or a few actions or traits—and it is no wonder that humans tend to live in a world of simplistic stereotypes.

Self-Deception and the Enigma of Reason

As much as we like to think we are rational and evidence-based, most of our arguments are actually attempts to rationalize something that we intuitively feel is true and convince others of the merit of our intuitions.

Cognitive scientists Dan Sperber and Hugo Mercier have an explanation for this phenomenon that they call “the argumentative theory of reasoning,” and we would do well to understand their ideas if we truly want to understand how our minds work.

At the root of this theory, which is gaining traction with other cognitive scientists, is the idea that the brain did not evolve as a tool for accurate understanding of our world; it evolved to equip us to survive more effectively. They believe that all the cognitive biases built into our minds are not glitches in the system, but features of the system that serve their purpose very effectively.

Survival requires getting the things we need and want from life, and we often do that more effectively when we can convince others to see the wisdom of our point of view (whether our point of view holds the actual truth or not…). Thus, our capacity to reason is not a tool for finding truth, or even for solving problems; our capacity to reason is a tool for convincing others of the rightness of our views so we can get what we want.

Further, Sperber and Mercier agree with cognitive scientists such as Robert Trivers, who makes the case in his book “The Folly of Fools,” that humans have developed the ability to deceive others in order to effectively compete for resources. Further, the most effective way to convince others is to first fool ourselves into believing whatever story will justify our initial emotion-based intuitions. In short, we easily fool ourselves into believing convenient falsehoods that serve our selfish purposes, and then we reason skillfully for what we have fooled ourselves into believing. The more skilled we are at reasoning, the more we convince ourselves that those intuitions are correct.

People who reason skillfully are often able to convince others of their “rightness,” frequently to the detriment of the one being convinced. Such people are what are commonly referred to as “influencers” and they often rise to leadership positions—with both positive and negative consequences.

The key implication of the argumentative theory of reasoning is that we can’t always trust our own reasoning and we need objective, external tools to help us uncover the ways we may be deceiving ourselves. Sperber and Mercier also point out that the group needs objective methods to protect its members from charismatic, but wrong, leaders and influencers.

Simply adhering to what the boss says or following the most charismatic person in the room can be a recipe for disaster. The same thing that the great physicist Richard Feynman said about science applies to business as well: “it is important that you don’t fool yourself, and you are the easiest person to fool.”

Recent Posts

Developing General Knowledge: Overcoming Ignorance

Part of an ongoing series of articles on clear-thinking skills, excerpted from “How to Think Well, and Why: The Awareness to Action Guide to Clear Thinking” by Mario Sikora (available at www.awarenesstoactionbooks.com). Last time we discussed ignorance; below are some...

Why We See What We Expect to See: Confirmation Bias and the Enneagram

By Mario Sikora The voice came from behind me in a moment of downtime after an exercise during the training. “So, Mario, have you identified your subtype?” I turned around to see Don Riso standing a few feet away. “Yes, I’m a self-pres Eight.” “No you’re not. You’re...

Discover more from Awareness to Action International

Subscribe now to keep reading and get access to the full archive.

Continue reading