By Mario Sikora
An ongoing series of articles on clear-thinking skills, excerpted from “How to Think Well, and Why: The Awareness to Action Guide to Clear Thinking” by Mario Sikora (available at www.awarenesstoactionbooks.com).
In the last few articles we discussed built-in biases that hinder clear thinking; now we provide some tools to protect us against those obstacles.
The scientific method relies on four steps: Observation, creating a hypothesis, attempting falsify our hypothesis, and adjusting our hypothesis accordingly.
The key step is the attempt to prove ourselves wrong rather than to prove ourselves right. This attempt at falsification is one of the things that separates science from pseudoscience or mere debate.
Related to the question, “How do I know this to be true?”, the effort to prove ourselves wrong should be at the heart of every critical thinker’s toolkit, and they should remember that every time they have proven themselves wrong they have learned something new.
Doctors do it, pilots do it—when lives are on the line, people use checklists. People tend to rely on their memories when they have a series of tasks to complete, but our memories are notoriously flawed and our attention is remarkably short. Important processes benefit from the creation of checklists.
Named after medieval philosopher William of Ockham, Occam’s Razor is another name for the principle of parsimony—the principle that we should keep explanations as simple as possible (but no simpler!) Commonly misunderstood as meaning that the simplest answer is usually the correct answer, Occam’s Razor simply encourages us to not add factors unnecessarily. If we can adequately explain a phenomenon with two factors, we shouldn’t add a third.
For example, if we want to explain how ice forms, the first equation below is better than the second.
Ice = Water + Cold
Ice = Water + Cold + Ice-making fairies
“Never attribute to malice that which can be attributed to stupidity.”
It is easy to assume bad intentions from others when we are overlooked or wronged. Often, however, such things are explained by mistakes rather than ill-intent. Hanlon’s Razor reminds us to calm down, take a deep breath, and give people who appear to have wronged us the benefit of the doubt.
Thinking about how we don’t want things to go can help us plan to avoid those outcomes. Known as “inversion,” thinking backward from an envisioned outcome can help us get to the outcomes we want and avoid the outcomes we don’t.
Unintended or unforeseen consequences are often the result of our decisions. Unfortunately, we often don’t take the time to consider what such consequences might be. We make a decision to try to resolve an immediate problem, but fail to ask, “But then what?” Second-order thinking involves taking the time to think through the consequences of our actions and preparing for them.
Very little in life is deterministic and we can rarely make predictions with great certainty. We do better, however, when we get into the habit of thinking in terms of probabilities—how likely is it that a thing will occur—and continue to update the odds as new data becomes available.
Avoid Unjustified Leaps of Inference
An unjustified leap of inference is drawing a conclusion that may not necessarily flow from the premises used to arrive at that conclusion. For example, just because I can safely jump off a curb or jump off a chair does not mean I can safely jump off a cliff. Assuming I could do so would be a leap too far.
When evaluating a conclusion or chain of reasoning, it is important to evaluate that each step is justified by the previous step.
Learn to Distinguish Between Naïve Intuition and Expert Intuition
System 1 thinking inclines us to trust our intuition, even if our intuition is not accurate. There is a distinct difference between naïve intuition—intuition that is based on uninformed “gut” feelings—and skilled intuition—the kind of non-conscious expertise that is the result of extensive training and experience.
If you’ve ever seriously studied or trained in a physical, artistic, or even intellectual pursuit, you know that after a lot of practice things that initially seemed difficult and required a lot of conscious effort became intuitive and automatic after time. This is because you trained your mind and your muscles to non-consciously act, but these actions were based on practiced expertise.
Unfortunately, the mind cannot always tell the difference between the feeling of this kind of skillful expertise and the illusory feeling of expertise in an area in which we have no trained competence. For example, because we develop the capacity to intuitively read the emotions and moods of people we know well based on repeated experience, we tend to falsely believe we can intuit the moods and needs of people we have just met. This same phenomenon is found in many areas.
Intuition is a useful and critical part of our nature, but we have to remember that there is a difference between skillful intuition based on experience and practice and potentially dangerous naïve intuition based on nothing but a false feeling of competence.
Our series on clear thinking will continue in the next article.