Incompetent but Confident: The Dunning Kruger Effect

Mario Sikora
Apr 30, 2012
“Ignorance more frequently begets confidence than does knowledge.”
Charles Darwin

We’ve all heard of the Peter Principle, the idea that people tend to be promoted to the level of their incompetence. Few are aware of an even more dangerous phenomenon, however–the fact that the least competent among us are the least able to see their incompetence, otherwise known as the “Dunning-Kruger Effect.”

Described by psychologists David Dunning and Justin Kruger, the Dunning-Kruger Effect (DKE) is the phenomenon where the people who are least competent in an area are the least able to judge their competence and the most likely to be overconfident in their expertise. They will also be least likely to be able to recognize competence in others, so they tend to ignore or dismiss experts because they don’t actually recognize their expertise. 

One of the best examples of the DKE is the Stephen Colbert show, in which he plays an ignorant host with little knowledge but very confident opinions about everything. Other blatant examples include:

  • The couch potato ranting at the television about how he could hit the pitches that the slumping third baseman is missing.
  • The celebrity without any understanding of science who gives medical advice to an audience on Oprah.
  • New Age mystics with no training in physics who tell others how to change their reality with the power of their minds and quantum physics.
  • Protesters with no understanding of business realities who rail against evil corporate practices.
  • Climate-change deniers with no understanding of climate science who rant about the climate-change “hoax.”

Unfortunately, the DKE affects us all. It is almost trite to say that we don’t know what we don’t know, but this fact has great implications. 

Humans are wired for certainty. Our brains don’t like loose ends or open questions, so it seeks to wrap things up in neat little stories that make our existential anxieties go away. At the same time, our brain is wired for energy efficiency, meaning that it will seek the simplest way to make uncertainty go away. It creates stories about the world, giving us a sense of closure and certainty but blocking out new or conflicting information. It will disincline us from seeking information that would cause us to spend time and energy reconciling our beliefs with the facts. The less information we have about a topic, the more simplistic our story about that topic is and the less willing we are to put in the time and energy to becoming informed and competent. The more ignorant we are, the more our brain will fight against seeing our ignorance because, by the logic of the brain’s intuitive cost-benefit analysis, it makes more sense to fool us with overconfidence than to invest the energy into learning all the things we need to learn to build competence.

In fact, there is a cognitive bias called “the Backfire Effect” which describes how the more someone tries to convince of us the wrongness of our views, the more we defend those views and resist changing them, facts be damned. (This is why arguments based on logic and reason are so ineffective against the kinds of people described above.) The Backfire Effect helps hold the Dunning-Kruger Effect in place.

As with many cognitive biases, there are ways to mitigate the impact of the Dunning-Kruger Effect in our own lives. 
  1. The first step is to simply be aware of the phenomenon and learn to see it in others (while understanding that you are not immune from it). 
  2. When voicing an opinion about something, ask yourself, How do I really know this to be true? How much do I really know about this topic? How could I learn more about it? What are the rationales for points of view that conflict with mine? 
  3. Appreciate the value of expertise. Yes, experts are not always correct and one should always apply critical thinking. But, especially in technical issues, there is generally value in expertise and the consensus of experts is usually correct. Using climate change as an example, the vast, vast number of climatologists (i.e., the experts) say that it is an real and anthropogenic problem. Those who disagree with this tend to be non-experts or experts in a field other than climatology. It is logical to weigh the view of the experts vs the non-experts in the same way you would when assessing a surgeon: Would you seek treatment from a heart surgeon or a car mechanic if you need a bypass? Expertise matters.
  4. Work to develop your expertise in areas that matter to you. Oddly enough, one of the findings of Dunning and Kruger’s research was that the higher the level of one’s expertise, the more likely one was to correctly assess one’s competence. In other words, highly competent people in a particular area are able to assess their strengths and recognize their weaknesses better than people with average or below-average expertise.  

Recent Posts

Developing General Knowledge: Overcoming Ignorance

Part of an ongoing series of articles on clear-thinking skills, excerpted from “How to Think Well, and Why: The Awareness to Action Guide to Clear Thinking” by Mario Sikora (available at Last time we discussed ignorance; below are some...

Why We See What We Expect to See: Confirmation Bias and the Enneagram

By Mario Sikora The voice came from behind me in a moment of downtime after an exercise during the training. “So, Mario, have you identified your subtype?” I turned around to see Don Riso standing a few feet away. “Yes, I’m a self-pres Eight.” “No you’re not. You’re...