Imagine driving down the highway, and suddenly a red flash cuts you off and zooms recklessly ahead, quickly disappearing in the distance. You think to yourself, “What a jerk! Thinks he doesn’t have to obey rules…he’s going to kill somebody! It figures he’s in a red Porsche…” And from that point you draw a conclusion that people who drive expensive red sports cars are jerks. You’ve switched on your radar to look for jerk-like red car drivers, and sure enough, you start to notice them everywhere. You’ve succumbed to the Confirmation Bias Villain—the tendency to seek out only information that supports your pre-existing beliefs.
In “Making Better Decisions: 5 Practical Methods to Create More Choices,” we explored beating the Narrow Framing Villain (terminology used by Chip & Dan Heath in Decisive) with ways to broaden our point of view and create more options for our decisions. Once we have identified more choices, we need to guard against becoming too quickly attached to a favorite– establishing a belief that causes us to focus our research on what supports that belief, and dismiss anything contradictory. The following explores seven practical methods to defend against Confirmation Bias.
- Practice the Scientific Method Mindset—Try to Disprove Your Belief
- Ask Specific and Disconfirming Questions of Your Sources
- Test Your Assumptions
- Use Controlled Experimentation
- Argue for the Opposite Choice
- Create a Safe Environment for Constructive Dissent
- Use an Outside View First, Then Adjust With the Inside View

Method One: Practice the Scientific Method Mindset—Try to Disprove Your Belief
Our opening example of the person who sees only badly driven red cars illustrates our natural tendency to fall into the confirmation bias trap. We think that, if we’re right about some belief that we hold, then evidence “A” would be right and validate our theory. So we look for that confirming evidence A. The scientific method teaches one to be skeptical. We learn to think that, if our theory is right, then evidence “B” could not be true and would disprove our theory; so we look for evidence B that will disprove our theory. It’s a different way of thinking about the world, and not always easy.
In fact, in the abstract, it appears difficult for humans to fully follow the logic of disconfirming evidence. Let’s illustrate with a famous experiment called the Wason Selection Task, named after psychologist Peter Cathcar Wason. Here’s the task: You are shown a set of four cards on a table, each of which has a number on one side and a colored patch on the other side. The cards show 3, 8, red and brown. Which card is, or which cards are, necessary for you to turn over in order to test if the following proposal is true or not: If a card shows an even number on one face, then its opposite face is red.
Before I tell you the answer, let me console you that only about 4% of the tested subjects got the answer completely correct…and I’ll let you “cheat” a little. I’ll put the question in a context which, in one study, increased the amount of right answers to about 72%. In the cards below, one side of the card is a person’s age, and the flip side shows a soda or a beer. Which card(s) must be turned over to test the idea that if you are drinking alcohol then you must be over 18?
The answer to the top one, you may have guessed now, is to turn over the 8 (because 8 is an even number, then its opposite face must be red), and the brown card (because if you turn it over and find an even number, you have just disproven the rule). The second picture probably made it easier, because it comes with a story with which most of us are familiar. Turn over the beer, because if someone’s drinking beer, and the other side is 18 or lower, the rule is broken or disproved. You also turn over the 16-year old, because if she’s drinking beer on the other side, the bar is going to lose its license for breaking the rule.
Now, there’s quite a bit of controversy over the ultimate meaning of Wason’s Task, but it illustrates that we have to think and look hard to find evidence that disproves a theory or belief. In the first abstract task, many people understand that turning over the even-numbered card should yield a red card, confirming the rule of “if even, then red.” However, most overlook the fact that turning over the brown card is a way to disprove the rule. In our search to disprove ourselves, we also need to dig deep into the sources informing our decision.
Method 2: Ask Specific and Disconfirming Questions of Your Sources
When researching background information to make a choice between options, we can think of ourselves as hard-nosed investigative journalists, hungry to get to the bottom of a story. Journalists get the most out of their sources by carefully preparing specific questions that challenge the superficial version of a story. As an Air Force Academy professor of history, I encouraged students to carefully evaluate their sources in at least three aspects:
- Is the source qualified for the specific area of expertise? Is it accountable for telling the truth, such as subject to audits or peer reviews?
- Is the source getting information from direct knowledge, or is it relying on second- or third-hand information?
- What is the purpose of the source in providing information? Are there any ulterior motives, like trying to sell something?
Author Cheryl Strauss Einhorn, herself a former journalist and now expert on decision-making, emphasizes the importance of smartly questioning sources. In her book Problem Solved, she notes questions for interviewing experts related to a decision come under four types:
- What does the subject do?
- What does the subject know?
- How does the subject feel about the information?
- How does the subject evaluate the significance of the information…what do they think?
For example, if I were a youth trying to decide on whether to join the military with a goal of flying, I would seek out expertise on that career choice…but also scrutinize it to find evidence that might disconfirm my beliefs. According to our template above, I might ask the following questions:
- What is your expertise about the military? Do others consider you an expert on military flying careers?
- Are you an aviator in the military?
- Are you a recruiter, or do you gain any benefit from recruiting people into the military?
- (For a military aviator) What is a typical day for you as a military flyer?
- What are the benefits and drawbacks of flying in the military?
- Are you happy as a pilot in the military?
- Do you think joining the military to fly is a good idea for me?
Method Three: Test Your Assumptions
We had a slightly crude joke about the word “assume” in the Air Force…When we “a_s_s_u_m_e” things, it makes an “ass” out of “u” and “me.” When we face decisions, we are trying to predict the future (which we are notoriously bad at), and we inevitably must make some assumptions. Since we’re so bad at predictions, we need to recognize and test as many assumptions as we can.
In Decisive, the Heath brothers relate the story of Dr. Roger Martin’s technique of asking a single question to uncover assumptions: “What would have to be true for this option to be the right answer?” As a consultant, Dr. Martin asked this in a contentious standoff between executive-level mine owners who wanted to shut down an operation, and the local managers and engineers who wanted to keep it open. By asking the question, and getting both sides to examine the conditions that had to be true to make their proposal the right one, he broke them out of the confirmation biases fueling the argument.
Recognizing assumptions is not always an easy task, but can lead to “out of the box” thinking. Airbnb runs one of the world’s largest accommodation services, and doesn’t own a single hotel. Uber runs one of the world’s largest transportation services, but doesn’t own a single taxi or bus. Most people thinking to make an investment in or starting an accommodation or transportation service would probably automatically assume that one would have to buy property or vehicles, perhaps not even recognizing that was an assumption. These two companies, however, questioned those assumptions, and revolutionized two industries.
Method Four: Use Controlled Experimentation
One way to test our assumptions is to experiment with prototypes. You may have heard the phrase, “Go big or go home,” encouraging someone to go all out, 100% effort, when taking on a challenge. This is usually not good advice for making successful business decisions. Using controlled experimentation is a more effective way to test and improve ideas, based on iterative design thinking and the continuous improvement process, particularly as it has evolved in the software development world as the “agile” process.
The fear of the unknown makes many of us hesitant to take action when we’re less than 100% sure of a solution. Our information will never be perfect, even more so when we’re trying to predict the future. Many entrepreneurs have succeeded not by making a 100-page business plan that tries to anticipate every situation…they went out and started trying to sell their service or product, learning what worked and what didn’t. They went with a 70% solution and worked up to the 100%. To work, however, the process depends upon constant monitoring of feedback, drawing the right lessons from mistakes, shortfalls, or other areas of improvement. This takes honesty and humility…it requires admitting we might be wrong on our beliefs or assumptions.
Method Five: Argue for the Opposite Choice
Imagine if on Facebook, every Hillary Clinton supporter had to publish one week of posts outlining all the positive points of Donald Trump, while every Trump supporter had to honestly publish every point that supported why Hillary should be president. Perhaps some heads would explode?
This is one very effective method, however, of subduing the Confirmation Bias Villain. Roger Martin and Jennifer Riel’s Creating Great Choices explains an effective decision-making process called Integrative Thinking. The first step in the method is to take a tough either/or choice, which doesn’t seem to yield a satisfactory solution, and create or find two opposite, extreme models. Typical examples in the business world could include
- whether to strictly centralize some operation, such as corporate training, or completely decentralize the training
- make a product like software free, accessible, and modifiable (like Linux) to satisfy customers, or make it proprietary and restricted to gain revenue
- standardize and mass produce a product at lowest cost, or customize, increasing complexity and cost of production
The second of the four Integrative Thinking steps helps us overcome confirmation bias by prescribing a Pro-Pro chart of the extreme models, rather than Pro-Con chart. As Dr. Martin says in one interview, “You have to fall in love with the two models, sequentially.” The key is to keep an open mind, and overlook, just like someone who’s just fallen head-over-heels in love, the parts you don’t like…you don’t want to use those parts anyway, so there’s no use in wasting energy finding them. There’s time to do more analysis in subsequent steps, and sort out the warts and ugly spots of a model (which I will discuss in a later article, and you can learn about in the introductory video below), but we can use this step by itself to challenge our pre-existing beliefs.
Method Six: Create a Safe Environment for Constructive Dissent
Most of us have probably seen or experienced toxic work environments. There are bosses out there with fragile egos who want to be surrounded by “yes” men and women, and don’t treat dissent kindly. But if all the board members surrounding a CEO agree on the same thing, then the board is redundant and not needed. For those of us who aren’t CEOs or on boards, we do the same thing when we dig into research on a decision, and only pick out material that supports our viewpoint. At work and in our own minds, we need to create a safe environment for constructive dissent.
But by safe environment, we don’t mean delicately comfortable! In 1969, distinguished physicist Hans Mark became director of NASA’s Ames Research Center. Among other projects, Mark was instrumental in developing the Space Shuttle program. NASA credits him with the invention of the Murder Board (or “scrub-down” for a cleaner metaphor), “a ruthless, uncompromising (but fair!) board of individuals who grilled those proposing new projects or lines of research.” Murder Board members are all experts in relevant fields, and hold nothing back in questioning and criticizing any proposal’s assumptions, constraints, proposed work-arounds…any aspect of the problem is up for scrutiny. Defending one’s idea in front of a Murder Board is decidedly uncomfortable, but the “safety” of the environment is maintained by mutual respect for expertise, and single commitment to a single goal of an improved outcome, such as the innovative and successful Space Shuttle.
Creating a safe personal mental environment for constructive dissent is just as important. We often deflect, shoot down, or ignore feedback that doesn’t match what we want to believe. It takes brave people to invite challengers to try and shoot down their proposals, but as long as we’re willing to adapt in the face of legitimate critique, that process ultimately strengthens our ideas. In Thanks for the Feedback, authors Stone and Heen advise breaking down feedback in several ways to get the most advantage from it. For example, we must separate the content of feedback from our relationship with the feedback giver—even people that we don’t like can have valuable insight on what might truly be a bad idea that we have.
Method Seven: Use an Outside View First, Then Adjust With the Inside View
Picture two sharply-dressed young college buddies, just graduated, sipping champagne in a lovely outdoor garden setting, watching their good friend dancing with his radiant bride who he’s just married. They discuss how great those two are as a couple, how they seem so in love, how this luxurious ceremony and setting symbolizes the happy life they’ll have. Surely this couple will live a happily-ever-after life! If we asked them, “what are the odds this couple will get divorced,” they might take all those thoughts of their friend’s specific situation, and give pretty low odds. These well-wishers are letting their “inside view,” or knowledge of a particular situation, influence their prediction.
But a more accurate answer would start with not-so-fairy-tale divorce statistics of couples who marry young. That’s the “outside view,” or base rate, from which we should start evaluations. The base rate can come from any of several layers of the “reference class,” or broad category to which our decision belongs, and usually the closer to our specific case, the better. But choosing the correct reference case takes skill; the more specific we get, the less supporting data we’ll have. And in this era of mega data analysis, we could get so specific that we include outlier characteristics that overly skew or make results less relevant. In the case of our young bride and groom, we could probably make a better guess drilling down from overall US divorce rates, to divorce rates in our region, to divorce rates of college graduates getting married at age 22 in our region. However, looking at regional left-handed, 22-year-old college graduates born on a Tuesday (if we could find that statistic!) would not yield a more accurate prediction.
This all may seem like common sense, but the problem is that our minds are resistant to using the outside view. The inside view…caught up in the emotion of swirling wedding gowns and live string quartets…is much more vivid, attention-grabbing, and optimistic. Our fast-reacting and intuitive System 1 of thinking focuses on the inside view; it takes an active engagement of our higher order System 2 to use the outside view properly. (See the previous article for explanation of Systems 1 & 2).
There are at least two ways that decision-makers use the outside view to improve outcomes. For more accurately predicting costs of proposed projects, forecasters use reference class forecasting. This takes a large database of previous, similar projects to predict cost and time to completion. Another method is Bayesian Statistics, a mathematical approach that anchors an estimation on a plausible base rate, but then factors in evidence specific to a case. In our marriage example, we might start with the basic reference class estimate of divorce rates for male college graduates married at age 22, but then adjust the probability by factors of divorced parents, unequal alcohol consumption between partners, financial security, and religious beliefs. The math mechanics of both these methods are a bit complex, but they illustrate a proper outside-inside approach to avoid the usual error of being overly optimistic in decision-making.
Escaping confirmation bias is not easy. As Behavioral Economist Daniel Kahneman points out, even the difference between asking “Is Sam friendly?” or “Is Sam unfriendly?” will automatically send our minds searching for evidence to confirm his friendliness in the first instance, or finding all those times when we saw Sam a little short-tempered in the second. The 7 methods above provide an array of defenses against seeing badly-driven red sports cars everywhere just because of our preconceived notions. In the next decision-making article, we’ll explore ways to overcome mistakes due to short-term emotions.
So glad you mentioned Bayesian statistics, because that’s exactly what you were describing. Without getting technical, Bayesian methods are different from the frequentist statistics that most of us learn in school. A frequentist approach assumes data to be random, and asks how well the facts fit a fixed statistical model. The Bayesian approach takes data to be fixed (which, in fact, they actually are) and asks which of two statistical models better describes the data. The Bayesian approach also takes the step of asking how much it will cost us if our inferences turn out to be wrong. This is a powerful approach, because it is able to incorporate new information as we learn more, and it enables us to more intelligently deal with the real world. xkcd sums up their differences very nicely: https://xkcd.com/1132/. We almost never have the luxury of perfect knowledge; life requires that we make inferences and base decisions– even important, consequential decisions– on incomplete knowledge. Bayesian methods enable us to make the best decisions humanly possible given the information available.
Thank you for your comment! Dr. Daniel Kahneman mentions Bayesian statistics as useful in combatting confirmation bias and “disciplining intuition” in his work Thinking Fast and Slow. Although I’m not personally able to do the math, it’s obviously a powerful concept. A colleague is using the method to perfect a security alert algorithm with some success.