Overconfidence: The Downfall of Decision-Making

Overconfidence sets us up for a long fall if we’re not realistic about our limitations. We all have cognitive biases that contribute to being overconfident, but these can be overcome with 6 practical methods to make more successful decisions.

3 comments

6 Practical Ways to Avoid Overconfident Decisions

Our collective self-confidence surprisingly defies logic.  One researcher found that “on nearly any dimension that is both subjective and socially desirable, most will see themselves as better than average.” Studies have shown:
  • Two-thirds of college professors rated themselves in the top 25% of professors.
  • 77% of Swedish drivers rated themselves safer than the median.
  • 65% of Americans believe they are above average in intelligence.
  • 70% of students rated themselves above average in leadership.
Psychologists call this phenomenon of overestimating our own qualifications and abilities the “Illusory Superiority” Bias. It appears in many areas of life—estimates of our own intelligence, creativity, memory, health behavior, job performance, etc. Further studies have shown, however, that there’s very weak correlation between self-reported skill levels and actual performance. We obviously can’t all be “above average,” and inflated ego makes us less aware of risks that threaten our decisions.
IllusorySuperiority
Icarus02
The Greeks and Romans used the story of Icarus to highlight the dangers of overconfidence—Icarus donned a set of wings made by his father to escape an island where the two were imprisoned. His father warned him not to fly either too low, lest the spray of the waves soak the feathers and drag him down, or too high, to prevent the sun from melting the wax that held the feathers. Once aflight, however, Icarus proudly soared higher and higher, only to plummet on a featherless frame to the sea and drown. The story is meant to remind us that proceeding without proper caution, humility, and respect for our limits can bring about catastrophe. In a low-risk, steady-state environment, having higher-than-reality confidence levels may not be harmful. But in a VUCA (Volatile, Uncertain, Complex, Ambiguous) environment, over-estimating our abilities can be dangerous. We need not lose all of our confidence, but effective decision-making requires a dose of humility as well—the ability to admit that we might not be as smart as we think we are—to admit we might be wrong! Overconfidence leads to bad choices. Our human problem is that it’s hard to avoid being overconfident. Along with this Illusion of Superiority, we appear to be wired with other cognitive errors that contribute to our overconfidence—Hindsight Bias, Outcome Bias, Survivorship Bias, and the Planning Fallacy.  But we can anticipate this tendency, and practice techniques that make our decisions more realistic. In the following discussion, we’ll briefly look at those biases that cloud our self-assessments, and then explore six methods we can use to overcome overconfidence, improving our odds for success.

What leads to Overconfidence?

The Hindsight cognitive bias stems from the problem that the human mind, after we’ve changed in our knowledge or opinion on something, has difficulty recalling its own state of mind before that change.  Our present selves underestimate how surprised our past selves were by past events. This is what psychologists Fischhoff and Beyth identified as the “I-knew-it-all-along” bias, and what behavioral economist Daniel Kahneman discusses as the Hindsight Bias. Experiments have shown that people polled before and after major events greatly overestimate their own accuracy in predicting an outcome. For example, students polled before and after the impeachment trial of President Clinton overestimated their previous level of certainty that he would be acquitted. Kahneman remarks in Thinking, Fast and Slow, “The tendency to revise the history of one’s beliefs in light of what actually happened produces a robust cognitive illusion.” HindsightBiasSummary OutcomeDef The inability to remember our past states of mind can lead us to false conclusions about the cause of events that we see, which is the Outcome Bias. Kahneman notes that Hindsight Bias “leads observers to assess the quality of a decision not by whether the process was sound but by whether its outcome was good or bad.” Because our minds want a coherent story, we quickly fill in a cause and effect that credits our efforts and ignores the role of chance. Every decision that we make will have some element of random chance involved, which we could not possibly forecast…and many times we have a successful outcome despite our efforts because we got lucky. While coaching operational efficiency and safety practices in high-risk environments, I’ve often observed workers crossing safety barriers as a shortcut. The outcome is usually “positive” in that no one gets hurt…but that didn’t make it a good decision. In a business environment, just because a risky project turned out profitable does not mean our plan was perfect—it could have been a confluence of events that were outside of our planning that brought success. Kahneman gives an example of a surgeon performing a low-risk surgery in which a highly improbable and unpredictable event occurs to cause the death of the patient. Because of Outcome Bias, it’s likely that a jury in a malpractice suit would greatly overestimate the risk that could have been foreseen before the surgery, simply because they know the outcome and discount the role of chance.

This slideshow requires JavaScript.

SurvivorshipDef Closely related to Outcome Bias is the tendency we have to look at only the winners of a selection process to draw conclusions about how they may have become winners. We are tempted to read the stories of a Steve Jobs or Elon Musk for clues, thinking that if we just do what they did, we might be successful. But we don’t have the huge volume of stories of people with similar talents & circumstances who didn’t become billionaire CEOs. Their more numerous stories would put into perspective the role of chance in life, as well as be much more educational for pitfalls to avoid. It’s the survivors who write the myriads of “How I made it, & how you can too!” books, even though their lessons don’t tell the full story.
Some enterprises may even use Survivorship Bias to entice customers. Imagine getting a newsletter from a financial planner who would like you to invest. The first letter includes free financial advice, including a prediction about a key stock price going up. Lo and behold, the price goes up the following week, just as predicted. You get a second letter, with another stock prediction, and so it goes for ten weeks, with every week containing an accurate rise of various stocks. By the end of 10 weeks, you’re ready to let this superstar forecaster run all of your investments! But what you didn’t see behind the scenes was that yours was only one of 1024 newsletters that went out that week, and half of them predicted the stock to rise, half to fall. Keeping track of the 50% that experienced an accurate prediction, the planner sent out 512 letters to that half, and so on, until in the 10th week, you were the lucky person who had seen all 10 predictions come true. In the same way, some mutual funds may play the same kind of trick. They “incubate” funds before presenting them to clients, selecting only the survivors, leaving out the history of all the other funds that underperformed, so it appears their investing acumen was much higher than it really was.
Because of these biases, we tend to take the best case estimates of our ability & circumstances, focusing more on past successes than failures, discounting the role of luck, when we forecast or plan for the future. The resulting Planning Fallacy means that we overestimate our ability to accomplish a plan on time, on budget, or with the projected benefits we had envisioned. A study by Oxford professor Bent Flyvbjerg produced what he calls the “iron law of megaprojects”: Over budget, over time, under benefits, over and over again. He found that 9 of 10 projects had cost overruns, and that 50% overruns were common. The UK-France underwater “chunnel” project was 80% over budget. In Thailand, the Suvarnabhumi Bangkok International Airport was first government-approved in 1991 with a US$4 Billion budget. Construction didn’t begin until 2002. Final cost of the first phase was US$5 Billion (a 25% overrun) and opened over a year behind an already delayed schedule. Structural and capacity problems still plague the project. Expected capacity was 45 million passengers annually, which was quickly exceeded, requiring the old airport to be reopened. Bangkok’s underground Mass Rapid Transit (MRT) had an even worse cost overrun of 67%, and suffered many delays both due to finances and unexpected engineering difficulties.
Suvarnabhumi Bangkok International Airport went 1 Billion USD over budget, over a year behind schedule, plagued with structure and capacity problems.
Bangkok’s first underground rail, the Mass Rapid Transit (MRT)  Blue Line suffered numerous delays and went over budget by more than 67%.
The Planning Fallacy affects people on a personal level, as well. US homeowners who planned kitchen renovations expected, on average, to spend $18,658. In reality, however, they ended up paying an average of $38,769 (nearly 108% over budget!) As British statesman Benjamin Disraeli once said, “What we anticipate seldom occurs; what we least expect generally happens.”

The 6 Practical Methods to Control Overconfidence

The key to guarding against overconfidence is the lesson of Icarus and flying too close to the sun: proceed with proper caution, a dose of humility, and respect for our limits.

1. Learn to Be a Superforecaster

“Superforecasters” are people who have demonstrated an exceptional track record of making accurate probability forecasts in complex areas such as geopolitics, economics, public health and technology. Political scientist Philip Tetlock and “decision scientist” Barbara Mellers identified and trained Superforecasters to consistently win forecasting tournaments sponsored by the US intelligence community. One of the most surprising outcomes of these forecasting tournaments has been the extent to which even minimal training in methodology could improve forecasting ability, regardless of one’s area of expertise. Here are some of the key features for Superforecasting:
  • Look at the future as multiple possibilities, each with different odds of coming true. Make your forecast as a range of possibilities, not a single target.
  • State your forecasts as an “80% confidence interval,” meaning that you have a reasonably best case (high) and reasonably worst case (low) estimate that you believe contains the correct answer 80% of the time. For example, combining these first two methods, if we were asked to predict whether opening a new branch location would be profitable, we would frame the response as “I’m 80% confident that a new branch would have between a -5 and +15% profit margin.”
  • Base the odds largely on “Reference Forecasting,” or comparing the current problem with similar problems of the past. This is what Daniel Kahneman calls using the “outside view.” For example, in predicting the odds of a new restaurant succeeding, compare to the general success rates of new restaurants.
  • From the outside view, adjust using the “inside view” — those circumstances specific to the situation, or newly available data, to adjust the probabilities accordingly. For example, analyzing the unique positive and negative characteristics of a potential new restaurant to adjust profitability forecasts from the reference class of newly opened restaurants.
  • Learn basic statistics, including learning to adjust probabilities given new evidence via Bayes’ Theorem.
  • Record the process and outcome of your decisions by holding review sessions, and use lessons learned to update the accuracy of this and future forecasts.
For more help improving your forecasting, check out these resources: “Six Rules for Effective Forecasting” by Paul Saffo Philip Tetlock’s “Ten Commandments for Aspiring Superforecasters.” Tetlock & Gardner’s Superforecasting: The Art and Science of Prediction Participate in the Hybrid Forecasting Competition

2. Learn from Past Failures

The best way to fight survivorship bias is to constantly be looking for lessons to be learned from failures, to find why other decision makers, despite having similar circumstances, failed. The overemphasis on unique success stories, without looking at the pitfalls that brought others down, is one of the most pervasive biases in the business consulting industry. It is much easier to study the successes—the Amazons, Zappos shoes, etc., because they stand out and their records survive. Sleuthing to find other enterprises and the hazards or circumstances that brought them down takes extra effort and analysis, but gives you an edge in smart decisions. The story of an unsung hero from World War II makes a perfect illustration of recognizing and countering the survivorship bias. In the early years after the US entry into the war, bomber missions were practically suicide flights—US bomber crews had about a one-in-four chance of making it through their required 25 missions. British attrition rates were even higher, close to 50%. The Army Air Force wanted to fortify the bombers with extra metal to increase their survivability. Slapping extra armor all over the plane was impractical, as it would make the heavy planes unflyable. Officials turned to a special team of scientists, the Statistical Research Group, who were using advanced mathematics to solve technical problems. They gave this team all available statistics on bombers with plots of everywhere they had been hit, which was primarily along the wings, fuselage, and around the tail gunner.
Plane copy
Illustration (not based on actual data) of the type of data taken from surviving bombers.
Abraham Wald, Mathematician
The Army brass was inclined to armor up those areas. But the team, and particularly a brilliant Jewish immigrant from Hungary, Abraham Wald, realized that the statistics were missing a key component—the bombers that didn’t survive! Wald assumed that in fact combat damage on planes was likely to be evenly distributed. Because the surviving bombers were showing a pattern that was NOT evenly distributed, he could determine those vulnerable areas which the missing bombers would have revealed. He used complex mathematics to extrapolate and determine critical areas, like the engines, that needed reinforcement. Recognizing the Survivorship Bias helped thousands of airmen survive.
A recent Business Insider UK article provides a list of 25 failed products that could be very instructional. From that list, here’s just one puzzle that would be worth figuring out: In 1975, Sony had a superior technical product for video tapes, Betamax, which had reached the market first, had significantly better video quality, and convenience advantages such smaller size and faster tape-winding features. But in an expensive battle, Sony lost the market to an inferior video quality VHS format. In 2006, however, Sony’s Blu-ray technology managed to beat a serious rival format, HD-DVD which had reached the market first, was cheaper, was used exclusively by Universal Studios, Paramount, and Dreamworks, and was being promoted by Microsoft. A thoughtful investigation into these cases of failure and success would be extremely fruitful. What were the differences in these cases? Why did one fail, and the other succeed? How much of the result was simply chance? Stretching beyond survivorship bias helps us raise enlightening questions that we would otherwise miss.

3. Learn from Future Failures

Historical data, even if it is “Big Data,” can only take us so far in predicting probability of future success. Even the ancient philosopher Aristotle recognized that scientific observation of events was only useful for things that “could not be other than they are.” For making decisions about an uncertain future, we need to think in terms of “things as they could be.” In an excellent Harvard Business Review article, Professors Roger Martin and Tony Golsby-Smith make the argument that “Executives need to deconstruct every decision-making situation into cannot [change] and can [change] parts and then test their logic.” For a situation that can be changed, we can use our imagination to explore all the possible ways that a decision could fail. One effective exercise of the imagination is the “Pre-Mortem,” or a prospective hindsight. You can use the creative nature of this exercise to reveal potential “failure modes” that can be further analyzed. The basic idea is to write a story as if from the future (say, one year from now). In this story, you imagine that the idea that you are presently considering was implemented, but failed. Your story is the analysis of why it failed…what went wrong, when did it go wrong, who was involved, and why. Here’s a basic “How-to” of a Pre-Mortem. Step 1. Have Individual Team Members Independently Generate Ideas and Write Them Down. The key to the exercise is to be as free thinking and creative as possible in finding potential failure modes. Formulating multiple possibilities takes more innovative than analytical thinking; engaging the more artistic part of the brain can help the ideas flow. Often, after studying a problem initially, disengaging for a little while will spur creativity. Here are some ways to activate the more innovative part of the brain: • Take a quiet walk alone, without any electronic devices. • Listen to or play instrumental music. • Enjoy a few funny or creative videos on YouTube or Vimeo, like some entertaining animal videos. Step 2. Call a Team Meeting. Post the Written Ideas for Potential Failure Modes on the Wall. Step 3. Give the Team Time to Browse and Evaluate All the Ideas. Step 4. Discuss and Outline a Plausible Pre-Mortem Story. Step 5. Conduct a Failure Mode and Effects Analysis (FMEA) A Failure Mode and Effect Analysis (FMEA) organizes our thoughts about risks to a decision that we are considering. It analyzes three aspects of each potential failure mode: • Severity of the Failure (Would it be total disaster, or a minor setback?) • Probability of Occurrence (Would only happen one in a million times, or is it quite possible to happen once in a hundred times?) • Probability of Detection (How likely are we likely to detect a failure with enough time to react?) You rate each potential failure by the criteria above with a score between one and ten, then multiply the three factors. That product is a “Risk Preference Number,” and the higher the number, the more attention that should be paid to that risk. We can explore ways to change our plan that might lower the severity of a failure, lower the chances that it occurs, or increase the chance of detecting a possible failure in time to prevent it. Below is a simplified example of performing this exercise:
A shorthand version of this process for making quick decisions can boil down to a few simple questions: “What’s the worst thing that can happen?” “What’s the most likely thing that can go wrong?” “What can I do to prevent failure?” “How am I going to recover if the failure occurs?”

4. Fail Fast and Forward

Did you know that social science research has shown that too much information can worsen our decision-making ability? In a 1973 study, with results that have since been corroborated in other experiments, psychologist Paul Slovic gathered 88 unique factors that would help professional horse bettors predict the outcome of horse races. Factors included past performance, jockey weight, jockey experience, days since horse’s last race, etc. He gave the professional bettors a chance to pick what they thought were the top five most relevant factors, then top 10, 20, and 40 factors. Using historical data on 40 actual races, “sanitized” so that the bettors would not be able to recognize them, he asked them to predict the outcomes given the top 5, then 10, 20, and 40 pieces of information that they had identified as being useful. In other words, each bettor predicted each race four times, each time having more information. The surprising result was that increased information had virtually NO effect on accuracy of the predictions. However, the professional handicappers’ confidence shot up with each increment of more information (see the illustration below). Consider the consequences of this—if someone is more confident in their decision, they’ll be more willing to wager much larger capital. And yet, they’ll have no more chance of success than they had with less information.
The lesson here is to avoid overspending on resources to achieve a “perfect” outcome when you can implement an adequate solution and learn from it. This is much like the principle of design thinking, or agile thinking, which emphasizes producing a prototype, getting feedback, and making incremental improvements. This model acknowledges that we are severely handicapped in guessing the future, can become dangerously overconfident, and that experience can be the best teacher.

5. Set “Joker” and “Bingo” Calls (Tripwires for Taking Action)

KC 135 Stratotanker Refueling F 15 Eagles
When, as a tanker aircraft commander, I would brief missions that involved aerial refueling of fighter aircraft doing basic air-to-air maneuvers with each other, I would always announce our “joker” and “bingo” fuel levels, and coordinate with the fighter aircraft to know what their “joker” and “bingo” were. Joker and Bingo fuel levels were conditions that made everyone aware of a changing condition that called for action. An aircraft would call “Joker” over the radio as an initial warning that fuel levels were getting low, and it was time to finish up the current maneuvers, and start to form up for return to home base. “Bingo” was the “no-kidding, it’s time to go home now” call to ensure adequate fuel for landing. There could be no arguing the call, no matter how much fun everyone was having or how confident one was in flying ability…once the aircraft reached bingo fuel, it was time to go. In the same way, to fight overconfidence in our best laid plans, we should pre-set a Joker call to give ourselves an initial preparation signal, and a Bingo tripwire which means “take action now.” It’s especially important to set this type of warning system when we’ve made assumptions that might change over time. In 1981, the photograph giant Eastman Kodak issued an internal report that made three significant assessments of the digital photography market:
  1. Consumers would never go for lower quality digital pictures over prints from film.
  2. Consumers would prefer to have the joy and experience of handling and giving prints to family and friends over viewing photographic memories on electronic devices.
  3. Electronic devices would be too expensive to enjoy widespread use.
The first digital camera, invented at Kodak labs in 1975.
But suddenly came the digital image and storage advances, the Internet, mobile phones, WiFi, and cellular data plans. Kodak’s failure wasn’t their technology. They could have been pioneers in digital photography, and in fact invented the world’s first digital camera, a 3.6 kg (8 pound) device that took its first black & white digital image in December 1975. Kodak failed to set a Joker or Bingo call that could break them out of their self-narrative that they were only a film company, and forced them to invest fully in digital development. A Joker call could have been something along the lines of the following: • Monitoring digital resolution advances      Example: technology reaching a digital resolution of 500×480 • Noting when a certain percentage of consumers are using electronic devices      Example: 10% of consumers carrying a portable device able to view digital pictures • Setting a trigger for when electronic devices reach a certain price       Example: portable electronic devices below $900 USD. Tripwires that are planned ahead of time take any emotional investment out of the picture, and make the decision to proceed on to another plan easier and more logical.

6. Over-Engineer

One final method for guarding against Overconfidence in decision-making is a rather “brute force” solution—looking at reference class cases for historical averages of budget and time overruns and inserting a margin of error for anticipated excess. Engineers design bridges or elevators to include a margin of safety many times what is required in strength. An elevator that posts a “10-person limit” is probably designed with a safety factor to handle 30, or 50, or more (although I don’t recommend putting it to the test). In the same way, we can plan on putting in a “cushion” for budgeted projects.
Engineers design in a very large cushion for error on critical projects such as elevators.
Professor Flyvberg, who formulated the previously-mentioned “Iron Law of MegaProjects,” describes how the United Kingdom and Denmark governments have mandated the use of reference class forecasting to account for Overconfidence. For bidding in public projects, planners must reference the historical records of similar projects (such as underground railroads, airports, a health service, etc.), and look at how often, and by how much, previous projects went over budget or deadline. The planned budget will take the contractor’s estimate and add the historical rate of overruns in order to have a more accurate prediction. Most importantly, the governments also create incentives so that contractors don’t take advantage of this padding. Contracts are written so that companies gain bonuses for hitting targets, or pay significant penalties for not hitting targets. Over-engineering a decision must have these kinds of incentives built in to make this method practical. We can apply the same principle to professional and personal decisions, provided accurate data has been or can be found. For example, suppose we wanted to decide whether or not to add a new function to a product, and need to know how much time and cost it would be. If good records have been kept about previous decisions on adding features, with comparisons of predicted and actual costs and timelines, we can adjust the current estimate by the historical errors. Of course, past performance doesn’t determine our fate, and some might hesitate to plan on missing deadlines or controls; thus the need to plan incentives to ensure optimum performance.

Conclusion

Poor Icarus didn’t perish due to normal pride or confidence—his fault was in the excess. People familiar with his story of flying too close to the sun often miss that he was also warned not to be afraid and fly too low, because he could just as easily have been dragged down with the same fatal result. Confidence is a positive trait that we like to see in workers; we just need to be aware of over-rating our abilities, to understand our hindsight, outcome, and survivorship biases, and to not rely on over-optimistic scenarios when planning the future. Balancing our confidence with the 6 methods – forecasting by probabilities, learning from past and imagining future failures, quickly prototyping and learning from mistakes, setting conditions (Joker and Bingo) that call for action, and building cushions based on historical data – will keep us on the middle flight path between the sun and the waves. This concludes the five-part series on cognitive biases and decision-making. If you’ve enjoyed the reading, or have something to add, please start a conversation in the comments. If you’re ready to soar to a higher level of ability and success in your own life, please join other high performers in learning more and practicing decision-making and other soft power skills @ Soft Power Skills Academy. I’ve created the 4-week series of online, face-to-face workshops in order to improve your leadership skills and human performance. The  curriculum focuses on practical goal-related projects, not boring lectures, guided by my experience and expertise, receiving honest feedback from your peers while building a lifelong network. The course is much more cost and time effective than an MBA, with similar benefits in mastering those skills that give you an edge in the marketplace.

3 comments on “Overconfidence: The Downfall of Decision-Making”

  1. This superb tutorial is one to bookmark and revisit. I learned much and will come back to this – it’s one of those in-depth articles that provide you unique insights each time you read.

Leave a Reply