How We’d Like to Make Decisions: Rationally
How many times have you had a decision to make, pulled out a blank piece of paper, and started a list of “pros” versus “cons”? It seems like a pretty simple accounting process, adding benefits in the one column, subtracting the drawbacks in another, and going with the net balance. Such a straightforward process would work pretty well if we were all perfectly rational creatures…it turns out, however, that we are much less rational in decision making than we think.
How We Really Make Decisions: Irrationally
Research in behavioral economics—examining the psychology of human behavior to explain economic decision-making—has convincingly shown that we humans allow many “irrational” influences to affect our judgment, usually without realizing it. Scientists use labels for these influences like “biases”, “fallacies”, “effects”, and “heuristics” (which means making mental shortcuts), and understanding how these influences work is the first step in guarding against them to make better decisions.
One of the best resources for improved decision-making that accounts for some of these irrationalities is the book Decisive: How to Make Better Choices in Life and Work by Chip and Dan Heath.
The Heath brothers identify four “villains” of decision-making:
- Narrow Framing
- Confirmation Bias
- Short-Term Emotion
To capture and neutralize these villains, the authors offer up a four-fold set of techniques, which they summarize with the acronym WRAP, standing for:
- Widen Your Options
- Reality Test Your Assumptions
- Attain Emotional Distance
- Prepare to Fail
Before I summarize and expand on the WRAP solutions in a following article, I will first explore in more detail those four villains. In fact, there are many more than four enemies of rational decision making. If we look up “list of cognitive biases” on Wikipedia, we’ll find a list of 109 “Decision-making, belief, and behavioral biases”—obviously a bit overwhelming and not so practical! However, deepening our understanding of the more influential and subtly irrational ways of thinking will better prepare ourselves to effectively use techniques such as described in Decisive. Therefore, using the organizing frame of the four villains—narrow framing, confirmation bias, short-term emotion, and overconfidence– I will illustrate 18 aspects, or related biases, of these villains. For each item, I provide a definition and example, many of which come from the Nobel Prize-winning founder of behavioral economics, Daniel Kahneman.
His book Thinking, Fast and Slow takes a deep dive into the mind, with many surprising, counter-intuitive scenarios illustrating how wonderfully strange is our thinking process. In the information-packed 500+ pages are dozens of ground-breaking ideas that have shaken our understanding of the world, two of which stand out in their influence on decision-making. The first is how our minds so often fill in for missing information. Out of millions of bytes of information that continuously flood our senses, our brain (which Kahneman divides into the fast thinking, reactive “system 1” and the slower, more analytical “system 2”) can only process so much information at a time. It creates shortcuts and other ways of dealing with information. We want a coherent view of the world, and sometimes those shortcuts make stories to provide that coherency. Often the details, usually provided from our previous experiences, are accurate…but not always, leading to irrationalities. A second major finding of Kahneman and his deceased research partner Amos Tversky is that people experience the prospect of a loss or of a gain differently. In fact, they feel the pain of a potential loss about twice as intensely as the pleasure of a potential gain. The urge to avoid pain exerts a strong influence on decision-making, and not always to our rational benefit.
With that broad-brush background in mind, here are the 18 ways to make bad decisions grouped by villains:
The Narrow Framing Villain
- Bounded Rationality: What You See Is All There Is (WYSIATI)
- Framing Effect and Loss Aversion
- Choice Overload and Decision Fatigue
The Confirmation Bias Villain
- Confirmation Bias
- “Availability” Mental Shortcut
- Gambler’s Fallacy
- “Representative” Mental Shortcut
- Conjunction Fallacy
- Ambiguity Neglect and Aversion
The Short-Term Emotions Villain
- “Affect” Mental Shortcut
- Exaggerated Emotional Coherence: The Halo Effect
- Projection Bias
- Certainty and Possibility Effects
The Overconfidence Villain
- Sunk Cost Fallacy
- Hindsight and Outcome Biases
- Endowment Effect
- Planning Fallacy and Optimism Bias
The Narrow Framing Villain
Bounded Rationality: What You See Is All There Is (WYSIATI)
We limit our decisions to the information that we have on hand. We also fill in the blanks with a coherent story—the more coherent, the more convincing.
Example: Two candidates interview for an analyst job. One is a recent PhD graduate who had excellent grades in school. She has not yet developed a work history. In the interview process, she was dynamic and impressive. The other candidate has held a post-doctorate position for several years and has a very good record of productivity. However, she is shy and did not impress the interviewers. Which candidate is more likely to get hired?
We actually do not know much specific position-relevant information about the first candidate, but if we were the interviewers impressed by her lively presentation, we would be sorely tempted to fill in details for a coherent story that she would be a good analyst. The unfortunate second candidate, whose work record is much more relevant, would not be as likely to trigger the same vivid, coherent story.
Framing Effect and Loss Aversion
We apply different weights to the same thing depending whether it is framed in a positive way (gain) or negative way (loss). We weigh potential losses about twice as much as potential gains. We can make decisions which assign more importance to the form in which information is presented rather than the content.
The following is an example of both Loss Aversion and how the Framing Effect can affect our decisions.
Example: Credit card companies were being pressured by gasoline station chains to allow dual pricing, so that consumers could be enticed with lower-priced gasoline without the extra expense of using a credit card. The credit card companies resisted, but when finally forced to allow dual pricing at gasoline stations, they made the stations offer cash discounts rather than a surcharge. Customers used credit cards more often because a surcharge would be considered a more painful loss, while a cash discount only a modest gain.
We tend to overvalue, or “anchor on,” the first information considered or suggested for an unknown quantity. This can skew our decisions when they involve estimates of quantities such as budget or production capability.
Example: Real estate agents have been shown in studies to be influenced by the asking price of a property when they make their own assessment of price, even when they think they are independent. A large sample of agents, split into two groups, were shown through a house that was actually for sale, but with different sales brochures as part of the experiment. Half of the brochures had an asking price far above the actual asking price, while the other half was far below. A control group of non-real estate agents also were shown the house and brochures. The expert agents who had been shown the high price anchored on a higher estimate, and the ones shown the low price underestimated the actual asking price. In fact, the agents with their special knowledge did only slightly better than the control group.
Choice Overload and Decision Fatigue
Too many choices can overwhelm our decision-making ability, and quality of our choices deteriorates at the end of a long session of making decisions.
This phenomenon is the opposite problem of narrow framing, but is necessary in understanding one of the solutions to narrow framing—widening our options. We need to find a sweet spot between being too narrowly focused on a “one or the other” choice, and taking on too many choices.
Example: People in supermarkets with overwhelming choices tend to buy less, and make more “impulse buys” at the check-out counter, after they have become fatigued with too many choices. This is one reason why grocery stores put candy at the check-out aisles, to take advantage of decreased resistance to poor choices.
This talk from psycho-economist Sheena Iyengar shows some fascinating results of offering too many choices to consumers.
The Confirmation Bias Villain
The tendency to seek out only information that supports our pre-existing beliefs and values.
Example: When given responsibility for developing a new product, a team will seek out market data that supports their idea, and overlook contradictory data. Kodak tended to confirm their initial assessments that the market wasn’t ready for digital photography, and focused on keeping film and printing market share. The story of Kodak is in fact a bit more complex than this example (Dan and Chip Heath discuss the case in more length in Decisive), but it illustrates how confirmation bias can blind us to contradictory data.
“Availability” Mental Shortcut (Heuristic)
The tendency to make decisions based on how easy it is to recall relevant data. This includes giving more weight to the most recent memories. We may ignore information that is more relevant but not as available.
Example. Reading about a recent plane crash, and deciding to take a train on your next trip. Studies have shown that passenger train ridership significantly increases after the news of a major plane crash. Or, reading about a recent shark attack and avoiding a swim at the beach because you think it is too risky, while ignoring statistically more significant health risks of obesity or smoking, because those risks aren’t as available in your attention.
The expectation that unrelated past events will influence the present. Derives from assuming the world is less random than it is.
Example: Judges deciding asylum cases were found to be influenced by previous decisions. If they perceived too many consecutive decisions, they tended to change their decision.
Yale economist Toby Moskowitz and his co-authors, Daniel Chen and Kelly Shue, wrote a paper that records this and other examples, “Decision-Making Under the Gambler’s Fallacy.” You can hear him explain in an interview with Freakonomics co-author Stephen J. Dubner at his Freakonomics podcast.
“Representative” Mental Shortcut (Heuristic)
We tend to make decisions by comparing data to prototypes. We allow stereotypes to overly influence us. When we judge something to be essentially similar to other things in the same category, it can lead us to make unwarranted assumptions.
Example: We are told Jane is decisive and intelligent. We decide she would be a good leader if asked to consider her for a job. But additional information that she is corrupt and cruel exposes the problem of relying on this mental shortcut.
Things that are more specific are easier for us to envision, and we overestimate the probability of that coming true.
This fallacy is closely related to the Representative Mental Shortcut.
Example 1: Linda is 31, single, outspoken, bright. She majored in philosophy. In college she was concerned with discrimination and social justice issues, and participated in anti-nuke demonstrations.
Which alternative is more likely?
- Linda is a bank teller.
- Linda is a bank teller and is active in the feminist movement.
Surprisingly, in multiple surveys, more people chose B, that she’s both a teller and an active feminist, even though this defies mathematical logic. Unless all bank tellers are feminists and all feminists are bank tellers, the intersection of the set of all feminists and all bank tellers must be smaller than either category. However, the story paints a coherent picture closer to a stereotype in our minds, so that we find it easier to believe that Linda is both a feminist and a teller.
Example 2: People were willing to pay more, on average, for flight insurance on an overseas flight to London, that specifically covered terrorism, over a policy that covered death for any reason.
Ambiguity Neglect and Aversion
Ambiguity Neglect is the tendency to ignore or discount the unfamiliar and unknown. We prefer to stick with the familiar and known in making our decisions. In our search for coherent stories, we tend to fill in ambiguous information with what we know, leading us to jump to conclusions. Ambiguity Aversion is related, in that we prefer taking risks with some knowable probability over taking unknown risks.
As former US Secretary of Defense Donald Rumsfeld said, “There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know.” It is those last two categories of unknowns that make us most uncomfortable, so that we avoid ambiguous inputs and outcomes.
Example: In evaluating a project, we may avoid unknowns, and spend too little effort researching to explore them, or fill in ambiguous information with a picture that fits our preferences.
Below are a few more examples of how our minds deal with Ambiguity.
The Short-Term Emotion Villain
Affect Mental Shortcut (Heuristic)
Affect is a psychologist’s way of describing our experience of feeling or emotions, which have powerful influence on our decision-making. People let their likes and dislikes determine their beliefs about the world. When asked “What do you think about …” which requires more mental effort, we tend to substitute the easier question “How do I feel about it?” Emotions are even more influential on decisions when time or resources are limited.
Example: When asked “How much are you willing to contribute to save an endangered species?” we substitute “How much do I like dolphins (or snail darters, Alabama cavefish, etc.)?”
Exaggerated Emotional Coherence: The Halo Effect
The tendency to take our favorable, or unfavorable, reaction to a limited set of characteristics that we observe about a person or thing, and apply our like or dislike to their other characteristics, even if we have not observed them. We tend to be heavily influenced by initial impressions, and fill in blanks in our information with our own coherent story.
Marketers use the Halo Effect all the time. If we like a certain sports player, or a certain actor, we will tend to like everything about those people, include the clothes they wear, the food they eat, the cars they drive, etc. The Halo Effect relates closely to the Confirmation Bias, as we selectively look at people or situations once we’ve made an initial judgment.
Example: We judge more favorably a person when we see a descriptive list that starts with positive characteristics, versus having a worse impression of a person whose description starts with negative characteristics…even when the list is the same!
We overestimate the degree to which others agree with us. This also applies to how much our “present self” will agree with our “future self.”
Example: We often make decisions affecting our future based on present emotional, physical, or emotional state—without regard to how those states might change. Almost everyone has had the experience of going to buy groceries when hungry, and ending up with a cart full of food that our future self doesn’t really want, once we’re not hungry anymore. When we are in a happy mood, we tend to make less rational decisions based on that emotion, rather than putting more thought into them.
Certainty and Possibility Effects
We give more weight than we rationally should to highly unlikely outcomes, and less weight than justified by probabilities to outcomes that are almost certain.
These effects, illustrated in a rubric that Kahneman and Tversky call the “fourfold pattern,” are central insights of behavioral economics. As Kahneman explains, “people attach values to gains and losses rather than to wealth, and the decision weights they assign to outcomes are different than probabilities.” The closer we get to an almost certainty, say from 95% to 100%, the less decision weight we assign than is justified by the probability. In the area where we move from NO chance, to a small (like 5%) chance, we assign a higher weight than is justified by strict probability. The combination of desire for gain, or a stronger fear of loss, versus high or low probability events, is illustrated with these four examples:
Certainty Effect Example in the Face of Gain: Settling for a smaller amount than expected value in an almost certain (95% chance of winning) million dollar civil court case with a risk-adjustment company. The expected value of 95% chance to win 1 million is $950,000, but they offer you $910,000, and you take it because you have assigned less weight to the almost certain outcome than statistical probability would advise.
Certainty Effect Example in the Face of Loss: Deciding to risk continuing a declining business with a 95% chance of being overtaken by technology, because you have assigned less weight than statistically (rationally) justified to the almost certainty.
Possibility Effect Example in the Face of Gain: Responding to “you can’t win if you don’t play” to buy lottery tickets in large amounts for a very small chance to win a large prize, because you’ve assigned a higher weight to your chance of winning than is statistically justified.
Possibility Effect Example in the Face of Loss: Paying a larger than rational amount for insurance against highly unlikely events, because you’ve assigned a higher weight to a statistically very unlikely event.
The Overconfidence Villain
Sunk Cost Fallacy
The tendency, often because of emotional attachment, to consider the irrecoverable costs of resources that have already been spent in considering the future value of things. In rational decision making, sunk costs should be ignored.
The following is an example of both the Sunk Cost Fallacy, and how the Framing Effect may change our decisions. It also demonstrates a phenomenon called mental accounting. We tend to put money into mental accounts for specific purposes. In the case below, there is a difference between losing tickets to a play, which were likely already put into an “entertainment” account, and losing cash, which would belong to a more general account—changing the perspective on the decision. Here are the details:
Situation A: A man intends to see a Broadway play, and has purchased two tickets for $160 ahead of time. When he arrives at the theater, he realizes he’s lost the tickets. Will he buy two more tickets to see the play?
Situation B: A man intends to see a Broadway play, and pay for tickets at the window. When he pulls his wallet out, he realizes that he’s missing $160, but has his credit card available. Will he buy the tickets?
Both cases are essentially the same: the man has $160 less, either in goods purchased or in cash, when he reaches the theater. In most cases, people answer that the man in Situation A, who lost the tickets, will likely not go see the play. This is because the tickets came out of a more limited mental account for a specific purpose. On the other hand, most people answer that in Situation B, the man will go see the play, using his credit card. This is because he’s not restricted in his mind the money that can be used for the tickets.
Hindsight and Outcome Biases
In the Hindsight Bias, our limited ability to reconstruct our past states of knowledge leads us to underestimate how much we were surprised by or could not have anticipated past events. In the related Outcome Bias, we judge the quality of our prior decisions on their outcomes, not on the process used to get the outcome.
These biases are somewhat related to the Gambler’s Fallacy, because they stem from our estimation of the randomness of events. An outcome may often come about because of sheer luck, but with our 20/20 hindsight, we ascribe the favorable outcome to our own skills instead of luck. This portends ill for future decisions, making us overconfident in our skills.
Example: We might judge a competent doctor’s decision in a surgery that had a large probability of success as either good or bad depending on whether the surgery experienced the rare complication–even though the complication had nothing to do with the doctor’s skill.
Once we own something, we put a higher value on it.
This is related to the Sunk Cost Fallacy, as we take ownership of the resources or time we may have invested in a project. The effect might make us overconfident in what we have, and less willing to trade for better options.
Example: In re-negotiating contracts, a collective bargaining team might demand more to give up previously awarded benefits, such as paid days off, than they would have been willing to pay were they bargaining for the first time for those benefits.
Planning Fallacy and Optimism Bias
The tendency to underestimate risks and likelihood of adverse events, such as skin cancer or car accidents. We also overestimate the benefits of projects or events. Optimistic estimates take the best case scenario when planning.
Example: Suvarnabhumi Bangkok International Airport was first government-approved in 1991 with US$4 Billion budget. Construction didn’t begin until 2002. Final cost of the first phase was US$5 Billion and opened over a year behind schedule. Structural and capacity problems still plague the project. Expected capacity was 45 million passengers annually, which was quickly exceeded, requiring the old airport to be reopened.
As we can see, these biases, fallacies, effects, and mental shortcuts interconnect and overlap across categories. The mind is complex, with the faster intuitive, emotional-driven system 1 interacting with the more deliberative, analytical, but slower (and Kahneman often points out lazier) system 2. With a tremendous inflow of stimuli, the brain is constantly making decisions both with and without our awareness. Most of the time, the system works great at keeping us safe and alive. But we can be lulled into a sense of security, because the quick shortcuts can be fooled, particularly as we fill in missing information to construct a coherent story. Awareness of the pitfalls is a first step. In my upcoming article, I will explore Chip and Dan Heath’s, as well as other sources’, various ways to mitigate the effects of the “four villains” and all the associated hazards to effective decision-making.
Copyright © 2017 by Robert Cummings All rights reserved.