My friend exemplifies one of the most common shortcomings that we all have in our critical thinking skills, confirmation bias: the tendency to overlook or ignore any evidence that does not accord with the conclusions we have already reached. I’m not talking about people forming a conclusion and deliberately ignoring contradictory information; when you’re advocating a position that’s often exactly your job. What I am referring to is when you overlook information in spite of your best efforts or intentions. It may be harmless when it comes to judging a public figure, but it can have costly repercussions in more important areas.
Confirmation bias can cause us to ask questions in such a way that we get the answers we expect to hear, which is a common problem in market research. Motorola employed over 100 people and several consulting groups to conduct market research before launching its Iridium satellite phone service, and its survey questions indicated that there would be enthusiastic acceptance. One sample question began: “There will soon be a new personal telephone service which at reasonable cost will provide you with the capability to be reached or to place calls anywhere in the world using satellite technology, which is not limited in coverage like a cellular phone. To access the service you would have a small handset that fits in your pocket…” They did not mention that “reasonable cost” meant a handset price of $3,000 and $3 per minute usage fees, or that “anywhere” meant line of sight view with an orbiting satellite, which meant it could not be used indoors. (Billion Dollar Lessons, Carroll and Mui) In reality, only 3,000 were sold and it folded within a year of launch, costing $5 billion in the process. (Please see Iridium’s side of the story, at the end of this post.)
Motorola’s example demonstrates that if you try hard enough, you can usually find enough data to support your point of view. As psychologist J. Edward Russo says, “If you torture the data long enough, it will confess.”
How many people have been unjustly convicted through confirmation bias? 60 Minutes once ran a show in which three professional polygraphers were each asked to test a subject whom they were told was suspected of stealing a camera. There had been no theft, and each polygrapher had a different suspect, but all three judged their subject to be deceptive.
Confirmation bias is one of the reasons that first impressions can be so important. We will tend to notice only information that confirms the first impression. Sometimes, it can even lead to self-fulfilling prophecies. Because we mostly notice evidence that supports our initial impression, people tend to live up or down to our expectations.
It can also close off diverse viewpoints that may make us better informed in general. An Ohio State study showed that people spend 36% more time reading essays that support their positions, for example. The bias is one of the principal reasons that social prejudices can be so hard to eradicate. If I suddenly got it into my head that people in red cars drive aggressively, you can bet that I would see red cars everywhere and would not remember the safe ones.
What can you do about it?
Always solicit disconfirming evidence before making a decision. Ask yourself: what would be true if my hypothesis were not correct? Then try to find instances of that.
When you do come across disconfirming information, do like Darwin did, and write it down immediately, before you forget it!
Write down reasons why your assumptions and hypotheses might be wrong and try to prove them so.
Get others to try to poke holes in your ideas. As a side benefit, you will have much more confidence when you do present your ideas for final approval.
Try to frame your questions to seek disconfirming evidence. One successful industry analyst credits his success to this. For example, if he thinks price competition is decreasing, he will ask, “Is it true that price competition is increasing?”
Hold a PreMortem meeting. With some colleagues, imagine that it is some future time and your idea has failed. Try to figure out all the ways it could have happened. (The Power of Intuition, Klein)
If you’re a manager:
Ask your subordinates what other hypotheses they have considered or what evidence they have found that contradicts their point of view. If they say they haven’t found any, that could be a red flag.
Generate disagreement. Alfred Sloan, the Chairman of GM, once asked if everyone agreed with the decision made. He then said, “If we are all in agreement on the decision – then I propose we postpone further discussion of this matter until our next meeting to give ourselves time to develop disagreement and perhaps gain some understanding of what the decision is all about.” One way to generate disagreement without generating defensiveness is to have a designated devil’s advocate. When someone is designated as the devil’s advocate, it can reduce personal tensions; just be sure to rotate the position frequently.
Foster an environment where it is OK to bring bad news and to dissent. In its after-action reviews following training exercises, the US Army allows the lowliest private to criticize the commander’s performance, and everyone gets better as a result.
In case you’re wondering, I did try to find data that disconfirms the existence of confirmation bias.
The rest of the story: After this post first ran, I received a gracious note from an Iridium spokesperson, who, while not disputing the facts I wrote, informed me that Iridium relaunched its service in 2001 and now has over 413,000 subscribers and 2009 revenues of $318.9 million. In addition, its technology has been a crucial help in many natural disasters. It just goes to prove the old saw that sometimes failure is just an invitation to begin again more intelligently.