fbpx

Tag Archives: superforecasting book review

Book reviews - Clear thinking - Thinking Books

Book Recommendation: Superforecasting

Superforecasting: The Art and Science of Prediction, by Philip Tetlock and Dan Gardner, is a fascinating book, but I’m not sure if you should read it, for reasons that I explain at the end of this post.

There is a huge market for forecasting in our country, from political talking heads on TV, to investment advisors, to the approximately 20,000 intelligence analysts working in our intelligence establishment. But, while the stakes for getting it wrong can be huge (see Iraq’s WMD), there is no formal reliable way of measuring or analyzing the track records of those doing the predicting. Pick your favorite commentator or columnist, what’s their hit rate on their predictions? That’s impossible to answer, first because no one has compared what they said would happen to what did happen, and even if they did, so many of their predictions are so vaguely worded that most of them can easily claim they meant something else and wiggle off the hook.

Philip Tetlock is trying to change that. Beginning in the 1980s, he has been studying how good experts are at prediction (answer, just slightly better than a drunk monkey throwing darts). One of his findings was that pundits who were the most confident tended to be wrong more often, but they also got on TV more often. They are hired more for their ability to tell a compelling story with confidence than for their track record in forecasting.

This latest book details his findings from a four-year project funded by IARPA, the Intelligence Advanced Research Projects Activity, to test the forecasting performance of several different teams of experts. It was a large test which asked over 500 questions to more than 20,000 participants between 2011 and 2015. It was also rigorous, with questions designed to eliminate the wiggle room problem. For example, they asked, “Will any country withdraw from the Eurozone in the next three months? How many additional countries will report cases of the Ebola virus in the next eight months?”

The study found that about 2% of participants, which he calls superforecasters, are consistently more accurate in their predictions. By identifying the superforecasters, and then testing different combinations and variables, he was able to tease out what makes them successful, and the bulk of the book explains the traits, techniques and habits of thought that make for superior judgment.[1]

The basic theme is that it’s not superior intellect that distinguishes the SFs, but how they think. Here are just a few of his recommendations:

  • Break down tough problems into their components, and make estimates or judgments about those.
  • Pay attention to base rates first, and then adjust. For example, I may think that my friend is very likely to strike it rich in a very difficult venture, because I start with knowing how smart he is. But if I begin by considering that the odds are 50 to 1 against success, I could double his chances and still think it’s very unlikely.
  • Be actively open-minded, not only being open to new information but looking for it. Once you have formed a judgment, pay attention to new information, especially anything that would call your initial judgment into question.
  • Write down your forecasts and your reasoning, because the mere fact of writing it will help distance you emotionally from your first prediction. If it’s important that you get it right, take the further step of writing down all the reasons you might be wrong, and then synthesize the two.
  • Expand your range of alternatives. Most people have a three-position dial about predictions: yes, no, and even odds. You can force yourself to become more systematic about your own thinking by adopting a 7-point scale as recommended by the National Intelligence Council as you see here:

Remote     Very unlikely     Unlikely       Even         Probably, likely         Very likely       Almost Certain

Even better, use percentages. It won’t guarantee[2] you’re right, but it will force you to examine your own thinking and help you adopt a more nuanced viewpoint.

There’s far more good advice than I can summarize, but frankly I’m struggling a little in deciding whether to recommend that you read Superforecasting. On the plus side, I predict that it is very likely that if you read and apply its lessons, you will become a better thinker. On the other hand, it’s an even chance that you will become a worse persuasive communicator. That’s because an effective salesperson radiates confidence about the future they recommend, while effective forecasters are far more cautious and humble about their predictions.

My personal choice would be to begin with better thinking. First, for the obvious point that you owe it to yourself and to others to give them your best thinking. Second, sustainable influence depends on credibility, which will in the long run correlate strongly with your predictions. It’s true that TV pundits who are the most confident in their predictions tend to be wrong most often, and they don’t suffer for it. But when people are putting their own reputations or money at stake based on what you predict, they tend to have longer memories.

[1] I use judgment in the sense that they are better predictors, not that they necessarily make better decisions.

[2] In fact, you may have noticed that the seven-point scale does not include certainty on either side.

Read More