I’ve just finished reading The
As a nation of rugged individualists who believe we’re all created equal, we Americans have always had a healthy skepticism about experts, which was noted as early as 1835 by Alexis de Toqueville almost 200 years ago. I remember one of my high school teachers defining an expert as “someone who learns more and more about less and less, until finally he knows everything about nothing.”
And there have been good reasons for that skepticism. First, expert mistakes have certainly cost us, with Exhibit 1 being the foreign policy elites who have gotten us into trouble from Vietnam to Iraq and many places in between. It’s also hard to trust experts when finding experts who contradict each other is as easy as switching channels, and experts who sell their opinion to the highest bidder or overstep their knowledge to gain attention unfortunately get more attention than those who are more cautious.
But focusing on the mistakes (or other shortcomings) of experts ignores their far more important contributions to our lives. The experts who got it wrong with the Challenger also got us to the moon; the chemists who gave us thalidomide also have saved or improved millions of lives with other drugs; and to give the foreign policy establishment their due, they also helped build the postwar world order that has prevented a war between major powers for over 70 years and has contributed to an unprecedented expansion of prosperity.
When you ignore the contributions of experts and focus only on their failings, you stand to lose far more than you gain, like burning down your house to kill the mouse you saw in your kitchen. So it’s smart to take a careful and informed approach to assessing expert advice. As the saying goes, “if you think an expert is expensive, try hiring an amateur.”
But that’s exactly the problem we’re running into today—we’re paying far more attention to the loud and simplistic amateurs than we should. The backlash against established expertise the problem we’re running into is turning (or already has, most likely) healthy skepticism not only into unhealthy skepticism and cynicism but into aggressive and willful ignorance. We value confidence far more than credentials, which is why we elected a man who says he is the only one who can fix things.
The Internet was supposed to lift us all up, by putting the accumulated knowledge of the world at our fingertips. Instead, according to Nichols it has made us dumber, and I agree with him. Because anyone with a connection can create a slick website and reach the whole world with their opinions, the overwhelming quantity of crap tends to bury the quality. Sturgeon’s Law, which says 90% of everything is crap, is woefully deficient in describing the internet. For most people using it to do “research”, the internet is simply a powerful engine for confirmation bias. As if that’s not enough, Nichols also describes the impact of a higher education system that has misguidedly turned students into “customers”, and the proliferation of talk radio and cable TV stations that cater to every conceivable taste and perspective, so that no one ever has to run the risk of running into an uncomfortable fact.
After I put down the book, I jotted down some notes to try to answer the title question of this blog. (Since I’ve covered some aspects of this problem previously in my blog, what follows combines some of the ideas from The Death of Expertise and some of my previous thinking, and it’s impossible to separate the two. As a rule of thumb, if it sounds smart, credit Nichols.)
What do the experts owe us?
- Don’t overstate your case. Nichols is slightly guilty of this, starting with his title. Death is a pretty strong word, and the word campaign carries a slight whiff of conspiracy theory to it. It’s definitely a trend that many have exploited, but no one is guiding it.
- Stick to what you know. Linus Pauling deservedly won two Nobel prizes, but tarnished his reputation when he touted Vitamin C as a panacea (not to mention dabbling in eugenics).
- Be a foxy hedgehog. From a strong base of expert knowledge, become curious about the rest of the world and get comfortable with uncertainty and disagreement.
- Separate fact from opinions. Be clear in your own mind first, and then explicit about the difference in your public statements.
- Separate analysis from predictions. As Philip Tetlock has shown us, the average expert is just slightly more accurate than a drunk monkey throwing darts when it comes to making predictions.
- Be professional. Professionalism includes the above admonitions plus an obligation to the greater good—of your clients and even sometimes the general public.
What do we owe the experts?
- Look for signs that the expert you’re reading is following the rules above.
- Recognize that when it comes to expertise we are not all created equal. Don’t think that a half hour spent perusing Google returns gives you the right to argue with someone who has devoted their professional life to the topic.
- If you still feel the need to argue with experts (for example, I take issue with some of the ideas that our City’s traffic experts are trying to sell to the public), at least make a serious effort to learn the fundamentals of the topic first.
- Be careful what you put in your mind. If it’s true that you are what you eat, it’s even more true that you are what you read.
- Become a more critical thinker and learn how to identify quality. Here’s a few recommendations for further reading that will better equip you for the task:
o When Can You Trust the Experts? by Daniel Willingham
o Superforecasting, or Expert Political Judgment by Philip Tetlock
o For detecting business and management BS: The Halo Effect…and Eight Other Business Delusions that Deceive Managers, by Phil Rosenzweig, and Leadership BS by Jeffrey Pfeffer
o Thinking, Fast and Slow, by Daniel Kahneman.
o Curious, by Ian Leslie.
I heartily recommend this book, but the irony of a book about the death of expertise is that those who most need to read it are the least likely to.
Superforecasting:
There is a huge market for forecasting in our country, from political talking heads on TV, to investment advisors, to the approximately 20,000 intelligence analysts working in our intelligence establishment. But, while the stakes for getting it wrong can be huge (see Iraq’s WMD), there is no formal reliable way of measuring or analyzing the track records of those doing the predicting. Pick your favorite commentator or columnist, what’s their hit rate on their predictions? That’s impossible to answer, first because no one has compared what they said would happen to what did happen, and even if they did, so many of their predictions are so vaguely worded that most of them can easily claim they meant something else and wiggle off the hook.
Philip Tetlock is trying to change that. Beginning in the 1980s, he has been studying how good experts are at prediction (answer, just slightly better than a drunk monkey throwing darts). One of his findings was that pundits who were the most confident tended to be wrong more often, but they also got on TV more often. They are hired more for their ability to tell a compelling story with confidence than for their track record in forecasting.
This latest book details his findings from a four-year project funded by IARPA, the Intelligence Advanced Research Projects Activity, to test the forecasting performance of several different teams of experts. It was a large test which asked over 500 questions to more than 20,000 participants between 2011 and 2015. It was also rigorous, with questions designed to eliminate the wiggle room problem. For example, they asked, “Will any country withdraw from the Eurozone in the next three months? How many additional countries will report cases of the Ebola virus in the next eight months?”
The study found that about 2% of participants, which he calls superforecasters, are consistently more accurate in their predictions. By identifying the superforecasters, and then testing different combinations and variables, he was able to tease out what makes them successful, and the bulk of the book explains the traits, techniques and habits of thought that make for superior judgment.[1]
The basic theme is that it’s not superior intellect that distinguishes the SFs, but how they think. Here are just a few of his recommendations:
- Break down tough problems into their components, and make estimates or judgments about those.
- Pay attention to base rates first, and then adjust. For example, I may think that my friend is very likely to strike it rich in a very difficult venture, because I start with knowing how smart he is. But if I begin by considering that the odds are 50 to 1 against success, I could double his chances and still think it’s very unlikely.
- Be actively open-minded, not only being open to new information but looking for it. Once you have formed a judgment, pay attention to new information, especially anything that would call your initial judgment into question.
- Write down your forecasts and your reasoning, because the mere fact of writing it will help distance you emotionally from your first prediction. If it’s important that you get it right, take the further step of writing down all the reasons you might be wrong, and then synthesize the two.
- Expand your range of alternatives. Most people have a three-position dial about predictions: yes, no, and even odds. You can force yourself to become more systematic about your own thinking by adopting a 7-point scale as recommended by the National Intelligence Council as you see here:
Remote Very unlikely Unlikely Even Probably, likely Very likely Almost Certain
Even better, use percentages. It won’t guarantee[2] you’re right, but it will force you to examine your own thinking and help you adopt a more nuanced viewpoint.
There’s far more good advice than I can summarize, but frankly I’m struggling a little in deciding whether to recommend that you read Superforecasting. On the plus side, I predict that it is very likely that if you read and apply its lessons, you will become a better thinker. On the other hand, it’s an even chance that you will become a worse persuasive communicator. That’s because an effective salesperson radiates confidence about the future they recommend, while effective forecasters are far more cautious and humble about their predictions.
My personal choice would be to begin with better thinking. First, for the obvious point that you owe it to yourself and to others to give them your best thinking. Second, sustainable influence depends on credibility, which will in the long run correlate strongly with your predictions. It’s true that TV pundits who are the most confident in their predictions tend to be wrong most often, and they don’t suffer for it. But when people are putting their own reputations or money at stake based on what you predict, they tend to have longer memories.
[1] I use judgment in the sense that they are better predictors, not that they necessarily make better decisions.
[2] In fact, you may have noticed that the seven-point scale does not include certainty on either side.
Piggybacking
For starters, being curious makes you easier to talk to. Actually caring enough to want to know about the other person is what gets us to ask questions and thus use our ears more than our mouths. Curiosity focuses attention and shows caring… If you want to be interesting to other people, show an interest in them; when you’re curious about them, and about the things that they care about, you will find that they will talk to you at length.
That’s called empathic curiosity, by the way, and it’s one of the three forms of curiosity that Leslie describes in his book.
Empathic curiosity is a key quality for successful salespeople as well. It’s the central ingredient in outside-in thinking, in which you strive to get what you want by helping the other person get what they want. You can find out a lot about the other person because it’s part of your job, or you can be intrinsically curious about who they are and what makes them tick—and people can tell the difference.
Besides showing you care, your curiosity is what prompts you not to accept the easy, surface answers, and to dig deeper into situations—to ask why with the tenacity of a four-year-old until you get to the real issues. This can be extremely useful in consultative selling, and especially in negotiations, where the ability to understand others’ perspectives can help uncover their true interests behind their declared positions.
Persuasion also depends to a large extent on having something useful or important to say, and that requires a mind filled with knowledge about the world, which you can only get if you are truly and deeply curious about how things work and how people think. This is called epistemic curiosity, and it’s the mechanism that drives us to learn for the sake of learning. Epistemic curiosity built our modern world because it led humans to explore outside the safety of their fire, to sail out of sight of land, and to question what the authorities called wisdom.
Epistemic curiosity is what the book is mostly about. It’s what drives us to dig deep into the details and nuances of a topic. The big difference between epistemic curiosity and the shallower sort is that it requires effort, and that effort is repaid through deeper learning and greater understanding. Of course, when you’re truly curious, the effort is not work, it is joy. It’s also curiosity with a specific direction, where you are in control of your own effort and learning, not pulled along by the latest shiny distraction that comes your way.
However, curiosity is not all good. While it may not kill you, it can certainly kill your productivity. The form of curiosity that fills your otherwise productive time is diversive curiosity, and unfortunately it’s probably the most common. It’s what attracts us to novelty; it’s shallow and strives for instant gratification. Unlike epistemic curiosity, diversive curiosity controls you. As Leslie tells us, imagine what you would tell someone from fifty years ago about the future:
“I possess a device, in my pocket, that is capable of accessing the entirety of information known to man. I use it to look at pictures of cats and get into arguments with strangers.”
It doesn’t have to be that way. The internet can make you smarter or dumber, depending on how you use it. Be careful what you put in your mind. Just as you are what you eat; you are what you read.
Curious is a fascinating blend of history and science. Chapter 2 explores the development—or lack thereof—of curiosity in children. Kids ask up to 100 questions per hour. Until about 30 months, their questions focus on what and where, and then they move on to why and how questions. Curiosity continues to flourish when adults answer the question and engage them with questions of their own, and dies when they don’t.
For me, the best chapter is the one in which Leslie demolishes the trendy idea that we don’t have to learn anything deeply anymore because we can Google it. We’re told that it’s more important to think critically and be creative than to stuff our minds with facts. The problem is that critical thinking and creativity require a deep database, because that’s the only way our mind can make the meaningful connections. In his words, ”Creativity starts in combination”, and you need a lot of useful information in your mind to make the necessary combinations.
Today we’re in the Age of Answers. The thing about Google is that it is very good at finding answers to things you specifically want to know, but it’s terrible at helping you stumble across things you don’t even know yet that you want to know more about.
The benefit of building your store of knowledge is why, according to recent research cited in this book, curiosity may be as important to success as intelligence and grit. It provides the intrinsic motivation to learn that keeps you engaged. To the curious, every day and every encounter is a new opportunity for growth.
Fortunately, curiosity is a state, not a trait. This means you can increase your general level of curiosity. Leslie provides seven suggestions.
7 ways to stay curious:
Stay foolish: don’t let success quench your curiosity.
Build the database: Facts are not separate bits of knowledge, they are nodes in a network of knowledge. “knowledge loves knowledge”.
Forage like a foxhog: Is it better to have deep or broad knowledge? Leslie’s take on the comparisons between hedgehogs and foxes is that you need t-shaped knowledge: deep in one area combined with breadth.
Ask the big Why. Understanding others’ motivations will make you a better negotiator and influencer.
Be a thinkerer. Ideas are nothing without hard work to make them come to fruition. Thinking and action have to go together, so you need both the big picture and the small details.
Question your teaspoons. Anything can be interesting if you study it closely enough.
Turn puzzles into mysteries. Turn puzzles into mysteries. Puzzles can lose their interest when solved. Mysteries can intrigue forever.
Well-researched and well-written, Curious is a fascinating book, which I strongly recommend[1].
[1] The only thing I did not like is the poor quality of the citations, because they make it difficult for those of us who are curious to dig even deeper into the topic.