Clear thinking

Book reviews - Clear thinking - Thinking Books

Is Expertise Dead?

I’ve just finished reading The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters, by Tom Nichols. Nichols is a Professor of National Security Affairs at the Naval War College, and he is deeply concerned about the dumbing-down of our national discourse, in which the loudest and most simplistic opinions seem to carry far more weight than the carefully nuanced opinions of experts.

As a nation of rugged individualists who believe we’re all created equal, we Americans have always had a healthy skepticism about experts, which was noted as early as 1835 by Alexis de Toqueville almost 200 years ago. I remember one of my high school teachers defining an expert as “someone who learns more and more about less and less, until finally he knows everything about nothing.”

And there have been good reasons for that skepticism. First, expert mistakes have certainly cost us, with Exhibit 1 being the foreign policy elites who have gotten us into trouble from Vietnam to Iraq and many places in between. It’s also hard to trust experts when finding experts who contradict each other is as easy as switching channels, and experts who sell their opinion to the highest bidder or overstep their knowledge to gain attention unfortunately get more attention than those who are more cautious.

But focusing on the mistakes (or other shortcomings) of experts ignores their far more important contributions to our lives. The experts who got it wrong with the Challenger also got us to the moon; the chemists who gave us thalidomide also have saved or improved millions of lives with other drugs; and to give the foreign policy establishment their due, they also helped build the postwar world order that has prevented a war between major powers for over 70 years and has contributed to an unprecedented expansion of prosperity.

When you ignore the contributions of experts and focus only on their failings, you stand to lose far more than you gain, like burning down your house to kill the mouse you saw in your kitchen. So it’s smart to take a careful and informed approach to assessing expert advice. As the saying goes, “if you think an expert is expensive, try hiring an amateur.”

But that’s exactly the problem we’re running into today—we’re paying far more attention to the loud and simplistic amateurs than we should. The backlash against established expertise the problem we’re running into is turning (or already has, most likely) healthy skepticism not only into unhealthy skepticism and cynicism but into aggressive and willful ignorance. We value confidence far more than credentials, which is why we elected a man who says he is the only one who can fix things.

The Internet was supposed to lift us all up, by putting the accumulated knowledge of the world at our fingertips. Instead, according to Nichols it has made us dumber, and I agree with him. Because anyone with a connection can create a slick website and reach the whole world with their opinions, the overwhelming quantity of crap tends to bury the quality. Sturgeon’s Law, which says 90% of everything is crap, is woefully deficient in describing the internet. For most people using it to do “research”, the internet is simply a powerful engine for confirmation bias. As if that’s not enough, Nichols also describes the impact of a higher education system that has misguidedly turned students into “customers”, and the proliferation of talk radio and cable TV stations that cater to every conceivable taste and perspective, so that no one ever has to run the risk of running into an uncomfortable fact.

After I put down the book, I jotted down some notes to try to answer the title question of this blog. (Since I’ve covered some aspects of this problem previously in my blog, what follows combines some of the ideas from The Death of Expertise and some of my previous thinking, and it’s impossible to separate the two. As a rule of thumb, if it sounds smart, credit Nichols.)

What do the experts owe us?

  • Don’t overstate your case. Nichols is slightly guilty of this, starting with his title. Death is a pretty strong word, and the word campaign carries a slight whiff of conspiracy theory to it. It’s definitely a trend that many have exploited, but no one is guiding it.
  • Stick to what you know. Linus Pauling deservedly won two Nobel prizes, but tarnished his reputation when he touted Vitamin C as a panacea (not to mention dabbling in eugenics).
  • Be a foxy hedgehog. From a strong base of expert knowledge, become curious about the rest of the world and get comfortable with uncertainty and disagreement.
  • Separate fact from opinions. Be clear in your own mind first, and then explicit about the difference in your public statements.
  • Separate analysis from predictions. As Philip Tetlock has shown us, the average expert is just slightly more accurate than a drunk monkey throwing darts when it comes to making predictions.
  • Be professional. Professionalism includes the above admonitions plus an obligation to the greater good—of your clients and even sometimes the general public.

What do we owe the experts?

  • Look for signs that the expert you’re reading is following the rules above.
  • Recognize that when it comes to expertise we are not all created equal. Don’t think that a half hour spent perusing Google returns gives you the right to argue with someone who has devoted their professional life to the topic.
  • If you still feel the need to argue with experts (for example, I take issue with some of the ideas that our City’s traffic experts are trying to sell to the public), at least make a serious effort to learn the fundamentals of the topic first.
  • Be careful what you put in your mind. If it’s true that you are what you eat, it’s even more true that you are what you read.
  • Become a more critical thinker and learn how to identify quality. Here’s a few recommendations for further reading that will better equip you for the task:

o   When Can You Trust the Experts? by Daniel Willingham

o   Superforecasting, or Expert Political Judgment by Philip Tetlock

o   For detecting business and management BS: The Halo Effect…and Eight Other Business Delusions that Deceive Managers, by Phil Rosenzweig, and Leadership BS by Jeffrey Pfeffer

o   Thinking, Fast and Slow, by Daniel Kahneman.

o   Curious, by Ian Leslie.

I heartily recommend this book, but the irony of a book about the death of expertise is that those who most need to read it are the least likely to.

Read More
Clear thinking - Persuasive communication

At the End of 2016, We Desperately Need to Remember These Two Ideas

Regardless of whether your candidate won or lost, the 2016 political season in the US has seriously undermined two key beliefs that I have long had about persuasive communications. The first is that the truth matters, and the second is that moderation of thought and expression is a virtue. Maybe I’ve taken them for granted because I’ve thought that both of these ideas are obvious, but it’s clear that someone needs to speak out in their defense.

Two of those defenders—whose opinions and approach I greatly respect—are Josh Bernoff, author of Writing Without Bullshit, and Peter Wehner,  a columnist for the New York Times. Since they can say it much better than I can, I simply give you these two links, and strongly recommend you read them:  

Bernoff: The Truth Foundation

Wehner: Moderate Is Not a Dirty Word (This is the title of the print edition of the opinion in the NYT; different than the online edition for some reason.)

Read More
Clear thinking - Lean Communication

Bad Simplicity, Complexity, Good Simplicity

The best way to add value and reduce waste in communication is to provide your audience with “simplicity on the other side of complexity”.

I first came across this phrase in a fascinating blog post titled: Simplicity on the other Side of Complexity, by John Kolko, and he in turn took it from Chief Justice Oliver Wendell Holmes who said, “I would not give a fig for the simplicity on this side of complexity, but I would give my life for the simplicity on the other side of complexity.”

The phrase resonated with me because it explains so clearly an idea that I am constantly trying to impart to participants in my lean communication and presentations classes. One of the most common reasons that companies bring me in to work with their teams is that their executives complain that they have to sit through unnecessarily complex and excessively detailed presentations, where masses of data substitute for clear understanding. The common complaint is “I ask what time it is, and they tell me how to build a watch.”

When you have to make a presentation to busy high-level decision makers—whether internally or to a potential customer—it’s because they need information to grasp the key issues about a complex decision, so that they can take effective action to improve outcomes. The simplicity with which you express information clearly adds value to them, because it reduces uncertainty and saves time and effort. But only if it’s good simplicity, which resides on the other side of complexity.

In essence, the gist of Kolko’s article is that the learning curve for a complex issue looks like a bell curve (I’ve added the labels to his simple drawing, and the rest of this article includes my additions to his):

simplicity curve 2

At the far left, when people first approach an issue, they know very little about it, but of course that doesn’t stop most of them from jumping to conclusions or forming opinions. People who are intellectually lazy or who use motivated reasoning don’t proceed past this stage and enjoy the confidence born of ignorance. They forget what H.L. Mencken said: “Explanations exist; they have existed for all time; there is always a well-known solution to every human problem — neat, plausible, and wrong.” For examples of this type of presentation, just think back to the past two weeks of US Presidential conventions. This kind of simple is almost pure waste, because it’s either wrong, or it’s trivial.

But you’re a business professional, so you can’t afford to be intellectually lazy because you know if you present a simplistic and shallow case you will be eviscerated in the boardroom. So you take your time, gather evidence and analyze it, and see greater depth and nuance in the issues. If you stick to it, you eventually become an expert; you “get it”, so that’s the next natural stopping point. Complexity is the point at which the average expert feels they are ready to present their findings. However, the biggest mistake they make is that they include far more detail than the listener needs to use their findings, either through defensiveness or inability to connect to what the listener cares about. As one CEO told me, “I get a lot of detail but very little meaning.” They may have added value, but there is still a significant amount of waste, in the form of time, effort, and confusion.

simplicity on the other side of complexity

Outstanding expert presenters know that you never truly know how well you know something until you try to explain it to others, so they take the next logical step. They add value to their listeners by distilling all their hard-won complexity into just the meaning the listener needs for their purposes. They know exactly why the listener needs the information, and give them just what they need to know to make the best possible decision, so that there is zero waste. Most times, they go beyond merely providing information and advocate for a specific decision (which is a given if it’s a sales presentation)—but it’s based on highly informed judgment.

The tools for achieving this type of good simplicity are the tools of lean communication: Outside-in thinking to know what the listener needs, Bottom Line Up Front to provide meaning right away, SO WHAT filter to root out waste, and pull to make adjustments as necessary.

Before you decide to strive for good simplicity, I would be remiss in not pointing out one personal risk you might run: if your goal is to call attention to how smart you are, it may not be the best way. As Kolko says, “The place you land on the right—the simplicity on the other side of complexity—is often super obvious in retrospect. That’s sort of the point: it’s made obvious to others because you did the heavy lifting of getting through the mess.”

But if your goal is to get the right things done, simplicity on the other side of complexity is the only way.

Read More
Book reviews - Clear thinking - Thinking Books

Book Recommendation: Superforecasting

Superforecasting: The Art and Science of Prediction, by Philip Tetlock and Dan Gardner, is a fascinating book, but I’m not sure if you should read it, for reasons that I explain at the end of this post.

There is a huge market for forecasting in our country, from political talking heads on TV, to investment advisors, to the approximately 20,000 intelligence analysts working in our intelligence establishment. But, while the stakes for getting it wrong can be huge (see Iraq’s WMD), there is no formal reliable way of measuring or analyzing the track records of those doing the predicting. Pick your favorite commentator or columnist, what’s their hit rate on their predictions? That’s impossible to answer, first because no one has compared what they said would happen to what did happen, and even if they did, so many of their predictions are so vaguely worded that most of them can easily claim they meant something else and wiggle off the hook.

Philip Tetlock is trying to change that. Beginning in the 1980s, he has been studying how good experts are at prediction (answer, just slightly better than a drunk monkey throwing darts). One of his findings was that pundits who were the most confident tended to be wrong more often, but they also got on TV more often. They are hired more for their ability to tell a compelling story with confidence than for their track record in forecasting.

This latest book details his findings from a four-year project funded by IARPA, the Intelligence Advanced Research Projects Activity, to test the forecasting performance of several different teams of experts. It was a large test which asked over 500 questions to more than 20,000 participants between 2011 and 2015. It was also rigorous, with questions designed to eliminate the wiggle room problem. For example, they asked, “Will any country withdraw from the Eurozone in the next three months? How many additional countries will report cases of the Ebola virus in the next eight months?”

The study found that about 2% of participants, which he calls superforecasters, are consistently more accurate in their predictions. By identifying the superforecasters, and then testing different combinations and variables, he was able to tease out what makes them successful, and the bulk of the book explains the traits, techniques and habits of thought that make for superior judgment.[1]

The basic theme is that it’s not superior intellect that distinguishes the SFs, but how they think. Here are just a few of his recommendations:

  • Break down tough problems into their components, and make estimates or judgments about those.
  • Pay attention to base rates first, and then adjust. For example, I may think that my friend is very likely to strike it rich in a very difficult venture, because I start with knowing how smart he is. But if I begin by considering that the odds are 50 to 1 against success, I could double his chances and still think it’s very unlikely.
  • Be actively open-minded, not only being open to new information but looking for it. Once you have formed a judgment, pay attention to new information, especially anything that would call your initial judgment into question.
  • Write down your forecasts and your reasoning, because the mere fact of writing it will help distance you emotionally from your first prediction. If it’s important that you get it right, take the further step of writing down all the reasons you might be wrong, and then synthesize the two.
  • Expand your range of alternatives. Most people have a three-position dial about predictions: yes, no, and even odds. You can force yourself to become more systematic about your own thinking by adopting a 7-point scale as recommended by the National Intelligence Council as you see here:

Remote     Very unlikely     Unlikely       Even         Probably, likely         Very likely       Almost Certain

Even better, use percentages. It won’t guarantee[2] you’re right, but it will force you to examine your own thinking and help you adopt a more nuanced viewpoint.

There’s far more good advice than I can summarize, but frankly I’m struggling a little in deciding whether to recommend that you read Superforecasting. On the plus side, I predict that it is very likely that if you read and apply its lessons, you will become a better thinker. On the other hand, it’s an even chance that you will become a worse persuasive communicator. That’s because an effective salesperson radiates confidence about the future they recommend, while effective forecasters are far more cautious and humble about their predictions.

My personal choice would be to begin with better thinking. First, for the obvious point that you owe it to yourself and to others to give them your best thinking. Second, sustainable influence depends on credibility, which will in the long run correlate strongly with your predictions. It’s true that TV pundits who are the most confident in their predictions tend to be wrong most often, and they don’t suffer for it. But when people are putting their own reputations or money at stake based on what you predict, they tend to have longer memories.

[1] I use judgment in the sense that they are better predictors, not that they necessarily make better decisions.

[2] In fact, you may have noticed that the seven-point scale does not include certainty on either side.

Read More
1 2 3 11