I recently wrote a recommendation that you read Enlightenment Now, by Steven Pinker. I believed it was one of the best and most important books I’ve read in a long time, and I want to say the same thing about Factfulness, by Hans Rosling, Ola Rosling, and Anna Rosling Rönnlund.
Just like Pinker, Rosling[1] contends that in almost every important measure, the world is better off than it ever has been in history and continues to improve. But his book is different in several ways, which is why I view it not as a substitute for EN, but as a complement to it, and a highly readable and fascinating one, at that.
Like Pinker, the Roslings provide a lot of surprising material, but they go one better by letting you test yourself, and then compare your performance to thousands of others, and to the documented truth. Chances are, you will be surprised. Here are a couple of examples:[2]
In all low-income countries across the world today, how many girls finish primary school?
A: 20 percent
B: 40 percent
C: 60 percent
There are 2 billion children in the world today, aged 0 to 15 years old. How many children will there be in the year 2100, according to the United Nations?
A: 4 billion
B: 3 billion
C: 2 billion
How did you do? If you gave chimps in the zoo the same tests, they would average 33% correct. But human audiences tend to do much worse, and the errors are invariably skewed toward the pessimistic side.
These tests are important because they clearly illustrate how many misconceptions the general public in the developed world carry about the state of the world. So many of us have a much darker view of what’s happening in the world than is actually going on. Far from going to hell in a handbasket, it’s getting better in most important measures, including health, wealth, education and violence. But most of us get the facts wrong, and often it’s the more highly educated who are the worst.
So why is that a problem? Isn’t it safer and more prudent to be more worried, rather than less? The problem is that being a hypochondriac at the global level contains some of the same risks as at the personal level. We spend too much on unnecessary remedies, some of which have unintended consequences, and not enough on bigger problems. Second, when we really do find something wrong, others may not listen. Third, it can feed an “us v them” mentality which demagogues are quick to hijack for their own purposes.
Factfulness differs from EN in another important way. Whereas Pinker blames our wrong thinking on those who attack the fundamental ideas of reason, science, humanism and progress, Rosling finds the fault in ourselves—more specifically, how our brains work, explaining ten instincts we all have and how we can work around them. These include:
- The gap instinct: we see the world in binary terms, with rich countries and poor, and a large gap between them. In reality, the vast majority of people live in middle income countries.
- The negativity instinct: we notice the bad more than the good, which is a theme I’ve written about before. And because bad events are most likely to make the news, the well-read may be the most wrongly informed.
- The destiny instinct: the idea that our innate characteristics determine our destinies. “They” have always been this way and will never be able to change.
Each instinct is covered in a separate chapter which also contains useful antidotes and work-arounds. For example, to counter the destiny instinct, Rosling suggests the following:
- Keep track of gradual improvements
- Update your knowledge
- Talk to Grandpa
- Collect examples of cultural change
These and all the other suggestions in the book comprise the tools of factfulness, which I inferred from my reading to be the application of critical thinking to the latest reliable data in order to more clearly understand the true state of things. Rosling, of course, has an even better definition; factfulness is “the stress-reducing habit of carrying only the opinions for which you have strong supporting facts”. If you develop the habit, how can you fail to be a persuasive communicator?
Factfulness is not a function of education, as we’ve seen; anyone can develop it. In fact, Rosling tells an amazing and moving story about how an uneducated African woman saved his life by using aspects of factfulness to deliver a speech that dispersed an angry crowd.
Besides being highly informative, Factfulness is very readable, which should come as no surprise to anyone who has seen any of his TED talks or other videos on YouTube. Rosling’s flair for simplifying and illustrating complex topics shows through on every page. If you want to understand the interplay between international finance and preventing malaria, for example the story about the plot to punch the pharma CEO in the face is worth the price of the entire book.
My tagline is “clear thinking, persuasively communicated.” I can’t think of a better representative of that thought than this book.
[1] Although there are three authors, Rosling writes in the first person and is the principal author.
[2] The answer to both is C. Throughout the book, your safest bet is to pick the most favorable answer.
I’ve just finished reading The
As a nation of rugged individualists who believe we’re all created equal, we Americans have always had a healthy skepticism about experts, which was noted as early as 1835 by Alexis de Toqueville almost 200 years ago. I remember one of my high school teachers defining an expert as “someone who learns more and more about less and less, until finally he knows everything about nothing.”
And there have been good reasons for that skepticism. First, expert mistakes have certainly cost us, with Exhibit 1 being the foreign policy elites who have gotten us into trouble from Vietnam to Iraq and many places in between. It’s also hard to trust experts when finding experts who contradict each other is as easy as switching channels, and experts who sell their opinion to the highest bidder or overstep their knowledge to gain attention unfortunately get more attention than those who are more cautious.
But focusing on the mistakes (or other shortcomings) of experts ignores their far more important contributions to our lives. The experts who got it wrong with the Challenger also got us to the moon; the chemists who gave us thalidomide also have saved or improved millions of lives with other drugs; and to give the foreign policy establishment their due, they also helped build the postwar world order that has prevented a war between major powers for over 70 years and has contributed to an unprecedented expansion of prosperity.
When you ignore the contributions of experts and focus only on their failings, you stand to lose far more than you gain, like burning down your house to kill the mouse you saw in your kitchen. So it’s smart to take a careful and informed approach to assessing expert advice. As the saying goes, “if you think an expert is expensive, try hiring an amateur.”
But that’s exactly the problem we’re running into today—we’re paying far more attention to the loud and simplistic amateurs than we should. The backlash against established expertise the problem we’re running into is turning (or already has, most likely) healthy skepticism not only into unhealthy skepticism and cynicism but into aggressive and willful ignorance. We value confidence far more than credentials, which is why we elected a man who says he is the only one who can fix things.
The Internet was supposed to lift us all up, by putting the accumulated knowledge of the world at our fingertips. Instead, according to Nichols it has made us dumber, and I agree with him. Because anyone with a connection can create a slick website and reach the whole world with their opinions, the overwhelming quantity of crap tends to bury the quality. Sturgeon’s Law, which says 90% of everything is crap, is woefully deficient in describing the internet. For most people using it to do “research”, the internet is simply a powerful engine for confirmation bias. As if that’s not enough, Nichols also describes the impact of a higher education system that has misguidedly turned students into “customers”, and the proliferation of talk radio and cable TV stations that cater to every conceivable taste and perspective, so that no one ever has to run the risk of running into an uncomfortable fact.
After I put down the book, I jotted down some notes to try to answer the title question of this blog. (Since I’ve covered some aspects of this problem previously in my blog, what follows combines some of the ideas from The Death of Expertise and some of my previous thinking, and it’s impossible to separate the two. As a rule of thumb, if it sounds smart, credit Nichols.)
What do the experts owe us?
- Don’t overstate your case. Nichols is slightly guilty of this, starting with his title. Death is a pretty strong word, and the word campaign carries a slight whiff of conspiracy theory to it. It’s definitely a trend that many have exploited, but no one is guiding it.
- Stick to what you know. Linus Pauling deservedly won two Nobel prizes, but tarnished his reputation when he touted Vitamin C as a panacea (not to mention dabbling in eugenics).
- Be a foxy hedgehog. From a strong base of expert knowledge, become curious about the rest of the world and get comfortable with uncertainty and disagreement.
- Separate fact from opinions. Be clear in your own mind first, and then explicit about the difference in your public statements.
- Separate analysis from predictions. As Philip Tetlock has shown us, the average expert is just slightly more accurate than a drunk monkey throwing darts when it comes to making predictions.
- Be professional. Professionalism includes the above admonitions plus an obligation to the greater good—of your clients and even sometimes the general public.
What do we owe the experts?
- Look for signs that the expert you’re reading is following the rules above.
- Recognize that when it comes to expertise we are not all created equal. Don’t think that a half hour spent perusing Google returns gives you the right to argue with someone who has devoted their professional life to the topic.
- If you still feel the need to argue with experts (for example, I take issue with some of the ideas that our City’s traffic experts are trying to sell to the public), at least make a serious effort to learn the fundamentals of the topic first.
- Be careful what you put in your mind. If it’s true that you are what you eat, it’s even more true that you are what you read.
- Become a more critical thinker and learn how to identify quality. Here’s a few recommendations for further reading that will better equip you for the task:
o When Can You Trust the Experts? by Daniel Willingham
o Superforecasting, or Expert Political Judgment by Philip Tetlock
o For detecting business and management BS: The Halo Effect…and Eight Other Business Delusions that Deceive Managers, by Phil Rosenzweig, and Leadership BS by Jeffrey Pfeffer
o Thinking, Fast and Slow, by Daniel Kahneman.
o Curious, by Ian Leslie.
I heartily recommend this book, but the irony of a book about the death of expertise is that those who most need to read it are the least likely to.
Regardless
Two of those defenders—whose opinions and approach I greatly respect—are Josh Bernoff, author of Writing Without Bullshit, and Peter Wehner, a columnist for the New York Times. Since they can say it much better than I can, I simply give you these two links, and strongly recommend you read them:
Bernoff: The Truth Foundation
Wehner: Moderate Is Not a Dirty Word (This is the title of the print edition of the opinion in the NYT; different than the online edition for some reason.)
The best way to add value and reduce waste in communication is to provide your audience with “simplicity on the other side of complexity”.
I first came across this phrase in a fascinating blog post titled: Simplicity
The phrase resonated with me because it explains so clearly an idea that I am constantly trying to impart to participants in my lean communication and presentations classes. One of the most common reasons that companies bring me in to work with their teams is that their executives complain that they have to sit through unnecessarily complex and excessively detailed presentations, where masses of data substitute for clear understanding. The common complaint is “I ask what time it is, and they tell me how to build a watch.”
When you have to make a presentation to busy high-level decision makers—whether internally or to a potential customer—it’s because they need information to grasp the key issues about a complex decision, so that they can take effective action to improve outcomes. The simplicity with which you express information clearly adds value to them, because it reduces uncertainty and saves time and effort. But only if it’s good simplicity, which resides on the other side of complexity.
In essence, the gist of Kolko’s article is that the learning curve for a complex issue looks like a bell curve (I’ve added the labels to his simple drawing, and the rest of this article includes my additions to his):
At the far left, when people first approach an issue, they know very little about it, but of course that doesn’t stop most of them from jumping to conclusions or forming opinions. People who are intellectually lazy or who use motivated reasoning don’t proceed past this stage and enjoy the confidence born of ignorance. They forget what H.L. Mencken said: “Explanations exist; they have existed for all time; there is always a well-known solution to every human problem — neat, plausible, and wrong.” For examples of this type of presentation, just think back to the past two weeks of US Presidential conventions. This kind of simple is almost pure waste, because it’s either wrong, or it’s trivial.
But you’re a business professional, so you can’t afford to be intellectually lazy because you know if you present a simplistic and shallow case you will be eviscerated in the boardroom. So you take your time, gather evidence and analyze it, and see greater depth and nuance in the issues. If you stick to it, you eventually become an expert; you “get it”, so that’s the next natural stopping point. Complexity is the point at which the average expert feels they are ready to present their findings. However, the biggest mistake they make is that they include far more detail than the listener needs to use their findings, either through defensiveness or inability to connect to what the listener cares about. As one CEO told me, “I get a lot of detail but very little meaning.” They may have added value, but there is still a significant amount of waste, in the form of time, effort, and confusion.
Outstanding expert presenters know that you never truly know how well you know something until you try to explain it to others, so they take the next logical step. They add value to their listeners by distilling all their hard-won complexity into just the meaning the listener needs for their purposes. They know exactly why the listener needs the information, and give them just what they need to know to make the best possible decision, so that there is zero waste. Most times, they go beyond merely providing information and advocate for a specific decision (which is a given if it’s a sales presentation)—but it’s based on highly informed judgment.
The tools for achieving this type of good simplicity are the tools of lean communication: Outside-in thinking to know what the listener needs, Bottom Line Up Front to provide meaning right away, SO WHAT filter to root out waste, and pull to make adjustments as necessary.
Before you decide to strive for good simplicity, I would be remiss in not pointing out one personal risk you might run: if your goal is to call attention to how smart you are, it may not be the best way. As Kolko says, “The place you land on the right—the simplicity on the other side of complexity—is often super obvious in retrospect. That’s sort of the point: it’s made obvious to others because you did the heavy lifting of getting through the mess.”
But if your goal is to get the right things done, simplicity on the other side of complexity is the only way.