fbpx

Tag Archives: system 1 thinking

Book reviews - Clear thinking - Persuasive communication

Book Recommendation: Thinking, Fast and Slow

Read it fast, and then again slowly

One of the highlights of my intellectual life was discovering the work of Daniel Kahneman and Amos Tversky, in a book called Against the Gods, by Peter Bernstein. Since then, I’ve read some of his more scholarly work and have come across his name in dozens of books about thinking and decision-making. So, when I learned that he had written Thinking,  Fast and Slow, I was eager to get it and am happy to be one of the first to recommend it.

As a fan of his work, it’s a sure bet that I am biased, but as a student of his work I have taken special pains in this review not to let that bias affect my thinking. Fortunately, this time both my intuition and reason agree that it is an important book for anyone who wants to learn more about the complexities and oddities that characterize our thinking, perceptions, and decision-making.

The key theme of the first section of the book is that we all have two currents of thought running simultaneously in our heads. Think of a hybrid engine, which runs quietly on electrical power in leisurely driving but requires gasoline power for surges of performance. System 1 thinking is equivalent to the electrical power. It’s fast and effortless and mainly runs below the level of our consciousness. System 2 is slower, more logical, and often difficult to use.

Clear thinking is hard to do and rarer than we think. Our brains use a disproportionate share of our energy resources, so we’ve evolved brains that conserve mental energy as much as possible. One of the ways we do this is by taking mental shortcuts (heuristics) to quickly arrive at “good enough” answers for most of life’s questions and challenges. System 1 generally serves us well.

Except when it doesn’t. In complex situations those shortcuts can take us in the wrong direction. When faced with a difficult question to answer, one of our most common shortcuts is to substitute an easier question. For example, the question: “Will this multimillion dollar investment deliver an ROI that exceeds our hurdle rate?” often becomes, “Do I trust this person who says it will?”

You can see the obvious implications for persuasion, regardless of which side of the persuasion attempt you’re on.

Another important idea is What You See Is All There Is, (WYSIATI). Even when we don’t have complete information to answer the difficult question, we often treat the information we have as all we need. One consequence of this is the halo effect, in which a salient judgment about a specific person carries over into other judgments. For example, people who are perceived as good-looking tend also to be seen as more intelligent, capable, etc.

We are remarkably good at some types of judgments, such as inferring the intentions of another person from a momentary glance. But we are also bad at other types of judgments, such as statistical thinking and some economic choices. In the second main section of the book, Kahneman shows us how our judgments deviate from the utility-maximizing “best choices” that economists tell us we should make.

I’ve written before about loss aversion and framing effects because of their close connection with persuasion. So often, it’s not the choice that makes the difference, but how the choice is described. For example, consider the following scenario.

A person with lung cancer can choose between radiation and surgery. Surgery has a better record for long term survival, but it is riskier in the short term. In studies, participants were given either of the following two descriptions:

  • The one-month survival rate is 90%.
  • There is 10% mortality in the first month.

Which would you choose?

Participants in the studies chose surgery 84% of the time when the first choice was posed, and only 50% chose it in the second frame. The choices are exactly the same, but the description makes a big difference. Disturbing, but maybe not surprising.

What was most surprising about the studies is that the framing effect applied equally to physicians as to the general population.

What makes so many of these errors especially sinister is that we are overconfident in our own certainties and abilities. In fact, often the people who are most wrong are in the worst position to know it.

As you can see, education is not enough of a guard against irrationality. It would be nice if Kahneman would have given us some practical advice on how to improve our judgments, but he tells us that “little can be achieved without a considerable investment of effort.” If you read this book, you will at least be in a better position to recognize situations where you should be on your guard and should make the extra mental effort to think through the choice.

There are many excellent books that have come out in recent years, but most of them are based on the original thinking and research done by Daniel Kahneman, so why not go directly to the source?  Most people I talk to have never heard of Kahneman, and I didn’t want to tell you more about him until you had a chance to react first to some of his ideas. I guess I should mention that he is the only psychologist ever to have won the Nobel Prize for Economics.

In this brief review, I’ve only scratched the surface of the dozens of examples of the ways our thinking can go astray. Although most of the judgments and choices we make turn out alright, sometimes we need the extra horsepower that System 2 can provide. Any serious student or practitioner of persuasion—or of thinking clearly and resisting persuasion—should read and re-read this book.

Read More