Having already read and thoroughly enjoyed Maria Konnikova’s two previous books,[1] I pre-ordered her newest book as soon as I learned about it. The Biggest Bluff did not disappoint. I believe it’s her best book yet, and I thoroughly recommend it.
First, let me summarize what it’s about. Konnikova is a psychology PhD who studies human cognition and decision making. In 2016, for various professional and personal reasons, she set herself the challenge of learning how to play poker from scratch, with the hugely ambitious goal of finishing in the money at a World Series of Poker event—within one year.
What strengths did she bring to the table? A strong academic grasp of the science of decision-making; an deep capacity to learn; and, most importantly, a humble recognition of her own ignorance. Not least, she chose her teacher wisely: Erik Seidel, a legendary poker champion who also has an incredible gift for teaching.
Reading The Biggest Bluff is like peeling an onion, because there are at least four layers to this deep book.
On the surface, The Biggest Bluff is about poker. I don’t play poker, so I can’t comment on how well it does its job. I did have trouble following some of the descriptions of certain hands she played, but those few instances do not detract from following the often suspenseful action and grasping the meaning of what’s going on.
Second, it’s a book about psychology, and the primary reason I bought the book. It’s one thing to study decision-making using other subjects in a laboratory setting; it’s another challenge entirely to study it under unrelenting time pressure, intimidation, using yourself as the test subject, with real money at stake. Konnikova learns and teaches useful insights about attention, emotional control, working with your own cognitive biases, reading people, and acquiring expertise. If the book had stopped here, I would have still considered it well worth reading. The next two layers are an unexpected bonus.
Third, The Biggest Bluff is a memoir that reads like a novel, a hero’s journey of self-discovery and personal transformation. As a neophyte, she had to adapt and learn quickly in order to survive her quest. As a woman in the heavily male-dominated world of poker, she had to learn to endure appalling insults and attempts at intimidation; Although it took longer than her initial one-year target, she ultimately fulfills her quest, but gains far more than she intended when she began. And something tells me that Konnikova’s journey is not over yet.
The fourth layer, and the one that resonated most strongly with me personally, is as a philosophy book, particularly Stoic philosophy. I’m not sure if she meant it that way. She only mentions the stoics once, near the end of the book, but their philosophy weaves throughout the book. Three key themes of stoicism are also key themes in this book: knowing what you can and can’t control; managing your perception and interpretation of reality; and staying calm and rational whether you’re winning or losing, especially the latter.
I strongly recommend you read this book if you’re interested in any one of the layers. You may even find other layers I’ve missed.
[1] Mastermind: How to Think Like Sherlock Holmes and The Confidence Game: Why We Fall for It…Every Time,
Podcast: Play in new window | Download
On a scale of 1-10, how good are you at spotting when others are trying to scam you?
If you rated yourself higher than a five, you’d better stay with me for this entire post. It’s a story of some very smart people—people who should have known better—who were fooled for a very long time and lost millions of dollars in the process.
I’m switching sides for at least one episode because I’ve recently become fascinated by how con artists work. I first became interested when I was preparing for my podcast on instant trust, and I read a book called The Confidence Game, by Maria Konnikova, At about the same time, my son recommended that I read Bad Blood: Secrets and Lies in a Silicon Valley Startup, by John Carreyrou. Both books are stark reminders that persuasive communication can be used for evil as well as for good, and it’s helpful to know how people pull off cons that seem unbelievable in retrospect.
Let’s start with Bad Blood. In a nutshell, the story is this: a company named Theranos started by an attractive and charismatic 19-year-old Stanford dropout sets out in 2003 to make a huge dent in the universe of healthcare by developing a revolutionary technology that makes it possible to perform hundreds of blood tests using a single drop of blood. It’s a powerful promise, and it attracts investors from professors, seasoned tech entrepreneurs, and the likes of former Secretary of State George Shultz, retired General James Mattis, Henry Kissinger, Rupert Murdoch, and also signs contracts worth hundreds of millions of dollars with Walgreens and Safeway. The company raked in over $700 million in capital and was valued at one time at $9 billion, making its founder Elizabeth Holmes the youngest self-made billionaire in history.
Holmes may have initially had sincere aspirations to deliver on her dream, but somewhere along the way it turned into a big, bad, elaborate deception. It finally got exposed and began crashing down in 2015, when Carreyrou wrote about it in the Wall Street Journal.
How did a company fool so many sophisticated people for so long? If people with such smarts and experience can be so easily fooled for so long, what hope is there for us ordinary mortals? Actually, as I will talk about, being smart is not necessarily a defense. In fact, being of above average intelligence may actually be a liability.
I believe any elaborate deception requires active participation by both sides in the transaction. This is no way implies that there is anywhere near moral equivalence between someone who deliberately sets out to deceive and their victims, but the thoughts and behaviors of the victims are certainly contributing factors. Let’s look at both sides and see how Elizabeth Holmes was able to pull it off for so long:
- She was extremely charismatic. very intense way of looking at someone; spoke with great sincerity and conviction
- She looked the part. She fit the story she was telling, and people had heard the story before: the gifted passionate dropout who transformed an entire industry, a la Bill Gates and especially Steve Jobs. In fact, she encouraged the similarity by dressing only in black turtlenecks.
- She was totally ruthless with the truth. Could easily look someone in the eyes and tell the most outlandish lies.
- She showed no empathy or conscience. She was willing to do anything to protect her version of the story, from hiring lawyers to intimidate and harass those who expressed doubts to even putting patients at risk.
Even smart people fall into common mental traps
Even the most analytical and careful thinkers take shortcuts or bend to certain biases, and here are just a few of the factors that Holmes exploited.
Social proof. Even smart people don’t have time to research the biochemistry of blood analysis, so they take a shortcut by relying on the words and actins of people they trust.
Halo effect. When someone exhibits positive outward qualities, such as looking and sounding professional and competent, it’s much easier to think they’re good at other things as well, such as being a good manager or scientist.
Confirmation bias. Once you build an attractive story in your mind, it’s almost guaranteed that you will ignore evidence that does not fit that narrative, or you will find convenient explanations.
Fear of Missing Out. If you’re Walgreen’s and you don’t take the plunge, what happens if CVS does and makes millions?
Highly intelligent people may be more vulnerable
Anyone can fall into the mental traps listed above, but highly intelligent people also have two additional handicaps, which may make them even more vulnerable.
Ricky Jay, a professional magician, says, “For me, the ideal audience would be Novel Prize winners…their egos tell them they can’t be fooled.”
But one Nobel prize winner, Richard Feynman, said, “The first principle is not to fool yourself. And you’re the easiest person to fool.” And keep in mind that he was speaking to the 1974 graduating class of CalTech when he said that.
What makes smart people so easy to fool? First, it goes to what Ricky Jay said. They know they’re smart, so they think they can’t be fooled. They don’t actually imagine that the person sitting across from them is smarter than they are (at least in this particular situation). That means that they won’t even listen when someone tells them they’re wrong. George Shultz’s own grandson was one of the first to blow the whistle on what they were doing, and Shultz sided with Holmes. He actually told his own grandson, “I don’t think you’re dumb, but I do think you’re wrong.”
Second, smart people are very clever at coming up with rational explanations for things that don’t look right. No peer reviewed journals? That’s because it prevents others stealing their advanced ideas. Negative press? That’s caused by competitors trying to stop them. Missed deliveries? That’s because of the earthquake in Japan. Those types of explanations are easier to think of than the simple fact that they may just be wrong.
So, what can you do about it?
- Konnikova says the key to resisting persuasion is to have “a strong, unshakeable, even, sense of self. Know who you are no matter what, and hold on to that no matter what.”
- Be objective. There’s a simple hack to help you distance yourself emotionally from the decision. Pretend that someone you know came up to you and asked your advice on whether to invest or not.
- Have an exit script. If you start losing money it can be tempting to throw more in to salvage it. Know what your limits are before you enter into the transaction and stick to it.
- Be very suspicious of secrecy and time pressure.
- Search for disconfirming information; actively search for evidence that you may be wrong.
OK, now that I’ve armed you with the tools, go ahead and listen to the rest of the podcast and see if you pass the test at the end!
Here’s a test: what single catastrophe killed more people worldwide than any other? It was the Spanish flu epidemic of 1918, which killed 50-100 million people, probably more people than the two world wars combined.[1] One contributor to the death toll was the fact that countries then at war suppressed information about it, which made it less likely to be contained. And one reason that we haven’t had a recurrence within several orders of magnitude of that is that governments around the world, especially the US government, gather and freely share huge amounts of data on diseases.
The average citizen is well aware of the dangers of nuclear war, terrorism and crime, so we accept and even embrace the institutions and people who protect us from them, such as the military, Homeland Security and first responders. When risks are vivid and potentially catastrophic, we don’t mind throwing vast sums at them, because we “get it”.
But it’s the less obvious risks that may threaten us the most in the long run, precisely because we don’t think enough about them and we begrudge every penny spent on preventing them. We don’t pay that much attention to weather, contaminated food, or insufficient health care and nutrition, but they have killed far more Americans than the more easily imagined risks.
It takes a gifted writer such as Lewis (Liar’s Poker, Moneyball, The Big Short) to make us understand and care about the “fifth risk”: project management. This refers to the enormous array of projects that the Federal government runs to address these hidden systemic risks, and the vast amount of data that makes that possible. So many of those projects and data are under threat today.
The average citizen does not understand data in general and does not care about it, unless maybe it’s about their fantasy football team. More specifically, the average citizen does not know:
- How much their safety and prosperity depend on national data—its collection, storage, analysis and application
- Who does most of the collection, storage, analysis and application
- Why that system is under threat today
The book is fascinating because Lewis is a master at telling the stories of interesting, dedicated individuals who work at all levels within the Federal government. Most of us think of them as grey, faceless bureaucrats, even “lazy or stupid” (I have to admit I’ve been guilty of that myself) but many of them are doing fascinating and even thrilling work. Lewis introduces us to unsung heroes who are smart enough to make far more money in the private sector but do what they do for reasons other than money: the mission, wanting to make a difference in people’s lives, a sense of being called to serve.
It also introduces us to the work that Federal agencies such as Commerce, Energy, and Agriculture do that save us from risks potentially as deadly or even worse than foreign enemies. Unfortunately, they’re like the offensive linemen of the Government, because they usually only get noticed when they fail. How many people are alive today because hurricane forecasts have become so much better? How many people are alive today because they did not die from the flu, or because the electricity grid has not succumbed to the half million cyber intrusion attempts it suffers per year? How many kids avoid malnutrition because of government programs, or how many more people avoided becoming victims of violent crime? You can thank government for that, because they are the only institutions who have the resources to collect the data to understand the problems and design and implement projects large enough to solve them.
The book is disturbing because many of those projects are at risk. They’ve always been at risk when politicians strive to cut budgets (and there’s no doubt a lot of fat and waste that needs cutting), but the present administration takes it to a whole different level. It’s a level that is not just oblivious to data, but openly hostile to it—willful ignorance, if you will. It’s the only administration that did not send large transitions teams to learn all about the agencies they were about to take over, to ensure a smooth handoff.
And as this administration took over, DJ Patil, the government’s Chief Data Scientist, “watched with wonder as the data disappeared across the Federal government,” such as links to climate change data, inspection reports of businesses accused of animal abuse, records of consumer complaints, even detailed crime data, and of course anything having to do with climate change. It doesn’t matter whether you’re Democrat or Republican, that’s data that you and I paid for, and they’re taking it away from us.
Why does this matter? The first answer is that the less data you have, the greater the chance that undetected risks will come to pass. The second answer is more fundamental: when government depends on the consent of the governed, how can making the governed less informed be a good thing?
This post is a little outside my usual persuasive communication content, but as a reader of this blog you probably care about clear thinking, supported by hard facts, so I strongly recommend you read The Fifth Risk.
[1] “A Deadly Touch of Flu”, The Economist, September 29, 2018, p.75.
I recently wrote a recommendation that you read Enlightenment Now, by Steven Pinker. I believed it was one of the best and most important books I’ve read in a long time, and I want to say the same thing about Factfulness, by Hans Rosling, Ola Rosling, and Anna Rosling Rönnlund.
Just like Pinker, Rosling[1] contends that in almost every important measure, the world is better off than it ever has been in history and continues to improve. But his book is different in several ways, which is why I view it not as a substitute for EN, but as a complement to it, and a highly readable and fascinating one, at that.
Like Pinker, the Roslings provide a lot of surprising material, but they go one better by letting you test yourself, and then compare your performance to thousands of others, and to the documented truth. Chances are, you will be surprised. Here are a couple of examples:[2]
In all low-income countries across the world today, how many girls finish primary school?
A: 20 percent
B: 40 percent
C: 60 percent
There are 2 billion children in the world today, aged 0 to 15 years old. How many children will there be in the year 2100, according to the United Nations?
A: 4 billion
B: 3 billion
C: 2 billion
How did you do? If you gave chimps in the zoo the same tests, they would average 33% correct. But human audiences tend to do much worse, and the errors are invariably skewed toward the pessimistic side.
These tests are important because they clearly illustrate how many misconceptions the general public in the developed world carry about the state of the world. So many of us have a much darker view of what’s happening in the world than is actually going on. Far from going to hell in a handbasket, it’s getting better in most important measures, including health, wealth, education and violence. But most of us get the facts wrong, and often it’s the more highly educated who are the worst.
So why is that a problem? Isn’t it safer and more prudent to be more worried, rather than less? The problem is that being a hypochondriac at the global level contains some of the same risks as at the personal level. We spend too much on unnecessary remedies, some of which have unintended consequences, and not enough on bigger problems. Second, when we really do find something wrong, others may not listen. Third, it can feed an “us v them” mentality which demagogues are quick to hijack for their own purposes.
Factfulness differs from EN in another important way. Whereas Pinker blames our wrong thinking on those who attack the fundamental ideas of reason, science, humanism and progress, Rosling finds the fault in ourselves—more specifically, how our brains work, explaining ten instincts we all have and how we can work around them. These include:
- The gap instinct: we see the world in binary terms, with rich countries and poor, and a large gap between them. In reality, the vast majority of people live in middle income countries.
- The negativity instinct: we notice the bad more than the good, which is a theme I’ve written about before. And because bad events are most likely to make the news, the well-read may be the most wrongly informed.
- The destiny instinct: the idea that our innate characteristics determine our destinies. “They” have always been this way and will never be able to change.
Each instinct is covered in a separate chapter which also contains useful antidotes and work-arounds. For example, to counter the destiny instinct, Rosling suggests the following:
- Keep track of gradual improvements
- Update your knowledge
- Talk to Grandpa
- Collect examples of cultural change
These and all the other suggestions in the book comprise the tools of factfulness, which I inferred from my reading to be the application of critical thinking to the latest reliable data in order to more clearly understand the true state of things. Rosling, of course, has an even better definition; factfulness is “the stress-reducing habit of carrying only the opinions for which you have strong supporting facts”. If you develop the habit, how can you fail to be a persuasive communicator?
Factfulness is not a function of education, as we’ve seen; anyone can develop it. In fact, Rosling tells an amazing and moving story about how an uneducated African woman saved his life by using aspects of factfulness to deliver a speech that dispersed an angry crowd.
Besides being highly informative, Factfulness is very readable, which should come as no surprise to anyone who has seen any of his TED talks or other videos on YouTube. Rosling’s flair for simplifying and illustrating complex topics shows through on every page. If you want to understand the interplay between international finance and preventing malaria, for example the story about the plot to punch the pharma CEO in the face is worth the price of the entire book.
My tagline is “clear thinking, persuasively communicated.” I can’t think of a better representative of that thought than this book.
[1] Although there are three authors, Rosling writes in the first person and is the principal author.
[2] The answer to both is C. Throughout the book, your safest bet is to pick the most favorable answer.