fbpx

Clear thinking

Clear thinking

Won’t Get Fooled Again

In my books and blogs, I try hard to ensure that what I write is backed up by evidence, and so I rely heavily on scientific papers, particularly those reporting the results of psychological studies. But, what if much of what I read is wrong?

That may be the case, according to Stuart Ritchie, in his new book, Science Fictions: How Fraud, Bias, Negligence and Hype Undermine the Search for Truth.

But Ritchie is not anti-science. Far from it. Think of his book as an intervention by a concerned family member, who only wants the best for science.

Much of what you read in the popular press about scientific “advances” is incorrect. That’s partly the fault of the media, but a large part of the blame rests on scientists themselves, and the way they react to a system that incentivizes poor scientific behavior and practice. It reminds me of what W. Edwards Deming said: “A bad system will beat a good person every time.”

“Every time” is surely an exaggeration when it comes to science, but the situation is bad enough. Researchers have failed to replicate about half of all studies in psychology. It makes depressing reading to learn that papers that I have cited in my books and blog posts are among these, including power posing, fixed v. growth mindset, marshmallow test. Other fields, while not as dire, show similar patterns, including economics, neuroscience, and biology.

How does it happen? Ritchie cites several reasons, but the most important is one that most of my readers will easily recognize. Scientists, like salespeople, are generally subject to a quota, except for them it’s about publishing papers rather than revenue. The number of papers published, the  frequency with which they get cited in other papers, and the prestige of the journals they are in are measures that  are used in tenure and grant decisions.

But as Goodhart’s Law states, “when a measure becomes the target it ceases to be a good measure.” Just as salespeople quickly figure out ways to game their system, it should not be a surprise that scientists do so as well. A few resort to outright fraud; some inadvertently or purposely let their bias affect the results they find and/or report; some make mistakes in their haste to publish; and the journals themselves and especially the popular press hype even modest results beyond what the results show.

What to do about it? Ritchie lays out a lot of ideas to reform the system. They seem sound, albeit perhaps a bit idealistic. I’m not a scientist myself, so I’m not sure I can credibly comment on his ideas. But as a consumer of science, I can focus on what I can do about it—and what you can do about it. It behooves me to be more critical and careful. For me, the most important section is the appendix, which has ten suggestions for how to read a scientific journal article. A lot of this I’ve known, but having a clearer picture of the scope of the problem will make me much more careful and skeptical in the future. I will take more care in reading individual studies and look to see if the results have been replicated elsewhere.

I can’t guarantee that I won’t get fooled again, but the authors are going to have to work a lot harder for that to happen!

Because what choice is there? In the end, even though it can be discouraging to read about the shortcomings of science, it’s important to paraphrase what Churchill said about democracy: “It’s the worst from of pursuing knowledge…except for all the others that have been tried.”

Read More
Clear thinking - Leadership Communication

When Can-Do Conflicts with Candor

There will be plenty of post-mortem analysis of the US intelligence community’s failure to anticipate the precipitous collapse of the Afghan military over the past few weeks. It will take a lot of explaining (or rationalization) to show why, after spending more than $80 billion over 20 years to train the Afghan government forces, such a massive effort yielded such pitiful results.

Whatever transpires from the analysis, it will almost certainly contain multiple factors and reasons, and it would be irresponsible to speculate this early on all of these. When disasters happen, the dots line up and connect perfectly in hindsight, but of course the view is never that clear before it happens.

But one of the factors is worth discussing here, because it applies so well to the business world as well. An article in today’s New York Times states: “Part of the problem, according to former officials, is that the can-do attitude of the military frequently got in the way of candid, accurate assessments of how the Afghan security forces were doing.”

It sounds obvious, but it’s hard to prevent. Imagine being a junior officer being asked to report on how your training efforts are going. What would be the effect on your career prospects if you candidly reported that despite your best efforts, you were pessimistic about their impact?

Now imagine yourself being an account executive or a product manager in a forecasting meeting. Would you do it any differently?

Probably not. America has long been a country where the power of positive thinking is enshrined in our culture. And overall, I believe that has been a good thing. It has enabled us to accomplish incredible feats and helped build the most powerful economy in the world.

But it can go too far. Can-do is almost a religion, where pessimism is a sin and realism is suspected to be. So, when someone reports or forecasts to their managers, they’ll naturally tend to shade toward the bright side, even when they think they’re being absolutely candid. Those managers will in turn take those reports and shade them slightly when they report up to the next level, so you can imagine the possibilities for overconfidence when the “ground truth” gets to the top.

If you’re a leader, here are three things you can do to reconcile can-do and candor:

  • Take this saying out of your vocabulary: “Don’t bring me a problem unless you bring me a solution.” If your subordinate finds out about a problem that they’re unequipped to handle, do you really want them to keep quiet about it?
  • Go to the gemba. That’s lean-speak for going to the scene of the action to see for yourself.
  • Break the path into detail. In his book, Perfectly Confident, business professor Don Moore cites the well-known example that 93% of people rate themselves in the top half of all drivers. But when they ask them to rate themselves on individual skills such as signaling, using mirrors, or backing up, their “overplacement” reduced substantially. Detail helps the dots line up into a truer picture.
Read More
Clear thinking - Podcasts - Thinking Books

When Smart People Do Dumb Things

On a scale of 1-10, how good are you at spotting when others are trying to scam you?

If you rated yourself higher than a five, you’d better stay with me for this entire post. It’s a story of some very smart people—people who should have known better—who were fooled for a very long time and lost millions of dollars in the process.

I’m switching sides for at least one episode because I’ve recently become fascinated by how con artists work. I first became interested when I was preparing for my podcast on instant trust, and I read a book called The Confidence Game, by Maria Konnikova, At about the same time, my son recommended that I read Bad Blood: Secrets and Lies in a Silicon Valley Startup, by John Carreyrou. Both books are stark reminders that persuasive communication can be used for evil as well as for good, and it’s helpful to know how people pull off cons that seem unbelievable in retrospect.

Let’s start with Bad Blood. In a nutshell, the story is this: a company named Theranos started by an attractive and charismatic 19-year-old Stanford dropout sets out in 2003 to make a huge dent in the universe of healthcare by developing a revolutionary technology that makes it possible to perform hundreds of blood tests using a single drop of blood.  It’s a powerful promise, and it attracts investors from professors, seasoned tech entrepreneurs, and the likes of former Secretary of State George Shultz, retired General James Mattis, Henry Kissinger, Rupert Murdoch, and also signs contracts worth hundreds of millions of dollars with Walgreens and Safeway. The company raked in over $700 million in capital and was valued at one time at $9 billion, making its founder Elizabeth Holmes the youngest self-made billionaire in history.

Holmes may have initially had sincere aspirations to deliver on her dream, but somewhere along the way it turned into a big, bad, elaborate deception. It finally got exposed and began crashing down in 2015, when Carreyrou wrote about it in the Wall Street Journal.

How did a company fool so many sophisticated people for so long? If people with such smarts and experience can be so easily fooled for so long, what hope is there for us ordinary mortals? Actually, as I will talk about, being smart is not necessarily a defense. In fact, being of above average intelligence may actually be a liability.

I believe any elaborate deception requires active participation by both sides in the transaction. This is no way implies that there is anywhere near moral equivalence between someone who deliberately sets out to deceive and their victims, but the thoughts and behaviors of the victims are certainly contributing factors. Let’s look at both sides and see how Elizabeth Holmes was able to pull it off for so long:

  • She was extremely charismatic. very intense way of looking at someone; spoke with great sincerity and conviction
  • She looked the part. She fit the story she was telling, and people had heard the story before: the gifted passionate dropout who transformed an entire industry, a la Bill Gates and especially Steve Jobs. In fact, she encouraged the similarity by dressing only in black turtlenecks.
  • She was totally ruthless with the truth. Could easily look someone in the eyes and tell the most outlandish lies.
  • She showed no empathy or conscience. She was willing to do anything to protect her version of the story, from hiring lawyers to intimidate and harass those who expressed doubts to even putting patients at risk.

Even smart people fall into common mental traps

Even the most analytical and careful thinkers take shortcuts or bend to certain biases, and here are just a few of the factors that Holmes exploited.

Social proof. Even smart people don’t have time to research the biochemistry of blood analysis, so they take a shortcut by relying on the words and actins of people they trust.

Halo effect. When someone exhibits positive outward qualities, such as looking and sounding professional and competent, it’s much easier to think they’re good at other things as well, such as being a good manager or scientist.

Confirmation bias. Once you build an attractive story in your mind, it’s almost guaranteed that you will ignore evidence that does not fit that narrative, or you will find convenient explanations.

Fear of Missing Out. If you’re Walgreen’s and you don’t take the plunge, what happens if CVS does and makes millions?

Highly intelligent people may be more vulnerable

Anyone can fall into the mental traps listed above, but highly intelligent people also have two additional handicaps, which may make them even more vulnerable.

Ricky Jay, a professional magician, says, “For me, the ideal audience would be Novel Prize winners…their egos tell them they can’t be fooled.”

But one Nobel prize winner, Richard Feynman, said, “The first principle is not to fool yourself. And you’re the easiest person to fool.” And keep in mind that he was speaking to the 1974 graduating class of CalTech when he said that.

What makes smart people so easy to fool? First, it goes to what Ricky Jay said. They know they’re smart, so they think they can’t be fooled. They don’t actually imagine that the person sitting across from them is smarter than they are (at least in this particular situation). That means that they won’t even listen when someone tells them they’re wrong. George Shultz’s own grandson was one of the first to blow the whistle on what they were doing, and Shultz sided with Holmes. He actually told his own grandson, “I don’t think you’re dumb, but I do think you’re wrong.”

Second, smart people are very clever at coming up with rational explanations for things that don’t look right. No peer reviewed journals? That’s because it prevents others stealing their advanced ideas. Negative press? That’s caused by competitors trying to stop them. Missed deliveries? That’s because of the earthquake in Japan. Those types of explanations are easier to think of than the simple fact that they may just be wrong.

So, what can you do about it?

  • Konnikova says the key to resisting persuasion is to have “a strong, unshakeable, even, sense of self. Know who you are no matter what, and hold on to that no matter what.”
  • Be objective. There’s a simple hack to help you distance yourself emotionally from the decision. Pretend that someone you know came up to you and asked your advice on whether to invest or not.
  • Have an exit script. If you start losing money it can be tempting to throw more in to salvage it. Know what your limits are before you enter into the transaction and stick to it.
  • Be very suspicious of secrecy and time pressure.
  • Search for disconfirming information; actively search for evidence that you may be wrong.

OK, now that I’ve armed you with the tools, go ahead and listen to the rest of the podcast and see if you pass the test at the end!

Read More
Clear thinking - Podcasts - Uncategorized

Dare to Disagree – But Be Smart About It

“If we are all in agreement on the decision – then I propose we postpone further discussion of this matter until our next meeting to give ourselves time to develop disagreement and perhaps gain some understanding of what the decision is all about.” Alfred P. Sloan


In recent podcasts, I’ve stressed the value of going along with your conversational partner in order to achieve an agreeable, smooth flow. But in this podcast, I am going to take the opposite side of that argument—to disagree with myself, essentially.

That’s because disagreements are not only inevitable but can be extremely valuable to produce best thinking and results. Unfortunately, most people are uncomfortable with it. As Margaret Heffernan says, 85% of executives admitted that they had issues at work that they were afraid to raise.

So, there are clearly advantages to daring to disagree—but you can also be smart about it. In this podcasts I explain the risks of being too agreeable, the benefits of challenging others’ thinking, and some approaches to use to ensure that you can be constructive in your disagreement while preserving relationships as well.

Risks of the “accepting” approach

  • Taking a bad idea too far
  • Leaving important things unsaid
  • Focusing too much on “being nice” can distract from thinking about the issue
  • Lack of clarity

Benefits of constructive disagreement

  • More clarity and less risk of misunderstanding
  • Speaking up may encourage others to do the same
  • Great way to pressure-test your ideas and conviction
  • Encourage diversity of thought

How to engage in constructive disagreement

  • Don’t make it personal
  • Keep the big picture in mind
  • Have an open mind and be open to persuasion
  • Use your imagination to find a third way that satisfies both parties

In the end, honest disagreement can be one of the highest signs of respect that you can offer to someone, because it treats them as an intelligent person who is willing to listen to reason and cares for a greater good than pure self-interest.

If you disagree with anything Isay in this podcast, please pay me the respect of letting me know.

See also: When Is It Your Duty to Disagree?

Read More
1 2 3 13