fbpx

Clear thinking

Clear thinking

Knowledge May Be Free, But It’s Not Worthless

I know!

A commentary by Bret Stephens in Tuesday’s Wall Street Journal got me thinking about the importance of “rote knowledge” in today’s world. In his article, he laments the fact that graduates are coming out of college with vast knowledge gaps, because of the trendy idea in education that learning how to think is more important than cramming your head with facts.

Of course, this idea has been around a long time, but it’s even more deeply embedded in the popular imagination since we have ubiquitous access to the world’s information. We have Google if we need to look something up, and Siri can answer practically any question we have. The implication is that since knowledge is free, it is worthless.

The problem with that line of thinking is that it’s a false tradeoff: knowledge and effective thinking are not only not mutually exclusive, but critical thinking is impossible without knowledge. In other words, cramming your head with facts does not make you a worse thinker. In fact, “rote knowledge” can make you a better thinker, in so many ways. Here are just a few:

Better critical thinking: Although critical thinking is partly about evaluating the logic of someone’s argument, it’s also about being able to compare their view of reality to yours, and being able to generate alternative points or explanations. Facts fit into our brains in patterns, and those patterns help us to filter incoming information. Something rings true or false depending on how it interacts with the existing patterns in our minds. Richer patterns make for more reliable critical filters.

If you don’t have a deep well of knowledge at your command, anytime you hear or read something you have to take it at face value until you have a chance to look it up, and that takes time which you will not always have.

Learn faster: In this world of rapid and accelerating change, the capacity to learn is a crucial asset. But the rate of learning is dependent on how much you already know. It’s virtually impossible to learn anything “new” without connecting it to something you already know. It’s a snowball effect: the more you know, the faster you learn, and the faster you earn, and so on.

More innovative: Innovation doesn’t spring from ignorance. Knowledge is the raw material of innovation. It proceeds from adding to existing knowledge or making new connections between disparate ideas. More knowledge exponentially increases the possible connections.

Improve focus and attention: The discipline of studying and learning something in depth, of memorizing and of testing your knowledge strengthens your powers of attention and focus. This is something I’m personally experiencing, as I am embarked on a project to learn as much French as possible before my trip to Paris at the end of June.

Intuition and decision making: Even intuition benefits from knowledge. Indeed, expert intuition may be no more than rapid pattern recognition that goes on underneath our slower logical thinking processes. Chess masters don’t think any more steps ahead than mediocre players, according to those who have studied the source of their expertise. They also don’t run through huge numbers of possible moves like a supercomputer. Their minds don’t have to run through an endless series of bad moves in order to find a few good ones to choose from. Rather, they have deep stored databases of patterns and moves that they recognize; they can “see” where those patterns will lead and the top two or three alternatives come into their minds. They win through superior knowledge, not any superhuman skill at decision making.

Logical thinking: Knowledge even makes pure logic easier. In a well-known demonstration, psychologists like to test our logical thinking by showing four cards, with either a letter or a number showing, and ask you which cards you would have to turn over to prove or disprove the statement: “If a card has a vowel on one side, then it has an even number on the other side.” Most people get the answer wrong, and this is supposed to show our deficiencies in logic. Yet, when the same problem is posed in terms of a familiar real-life scenario, such as deciding whose ID to check to guard against underage drinking, almost everyone gets it right. In other words, existing knowledge makes it easier to follow the logic.

So there you have it, six ways I can think of, off the top of my head, how knowledge adds to good thinking. I’m sure I could have found more, if I wanted to look them up.

Read More
Clear thinking - Expression - Mythbusters - Persuasive communication

Selling to the Amoeba Brain?

Who’s the decision maker?

I bet you never thought you would learn about one-celled organisms in a blog about persuasion, but bear with me for a few paragraphs because I want to make an important point.

It’s been quite the fashion over the past few years in sales and persuasion circles to focus on our three brains: the reptilian brain, the rat brain, and the human brain. The idea is pretty simple: our human brains have evolved over eons in a different environment than our modern world; since evolution by definition proceeds from what went before, as our newer brain structures and functions evolved, the old structures remain and continue to be quite active. It’s kind of like the separation of powers in the Federal government: all three branches get involved in the process. So, if you want to persuade someone, you have to appeal to the simpler brains as well as to our logical faculties.

That may be true, but why stop at the reptile brain? If we trace our ancestry even further, we all evolved from one-celled organisms—amoebas, if you will, and we were amoebas even longer than we were reptiles. So, should we also tailor our persuasive efforts to the amoeba brain that surely lurks within all of us?

I can see it now: make sure you have a lot of light when you make your presentations, because amoebas move toward the light. You wouldn’t have to explain your solution, because people can get it by osmosis. We could call it “celling”. I realize I’ve reached the point of absurdity, but unfortunately so have many of the “scientific” persuasion experts.

The basic idea is sound, as long as it’s not carried too far. Aristotle, the father of modern persuasion science, made it the core of his Rhetoric, acknowledging that persuasive appeals comprise three strands, ethos, pathos, and logos. More recently, research has categorized and measured myriad ways in which our decisions and behaviors deviate from the purely rational. Science has learned a lot about the brain in the past few decades, and technologies such as functional MRIs let them see real time into our brain activity as we make decisions, respond to stimuli, etc. A lot of that research has confirmed, refined, or changed our understanding of how our minds work.

But new scientific discoveries are inevitably seized upon immediately by modern-day snake oil salesmen to add some legitimacy to their half-baked ideas. The ink was barely dry on the first edition of Darwin’s Origin of Species when the Social Darwinists hijacked his theories to justify their own notions of how society should be organized. In the same spirit, a lot of experts have picked up on the colorful pictures showing various areas of the brain lighting up during controlled laboratory experiments to market their services to companies, promising that they can read consumers’ brains and know what makes them tick, even better than the consumers themselves can. (By the way, if you’ve ever been inside of one of those machines, you know just how “natural” that situation is.)

When a book applies the idea to B2B sales, telling us that, “In spite of our modern ability to analyze and rationalize complex scenarios and situations, the old brain will regularly override all aspects of this analysis and, quite simply, veto the new brain’s conclusions.”[1], then the idea has gone too far.

I have a lot of respect for the work of one of the deans of persuasion science, Robert Cialdini, but even he goes a little too far. His six principles for influence are implied to be so powerful that you can’t sell a good idea without them, and you can definitely sell bad ideas with them. You can trigger fixed-action patterns that cause people to act like a mother turkey does when she hears “cheep-cheep”, and that’s what his book is about.[2] Cialdini at least acknowledges the importance of rationality—in a footnote—saying  of course material self-interest is important, but that it goes without saying.

Yet, that’s the problem: it doesn’t go without saying. When a Harvard Business School professor tells us that “what you say is less important than how you say it”, and “style trumps content”, then it has to be said.

Content has to come first

This article is a plea for a little more, well, rationality in the understanding of what it takes to get ideas approved and products sold. Of course it’s important to be able to appeal to more than just the rational parts of the brains of your persuasive target. I’ve written about ways to do that many times on this blog. But it has to start with a sound, logical and defensible business or personal case—with what Cialdini called material self-interest.

One of the oldest sayings in sales is that you should sell the sizzle, not the steak. But what happens when they buy the appetizing sizzle and then find out the steak is crappy? They won’t come back. That’s why you have to make absolutely you have an excellent steak before you worry about the sizzle. Regardless of how many persuasive cues you employ, or which regions of the brain turn which color in the fmri, if the idea does not work, you soon won’t, either. Bad ideas are bad ideas, no matter how they’re dressed up.

One problem with fixation on techniques to appeal to the old brain is that they distract from the main job of putting together a strong proposal. There are people who spend most of their time learning “the tricks of the trade” in the hope of finding shortcuts, when they should be learning the trade itself. I’ve seen people put more time in the choice of fonts for their slides than they do in critically examining the strength of their ideas.

Whether you’re trying to get a proposal approved internally or selling a solution to a customer, most business decisions are complex enough to require extensive data, deep analysis, and careful decisions. That’s the reality of our modern world, which was built by the human brain. Never forget that it’s still the most important decision maker.


[1] Patrick Renvoise and Cristophe Morin, Neuromarketing, viii.

[2] Robert B. Cialdini, Influence: Science and Practice.

Read More
Clear thinking

Headed to Davos?

Don’t believe it

I know that at least one of my readers is headed to Davos this week for the World Economic Forum, the annual gathering of world leaders, business titans, economists and other assorted experts, who come to find out what some of the world’s foremost experts have to say about our future.

The rest of us will have to find out what they say via the news media, but the point of this article is that it won’t be a good idea to take what they say literally and rush out to buy survival supplies. Regardless of how famous the speaker—in fact, because of how famous the speaker—be very, very skeptical. And don’t return home and start spreading the gloom and doom you are sure to hear.

The general theme of all the talks will be about how screwed up our system is and how we should absolutely follow their advice for major changes to our economies, societies and indeed our way of thinking.

Read More
Clear thinking

The More You Learn the Less You Know: How to Maintain a Healthy Learning Diet

This will not end well

Is it possible to learn more and more and know less and less? It depends on what and how you learn.

We live in a wonderful era of lifelong learning and easy access to the world’s knowledge. As a result, more and more of what we learn, we learn on our own, outside of the guidance of formal education.

In many ways, this is a good thing. Speaking for myself, I have always been an advocate of learning and personal growth. As a free-range grazer in the noosphere, I’m an active consumer of information about topics that interest me: with main courses comprising about 75-100 nonfiction books a year, and constant snacking on magazine articles, blogs, and videos. Much like you, I have a voracious appetite for new learning, and the world’s table is spread with abundance and variety.

This is especially true in psychology and the social sciences, where it seems every day brings a fascinating new study about how we think and make decisions. Writers like Malcom Gladwell and Daniel Pink know how to turn their research into palatable packages that go down easily because of their artful combination of solid research, sweetened with compelling stories and vivid detail.

Unfortunately, without using some good judgment and discipline, it’s too easy to stuff ourselves with empty calories of fluff and trick ourselves into thinking that we’re acquiring a real education. We need less sugar and more salt—it may be bad for you in an actual physical diet, but most of us don’t take enough of it when we read stuff that interests us.

Because we have so much choice in what we read, we tend to read material that is easy to grasp and that we already agree with, so it’s possible that instead of learning more, we may be merely embedding false information into our minds even more firmly. In those cases, it’s possible to read more and more and know less and less about something.

You can try to fight this tendency by being choosy about what you consume. Books carry the equivalent of nutrition labels in the form of the author’s qualifications, index, bibliography and notes; look those over to get a sense of what you’re about to stick into your brain. When listening to speakers, if they say “studies show”, without showing the studies, be very skeptical.

These precautions should help, but if you stop there you may be even more susceptible to error. In sports, the introduction of better protective equipment sometimes leads to worse injuries because it can make athletes more reckless. The same may be true in your reading. It’s possible that the fact that you’ve been rigorous in choosing what to read may make you less skeptical of its claims while reading it. After all, if a Nobel winner said it, or it’s in the Harvard Business Review, who are you to doubt it?

There are two good reasons to keep the salt handy even when reading a book from a “trusted” source. First, there is excellent documentation that expert opinion can be extremely unreliable.

Second, most of the studies that they cite in the footnotes are probably wrong, or even if reliable are cherry-picked to support the author’s conclusions. I realize that it’s ironic that I cite a paper to support the assertion that most published research papers are false, but John P.A. Ioannidis’ essay “Why Most Published Research Findings Are False,” makes for very interesting and disconcerting reading. (And it was written before the recent scandals involving falsification of research data.)

Unfortunately, the “findings” least likely to be true are the most likely to catch and stick in our attention. We’ve all had the experience of reading something in the paper like, “Scientists show rutabaga lowers risk of knuckle cancer.”[1] Here’s just one reason why you should not immediately rush out to the produce aisle: Maybe nineteen other research teams studied the link between rutabagas and knuckle cancer and found no correlation—they would not have bothered to try to publish their “negative” results. Even if they did publish their results, which study do you think the reporter would use to generate an article people would read? And, if you read articles on both sides of the issue, which would be more likely to stick in your mind when you went to the store?

To make the problem even worse, even if you read absolute proof that the information you read was false, you are much more likely to remember the vivid claim than the solid but boring refutation.

Guidelines for a healthy knowledge diet

Doubt first; look for contradictory information. Your default position should be doubt and skepticism, not immediate acceptance. At the risk of pushing the metaphor too far, chew thoroughly before swallowing.When you read something that rings true, it may be easy to bring examples to mind that support it, but you should also try to think of counterexamples. We all suffer from confirmation bias, which blinds us to contradictory information.

Go deeper. The footnotes aren’t just for decoration. You can’t verify everything you read, but you can find the original source in the footnotes and read it yourself to see what it says.

Learn just a tiny little bit of statistics. At least enough to understand how much weight you should put on reported results. At a minimum, you should understand sample sizes, correlation and effect sizes. (If this sentence made your eyes glaze over, read this.)

Fire bullets not cannonballs. This advice comes from Great by Choice, by Collins and Hansen. What it means is that you should not bet the farm on something new and untried by making big or irreversible changes to your business strategies or processes. Experiment, measure the results, and make adjustments as necessary. When you’re sure, then you can fire the cannonball.

Don’t be so damned sure of yourself all the time. As we’ve seen, it’s possible that the more educated you are, the more wrong you are likely to be. Be open-minded and willing to listen to others. Certainty shuts down learning.


[1] I purposely used a silly example to avoid another phenomenon: if something sticks in our minds when we read it, we tend to believe it even after it has been proven to be false.

Read More
1 7 8 9 10 11 13