Freedom from religion foundation, Inc | Subscribe
Published by the Freedom From Religion Foundation, Inc.

Steven Pinker: Rationality and why it matters

Steven Pinker says that rational people “have fewer accidents and mishaps, better financial health and employment outcomes, and are less likely to be swindled by medical or psychic or, for that matter, religious charlatans.” (Photo by Ingrid Laas)
The FFRF convention audience listens intently to Steven Pinker’s speech. (Photo by Ingrid Laas)

Steven Pinker gave this speech (edited for length) on Nov. 19, 2021, at the Boston Park Plaza during FFRF’s national convention. To watch the full speech, go to ffrf.us/speeches-2021. He was introduced by FFRF Board Chair Stephen Hirtle.

Stephen Hirtle: I am pleased to introduce our next speaker, Steven Pinker, who is a cognitive scientist, experimental psychologist, linguist and bestselling popular science author. Steve also serves as FFRF honorary president and has kindly recorded a 30-second commercial for FFRF that has run on “CBS Sunday Morning” and “Late Night with Stephen Colbert,” among other outlets. Steve Pinker is the Johnstown Family Professor in the Department of Psychology at Harvard University and is known for his advocacy of evolutionary psychology and the computational theory of mind. 

His newest book, which also directly relates to FFRF’s mission, is called Rationality: What It Is, Why It Seems Scarce, Why It Matters. Please welcome the distinguished Steven Pinker.

By Steven Pinker

Thank you. It’s an honor to return to the Freedom from Religion Foundation’s annual meeting. 

Rationality presents us with a puzzle. On the one hand, our species has walked on the moon, taken photographs of our planet, plumbed the secrets of the cosmos, of life, of mind. At the same time, a majority of Americans aged 18 to 24 think that astrology is “very” or “sort of” scientific. Large proportions believe in conspiracy theories, such as that Covid vaccines are a plot to implant microchips into our bodies by Bill Gates, or that the American “deep state” houses a cabal of Satan-worshiping cannibalistic pedophiles.

Many of us believe in paranormal woo-woo, such as possession by the devil, extrasensory perception, ghosts and spirits, witches and spiritual energy in mountains, trees and crystals. This is the puzzle I tried to deal with in my book Rationality: What It Is, Why It Seems Scarce, Why it Matters.

What is rationality?

Well, let’s begin at the beginning. What is rationality? I think a good characterization of rationality raises the question: “How can knowledge be used to attain goals?” And the answer comes from normative models, a set of tools that specify how one ought to reason if one wants to attain goals rationally. Different models for different goals. 

Logic: It’s the set of tools for deducing new true propositions from existing ones. An awareness of logic can help us avoid fallacies, such as affirming the consequent. “Every creative genius was laughed at in his time. People laugh at my ideas. Therefore, I’m a creative genius.” 

Probability: The likelihood of an event depends on the number of occurrences as a proportion of the number of opportunities. An awareness of how to calculate probability helps us avoid fallacies like the availability bias, in which the subjective likelihood of an event depends on how easily you can recall anecdotes and images.

Bayes rule: We should give credence to a hypothesis to the extent that it’s credible a priori. It’s consistent with the evidence, and the evidence is uncommon across the board. That allows us to avoid fallacies such as base-rate neglect, as in, a woman I know whose 2-year-old daughter suffered from twitches. A family doctor said, “Oh, perhaps she has Tourette’s syndrome” because Tourette’s patients tend to have twitches, ignoring the fact that Tourette’s syndrome is rare in the population, whereas twitches are common. This friend of mine rediscovered Bayesian reasoning just working it through for herself. 

The theory of rational choice: A rational actor chooses the option with the greatest expected utility. It helps us avoid our buying extended warranties, which a large percentage of American consumers do. Does it really make sense to take out a health insurance policy on your toaster?

Signal detection theory: A fallible observer cannot know whether an observation is real, a signal or bogus noise and must set a decision cutoff that trades off misses and false alarms according to their costs. This allows us to avoid fallacies such as, “We should deal with misconduct by making it easier to convict the accused in the absence of heightening the sensitivity of forensic methods.” This is exactly equivalent to saying we should punish more innocent people.

Game theory: Or, how to make rational choices when the payoffs depend on someone else’s rational choices. Well, an awareness of game theory can help us avoid fallacies, such as we can avoid climate change just by convincing everyone that it’s in their interest to conserve. The problem is that it is not in the interests of any individual to conserve unless everyone else is making the same decision at the same time and is guaranteed to stick with it. Otherwise, a person who conserves suffers the misery of waiting for a bus in the rain, shivering in the winter, sweltering in the summer, while his compatriots enjoy the comfort of cars and air conditioners and heaters. Or, he could be a free rider and enjoy all the benefits of consuming fossil fuels. And his decision will not by itself harm the planet. It is, to be sure, in everyone’s interests if everyone conserves, but it’s in no one’s interest to conserve individually. 

Finally, causal inference to distinguish causation from correlation: One must manipulate the putative cause holding all else constant. And this allows us to avoid fallacies like failing to rule out confounds. My favorite illustration comes from an old joke in which an Orthodox Jewish couple beseech their rabbi for advice. The wife is sexually unsatisfied, and it is written in the Talmud that a man is responsible for his wife’s sexual pleasure. Well, the rabbi strokes his beard, and he says, “Well, here’s an idea. Why don’t you hire a young, strapping, handsome young man? And the next time you make love, have him wave a towel over you, and the fantasies will help the missus achieve satisfaction.” Well, they try it. And, sure enough, nothing happens. They go back to the rabbi, who strokes his beard again, and says, “Well, this time, let’s try a slight variation. This time, have the young man make love to your wife and you, the husband, will wave the towel while they do it.” And sure enough, she achieves an Earth-shaking, screaming orgasm. And the husband says to the young man, “Schmuck! Now that’s the way you wave a towel!”

Explaining irrationality

Well, now the question that everyone is waiting for. I know this because as soon as I told people I was teaching a course on rationality and then writing a book on rationality, the frequently asked question was, “If people can be rational, why does humanity seem to be losing its mind? How do you explain, professor, the conspiracy theories and fake news and post-truth rhetoric and paranormal woo-woo?”

Not an easy question to answer, and I think the explanation has at least four parts. 

The most obvious is motivated reasoning. I mentioned that rationality is always in service of a goal. That goal is not necessarily objective truth. It can also be to win an argument in which the stakes matter to you. It’s not surprising that tobacco companies deploy considerable ingenuity to try to persuade us that smoking is harmless. And as Upton Sinclair said, “It is hard to get a man to understand something when his livelihood depends on not understanding it.” 

To show how wise and moral your group is, your religion, your tribe, your political sect and how stupid and evil the opposing one is called the “My Side Bias,” which is the subject of an important new book, The Bias That Divides Us, by the psychologist Keith Stanovich. He argues that this is the most robust and pervasive of the cognitive biases documented by psychology. 

Let me give you an example. Is this syllogism valid? “If college admissions are fair, then affirmative action laws are no longer necessary. College admissions are not fair. Therefore, affirmative action laws are necessary.” Well, think about it for a second. In fact, this is not a valid syllogism. He commits the fallacy of denying the antecedent. He implies Q, not P, therefore not Q. That is illogical. And a majority of liberals commit the fallacy, and most conservatives do not. If you ask a conservative for the explanation, they say, “Well, we told you all along, liberals are illogical.” But not so fast. 

Here’s another syllogism: “If less severe punishments deter people from committing crime, then capital punishment should not be used. Less severe punishments do not deter people from committing crime. Therefore, capital punishment should be used.” Another example of the fallacy of denying the antecedents. But this time, it’s conservatives that commit the fallacy, and liberals don’t. 

What the two problems have in common is that, in both cases, people ratify the conclusion that is congenial to their political ideology in the first place, and they’re not so good at tracking logic that seems to be inconsistent with it. In other words, politics makes you illogical. Quite literally.

A second part of the explanation is that we’re all vulnerable to primitive intuitions. For example, people are liable to the intuition of dualism. A person has a body and a mind, and there is a short step to imagine that there can be minds without bodies. So, you get spirits and souls and ghosts and afterlife, reincarnation, ESP. We imagine that there is something immaterial, ineffable, invisible going on that happens to be linked to their body. And from there, it’s a short step to imagine it unlinked from their body. 

It’s natural to think that living things contain some kind of invisible essence or stuff that gives them their form and powers from which it’s a short step to think that disease is caused by an adulteration of one’s essence, by some formal contaminant that makes people resistant to vaccines. And vaccine resistance is as old as vaccines. Because, when you think about it, the last thing that you’d want to do to prevent disease is actually inject a version of the disease pathogen into the tissues of your body. That is deeply unintuitive, but that’s what we’re asked to do when we get vaccinated. Likewise, genetically modified organisms, known to be completely innocuous, give many people the willies. It makes people susceptible to homeopathy and herbal remedies. 

We are vulnerable to intuitions of teleology. We know that our plans and artifacts are designed with a purpose, our purpose. From there, it’s a short step to imagine that the world is designed with a purpose, leading to beliefs in creationism, astrology, synchronicity and the vague sense that everything happens for a reason. There are no coincidences. 

Now, these primitive intuitions are unlearned and objective truths are acquired only by trusting legitimate expertise such as scientists, historians, journalists, government record-keeping agencies. Few of us can really justify our beliefs, including our true beliefs.

Finally, I think there’s a key distinction between what I call realist and mythological beliefs. Why do people believe outlandish fake news and conspiracy theories? Well, part of the answer is it depends what you mean by belief. Bertrand Russell said it is undesirable to believe a proposition when there is no ground whatsoever for supposing it is true. I suspect that most people in this room think that this is an obvious, trite, banal, commonsense observation. In fact, it is a radical, unnatural manifesto that for most people, through most of history, grounds for believing something are just one of the reasons to hold a proposition. 

Reality or mythology zone

I suspect people hold two kinds of beliefs. Their beliefs in the reality zone. This is the physical objects around them, the other people that they deal with face to face, the memory of their interactions. Even people who believe in chem trails or who are 911 truthers or lizard people, a lot of them hold jobs and keep food in the fridge and gas in the car and get the kids clothed and fed and off to school on time. It’s not that they are irrational throughout their lives, there are just certain zones in which they seem to depart from ordinary, verifiable cause-and-effect reasoning. 

In the second zone, the mythology zone, which covers the distant past, the unknowable future, far away peoples and places, remote corridors of power, CEO boardrooms, presidential palaces, the microscopic, the cosmic, the counterfactual, the metaphysical. Here, people hold beliefs because they’re entertaining, they’re uplifting, they’re empowering, they’re morally edifying. Whether they are true or false is unknowable and irrelevant. And indeed, for most of our history, they were unknowable before we had science and government record-keeping and responsible journalism and historians and so on. 

One example that hardly needs to be mentioned in this room is religion. A remarkable phenomenon that accompanied the publication of the quartet of books a dozen years ago by the “new atheists” — Sam Harris, Christopher Hitchens, Richard Dawkins — is that the furious counterreaction was not so much that they were wrong and that there is plenty of evidence to believe in the existence of God, but rather it’s somehow just, you know, inappropriate or uncouth to consider the existence of God to be a matter of truth and falsity in the first place. You hold it because it is a good thing to believe, not because it is factually accurate. 

As an example, consider Pizzagate, the predecessor of QAnon, according to which Hillary Clinton ran a child sex ring in the basement of Comet Ping Pong, a Washington, D.C., pizzeria. A typical response among holders of this theory was to leave a one-star Google review of the restaurant saying, and I quote, “The pizza was incredibly under-baked and suspicious-looking men gave funny looks to my 5-year-old son.” Now, this isn’t the kind of reaction you would have if you literally thought that children were being raped in the basement. Instead, you might call the police. So, what is it? What do people mean when they say, “I believe that Hillary Clinton ran a child sex ring”? What they really are saying is, “I believe Hillary Clinton is so depraved that she’s capable of running a child sex ring,” or, perhaps even more accurately, “Boo, Hillary.” That is, beliefs can be expressions of moral convictions. 

How to be more rational

All this raises the question: How can we become more rational again? There’s not a simple answer. I think part of the solution is that the tools of formal rationality should become second nature. Rationality should be the fourth “R” taught in school, along with reading, writing and ’rithmetic. 

We should have a greater awareness of the fallacies that the unaided human mind is prone to. We should promote the norm, that beliefs should be based on evidence and that changing your mind when the evidence changes should be taken as a sign of strength, not of flip-flopping or weakness.

Perhaps, most importantly, institutions with rationality-promoting rules must be safeguarded. The great achievements of human rationality were not the product of some single genius granting his brainchild to the world, but rather from institutions, societies and professions in which people voice hypotheses and other people can criticize them. In that way, one person can notice and make up for another’s biases. Though each of us is rather poor at spotting our own biases. Sometimes called the “bias bias,” that is, all of us think everyone else is biased, but not us. On the other hand, we’re pretty good at spotting other people’s biases, and in a community of people where you’re allowed to do that, the collective can become more rational than anyone is individually. 

What do I mean by rationality-promoting institutions? 

We have science with its rules for empirical testing and peer review. 

• We’ve got democratic government with its checks and balances. 

• Journalism with its mechanisms of editing and fact checking and cultivating a reputation for accuracy. 

• The judicial system with its adversarial proceedings instead of just entrusting verdicts to a hanging judge. 

• Academia, at least in theory, with freedom of inquiry and open debate. 

• And another example is Wikipedia, with the system of corrections based on a commitment to neutrality and objectivity among electronic media. 

What it means is that the credibility and objectivity of these rationality-promoting institutions must be safeguarded. They are a precious resource that people are disabused of weird beliefs to the extent that they trust institutions and that trust has to be earned.

Experts should show their work. We should not have public health authorities just giving pronouncements like edicts or new cases or dogmas, but rather explain the rationale behind their recommendations. Fallibility should be acknowledged. No one’s perfect. It is inevitable that people will make mistakes. That brings down the credibility of the whole institution if they were just presented as the pronouncements of an oracle or a priesthood and gratuitous politicization should be avoided. 

Unfortunately, that’s something that’s far from getting better; it’s getting worse as more and more of our institutions brand themselves as branches of the political left. When you have public health authorities say, “To reduce the spread of Covid, people should stay away from Make America Great Again rallies, but it’s OK to attend Black Lives Matter rallies because the cause of social justice is so important that it’s worth coming down with Covid,” which is what they said. And which is, to put it mildly, a strategic blunder in terms of securing the credibility and objectivity of public health institutions. 

Why rationality matters

Finally, why rationality matters. Rationality certainly matters to our lives. Again, this is a conclusion that I hardly need to make to this audience. People who do follow the normative models and avoid cognitive fallacies, on average, have fewer accidents and mishaps, better financial health and employment outcomes, and are less likely to be swindled by medical or psychic or, for that matter, religious charlatans. Rationality drives material progress. In my book Enlightenment Now, I presented a large number of graphs that show that over the decades and centuries, longevity, peace, prosperity, safety and quality of life have all increased. 

But the universe contains no force that carries us ever upward. Quite the contrary. A number of forces that are at best indifferent to our well-being and, at worst, appear to be trying to grind us down — pandemics being the most obvious example. 

Progress comes from deploying reason to improve human flourishing. That is, when people apply their brainpower to the goal of making people better off every once in a while, they will succeed if we retain these solutions that make people better off. And try not to repeat our mistakes. Things can get better. That’s all there is to progress.

The power of rationality to guide moral progress is of a piece with its power to guide material progress and wise choices in our lives. Our ability to eke increments of well-being out of a pitiless world and to be good to others despite our flawed nature depends on grasping impartial principles that transcend our parochial experience. We are a species that has been endowed with an elementary faculty of reason and that is discovered formulas and institutions that magnify its scope. They awaken us to ideas and expose us to realities that confound our intuitions, but are true for all that. Thank you. 

en English
X