We have completed maintenance on DiscoverMagazine.com and action may be required on your account. Learn More

A User's Guide to Rational Thinking

Cut through flawed assumptions and false beliefs — including your own — with these strategies.

By Christie Aschwanden
May 28, 2015 12:00 AMMay 21, 2019 3:18 PM
rational thought
Pat Kinsella

Newsletter

Sign up for our email newsletter for the latest science news
 

In the digital age, information is more plentiful than ever, but parsing truth from the abundance of competing claims can be daunting. Whether the subject is Ebola, vaccines or climate change, speculation and conspiracy theories compete with science for the public’s trust. Our guide to rational thinking is here to help. In the following pages, you’ll learn tools to identify the hallmarks of irrational thinking, evaluate evidence, recognize your own biases and develop strategies to transform shouting matches into meaningful discussions.

The Irrationalist in You

We’re programmed for irrational thought. 

Irrational thinking stems from cognitive biases that strike us all. “People don’t think like scientists; they think like lawyers. They hold the belief they want to believe and then they recruit anything they can to support it,” says Peter Ditto, a psychologist who studies judgment and decision-making at the University of California, Irvine. Motivated reasoning — our tendency to filter facts to support our pre-existing belief systems — is the standard way we process information, Ditto says. “We almost never think about things without some preference in mind or some emotional inclination to want one thing or another. That’s the norm.”

If you think you’re immune, you’re not alone. We’re very good at detecting motivated reasoning and biases in other people, Ditto says, but terrible at seeing it in ourselves. Spend a few minutes in honest reflection, and chances are you will find a few examples from your own life. Whether we’re telling ourselves that we’re better-than-average drivers, despite those traffic tickets, or insisting we’ll get through a 40-hour to-do list in a single day, we’re all prone to demonstrably false beliefs.

Much of our thinking on contentious issues is influenced by our pre-established social or cultural groups, says Dan Kahan, a law professor and science communication researcher at Yale Law School. Kahan studies cultural cognition — the idea that the way people process information is heavily determined by their deep-seated values and cultural identities. We don’t have time to evaluate every piece of evidence on every issue, so we look to people we trust in our in-groups to help us make judgments, Kahan says. Once a certain idea or stance becomes associated with a group we belong to (part of what Kahan calls a cultural identity), we become more inclined to adopt that position; it’s a way to show that we belong.

Pat Kinsella

If you consider yourself an environmentalist, for instance, you’re primed to adopt the view that hydraulic fracturing, or “fracking” — the controversial method of oil and gas extraction that involves cracking rock with pressurized fluids — poses a threat to the environment and human health. On the other hand, if you’re a conservative, you’re more apt to believe that fracking is harmless, since this is the stance taken by others in that group.

Being science-literate won’t protect you from such biases, Kahan says. His research has found that people who score high on measures of science comprehension tend to be more polarized than others on contentious issues. In one such study, Kahan and his research team surveyed a diverse sample of about 1,500 American adults regarding their political views. The team asked them to do a calculation designed to test their ability to slow down and do the math, rather than taking gut-reaction shortcuts that can lead to the wrong answer. The researchers presented the same math problem framed two different ways: as a nonpolitical question, and as a question looking into a politically charged issue, such as gun control. They found that the people who scored the best on the nonpolitical math problem fared the worst when the same problem was presented as a politically charged issue. The better your knowledge of science and the stronger your ability to understand numbers and make sense of data, the more adept you are at fitting the evidence to the position held by your group identity, Kahan says.

Is it possible to overcome these internal biases that sidetrack our thinking? The Center for Applied Rationality (CFAR) thinks so. This nonprofit group, based in Berkeley, Calif., holds workshops and seminars aimed at helping people develop habits of thought that break through biases.

The first step toward overcoming bias is to recognize and accept your fallibility, says Julia Galef, president and co-founder of CFAR. “We tend to feel bad about ourselves when we notice we’ve been wrong,” she says, but if you punish yourself, you create a disincentive for searching for truth. “We try to encourage people to congratulate themselves when they notice a flaw in their belief or realize that the argument someone else is making has some basis,” Galef says.

Another trick Galef recommends is the flip — turn your belief around. Ask yourself, “What are some reasons I might be wrong?” This strategy forces you to turn your attention to contrary evidence, which you might be motivated to overlook if you simply listed reasons for your views. Consider what it would look like for you to be wrong on this issue. Is any of the evidence compatible with this opposite view? Would you be inclined to believe this opposite argument if it were being promoted by someone from your own political party or social group? The answers can help you determine the strength of your position, Galef says, and whether it’s time to reconsider it

A Field Guide to Irrational Arguments

Scientific explanations are based on evidence and subject to change when new facts come to light. Irrational ones rely on assumptions and involve only the facts that support a chosen side. Here are five hallmarks of irrational arguments.

The science is nitpicked to fan doubt: Rather than considering the totality of the evidence, unscientific arguments cherry-pick data, mischaracterize research methods or results, or even make outright false claims. For instance, people who insist that vaccines cause autism may point to the known dangers of mercury as evidence, even though mercury is no longer a component of most vaccines, and studies have found no link between vaccines and autism. When a proponent’s explanation of the research contradicts the scientists, that’s a good sign that they’re peddling this type of false doubt.

The science is rejected based on implications not data:Instead of taking issue with the evidence itself, these types of arguments focus on the perceived implications, says Josh Rosenau, programs and policy director at the National Center for Science Education. “People will say, ‘Well, if evolution is true, then we don’t have souls, or we should all behave like animals.’ ” Never mind that the science doesn’t actually say anything about how people should behave. If the science can be implied to repudiate beliefs that people hold dear, it creates a huge incentive to engage in motivated reasoning, lest one’s world view comes crashing down.

Scientists’ motives and reasons are attacked: Critics often turn to personal attacks on scientists to cast doubt on their findings. Instead of criticizing the science itself, these lines of argument suppose that scientists have rigged their research to support the scientific consensus. Paul Offit, director of the Vaccine Education Center at the Children’s Hospital of Philadelphia, co-invented a life-saving rotavirus vaccine. Anti-vaccine crusaders seized on his association with it to imply that his advocacy stems from his financial interest in vaccines and ties to pharmaceutical companies. Offit has concluded that some of these people cannot be convinced. “It doesn’t matter how much data you show that person. If in their hearts they’re conspiracy theorists, you can’t convince them.”

Legitimate disagreements among scientists are amplified to dismiss the science: Evolution is one of the foundations of modern biology, but biologists are still discovering details of how evolution works. When geneticists offer contrary ideas about how speciation occurs, they’re debating the nuts and bolts of how evolution works, not arguing over whether it happens. Yet people fighting the teaching of evolution in schools may seize on legitimate scientific disputes as reasons to dismiss the scientific theory altogether, Rosenau says. When they present gadflies or scientists whose views are out of step with the majority of the field as the most trustworthy experts on an issue, that’s another red flag.

Appeals are made in the name of “fairness”: People touting this argument say, “We should just teach kids both sides because there are exactly two sides, in equal proportions,” even though there aren’t, Rosenau says. In most cases, this appeal is invoked to give false equivalence to a concept like intelligent design, which lacks evidence. If not counteracted, this approach can lend legitimacy to debates without scientific merit, Rosenau says.

6 Strategies for Conversing With Someone Who Has Irrational Ideas

When you encounter, say, some neighbors who refuse to vaccinate their children because of long-debunked fears of autism and mercury poisoning, it’s tempting to throw facts at them. But — as you know if you’ve ever tried this approach — bombarding people with evidence is doomed to fail. If you want any chance of engaging in a meaningful conversation, you’ll need better tactics. Here are six worth trying. We can’t promise they’ll work, but they’ll give you a fighting chance. 

Be a good listener and make a connection: As much as we’d like to think otherwise, most human judgments aren’t based on reason, but on emotion, says Ditto, the UC Irvine psychologist. Aim to forge a personal connection that makes the other person inclined to see you as “one of us.” Research by Yale’s Kahan has shown that people tend to adopt beliefs associated with their cultural groups. So look for common ground.

That means listening with respect, says Randy Olson, a scientist-turned-filmmaker and author of Don’t Be Such a Scientist. “Do not lecture. Nobody wants to hear that,” he says. Instead of throwing out a bunch of facts, ask questions. Show that you’re open to what the other person has to say. “Don’t rise above them; approach them at their level,” Olson says. The moment you create a divide (by implying that you’re smart and they’re not, for instance) you’ve lost the debate. Ultimately, it makes no difference how much evidence you’ve got. If you want your message to register, you have to speak it in a voice that’s trusted and likable, Olson says.

Figure out where they’re coming from and devise a frame that speaks to that: When people cling to irrational beliefs, it’s often because they’re somehow tied to their identity or social group. Whenever possible, present your argument in a way that fits, rather than challenges, the other person’s self-identity, says Julia Galef, co-founder of CFAR. For example, imagine you are trying to convince a friend who thinks of herself as bold and decisive that it’s OK to change her mind about an issue on which she’d taken a public stand.

One way to do this, Galef says, would be to frame an about-face as a gutsy and strong move.

Usually it’s not an aversion to science that motivates people to tout unscientific ideas, but some underlying cultural, social or personal issue, says Rosenau, of the National Center for Science Education. For instance, he says that many evangelicals he’s encountered see evolution as a repudiation of their religious beliefs. As long as they view it that way, they can’t endorse evolution without giving up their identity — and they’re unlikely to do that, no matter how compelling your facts, Rosenau says. The solution? Find a way to talk about evolution that doesn’t force them to abandon their group identity or belief system. “I might say, ‘Did you know that [National Institutes of Health Director] Francis Collins is an evangelical Christian?’ Then we might have a real conversation and talk about what it means to be a Christian who accepts that evolution is true.”

Affirm their self-worth before knocking down their erroneous beliefs: When your facts challenge people’s self-identities, their immediate impulse will be to reject them — that’s human nature, says Brendan Nyhan, a political scientist at Dartmouth College. When the thought of giving up a tightly held belief feels like a threat to our identity or world view, we’re prone to reject it out of hand.

One way to circumvent this problem is to make the person feel positive about themselves before presenting evidence that might topple their self-image. In one study, Nyhan and his colleagues had volunteers participate in an exercise designed to bolster their feelings of self-worth, such as remembering a time they felt good about themselves or recalling a value that was important to them, before presenting them with information that contradicted their beliefs about political events. The results showed that the self-affirming drills increased participants’ willingness to accept uncomfortable facts.

In real life, this might look more like an exchange that happened between my husband and me while we were backpacking. Coming to a fork in the trail, I insisted that we needed to go one way, while Dave was sure the other way was correct. It turned out that he was right, and I knew he was right, but I didn’t like what that said about me — that I have a poor sense of direction. This notion contradicts the vision I have of myself as a competent person. But when Dave laughed about it, and told me how funny it was that a smart person like me could get lost, I was suddenly able to accept his facts because they no longer challenged my beliefs about myself. By telling me that I can be a smart person and also get lost, he gave me a way to accept his directions and still feel good about myself.

Focus on the facts, not the misconceptions: When trying to counteract a myth, a natural response is to present and then debunk it. But tread carefully, says Nyhan. Studies show that repeating a misconception in order to disprove it often ends up reinforcing the erroneous idea in people’s minds.

In one study, volunteers viewed a pamphlet debunking myths about flu vaccines. Immediately afterward, people correctly sorted myths from facts, but just a half-hour later, they performed worse on this sorting task than they had before reading the flier. Reading the myths connected them to flu shots in participants’ minds, Nyhan says. People remembered reading those things about the flu shot, but over time they forgot which were true and which were false.

Instead of reiterating myths, Nyhan advises finding a simple, truthful message to present. If you overwhelm the person with a long list of complex explanations, you could invoke the so-called overkill backfire effect that drives your target to an explanation that’s more appealing.

“A simple myth is more cognitively attractive than an overcomplicated correction,” write researchers John Cook and Stephan Lewandowsky in The Debunking Handbook.

Ask the person to explain what they know: People who feel sure of their position set a high bar for contrary evidence, Ditto says, but often such confidence stems from a misperception that they know more than they actually do, a phenomenon researchers call the illusion of explanatory depth. Break this illusion, and the person may become more open to your position, Ditto says.

A study published in the journal Psychological Science in 2013 found that when people were asked to explain the details of how a political policy they supported would work, their beliefs on the issue became more moderate. Asking people to explain what’s behind their beliefs seems to make them scrutinize what they know, which in turn can force them to recognize the gaps in their knowledge. As a result, they become less sure of their position and possibly more open to what you have to say.

I recently tried this approach with an acquaintance who expressed concern that vaccines would harm her baby. What, exactly, was she worried might happen? Halfway through her attempt at an explanation, she admitted that she wasn’t really sure how immunizations would hurt him, but it scared her to give such a young child so many shots at once. She didn’t change her mind then and there, but she did agree to read some information I gave her to ease her fears.

Engage in person, not in writing: It’s no secret that people can behave poorly online. When you’re having a discussion in the abstract, it’s easy to set people off without being conscious of it, since you miss the body language and other social cues that would normally inform your behavior, says Chris Mooney, co-author of Unscientific America. “Once the emotions are working, responses are hot rather than cold, and pretty soon everybody’s circling wagons,” Mooney says. It’s more difficult to spiral into mindless rants and name-calling when you’re engaging someone face-to-face than when you’re arguing with their avatar. If you want to have a real debate, Mooney says, “go have a beer. Don’t argue on Facebook.”

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.