We have completed maintenance on DiscoverMagazine.com and action may be required on your account. Learn More

Why Do People Behave Nicely?

No one may ever know unless social psychologists shake off their fascination with jerks.

By Ethan Watters
Dec 1, 2005 6:00 AMJul 13, 2023 2:53 PM

Newsletter

Sign up for our email newsletter for the latest science news
 

On the television show The Bachelor, Rachel lies to her fellow contestants about last night's date. Over on The Amazing Race, Jonathan shoves his wife after she slows them down en route to the finish line. On The Apprentice, Maria attacks Wes, then Donald Trump fires them both.

In just a few years, more than 100 reality television shows have been striving to help contestants act like jerks, and audiences love it.

Sure, contestants sometimes form noble alliances, and the occasional romance blossoms, but the behavior that viewers talk about the next day at the water cooler invariably involves contestants behaving maliciously or embarrassing themselves by cracking under pressure. Although it's clear that participants are purposely placed in coercive situations, we nonetheless think we are seeing something real and noteworthy about the character and the psychology of fellow humans.

Perhaps that fascination explains why so many experiments in the field of psychology read like the premise for a reality TV series. Consider themost famous of all social psychology experiments, Stanley Milgram's "Behavioral Study of Obedience," published in 1963. After answering a newspaper ad, volunteers (all men) arrive at a Yale University laboratory, where a man in a gray lab coat asks for help in a "learning experiment." The subject is instructed to administer a shock to a stranger in an adjoining room when the stranger answers a question incorrectly. The shocks are mild at first, but after each wrong answer the experimenter asks the subject to deliver a stronger voltage. The cries from the stranger in the other room grow more agonized as the shock is increased in 15-volt increments. (The shocks aren't real; the "stranger" is merely acting.) If the subject hesitates, the man in the lab coat says sternly, "Please continue." If the subject still balks, he is first told, "The experiment requires that you go on," then, "It is absolutely essential that you continue," and then, "You have no other choice, you must go on."

By the time the subjects deliver what they believe to be a "very strong shock," some are sweating, trembling, stuttering, or biting their lips. In the most interesting reaction, which would have made for great television, some of the subjects experience uncontrollable fits of nervous laughter. One 46-year-old encyclopedia salesman is so overcome by a seizure of laughter that the experiment has to be stopped to allow him to recover.

What drew attention to Milgram's paper was his report that most of the randomly selected men were coaxed into hitting a switch labeled "Danger: Severe Shock," administering a supposed 420-volt zap. Milgram was surprised that although "subjects have learned from childhood that it is a fundamental breach of moral conduct to hurt another person against his will," most were willing to do so.

Milgram was inspired to figure out why prison guards at World War II Nazi death camps willingly followed horrifying orders. That question still rings out today, not only on TV shows like Survivor or The Apprentice but also on the network news, as corporate executives steal millions, terrorists behead innocents, and prison-camp guards in Afghanistan, Iraq, and Cuba mistreat inmates. We are fascinated, troubled, and desperate to know how human behavior can go so wrong, fearful that we, too, might behave badly in a similar situation.

For more than a century, psychologists have attempted to get to the root of evil and error. What they have discovered is not encouraging. Milgram and earlier researchers demonstrated that the ability to act rationally can be subverted by crowds or by pressure from authority figures. Recent studies show that humans, even when left alone, are prone to bewildering mistakes and biases.

"Basically,the job of the social psychologist has been to demonstrate how people screw up," says Joachim Krueger, associate professor of psychology at Brown University. By night, he has been mesmerized by both Survivor and, more recently, by the naked ambition and displays of status on The Apprentice. By day, however, he has become convinced that misconduct is only half the story. Evil and error, he argues, cannot be grasped without first understanding why humans often do the right thing. If he is correct, the first century of social psychology study may one day be likened to the early days of medicine, when doctors sought cures for diseases by practicing procedures like trepanning without any true inkling of how the body functions.

Recently, Krueger and a colleague, David Funder at the University of California at Riverside, published a paper calling for a reorientation of the field. Without a greater effort to examine how humans do things well, they argue, a "distorted view" emerges that "yields a cynical outlook on human nature." Another researcher summarized their argument this way: Krueger and Funder are asking researchers to abandon the "people are stupid school of social psychology."

Krueger is tall and soft-spoken, his voice accented by his native German. His cinder-block office is neat and unadorned. One afternoon, to explain why social psychology became so obsessed with human errors and why that obsession may itself be in error, Krueger began pulling books off his shelves, offering a trip through the history of this science.

+++

Social psychology crystallized in the 19th century around a concern with crowd behavior: Why do otherwise reasonable individuals become irrational or even dangerous when placed in a mob of people? By the middle of the 20th century, social psychologists had widened their research to examine how people can be influenced to make incorrect judgments or cross moral boundaries. In the 1950s, Solomon Asch, a pioneer in social psychology, pitted naive test subjects against a group of strangers who made bizarre judgments about the relative lengths of lines. Pressured to conform to the group, subjects often disregarded the obvious visual evidence and adopted the prevailing judgment.

About the time of Milgram's experiment, Princeton University professor John Darley studied why bystanders, when confronted with strangers in distress, sometimes respond by walking away or closing the drapes. Inspired by the case of Kitty Genovese, a New York City murder victim whose cries for help failed to rouse her neighbors to action, Darley showed that test subjects were less likely to aid a stranger if they thought they were just one among several witnesses.

Despite evidence of sheeplike behavior, many researchers still assumed that individuals, on their own, could be counted on to be rational and moral. The sea change came in the 1970s, from insights gleaned through economics research. In a series of articles and books, psychologists Daniel Kahneman, who later won the Nobel Prize in Economics, and Amos Tversky rejected the long-held notion that humans are rational actors in a marketplace. Rather than using all the information available and calculating the best decision, they argued, the human mind relies on "quick and dirty" heuristics, mental shortcuts or rules of thumb, to make decisions.

Social psychologists, including Krueger, jumped in to investigate these rules of thumb. Because the rules aren't always rational, researchers thought they would be exposed in situations where test subjects were led to make mistakes. In effect, the psychologists started looking for errors—and for experiments that would prompt them to occur.

"Like many other graduate students, I thought this stuff was so cool," Krueger says, holding a book containing some of Kahneman and Tversky's work. "The task before us was to set up experiments that would show errors and biases, and those mistakes would tell us what was really going on with human cognition. Of course, what was really going on was always something bad—a departure from some researcher's idea of how the mind should work."

Krueger's interest was stereotyping. In the late 1980s and early 1990s, he published papers showing how people use arbitrary categories to make judgments. On hot August days, for instance, people look forward to the first day in September, as if turning a page on the calendar would suddenly make the weather cooler. Krueger found that people make two errors in this case: They underestimate temperature changes within a month (assuming, for instance, that August will be uniformly hot) and overestimate the changes in temperature that will occur when the month ends.

Since then, revelations of human misperception and bias have popped up in the social psychology studies like toadstools after a rain. We humans have a variety of ways of perceiving ourselves as smarter, more skilled, and more appealing than we are in reality. Most drivers, for example, say they drive more safely than the average person, even though that is a statistical impossibility. People also tend to consider themselves more attractive than others say they are. We tend to underestimate the chance that past events will reoccur, like winning two poker hands in a row (the "hot hand" fallacy). Likewise, we incorrectly assume that because a basketball player has made the last five shots he will make the sixth. We overestimate small risks, like being killed by a terrorist, yet underestimate much larger ones, like being killed in a traffic accident.

The list goes on: the "hindsight bias," the "systematic distortion effect,"the "false uniqueness effect," the "just world bias," the "clouded judgment effect," and the "external agency illusion." And just in case you think you're hip to your own biases, researchers have unveiled the"bias blind spot," in which you see biases in others but overlook them in yourself.

+++

Taking this research at face value, one might conclude that when people are not misjudging the world around them, they are lying to themselves about their own abilities and motivations. In one famous study, people were found to be "insensitive," beset by "ignorance," "general misconceptions," and a whole range of "shortcomings and biases." Krueger remembers a popular debate among social psychologists over which metaphor best drives home the depth of the mind's failings: Should researchers view the mind as a "cognitive miser," emphasizing our limited resources and reliance on irrelevant clues, or is the mind more accurately depicted as a "totalitarian ego," pursuing self-esteem at the cost of self-deception? Is your mind a Scrooge or a Stalin?

By the mid-1990s, Krueger began to wonder about the value of finding mistakes in human reasoning. His daughter was a toddler, and like many parents, he had become fascinated with her development. "I was overawed with the day-to-day advances in her thinking," he recalls. "What I was admiring was not her rational thought but her development of intuitive, associative, and automatic reasoning. In other words, I was admiring the same kind of thinking that social psychology researchers were finding fault with when they studied adults."

Was human reasoning really so flawed? Perhaps the errors lay in the means by which psychologists sought to explain them. Human thinking, Krueger notes, is of two broad types. There are the snap judgments we make on the fly, like assessing whether a person approaching us on the street is welcoming or threatening. And there are the activities to which we apply the full force of our minds, like preparing a business presentation or solving a math problem. That laborious reasoning has long been assumed to represent the gold standard of human thinking. It is the type of reasoning that social psychologists themselves employ. Test subjects, however, are typically placed in a situation and required to guess, react, or estimate. Later, the researcher analyzes the behavior at length, through the lens of statistics or logic. Whenever there is a disparity, the test subject is assumed to be displaying the error or bias, not the researcher.

Another problem with the studies, Krueger says, is that researchers are "null-hypothesis testing." Basically, they begin with the premise that the human mind is rational and then look for any deviation. Good behavior or moments of rationality are ignored because the intent is to study bad behavior. It's not unlike reality television: Unless there is some bad behavior, the research has nothing to show.

"I began to think that by comparing human judgment to objective reality, we were missing a bigger picture," says Krueger. "We were chronicling mistakes but stopping short of asking why such behavioral or cognitive tendencies existed or what general purpose they might serve. I began to think that bias and error couldn't be the end of the story."

The mind wants to believe that the line between good and bad behavior is clear. Looking again at the Milgram shock experiment, one wants to consider the subjects who administered "shocks" under order as cowards and those who refused as heroes. But imagine a different Milgram study. What if, when subjects showed up at the lab, they instead were confronted with smoke pouring out of the windows and a firefighter who told them, "Quick, help me carry this hose into this burning building." What would we think of those who followed authority in that situation? What would we think of those who refused?

It is an uncomfortable fact that the soldiers who ran the Nazi death camps and the soldiers who liberated them were all acting under orders from superiors. There is a world of difference in the moral implications of what they did, but the human tendency to obey authority resulted in both evil and good. Krueger's challenging question is this: Wouldn't scientists learn as much or more about mental mechanisms like obedience if they took its advantages into account? Couldn't we learn more about the bad by studying the good, or at least by examining bad and good behavior in the same context?

A rethinking of one particular classic of error research had a dramatic influence on Krueger's thinking. In a now famous study, Lee Ross and colleagues at Stanford University asked students if they would walk around campus wearing a sandwich board that read "Eat at Joe's." The test subjects who agreed to do this embarrassing task predicted that 62 percent of others approached to carry the sign would do it. But test subjects who refused to carry the sign thought that only 33 percent of others would agree to do it when asked. Researchers concluded that they had found a new bias in reasoning, which they called the "false consensus effect"—that people have the naive tendency to project their individual attitudes, values, and behaviors onto the majority.

Krueger was impressed by a critique of the study. Robyn Dawes, a professor Krueger had studied under, countered that the students who predicted that their opinions would be in the majority were not making an error at all but rather were taking their own opinion as a legitimate piece of data. "By definition, most people are in the majority most of the time," explains Krueger. "Therefore, if you assume that your opinion will match that of the majority, you will be right more often than not."

Krueger has taken this thinking a step further, to study the personal and social benefits of such behavior. In doing so, he may have cracked the "prisoner's dilemma," a classic experiment of both social psychology and economics. In the prisoner's dilemma, you are asked to imagine yourself alone in a cell, with an unseen companion isolated in a separate cell. You both are under suspicion of having committed a crime together, but the police don't have the evidence to convict you—yet. If you agree to betray your companion by testifying against him and he chooses to remain silent, you will be freed (zero years); if you both rat on each other, you receive a near-maximum sentence of three years. If you remain silent, and your companion does, too, you both receive only minimal time (one year), but if you stay quiet and your companion betrays you, you receive full punishment—five years, the sucker's outcome. Which choice, betrayal or silence, assures you the least time?

+++

Many researchers have assumed that the logical choice is betrayal, since your potential outcomes, depending on what the other prisoner does, are zero or three years—less time on average than the consequences of staying silent (one or five years). Yet when faced with this problem, most laypeople make the illogical choice to remain silent. Why?

The answer, Krueger believes, is that they are employing social projection: They assume that the second prisoner will act the same way they will, and then they incorporate that assumption into the decision-making process. By that reasoning, the choice comes down to mutual betrayal (three years) or mutual cooperation (one year). Cooperation becomes the logical choice.

The mind-bending part of Krueger's theory is that participants are assuming that other people will act like them before they themselves decide how they are going to act. People don't decide on a strategy and then assume people will act similarly. Rather, they assume similarity and then act on that assumption. Krueger believes this may explain why we do many socially conscious acts, such as taking time to vote even though we know that our individual vote probably won't make a difference. The assumption that people will act like us actually influences our decision to participate.

"The result is that there are higher levels of cooperation in groups where people project their beliefs on others," says Krueger. "The collective good is a by-product of this. In this model there is no conflict between acting selfishly and acting for the public good. The latter comes from the former."

Talking off the Brown campus one evening, I ask Krueger about evil. If human reasoning has all these heretofore unknown positive aspects, how does one account for the horrors on the nightly news? Does social psychology have any hope of really understanding human misbehavior?

I'm not alone in wondering. Commenting on Krueger and Funder's paper, developmental psychologist Michael Maratsos of the University of Minnesota argues that the truly troubling revelation of Milgram's experiment was the extent of conformity and cruelty, "given how little the subjects had at stake." Throughout history, people have willingly done horrible things to avoid punishment or gain status; Maratsos cites foot binding, slavery, and recent corporate scandals as examples. Isn't it reasonable to begin studying humans, as Maratsos does, with the conclusion that people are "basically a disappointment"?

On the subject of morality, Krueger seems uncomfortable. He has talked admiringly, almost longingly, of research on vision, where issues of "good" and "bad" don't apply. No one expresses alarm when a researcher figures out a way to trick our visual perceptions. Visual misperceptions produced in the laboratory are assumed to reveal the mechanisms by which vision functions well in the real world. That isn't so with the science of human interactions.

"I'm not making the case that human behavior is wonderful and is the way it should be," Krueger says at last. "What I'm saying is that the field has been out of balance in pursuing errors and biases, and because of that we don't know as much about either the good or the bad behavior as we should. You can't understand the bad without understanding the good."

As Krueger and I walk, our attention is drawn across the street. A group of high school students is gathered at a bus stop. Suddenly there is a quick movement, some shouting, and a young man jumps up and begins running. We stand there straining to determine what is happening. Are the voices raised in distress? Is the young man running in retreat? My first thought is that I am the subject of a social psychology experiment. I glance at Krueger, then around me. Are there cameras or grad students hidden in the bushes, recording my reaction?

We walk on, but I'm slightly shaken. Should we have done something? We agree that there was nothing to do, but someone apparently thought differently: A police car soon comes by with its siren on. Over dinner, I harangue Krueger with questions. What was his impression of what we saw? How did that situation compare to the classic studies of bystander intervention? All the questions boil down to one: Did I do the right thing?

"Most things that happen to you in the world happen quickly," Krueger says.

"Fortunately, our fast and frugal reasoning tends to serve us very well in the long run. Life is an experiment without a control group. You will never know how your actions would be different had the situation been slightly different. That's why we do experiments. One thing is for certain: You can't carry all the research around and have a bird's-eye view of your own behavior in every moment. You'd break down."

Perhaps, I suggest, there is solace, even absolution, to be gained by viewing human misbehavior in a wider context. "Yes," Krueger says. "I think the next wave of research will take us to a place of greater balance and acceptance. If we come to a more realistic and accurate self-understanding, we may be better able to forgive ourselves and others."

Of course, nothing will stop us from categorizing behavior as right, wrong, good, or evil. But understanding behavior and judging it are two different tasks; the first is scientific, the second is not. When it comes to understanding, it might be more fruitful to approach ourselves with wonderment instead of disappointment.

"I watch my kids," Krueger says, "and even when they are doing something that annoys me, I'm thinking that they are acting just the way they should, as the highly evolved mammal that they are. There is a Zen master who said something like 'Humans are perfect, but they could use a little improvement.' To the Aristotelian mind that idea would be a contradiction; it would be gibberish. To me it has great appeal."

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.