James Gee, a professor of learning sciences at the University of Wisconsin, was profoundly humbled when he first played a video game for preschool-age kids called Pajama Sam: No Need to Hide When It’s Dark Outside. Gee’s son Sam, then 6, had been clamoring to play the game, which features a little boy who dresses up like his favorite action hero, Pajama Man, and sets off on adventures in a virtual world ruled by the dastardly villain Darkness. So Gee brought Pajama Sam home and tried it himself. “I figured I could play it and finish it so I could help Sam,” says Gee. “Instead, I had to go and ask him to help me.”
Gee had so much fun playing Pajama Sam that he subsequently decided to try his hand at an adult video game he picked at random off a store shelf—an H. G. Wells–inspired sci-fi quest called The New Adventures of the Time Machine. “I was just blown away when I brought it home at how hard it was,” he says.
Gee’s scholarly interest was also piqued. He sensed instantly that something provocative was happening in his mind as he struggled to complete the puzzles of the time machine. “I hadn’t done that kind of new learning since graduate school. You know, as you get older, you kind of rest on your laurels.”
Gee’s epiphany led him to the forefront of a wave of research into how video games affect cognition. Bolstered by the results of laboratory experiments, Gee and other researchers dared to suggest that gaming might be mentally enriching. These scholars are the first to admit that games can be addictive, and indeed part of their research explores how games connect to the reward circuits of the human brain. But they now recognize the cognitive benefits of playing video games: pattern recognition, system thinking, even patience. Lurking in this research is the idea that gaming can exercise the mind the way physical activity exercises the body: It may be addictive because it’s challenging.
All of this, of course, flies in the face of the classic stereotype of gamers as attention deficit–crazed stimulus junkies, easily distracted by flashy graphics and on-screen carnage. Instead, successful gamers must focus, have patience, develop a willingness to delay gratification, and prioritize scarce resources. In other words, they think.
The video game Tetris, among the earliest games to launch the industry, involves falling tile-like tetraminoes that a player must quickly maneuver so they fit into space at the bottom of the screen. In the early 1990s, Richard Haier, a professor of psychology at the University of California at Irvine, tracked cerebral glucose metabolic rates in the brains of Tetris players using positron-emission tomography (PET) scanners. The glucose rates show how much energy the brain is consuming, and thus serve as a rough estimate of how much work the brain is doing. Haier determined the glucose levels of novice Tetris players as their brains labored to usher the falling blocks into correct locations. Then he took their levels again after a month of regular play. Even though the test subjects had improved their game performance by a factor of seven, Haier found that their glucose levels had decreased. It appeared that the escalating difficulty of the game trained the test subjects to manipulate the Tetris blocks mentally with such skill that they barely broke a cognitive sweat completing levels that would have utterly confounded them a month earlier.
Nearly a decade after Haier’s study, Gee hit upon an explanation. He found that even escapist fantasy games are embedded with one of the core principles of learning—students prosper when the subject matter challenges them right at the edge of their abilities. Make the lessons too difficult and the students get frustrated. Make them too easy and they get bored. Cognitive psychologists call this the “regime of competence” principle. Gee’s insight was to recognize that the principle is central to video games: As players progress, puzzles become more complex, enemies swifter and more numerous, and underlying patterns more subtle. Most games don’t allow progress until you’ve reached a certain level of expertise.
To understand why games might be good for the mind, begin by shedding the cliché that they are about improving hand-eye coordination and firing virtual weapons. More than 70 percent of video games contain no more bloodshed than a game of Risk, and are popular because they challenge mental dexterity. Among the best-selling game franchises, The Sims involves almost no hand-eye coordination or quick reflexes. One manages a household of characters, each endowed with distinct drives and personality traits, each cycling through an endless series of short-term needs (companionship, say, or food), each enmeshed in a network of relationships with other characters. Playing the game is a nonstop balancing act. Even a violent game like Grand Theft Auto involves networks of characters that the player must navigate and master, picking up clues and detecting patterns.
Gee contends that the way gamers explore virtual worlds mirrors the way the brain processes multiple, but interconnected, streams of information in the real world. “Basically, how we think is through running perceptual simulations in our heads that prepare us for the actions we’re going to take,” he says. “By modeling those simulations, video games externalize how the mind works.”
Even if Gee is right and video games are learning machines, one question remains: Do the skills learned in the virtual world translate into the real one?
the answer comes from a slew of recent studies, one of which began when then cognitive sciences research assistant and ardent gamer Shawn Green worked with University of Rochester cognitive sciences professor Daphne Bavelier on a project investigating visual perception in video game players. On standard tests that measure attention span and information-processing time, Green found that gamers consistently outperformed nongamers. When Green tweaked the tests to make them challenging enough so the gamers wouldn’t have perfect scores, the nongamers sometimes performed so poorly that their answers might as well have been random guesses. The researchers addressed an admitted weakness of the study—that visually intelligent people were more likely to be attracted to video games in the first place—by immersing a group of nonplayers for a week in the World War II game Medal of Honor. They found that the group’s skills on the standard visual tests improved as well.
Green did the initial research as part of his honors thesis, and after graduation, he and Bavelier continued the study. Nature published the results in May 2003. Since then the pair has also found that gamers can visually track more objects simultaneously than nongamers and that playing video games improves this ability. Their latest research on the visual precision of gamers is forthcoming in Psychological Science and the Journal of Experimental Psychology. Green says his main interest is the brain’s plasticity, but cautiously concedes there may be practical applications to playing video games. “Strong peripheral vision is useful to law enforcement, firefighters, and the military. They need those enhanced skills,” he adds.
The notion that video games can develop abilities that apply to real-world situations has been expressed by many and is increasingly being put to the test. In October 2006 the Federation of American Scientists (FAS) endorsed video games as a potential means for teaching “higher-order thinking skills, such as strategic thinking, interpretive analysis, problem solving, plan formulation and execution, and adaptation to rapid change.” They cited “owners mode,” a component of the video football game Madden, which lets players manage an NFL team, as teaching basic business skills. Team games, such as EverQuest and World of Warcraft, develop cooperation and communication skills that the FAS says are useful in business settings.
A prime example of gaming that tangibly improves professional technique comes from James Rosser, director of the Advanced Medical Technology Institute at Beth Israel Medical Center in New York City. He found that laparoscopic surgeons who played games for more than three hours a week made 37 percent fewer errors than their nongaming peers, thanks to improved hand-eye coordination and depth perception. The Harvard Business School Press published a new book in November 2006 by John Beck, who has looked at three distinct groups of white-collar professionals: hard-core gamers, occasional gamers, and nongamers. The findings contradict nearly all the preconceived ideas about the impact of games. The gaming population turned out to be consistently more social, more confident, and more comfortable solving problems creatively. They also showed no evidence of reduced attention spans compared with nongamers. “It wasn’t surprising that gamers were more competitive, or more strategic, but the social and leadership skills that they exhibit don’t fit the stereotype of a loner in the basement,” Beck says.
The U.S. military has long supported the premise that learning through games can prepare soldiers for the complex, rapid-fire decision making of combat. Since 2002, they have offered new versions of their own game, America’s Army, which lets potential recruits play at everything from boot camp to Special Forces missions. According to the gamemakers at West Point, the purpose of America’s Army is to “give the player an idea of what it’s like for real U.S. Army soldiers to train for duty.” More than 4-and-a-half million registered players have completed the game’s basic training.
In the fall of 2003 two media researchers at the University of Southern California set up a study to look at the patterns of brain activity triggered by violent video games. Peter Vorderer and René Weber booked time on an fMRI machine, loaded a popular game called Tactical Ops on an adjoining computer console, and watched one test subject after another pretend to be part of a Special Forces team trying to prevent a terrorist attack.
Before Vorderer and Weber even looked at any of the brain scans, they were surprised by the behavior of the dozen or so adults who volunteered for the test. Participating in an fMRI study involves lying for extended periods of time in an extremely confined and loud space. Even a mild claustrophobic will invariably find the experience intolerable, and most people need a break after 20 minutes. But most of the Tactical Ops players happily stayed in the machine for at least an hour, oblivious to the discomfort and noise because they were so entranced by the game.
The genesis of this reaction may lie in the neurotransmitter dopamine. A number of studies have revealed that game playing triggers dopamine release in the brain, a finding that makes sense, given the instrumental role that dopamine plays in how the brain handles both reward and exploration. Jaak Panksepp, a neuroscientist collaborating with the Falk Center for Molecular Therapeutics at Northwestern University, calls the dopamine system the brain’s “seeking” circuitry, which propels us to explore new avenues for reward in our environment. The game world is teeming with objects that deliver clearly articulated rewards: more life, access to new levels, new equipment, new spells. Most of the crucial work in game interface design revolves around keeping players notified of potential rewards available to them and how much those rewards are needed.
If you create a system in which rewards are both clearly defined and achieved by exploring an environment, you’ll find human brains drawn to those systems, even if they’re made up of virtual characters and simulated sidewalks. It’s likely those Tactical Ops players in an fMRI machine were able to tolerate the physical discomfort of the machine because the game environment so powerfully stimulated the brain’s dopamine system.
Of course, dopamine is also involved in the addictiveness of drugs. “The thing to remember about dopamine is that it’s not at all the same thing as pleasure,” says Gregory Berns, a neuroscientist at Emory University School of Medicine in Atlanta, who looks at dopamine in a cultural context in his book, Satisfaction. “Dopamine is not the reward; it’s what lets you go out and explore in the first place. Without dopamine, you wouldn’t be able to learn properly.”
What kind of cognitive skills should we expect to find in the Pokémon generation? Not surprisingly, Gee has got a list. “They’re going to think well about systems; they’re going to be good at exploring; they’re going to be good at reconceptualizing their goals based on their experience; they’re not going to judge people’s intelligence just by how fast and efficient they are; and they’re going to think nonlaterally. In our current world with its complex systems that are quite dangerous, those are damn good ways to think.”
Gee’s remarks remind me of an experience I had a few years earlier, introducing my 7-year-old nephew to SimCity 2000, the best-selling urban simulator that lets you create a virtual metropolis on your computer, build highways and bridges, zone areas for development, and raise or lower taxes. Based on the player’s decisions, neighborhoods thrive or decline, streets get overrun with traffic or remain wastelands, and criminals prosper or disappear. When I walked my nephew through the game, I gave him only the most cursory overview of the rules; I was mostly just giving him a tour of the city I’d built. But he was absorbing the rules nonetheless. At one point, I showed him a block of rusted, crime-ridden factories that lay abandoned and explained that I’d had difficulty getting this part of my city to come back to life. He turned to me and said, “I think you need to lower your industrial tax rates.” He said it as calmly and as confidently as if he were saying, “I think we need to shoot the bad guy.”
In a 20-minute tour of SimCity, my nephew had learned a fundamental principle of urban economics: Some areas zoned for specific uses can falter if the zone-specific taxes are too high. Of course, if you sat my 7-year-old nephew down in an urban studies classroom, he would be asleep in 10 seconds. But just like those Tactical Ops players happily trapped for an hour in an fMRI, something in the game world had pulled at him. He was learning in spite of himself.
Discover ran an earlier version of this article in 2005.