A Psychologist in Cyberspace
Photograph by Eric Weeks
Until the advent of modern technology, our closest bonds were with our fellow humans. These days, millions of people develop close relationships with robot dogs, Tamagotchi toys, and virtual characters and kingdoms they encounter or create on the Internet. Sherry Turkle, a psychologist who directs the Initiative on Technology and Self at the Massachusetts Institute of Technology, has spent decades studying the intimate bonds we form with our artifacts and how they shape who we are. She shared her insights into "cyber-analysis" with senior associate editor Josie Glausiusz.
What first triggered your interest in the ways humans bond with machines? I came to MIT in 1976 to teach the sociology of science and saw the intensity and passion of my students' relationships with computers. They used computer metaphors to think about their minds, like "debugging a problem." I realized that the computer is a very evocative object, and the idea of studying the technological world opened up for me.
Why are people so eager to imbue inanimate objects, such as robots and computers, with human emotions? From the earliest stages of life, we have a very profound need to connect. Infants experience themselves as if the objects in the world are part of them and they are part of the objects in the world. These objects, such as Linus's baby blanket or a teddy bear, are perceived as being imbued with the self. A computer, too, can evoke in its users a sense of connection and personality.
Why would anyone experience deep feelings for an object like a robot? Toys like the Tamagotchi—those little egglike digital toys that need you to feed them and clean up after them—ask for nurturance. By doing so, they push a very profound button in us. As a species, we're programmed to attach to the things that we take care of and that blossom under our care. People don't just love their plants or talk to them because they have a connection with plants. It's because those are the plants that we nurture.
Could we ever come to care for robots in the way we care for friends or siblings? We do care for robots. People who have Sony AIBO robot dogs are teaching them tricks, are thinking about their welfare, are bonded with them, and want to take them places. Children who have created pets online are worrying about those pets, and whether they've taken care of them, and whether they should get them a new coat. But it's a different kind of caring. Just because you have a robot dog doesn't mean that a biological dog should lose all of its allure.
Is it possible to reverse this psychology? For instance, could robots someday nurture us? A lot of people are very excited about the use of robots for the elderly. They see this as a humanitarian application of robotics, to help people who need companionship and to give them their medicine, take their blood pressure, and so on. To me, it's important to study not just what kind of new relationship an elderly person has with the robot but what that relationship is doing to us as people. It used to be very meaningful for children to hang out with their grandmothers. The danger is that we may legitimize taking some of those interactions out of the human experience.
Where else do you imagine that robots might replace people? We already seem to have decided that we want robots to be our cash machines. Are we going to have robots as nurses and nannies? Are we going to have robots as our soldiers? Because it's a different world if war is not about killing people but rather machines in combat. At a certain point, one could imagine that the machines would be fighting other machines. So the moral, ethical and human dimensions of what war is about would change.
Would you want to live in a world in which intelligent robots were integrated into our lives? Intelligent robots are going to be an integral part of our lives. The question is what roles they are going to have in the human life cycle, and what aspects of that life cycle we are going to preserve as a "sacred space" that belongs to humans. Nurturing children is something we would want to aggressively keep for ourselves. If you give your child a robot nanny, or robotic dolls as playmates, that's done something to you as well as your child. But there are some things, like ATMs, that are better than a person. Maybe we'd rather have robots collecting the garbage, or certainly working in dangerous environments.
Has society been changed by our interactions with the Internet? Definitely. The ability to join online communities, or being able to play out aspects of self that are different than what your physical self permits, has profoundly changed what is available to the human psyche. One of my students formed a friendship on the Internet with a person who turned out to be profoundly physically impaired. Certain aspects of that person's self—the vivaciousness, the sense of exploration, of risk-taking—would not have had an opportunity to express themselves without the sociability the Internet provides.
Do you worry about the potential for dishonesty on the Internet? It's a place where people experiment with identity. Medieval times had festivals and fairs for that kind of play. As long as we know that it's a space for that kind of play—that somebody calling themselves "fabulous hot babe," might be an 80-year-old guy in a nursing home in Miami—it's good. Now, you don't want that on the site where your American Express card is processed. As long as we keep these spaces separate, I think that the Internet as a place for identity play is good.
Have you ever adopted an alternative identity online? I've experimented with being a man and saw how people responded to me differently. I found it quite a fascinating exercise. One of the things that a lot of women notice in virtual communities is that if you're a man, people stop offering to help you—especially when there is a lot of technical stuff to do.
Can people become too attached to their computers? There are two completely separate issues. One is the computer itself, and the second is what's on the computer. If you told me that you were writing a novel, and that you were thrilled and excited and gratified because you're so lucky that you can spend seven, eight, nine hours a day working on your novel on the computer, I'd say, "Hey, Josie, send me the first draft; I'm thrilled for you." Now, if you told me you're spending that time engaged in violent, aggressive mind games, I'd say, "Well, why are you doing that?" The fact that you're doing it on a computer is the last thing on my mind.