Even Trash Can Robots Need Social Skills

Lovesick Cyborg
By Jeremy Hsu
May 8, 2015 11:51 PMNov 19, 2019 9:06 PM
Screen-Shot-2015-05-07-at-9.12.11-PM.png

Newsletter

Sign up for our email newsletter for the latest science news
 

Credit: Stanford University Pity the trash can robot. When it tried to offer its services as a waste receptacle in a Stanford University cafeteria, some people pointedly ignored the robot despite its attempts to get their attention. One person even gave the trash robot a kick to move it along. Unlike the protocol droid C-3PO from "Star Wars," the trash can robot took its abuse in good stride rather than blurting out "How rude!" The trash robot represented part of a Stanford University experiment designed to test how people interact with robots in a more natural setting outside the lab. Such information could prove valuable as human designers try to create more sophisticated robots capable of reading human social signals. A kick from a person represents an obvious social signal to "go away." But Stanford researchers, working with a colleague from the University of Southern Denmark, found that the majority of people who didn't want the robot's services showed their lack of interest by choosing to avoid social interaction with the robot entirely. "We are particularly interested in how people will behave when they encounter robots "in the wild" as they go about their daily activities; what they do to signal or interact with the robot, and how they make sense of the interaction," said Wendy Ju, executive director of Interaction Design Research at Stanford University and a coauthor of the paper. That means robots of the future may need the artificial intelligence to interpret social signals — or the lack of such signals — in order to interact well with people. The experiment was detailed in a paper published online from the Association for the Advancement of Artificial Intelligence 2015 Spring Symposia.

Will Wiggle for Trash

The robot in this particular case was just a remote-controlled trashcan sitting on top of an iRobot Create. Two cameras hidden in the trashcan handles provided video and audio to a researcher who used a laptop to control trash can robot's movements. But the researchers weren't testing the trash can robot's intelligence; they were using the robot to test people's reactions to the robot as it offered its services. A public cafeteria on the Stanford University campus provided the setting for the robot and the unknowing human participants over the course of one and a half hours. The single afternoon's worth of cafeteria observations described in the paper represented just one part of a larger, multi-afternoon study. The paper describes 26 interactions involving the trash can robot attempting to approach people sitting both alone and together in groups. The robot tried out two behaviors. First, it approached within a distance of about five feet, stopped and wiggled. If the person or people seemed inviting, the robot would drive closer. And if anyone threw trash into the trashcan robot, it also responded by wiggling. The second approach was more direct; trash can robot simply drove up to the table where people were sitting and sometimes aimed to bump into a chair.

To Ignore or Not Ignore

The encounters from the experiment provided some useful lessons about how future service robots might need to be able to interpret human social behaviors. People who had trash to throw away gave plenty of social signals such as looking at the robot, gesturing, smiling, and sometimes waving the trash in question. But people who didn't want the robot's services usually avoided giving any social signals to the robot. Many refused to turn their bodies or even their heads in acknowledgment of the robot's presence. One person made a point of looking in the opposite direction. Those people showed their disinterest in much the same way that people would to another person, said Stephen Yang, a Ph.D. student in engineering at Stanford University and coauthor of the paper. But he added that some people pretended to focus on something else while keeping the robot in the corner of their eye. Others only looked at the robot when it had turned away to face a different direction. That suggests the people were more interested in the robot than they let on. Yang shared one example that was not in the paper:

For example, there was an undergraduate who very nonchalantly interacted with the trash barrel by tossing her trash into the trash barrel and then turning back to ‘work.’ As the robot pulled away and turned the other direction, she takes out her cell phone and starts recording the robot. When the robot suddenly swivels around to face her again, she immediately points her phone back towards the ground and pretends to be just casually using it. I would go as far as to say she seemed embarrassed to be ‘caught’ by the robot.

One of the more uncommon reactions came from a lone student who acknowledged the robot in a friendly manner without having any trash to throw away. She looked between her laptop and the approaching robot, smiling, before saying "no trash" very slowly. Most of the people who ignored the robot were sitting alone in the cafeteria. By comparison, the robot received a much more favorable response from groups of people sitting in the cafeteria. But even the groups of people sitting together had to first check amongst themselves — through talking, shared laughs and glances — to decide whether or not they would respond to the robot.

Lessons for Future Robots

Such group actions represent the psychological urge to seek "social proof" from other people in order confirm the correct behavior in a given situation, said Kerstin Fischer, an associate professor at the Institute for Design and Communication at the University of Southern Denmark and lead author of the paper. She explained that people can have a lot of social insecurity when faced with the unfamiliar situation of interacting with robots. That might explain why individual people facing the trash can robot by themselves usually chose to ignore it; they didn't have anyone else to turn to for social cues about how to behave. But that's not the only complication in human-robot interactions. Past studies suggest that robots can have a tough time getting people's attention when they didn't fully understand the robot's function or intentions, when they were already busy and when they had nobody else to turn to for social proof, Fischer said. The Stanford group tried to eliminate some of those complications by giving the trash can robot a very straightforward design and testing it in a situation where people would genuinely need its services and might not have much else to do. The bigger lesson from the experiment suggests that service robots will need some way to interpret human behavior and social cues. That kind of social intelligence could make the difference between a helpful robot and an annoying one. "This study shows the significant amount of signaling and monitoring that machines need to do to determine whether people are willing to engage in interactions," Ju said. "The robot can only interpret a lack of social signals if there is a model of the social signals that the robot is looking for in the context the robot is deployed in. The social signals themselves are very overt and not difficult for a robot to detect."

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.