(Credit: Madeline Gannon/Instructables) You don't need to be Tony Stark to have a robot assistant anymore. During a fellowship at Autodesk’s Pier 9 workshop, Madeline Gannon, a Ph.D. candidate at Carnegie Mellon University, combined motion capture technology with a robotic arm to create an interactive system that reads human motions and responds to them accordingly. In the video below, watch a robotic arm observe and follow Gannon, turning and flexing to match her movements like a charmed snake.
Opening Its Eyes
The robot “sees” with a Vicon motion camera system paired with reflective wearable markers that tell the robot where to look. Quipt, the control software Gannon developed, acts as an interpreter, translating human motions into instructions for the robot. The actions themselves come from Robo.Op, an open-source library created by Gannon that contains movement commands. Quipt also includes an Android app that provides a continuous readout of its commands so controllers can stay updated. The robot arm works with humans through a few basic modes of operation. It can follow by tracking, say, a human hand while remaining a set distance away. It can also mirror movements: turning to look where a person is looking, for example. The robot can also be told to avoid markers, moving to keep a certain distance away at all times.
Madeline Gannon plays with her robot. (Credit: Madeline Gannon/Instructables) Ultimately,the program allows robots to respond directly to human actions instead of the typed commands most robots rely on today. Preprogrammed command sets are helpful on an assembly line where the job is repetitive, but it's far more difficult to pre-program a robot that thrives in a dynamic environment, such as a construction site. Because Quipt constantly receives inputs in the form of human motion, robots can adapt on the fly to stimuliin their surroundings. This could conceivably allow a robot to be employed anywhere a human needs to use a tool.
A New Frontier for Robotics
While movement-tracking technology is by no means revolutionary, Quipt significantly extends the functionality of interactive robots. By pairing motion inputs with a large library of commands, robots can move and interact with more freedom than before, opening up a range of new tasks. And by making the technology open-source, Gannon hopes to stimulate interest in a field hampered by prohibitively high costs of entry — robots are expensive and require specialized knowledge to program and use.
An app allows Madlab to control the robot from a smartphone. Credit: Madlab.cc With her design studio Madlab, Gannon is working to open up the field of robotics to designers and artists without requiring an engineering degree. She states on the Madlab website:
“Adapting [Industrial Robots] for uses outside of manufacturing domains can change industrial robots from adversaries into collaborators. Instead of developing ways for IR’s to replace human labor, a collaborative model can create ways for IR’s to augment, amplify and extend human capabilities and creativity.”
Their projects include an app that lets users control a robot with a swipe of a finger, and another app that instructs a robot to copy shapes drawn on a screen. Another featured project is a robot arm that rocks a cradle, responding to a baby’s cries with more vigorous motion. If robots are going to share our workspaces as we move forward, we're going to have to teach them to act a bit more human.