Last night's Terminator: The Sarah Connor Chronicles brought into the foreground an idea that's been floating around in the background of the Terminator franchise for some time: that the flesh-and-blood bodies that surround terminator exoskeletons are based on real people. In the future, a young woman called Allison Young falls into the hands of Skynet, and given that she looks exactly like terminator Cameron, we have a fair idea of how things are going to turn out for her. In the real world, how close are we to creating not just a generic individual, but a doppleganger of a specific individual? From point of view of physical appearance, we're already there—assuming the duplicate is allowed to sit down, and avoid expansive hand gestures. Creating even robot-looking robots that can walk and gesture like a human is still a tough order, but solid progress is being made. Making a robot that looks like a given human is pretty much within the gift of any half-way decent prop-making company--after all, Andy Warhol commissioned a look-alike robot back in the 80's. Of course, making something that looks like a specific person act like a specific person is the really big problem. Currently, animatronic heads designed to mimic an individual's mannerisms rely on being controlled by skilled puppeteers to produce believable results. For a machine to act well enough to fool a friend of the original human for more than a few moments probably requires solving the problem of how to create true artificial intelligence in the first place (and so, perfectly believable within the Terminator universe). However you might get some of the way there by copying some of the principles underlying today's chat bots: more-or-less randomly weave together various canned gestures (copied from the original subject), in the way that video game makers do when they capture movements from real athletes to make games like Madden NFL.