Webcams and video-conferencing software like Skype have really enhanced the way we communicate with friends, family, and colleagues around the world. Even so, talking to a floating head on a computer screen can still feel pretty cold, and it doesn't look like we're going to get a teleportation device any time soon (le sigh). However, researchers at Stanford University are hoping to make that interaction a little more lifelike with a computer that can mimic human motions.
David Sirkin and Wendy Ju from Stanford's Center for Design Research created a motorized flat-screen display that copycats various human motions like shrugging, nodding, and laughing. The team did so by adding motors to the Apple iMac G4 and then linking it to software that reads a person's movements and instructs the G4's moveable arm to perform one of nine motions.
These moves, which include nodding up and down for yes and moving side to side for no, as well as leaning in and out -- all are controlled by the user via a Wii game controller. The team also added a robotic arm for extra effects, such as tapping on the table to get someone's attention.
I like the idea in theory, though I have my doubts about how much more personal conversations will be with the added motions, especially when you hear all those mechanical sounds. However, Sirkin and Ju said their project got a warm reception at last month's Human-Robot Interaction conference in Boston, as people thought the "consistency between physical and onscreen action improved understanding of the messages that remote participants communicated" and made the caller seem more involved and friendly.
What do you guys think?
(Via New Scientist)