Robotic Body Language
The ins and outs of body language make interacting with another person complex enough. But what if your counterpart isn’t a person at all? In recent years, we’ve been dealing more and more with intelligent robots. They provide services like military training and medical care. Though robots’ faces can be made to look human, improving their expressions, gestures, and ability to read body language remains a key challenge.
In 2009 an interesting study was conducted at Carnegie Mellon University. In the room were a robot and a table with several objects on it. People asked the robot yes/no questions to determine which object it was going to choose. It took an average of 5.5 questions for people to guess the correct item. Occasionally, the robot’s eyes glanced at the item. When that happened, people needed only five questions to guess the correct object. Tellingly, the eye movement only made a difference when a humanoid robot called Geminoid was involved. It made no difference when a non-humanoid robot was used.
In addition to being more expressive, robots can also benefi t from knowing how to read body language. One application where this can help is Intelligent Tutoring Systems (ITS). ITS robots are used as private tutors, teaching assistants, and training specialists. Scientists are working on ways to make them more “affect-aware.” That means they can judge whether a person is interested, bored, tired, etc. Researchers at UC San Diego’s Machine Perception Lab have already taught an ITS to recognize smiles. Over time, as robots learn to recognize more emotions, they’ll become better at their tasks.