Teaching Robots to Move Like Humans

March 6, 2011

When people communicate, the way they move has as much to do
with what they’re saying as the words that come out of their mouths. But what
about when robots communicate with people? How can robots use non-verbal communication
to interact more naturally with humans? Researchers at the Georgia Institute of
Technology found that when robots move in a more human-like fashion, with one
movement leading into the next, that people can not only better recognize what
the robot is doing, but they can also better mimic it themselves. The research
is being presented today at the Human-Robot Interaction conference in Lausanne,
Switzerland.

“It’s important to build robots that meet people’s social
expectations because we think that will make it easier for people to understand
how to approach them and how to interact with them,” said Andrea Thomaz, assistant
professor in the School of Interactive Computing at Georgia Tech’s College of
Computing.

Thomaz, along with Ph.D. student Michael Gielniak, conducted
a study in which they asked how easily people can recognize what a robot is
doing by watching its movements.

“Robot motion is typically characterized by jerky movements,
with a lot of stops and starts, unlike human movement which is more fluid and
dynamic,” said Gielniak. “We want humans to interact with robots just as they
might interact with other humans, so that it’s intuitive.”

Using a series of human movements taken in a motion-capture
lab, they programmed the robot, Simon, to perform the movements. They also optimized
that motion to allow for more joints to move at the same time and for the
movements to flow into each other in an attempt to be more human-like. They
asked their human subjects to watch Simon and identify the movements he made.

“When the motion was more human-like, human beings were able
to watch the motion and perceive what the robot was doing more easily,” said
Gielniak.

In addition, they tested the algorithm they used to create
the optimized motion by asking humans to perform the movements they saw Simon
making. The thinking was that if the movement created by the algorithm was
indeed more human-like, then the subjects should have an easier time mimicking
it. Turns out they did.

“We found that this optimization we do to create more
life-like motion allows people to identify the motion more easily and mimic it
more exactly,” said Thomaz.

The research that Thomaz and Gielniak are doing is part of a
theme in getting robots to move more like humans move. In future work, the pair
plan on looking at how to get Simon to perform the same movements in various
ways.

“So, instead of having the robot move the exact same way
every single time you want the robot to perform a similar action like waving, you
always want to see a different wave so that people forget that this is a robot
they’re interacting with,” said Gielniak.

Video: Teaching Robots to Move Like Humans