stub Researchers Create Autonomous Robot That Mimics Human Expressions - Unite.AI
Connect with us

Robotics

Researchers Create Autonomous Robot That Mimics Human Expressions

Updated on
Image: Creative Machines Lab

Researchers in the Creative Machines Lab at Columbia Engineering have developed a new autonomous robot called EVA, which has a soft and expressive face that matches the expressions of nearby humans. 

The new research is set to be presented on May 30 at the ICRA conference. The blueprints for the robot are open-sourced on Hardware-X.

Hod Lipson is a James and Sally Scapa Professor of Innovation (Mechanical Engineering) and director of the Creative Machines Lab.

“The idea for EVA took shape a few years ago, when my students and I began to notice that the robots in our lab were staring back at us through plastic, googly eyes,” said Lipson.

Humanizing Robots

According to Lipson, he noticed a similar trend in grocery stores, where he observed restocking robots that were humanized with things like name badges.

“People seemed to be humanizing their robotic colleagues by giving them eyes, an identity, or a name,” he said. “This made us wonder, if eyes and clothing work, why not make a robot that has a super-expressive and responsive human face?”

Roboticists have been working towards creating convincing robotic faces for quite a long time now, but the task is not easy. Robotic body parts have traditionally consisted of metal or hard plastic, which are too stiff to move in the same way as human tissue. Robotic hardware has also often been difficult to work with given the amount of circuits, sensors, and power-intensive and heavy motors. 

(Hardware Animation) Smile Like You Mean It: Driving Animatronic Robotic Face with Learned Models

 

Constructing EVA

The project began several years ago in Lipson’s lab when undergraduate student Zanwar Faraj led a team of students to build the robot’s physical “machinery.” EVA was first constructed as a disembodied bust, and now the robot can express six basic emotions of anger, disgust, fear, joy, sadness, and surprise. EVA can also express other nuanced emotions by relying on artificial “muscles,” which are in the form of cables and motors. These artificial muscles pull on specific points on EVA’s face, which enables the robot to mimic the movements of more than 42 tiny muscles that are attached to the skin and bones of human faces.

“The greatest challenge in creating EVA was designing a system that was compact enough to fit inside the confines of a human skull while still being functional enough to produce a wide range of facial expressions,” Faraj said.

In order to address this, the researchers used 3D printing to create parts with complex shapes, and these were able to be integrated with EVA’s skull. 

“I was minding my own business one day when EVA suddenly gave me a big, friendly smile,” Lipson said. “I knew it was purely mechanical, but I found myself reflexively smiling back.”

Programming the AI 

The team then moved forward to program the AI that enabled EVA’s facial movements. The robot relies on deep learning AI to “read” and then mimic the expressions of nearby humans, and EVA learns by trial and error when watching videos of itself. 

According to Boyuan Chen, Lipson’s PhD student who led the software phase, EVA’s facial movements were too complex to be governed by pre-defined rules, so several deep learning neural networks were used to create EVA’s brain. 

The robot’s brain needed to first learn to use its complex system of mechanical muscles to generate facial expressions, and then it had to learn how to “read” the faces of humans to then mimic. 

The team filmed EVA making random faces for hours, and then EVA’s internal neural network learned to pair muscle motion with the video footage, meaning it had a sense of how its own face operated. A second network was then used to match its own self-image with that of a human face, and following several adjustments, EVA was able to read human face gestures from a camera and then respond by mimicking the same facial expression. 

These types of robots could be used in places like hospitals, schools, and homes. 

“There is a limit to how much we humans can engage emotionally with cloud-based chatbots or disembodied smart-home speakers,” said Lipson. “Our brains seem to respond well to robots that have some kind of recognizable physical presence.”

“Robots are intertwined in our lives in a growing number of ways, so building trust between humans and machines is increasingly important,” Chen added.

 

Alex McFarland is an AI journalist and writer exploring the latest developments in artificial intelligence. He has collaborated with numerous AI startups and publications worldwide.