Case Western Reserve University researchers are moving toward robots with superior emotional intelligence. They’re advancing the artificial intelligence (AI) that’s already being used in today’s interactive video games, with lofty goals of creating next-gen personalized robots that can read human emotions in real time.
Reportedly, the robots can identify human feelings and emotions faster than previous machines, while maintaining similar accuracy stats with a 98 percent success rate.
“Even a three-second pause can be awkward,” says Kiju Lee, the Nord Distinguished assistant professor in mechanical and aerospace engineering at the Case School of Engineering. “It’s hard enough for humans—and even harder for robots—to figure out what someone feels based solely on their facial expressions or body language. All of layers and layers of technology—including video capture—to do this also unfortunately slows down the response.” Lee also worked with Xiao Liu, a graduate student, on the AI project.
.jpg)
What’s the secret behind the robot’s improved speed? According to Case Western Reserve University, the researchers combined “two pre-processing video filters to another pair of existing programs to help the robot classify emotions based on more than 3,500 variations in human facial expression.”
However, this pales in comparison to a human’s ability to register more than 10,000 facial expressions. So, the team used computers with deep-learning algorithms for information processing, and were able to classify many expressions into just seven emotion categories—fear, surprise, happiness, disgust, sadness, anger, and neutral.
The researchers believe emotionally intelligent robots can help in a variety of scenarios, such as detecting early signs of depression, social and emotional intervention in senior adults, and learning the minute facial expressions of someone on the autism spectrum.
“These social robots will take some time to catch in the U.S.,” says Lee. “But in places like Japan, where there is a strong culture around robots, this is already beginning to happen. In any case, our future will be side-by-side with emotionally intelligent robots.”
Recently, Lee and Liu presented their work at the 2018 IEEE Games, Entertainment, and Media Conference.