If robots are our future assistants, they need to get much smarter. And the pain point isn’t simply around enhancing current AI and machine-learning capabilities. Robots need to be socially smarter, too!
Most of us have heard news stories about the need for self-driving cars to get better at reading unusual traffic situations.
But researcher Jason R. Wilson has been working on a different type of learning: social-emotional intelligence.
Wilson presented his findings at the Artificial Intelligence for Human-Robot Interaction Symposium earlier this year.
Wilson is teaching robots to read human nonverbal social cues.
Knowing how to read human body language will allow robots to “make decisions” about when to intervene, when to offer help, and when to leave humans in peace.
Verbal cues from humans, such as the phrase “I’m not sure,” would be one way the robots would know to intervene. Non-verbal cues, including the human gaze toward the robot, will also be taught.
Now that researchers are working on making robots socially-aware, maybe they can make other technology less annoying, too!
Photo by Andy Kelly on Unsplash