If robots are our future assistants, they need to get much smarter. And the pain point isn’t simply around enhancing current AI and machine-learning capabilities. Robots need to be socially smarter, too!
Most of us have heard news stories about the need for self-driving cars to get better at reading unusual traffic situations.
But researcher Jason R. Wilson has been working on a different type of learning: social-emotional intelligence.
Wilson presented his findings at the Artificial Intelligence for Human-Robot Interaction Symposium earlier this year.
Wilson is teaching robots to read human nonverbal social cues.
Knowing how to read human body language will allow robots to “make decisions” about when to intervene, when to offer help, and when to leave humans in peace.
Verbal cues from humans, such as the phrase “I’m not sure,” would be one way the robots would know to intervene. Non-verbal cues, including the human gaze toward the robot, will also be taught.
Now that researchers are working on making robots socially-aware, maybe they can make other technology less annoying, too!
Have some fun tech news to share? Get in touch with us on LinkedIn or Tweet us. We’re always happy to talk!
–parin
Managing Partner
Photo by Andy Kelly on Unsplash
With over two decades of experience, Parin leads an expert demand-generation agency, StratMg, that helps industrial manufacturing clients achieve unambiguous and quantified organic sales growth across the US, EMEA & APAC.
Parin has built & positioned StratMg to be a value-added marketing services provider that strives to create a culture of quantified sales-driven marketing initiatives leading to sustained business growth through channel management, diversification, new customer acquisition and retention strategies and tactical execution.