There are more and more robots become common in everyday life but their communication skills are still lagging behind. One key attribute that can really help robot-human interaction is whether robots can learn to read and respond to human emotional cues.
In this case, they could intervene when they are really needed and not interfere the rest of the time. Researchers at Franklin & Marshall College are now working on solving the social auxiliary robots to process and respond to social signals given by humans, as reported by TechXplore.
“I’m interested in designing robots that help people daily tasks, such as cooking dinner, learning math or assembling furniture from Ikea, ”said Jason R. Wilson, one of the researchers who conducted the study. TechXplore. “I’m not looking to replace the people who help with these tasks. Instead, I want robots to be able to complement human help, especially when we don’t have enough people to help. ”
Wilson ‘s work is published in an article in arXiv and presented to AI-HRI (Artificial Intelligence for Human-Robot Interaction) Symposium 2021 last week. He demonstrates a new method that sees robots find out for themselves when it is appropriate to intervene and help their human counterparts. This claims that Wilson allows people who are helped to maintain their dignity.
The new method relies on people to convey that they need help both verbally and non-verbally. Verbal signs can consist of a person simply saying “I’m not sure,” and nonverbal signs can even be the human gaze. In fact, Wilson and his team came up with a method for allow robots to automatically process sight-related signs in useful ways.
Initial testing of the new technique proved to be very promising, which means that robots’ ability to detect both verbal and nonverbal cues may soon come to a bot near you.