Abstract:
Currently humanoid robots become technically more capable of executing complex movements, showing human-like gestures, sometimes even facial expressions, and acting in ge...Show MoreMetadata
Abstract:
Currently humanoid robots become technically more capable of executing complex movements, showing human-like gestures, sometimes even facial expressions, and acting in general. While this lays the basis to make robot theater/enactments more and more interesting for the audience, another key-component is flexibility in the flow of an event to move on from simple pre-scripting. Here a sophisticated method is introduced relying on audio processing, clustering and machine learning techniques to evaluate audience's applauses, allowing the robot to infer self-evaluation about its actions. In a second step we use this information and a humanoid robot's body language to alter the flow of the event and display a reaction for the audience.
Date of Conference: 15-17 November 2016
Date Added to IEEE Xplore: 02 January 2017
ISBN Information:
Electronic ISSN: 2164-0580