Abstract:
Humanoid robots are expected to interact in human environments, where physical interactions are unavoidable. Therefore, whole-body control methods that include multi-cont...Show MoreMetadata
Abstract:
Humanoid robots are expected to interact in human environments, where physical interactions are unavoidable. Therefore, whole-body control methods that include multi-contact interactions are required. The new emerging technologies in touch sensing are fundamental to acquire online and rich information about these physical interactions with the environment. These technologies lead to the design of novel control systems that can profit from the tactile sensor information in an efficient form, thus producing reactive and compliant robots capable of interacting with their environment. In this paper, we present a novel control framework to integrate the multi-modal tactile information of a robot skin with different control strategies, producing dynamic behaviours suitable for Human-Robot Interactions (HRI). The control framework was experimentally evaluated on a full-size humanoid robot covered with more than 1260 skin cells distributed in the whole robot body. The results show that multi-modal tactile information can be fused hierarchically with multiple control strategies, producing active compliance in a position-controlled stiff humanoid robot.
Date of Conference: 20-24 May 2019
Date Added to IEEE Xplore: 12 August 2019
ISBN Information: