Abstract:
This paper deals with the problem of using the tactile feedback generated by a robotic skin for discriminating a human hand touch from a generic contact. Humans understan...Show MoreMetadata
Abstract:
This paper deals with the problem of using the tactile feedback generated by a robotic skin for discriminating a human hand touch from a generic contact. Humans understand collaboration intentions through different sensing modalities such as vision, hearing and touch. Among them, a physical interaction is mainly used for demonstrating or correcting a kind of motion and is usually started by touching with the hands the other human body. Until recently, it was difficult to perform the same in human-robot cooperation due to the lack of large-scale tactile systems functionally similar to a human skin. Our approach consists in transforming measurements of sensors distributed on the robot body into a convenient 2D representation of the contact shape, i.e., a contact image, then applying image classification techniques in order to discriminate a human touch from unexpected collisions. Experiments have been performed on a robotic skin composed of 768 pressure sensors integrated on a Baxter robot forearm. More than 1800 contact images have been generated from 43 different persons for training and testing two machine learning algorithms: Bag of Visual Words and Convolutional Neural Networks. The experimental results show that both approaches are valid, obtaining a classification accuracy higher than 96%.
Date of Conference: 24-28 September 2017
Date Added to IEEE Xplore: 14 December 2017
ISBN Information:
Electronic ISSN: 2153-0866