Abstract:
We present a system for learning haptic affordance models of complex manipulation skills. The goal of a haptic affordance model is to improve task completion by character...Show MoreMetadata
Abstract:
We present a system for learning haptic affordance models of complex manipulation skills. The goal of a haptic affordance model is to improve task completion by characterizing the feel of a particular object-action pair. We use learning from demonstration to provide the robot with an example of a successful interaction with a given object. We then use environmental scaffolding and a wrist-mounted force/torque (F/T) sensor to collect grounded examples (successes and unsuccessful “near misses”) of the haptic data for the object-action pair. From this, we build one “success” Hidden Markov Model (HMM) and one “near-miss” HMM for each object-action pair. We evaluate this approach with five different actions on seven different objects to learn two specific affordances (open-able and scoop-able). We show that by building a library of object-action pairs for each affordance, we can successfully monitor a trajectory of haptic data to determine if the robot finds an affordance.
Date of Conference: 07-10 March 2016
Date Added to IEEE Xplore: 14 April 2016
ISBN Information:
Electronic ISSN: 2167-2148