Abstract:
Intelligent object manipulation is critical for a robot to effectively operate in a household environment. There are many grasp planners that can estimate grasps based on...Show MoreMetadata
Abstract:
Intelligent object manipulation is critical for a robot to effectively operate in a household environment. There are many grasp planners that can estimate grasps based on object shape, but these approaches often perform poorly because they miss key information about non-visual object characteristics. Object model databases can account for this information, but existing methods for database construction are time and resource intensive. We present an easy-to-use system for constructing a grasp database from crowdsourced demonstrations. The method requires no additional equipment other than the robot itself, and non-expert users can demonstrate grasps through an intuitive web interface, with virtually no training required. We show that the crowdsourced grasps can prove sufficient for object manipulation, and furthermore the demonstration approach outperforms purely vision-based grasp planning approaches for a wide variety of object classes.
Date of Conference: 14-18 September 2014
Date Added to IEEE Xplore: 06 November 2014
ISBN Information: