Fusing visual and tactile sensing for 3-D object reconstruction while grasping | IEEE Conference Publication | IEEE Xplore

Fusing visual and tactile sensing for 3-D object reconstruction while grasping


Abstract:

In this work, we propose to reconstruct a complete 3-D model of an unknown object by fusion of visual and tactile information while the object is grasped. Assuming the ob...Show More

Abstract:

In this work, we propose to reconstruct a complete 3-D model of an unknown object by fusion of visual and tactile information while the object is grasped. Assuming the object is symmetric, a first hypothesis of its complete 3-D shape is generated from a single view. This initial model is used to plan a grasp on the object which is then executed with a robotic manipulator equipped with tactile sensors. Given the detected contacts between the fingers and the object, the full object model including the symmetry parameters can be refined. This refined model will then allow the planning of more complex manipulation tasks. The main contribution of this work is an optimal estimation approach for the fusion of visual and tactile data applying the constraint of object symmetry. The fusion is formulated as a state estimation problem and solved with an iterative extended Kalman filter. The approach is validated experimentally using both artificial and real data from two different robotic platforms.
Date of Conference: 06-10 May 2013
Date Added to IEEE Xplore: 17 October 2013
ISBN Information:
Print ISSN: 1050-4729
Conference Location: Karlsruhe, Germany

Contact IEEE to Subscribe

References

References is not available for this document.