Abstract:
Increasingly widespread available haptic sensors mounted on articulated hands offer new sensory channels that can complement shape extraction from vision to enable a more...Show MoreMetadata
Abstract:
Increasingly widespread available haptic sensors mounted on articulated hands offer new sensory channels that can complement shape extraction from vision to enable a more robust handling of objects in cases when vision is restricted or even unavailable. However, to estimate object shape from haptic interaction data is a difficult challenge due to the complexity of the contact interaction between the movable object and sensor surfaces, leading to a coupled estimation problem of shape and object pose. While for vision efficient solutions to the underlying SLAM problem are known, the available information is much sparser in the tactile case, posing great difficulties for a straightforward adoption of standard SLAM algorithms. In the present paper, we thus explore whether a biologically inspired model based on dynamic neural fields can offer a route towards a practical algorithm for tactile SLAM. Our study is focused on a restricted scenario where a two-fingered robot hand manipulates an n-gon with a fixed rotational axis. We demonstrate that our model can accumulate shape information from reasonably short interaction sequences and autonomously build a representation despite significant ambiguity of the tactile data due to the rotational periodicity of the object. We conclude that the presented framework may be a suitable basis to solve the tactile SLAM problem also in more general settings which will be the focus of subsequent work.
Date of Conference: 14-18 September 2014
Date Added to IEEE Xplore: 06 November 2014
ISBN Information: