Implicit Sensorimotor Mapping of the Peripersonal Space by Gazing and Reaching | IEEE Journals & Magazine | IEEE Xplore

Implicit Sensorimotor Mapping of the Peripersonal Space by Gazing and Reaching


Abstract:

Primates often perform coordinated eye and arm movements, contextually fixating and reaching towards nearby objects. This combination of looking and reaching to the same ...Show More

Abstract:

Primates often perform coordinated eye and arm movements, contextually fixating and reaching towards nearby objects. This combination of looking and reaching to the same target is used by infants to establish an implicit visuomotor representation of the peripersonal space, useful for both oculomotor and arm motor control. In this work, taking inspiration from such behavior and from primate visuomotor mechanisms, a shared sensorimotor map of the environment, built on a radial basis function framework, is configured and trained by the coordinated control of eye and arm movements. Computational results confirm that the approach seems especially suitable for the problem at hand, and for its implementation on a real humanoid robot. By exploratory gazing and reaching actions, either free or goal-based, the artificial agent learns to perform direct and inverse transformations between stereo vision, oculomotor, and joint-space representations. The integrated sensorimotor map that allows to contextually represent the peripersonal space through different vision and motor parameters is never made explicit, but rather emerges thanks to the interaction of the agent with the environment.
Published in: IEEE Transactions on Autonomous Mental Development ( Volume: 3, Issue: 1, March 2011)
Page(s): 43 - 53
Date of Publication: 28 January 2011

ISSN Information:


Contact IEEE to Subscribe

References

References is not available for this document.