skip to main content
10.1145/2559636.2559816acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
poster

Learning hand-eye coordination for a humanoid robot using SOMs

Published:03 March 2014Publication History

ABSTRACT

Hand-eye coordination is an important motor skill acquired in infancy which precedes pointing behavior. Pointing facilitates social interactions by directing attention of engaged participants. It is thus essential for the natural flow of human-robot interaction. Here, we attempt to explain how pointing emerges from sensorimotor learning of hand-eye coordination in a humanoid robot. During a body babbling phase with a random walk strategy, a robot learned mappings of joints for different arm postures. Arm joint configurations were used to train biologically inspired models consisting of SOMs. We show that such a model implemented on a robotic platform accounts for pointing behavior while humans present objects out of reach of the robot's hand.

References

  1. J. L. Elman, E. A. Bates, M. H. Johnson, A. Karmiloff-Smith, D. Parisi, and K. Plunkett. Rethinking Innateness: A Connectionist Perspective on Development. MIT Press, 1996.Google ScholarGoogle Scholar
  2. V. V. Hafner and G. Schillaci. From field of view to field of reach - could pointing emerge from the development of grasping? Frontiers in Computational Neuroscience, 2011.Google ScholarGoogle Scholar
  3. T. Kohonen. Self-organized formation of topologically correct feature maps. Biological cybernetics, 43(1):59--69, 1982.Google ScholarGoogle ScholarCross RefCross Ref
  4. M. Lungarella and G. Metta. Beyond gazing, pointing, and reaching: A survey of developmental robotics. In Epigenetic Robotics, pages 81--89, 2003.Google ScholarGoogle Scholar
  5. A. Morse, J. de Greeff, T. Belpaeme, and A. Cangelosi. Epigenetic robotics architecture (era). Autonomous Mental Development, IEEE Transactions on, 2(4):325--339, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. G. Vettigli. MiniSom: minimalistic and numpy based implementation of the self organizing maps. https://github.com/JustGlowing/minisom, 2013.Google ScholarGoogle Scholar

Index Terms

  1. Learning hand-eye coordination for a humanoid robot using SOMs

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      HRI '14: Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
      March 2014
      538 pages
      ISBN:9781450326582
      DOI:10.1145/2559636

      Copyright © 2014 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 3 March 2014

      Check for updates

      Qualifiers

      • poster

      Acceptance Rates

      HRI '14 Paper Acceptance Rate32of132submissions,24%Overall Acceptance Rate242of1,000submissions,24%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader