Skip to main content

Advertisement

Log in

Controlling gaze with an embodied interactive control architecture

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Human-Robot Interaction (HRI) is a growing field of research that targets the development of robots which are easy to operate, more engaging and more entertaining. Natural human-like behavior is considered by many researchers as an important target of HRI. Research in Human-Human communications revealed that gaze control is one of the major interactive behaviors used by humans in close encounters. Human-like gaze control is then one of the important behaviors that a robot should have in order to provide natural interactions with human partners. To develop human-like natural gaze control that can integrate easily with other behaviors of the robot, a flexible robotic architecture is needed. Most robotic architectures available were developed with autonomous robots in mind. Although robots developed for HRI are usually autonomous, their autonomy is combined with interactivity, which adds more challenges on the design of the robotic architectures supporting them. This paper reports the development and evaluation of two gaze controllers using a new cross-platform robotic architecture for HRI applications called EICA (The Embodied Interactive Control Architecture), that was designed to meet those challenges emphasizing how low level attention focusing and action integration are implemented. Evaluation of the gaze controllers revealed human-like behavior in terms of mutual attention, gaze toward partner, and mutual gaze. The paper also reports a novel Floating Point Genetic Algorithm (FPGA) for learning the parameters of various processes of the gaze controller.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Argyle M (2001) Bodily communication. Routledge, London, New Ed edition

    Google Scholar 

  2. Atienza R, Zelinsky E (2003) Intuitive human-robot interaction through active 3d gaze tracking. In: 11th int symposium of robotics research

  3. Kuno Y, Sakurai A, Miyauchi D, Nakamura A (2004) Two-way eye contact between humans and robots. In: ICMI ’04: Proceedings of the 6th international conference on multimodal interfaces. ACM, New York, pp 1–8. doi:10.1145/1027933.1027935

    Chapter  Google Scholar 

  4. Seemann E, Nickel K, Stiefelhagen R (2004) Head pose estimation using stereo vision for human-robot interaction. In: Sixth IEEE international conference on automatic face and gesture recognition, pp 626–631

  5. Sidner CL, Kidd CD, Lee C, Lesh N (2004) Where to look: a study of human-robot engagement. In: IUI ’04: Proceedings of the 9th international conference on Intelligent user interfaces. ACM, New York, pp 78–84. doi:10.1145/964442.964458

    Chapter  Google Scholar 

  6. Hoffman MW, Grimes DB, Shon AP, Rao RPN (2006) A probabilistic model of gaze imitation and shared attention. Neural Netw 19(3):299–310. doi:10.1016/j.neunet.2006.02.008

    Article  MATH  Google Scholar 

  7. Mohammad Y, Nishida T (2009) Towards combining autonomy and interactivity for social robots. AI Soc 24(1):35

    Article  Google Scholar 

  8. Ziemke T (1999) Does representation need reality, Chap Rethinking grounding. Kluwer Academic, Dordrecht, pp 177–190

    Google Scholar 

  9. Nicolescu MN, Matarić MJ (2002) A hierarchical architecture for behavior-based robots. In: AAMAS ’02: Proceedings of the first international joint conference on autonomous agents and multiagent systems. ACM, New York, pp 227–233. doi:10.1145/544741.544798

    Chapter  Google Scholar 

  10. Perez MC (2003) A proposal of a behavior-based control architecture with reinforcement learning for an autonomous underwater robot. PhD thesis, University of Girona

  11. Mohammad YFO, Nishida T (2007) A new, hri inspired, view of intention. In: AAAI-07 workshop on human implications of human-robot interactions, pp 21–27

  12. Ishiguro H, Ono T, Imai M, Maeda T, Kanda T, Nakatsu R (2001) Obovie: an interactive humanoid robot. Ind Rob 28(6):498–504

    Article  Google Scholar 

  13. Mohammad YFO, Nishida T (2007) Intention through interaction: Towards mutual intention in human-robot interactions. In: IEA/AIE 2007 conference, pp 114–124

  14. Mahanti G, Chakraborty A, Das S (2005) Floating-point genetic algorithm for design of a reconfigurable antenna arrays by phase-only control. In: Microwave conference proceedings, APMC 2005. Asia-Pacific conference proceedings, vol 5, 3 pp. doi:10.1109/APMC.2005.1606987

  15. Devaraj D, Yegnanarayana B (2005) Genetic-algorithm-based optimal power flow for security enhancement. In: Generation, transmission and distribution. IEE proceedings, vol 152(6), pp 899–905. doi:10.1049/ip-gtd:20045234

  16. Tian L, Collins C (2003) Motion planning for redundant manipulators using a floating point genetic algorithm. J Intell Rob Syst 38:297–312

    Article  Google Scholar 

  17. Mohammad Y, Xu Y, Matsumura K, Nishida T (2008) The h 3 r explanation corpus:human-human and base human-robot interaction dataset. In: The fourth international conference on intelligent sensors, sensor networks and information processing (ISSNIP2008)

  18. Gusfield D (1997) Algorithms on strings, trees, and sequences: computer science and computational biology. Cambridge University Press, Cambridge

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yasser Mohammad.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mohammad, Y., Nishida, T. Controlling gaze with an embodied interactive control architecture. Appl Intell 32, 148–163 (2010). https://doi.org/10.1007/s10489-009-0180-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-009-0180-0

Keywords

Navigation