skip to main content
10.1145/2401836.2401839acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
research-article

Perception of gaze direction for situated interaction

Published: 26 October 2012 Publication History

Abstract

Accurate human perception of robots' gaze direction is crucial for the design of a natural and fluent situated multimodal face-to-face interaction between humans and machines. In this paper, we present an experiment targeted at quantifying the effects of different gaze cues synthesized using the Furhat back-projected robot head, on the accuracy of perceived spatial direction of gaze by humans using 18 test subjects. The study first quantifies the accuracy of the perceived gaze direction in a human-human setup, and compares that to the use of synthesized gaze movements in different conditions: viewing the robot eyes frontal or at a 45 degrees angle side view. We also study the effect of 3D gaze by controlling both eyes to indicate the depth of the focal point (vergence), the use of gaze or head pose, and the use of static or dynamic eyelids. The findings of the study are highly relevant to the design and control of robots and animated agents in situated face-to-face interaction.

References

[1]
Argyle, M., Ingham, R., Alkema, F., & McCallin, M. 1973. The different functions of gaze. Semiotica, 7(1), 19--32.
[2]
Mirenda, P. L., Donnellan, A. M., & Yoder, D. E. 1983. Gaze behavior: A new look at an old problem. Journal of Autism and Developmental Disorders, 13(4), 397--409.
[3]
Tomasello, M., Hare, B., Lehmann, H., & Call, J. 2007. Reliance on head versus eyes in the gaze following of great apes and human infants: the cooperative eye hypothesis. Journal of Human Evolution, 52(3), 314--320.
[4]
Mutlu, B., Forlizzi, J., & Hodgins, J. 2006. A storytelling robot: Modeling and evaluation of human-like gaze behavior. In Proceedings of 6th IEEE-RAS International Conference on Humanoid Robots (pp. 518--523).
[5]
Torres, O., Cassell, J., & prevost, S. 1997. Modeling gaze behavior as a function of discourse structure. Proc. of the First International Workshop on Human-Computer Conversation.
[6]
Boucher, J. D., Pattacini, U., Lelong, A., Bailly, G., Elisei, F., Fagel, S., Dominey, P. F., & Ventre-Dominey, J. 2012. I reach faster when I see you look: gaze effects in human-human and human-robot face-to-face cooperation. Frontiers in neurorobotics, 6.
[7]
Yoshikawa, Y., Shinozawa, K., Ishiguro, H., Hagita, N., & Miyamoto, T. 2006. Responsive robot gaze to interaction partner. In Proceedings of robotics: Science and systems.
[8]
Johnson-Roberson, M., Bohg, J., Skantze, G., Gustafson, J., Carlson, R., Rasolzadeh, B., & Kragic, D. 2011. Enhanced Visual Scene Understanding through Human-Robot Dialog. In IEEE/RSJ International Conference on Intelligent Robots and Systems.
[9]
Al Moubayed, S., Edlund, J., & Beskow, J. 2012. Taming Mona Lisa: communicating gaze faithfully in 2D and 3D facial projections. ACM Transactions on Interactive Intelligent Systems, 1(2), 25.
[10]
Al Moubayed, S., Beskow, J., Skantze, G., & Granströöm, B. 2012. Furhat: A Back-projected Human-like Robot Head for Multiparty Human-Machine Interaction. To be published in Cognitive Behavioural Systems. Lecture Notes in Computer Science. Springer.
[11]
Al Moubayed, S., & Skantze, G. 2011. Turn-taking Control Using Gaze in Multiparty Human-Computer Dialogue: Effects of 2D and 3D Displays. In Proceedings of AVSP. Florence, Italy.
[12]
Delaunay, F., de Greeff, J., and Belpaeme, T. 2010. A study of a retro-projected robotic face and its effectiveness for gaze reading by humans. In Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction (HRI '10). IEEE Press, Piscataway, NJ, USA, 39--44.
[13]
Bailly, G., Raidt, S., & Elisei, F. 2010. Gaze, conversational agents and face-to-face communication. Speech Communication, 52(6), 598--612.
[14]
Gibson, J. J., & Pick, A. D. 1963. Perception of another person's looking behavior. The American journal of psychology, 76(3), 386--394.
[15]
Watt, R., Craven, B., & Quinn, S. 2007. A role for eyebrows in regulating the visibility of eye gaze direction. The Quarterly Journal of Experimental Psychology, 60(9), 1169--1177.

Cited By

View all
  • (2024)Children's Word Learning from Socially Contingent Robots Under Active vs. Passive Learning ConditionsProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634931(669-677)Online publication date: 11-Mar-2024
  • (2021)From Social Gaze to Indirect Speech Constructions: How to Induce the Impression That Your Companion Robot Is a Conscious CreatureApplied Sciences10.3390/app11211025511:21(10255)Online publication date: 1-Nov-2021
  • (2021)I Can See It in Your Eyes: Gaze as an Implicit Cue of Uncanniness and Task Performance in Repeated Interactions With RobotsFrontiers in Robotics and AI10.3389/frobt.2021.6459568Online publication date: 7-Apr-2021
  • Show More Cited By

Index Terms

  1. Perception of gaze direction for situated interaction

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      Gaze-In '12: Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
      October 2012
      88 pages
      ISBN:9781450315166
      DOI:10.1145/2401836
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 26 October 2012

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. ECA
      2. eyelids
      3. furhat
      4. gaze perception
      5. head pose
      6. robot head
      7. situated interaction
      8. talking head

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      ICMI '12
      Sponsor:
      ICMI '12: INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION
      October 26, 2012
      California, Santa Monica

      Acceptance Rates

      Overall Acceptance Rate 19 of 21 submissions, 90%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)15
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 16 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Children's Word Learning from Socially Contingent Robots Under Active vs. Passive Learning ConditionsProceedings of the 2024 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3610977.3634931(669-677)Online publication date: 11-Mar-2024
      • (2021)From Social Gaze to Indirect Speech Constructions: How to Induce the Impression That Your Companion Robot Is a Conscious CreatureApplied Sciences10.3390/app11211025511:21(10255)Online publication date: 1-Nov-2021
      • (2021)I Can See It in Your Eyes: Gaze as an Implicit Cue of Uncanniness and Task Performance in Repeated Interactions With RobotsFrontiers in Robotics and AI10.3389/frobt.2021.6459568Online publication date: 7-Apr-2021
      • (2021)Cognitive Impact of Anthropomorphized Robot GazeACM Transactions on Human-Robot Interaction10.1145/345999410:4(1-14)Online publication date: 14-Jul-2021
      • (2019)Berrick: a low-cost robotic head platform for human-robot interaction2019 IEEE International Conference on Systems, Man and Cybernetics (SMC)10.1109/SMC.2019.8913932(559-566)Online publication date: Oct-2019
      • (2018)Accuracy of Perceiving Precisely Gazing Virtual AgentsProceedings of the 18th International Conference on Intelligent Virtual Agents10.1145/3267851.3267852(263-268)Online publication date: 5-Nov-2018
      • (2017)Social eye gaze in human-robot interactionJournal of Human-Robot Interaction10.5898/JHRI.6.1.Admoni6:1(25-63)Online publication date: 26-May-2017
      • (2015)Eye gaze tracking for a humanoid robot2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids)10.1109/HUMANOIDS.2015.7363561(318-324)Online publication date: Nov-2015
      • (2014)Tutoring RobotsInnovative and Creative Developments in Multimodal Interaction Systems10.1007/978-3-642-55143-7_4(80-113)Online publication date: 2014
      • (2013)Towards rich multimodal behavior in spoken dialogues with embodied agents2013 IEEE 4th International Conference on Cognitive Infocommunications (CogInfoCom)10.1109/CogInfoCom.2013.6719212(817-822)Online publication date: Dec-2013
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media