skip to main content
10.1145/3365610.3365643acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmumConference Proceedingsconference-collections
research-article

Evaluation of attention inducing effects using ubiquitous humanlike face robots

Published:26 November 2019Publication History

ABSTRACT

A human gaze not only has the meaning that someone is looking at something, but also has various effects on the surrounding people. For example, previous studies have shown that attention is induced in the gaze direction of others and that consciousness of being seen by others affects human behavior. In the future, android robots will play an active role in society. We can predict that communication and information transmission with such gaze information will be possible between humans and robots. However, to the best of our knowledge, no study has yet investigated the effect of robot gazes on people in a situation where many robots exist ubiquitously in daily life. Therefore, in this study, we aim to evaluate the attention inducing effect of the gaze of the face robots installed in the daily living environment. In this paper, we present several examples of services in a situation, where face robots are established ubiquitously in daily life. We verified the attention inducing effect in three test cases. As a result of the experiment, the number of times when the participants focused their attention in the direction of the gaze of the face robot was small. However, the feasibility of the service was demonstrated through the results of the experiment, such as surrounding situation, user's consciousness about the face robots, and attention induction timing.

References

  1. Henny Admoni, Caroline Bank, Joshua Tan, Mariya Toneva, and Brian Scassellati. 2011. Robot gaze does not reflexively cue human attention. In Proceedings of the Annual Meeting of the Cognitive Science Society, Vol. 33.Google ScholarGoogle Scholar
  2. Melissa Bateson, Daniel Nettle, and Gilbert Roberts. 2006. Cues of being watched enhance cooperation in a real-world setting. Biology letters 2, 3 (2006), 412--414.Google ScholarGoogle Scholar
  3. Jon Driver IV, Greg Davis, Paola Ricciardelli, Polly Kidd, Emma Maxwell, and Simon Baron-Cohen. 1999. Gaze perception triggers reflexive visuospatial orienting. Visual cognition 6, 5 (1999), 509--540.Google ScholarGoogle Scholar
  4. Chris Kelland Friesen and Alan Kingstone. 1998. The eyes have it! Reflexive orienting is triggered by nonpredictive gaze. Psychonomic bulletin & review 5, 3 (1998), 490--495.Google ScholarGoogle Scholar
  5. Alexandra Frischen, Andrew P Bayliss, and Steven P Tipper. 2007. Gaze cueing of attention: visual attention, social cognition, and individual differences. Psychological bulletin 133, 4 (2007), 694. Google ScholarGoogle ScholarCross RefCross Ref
  6. Jennifer Goetz, Sara Kiesler, and Aaron Powers. 2003. Matching robot appearance and behavior to tasks to improve human-robot cooperation. In The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003. Ieee, 55--60.Google ScholarGoogle ScholarCross RefCross Ref
  7. Minoru Hashimoto, Hiromi Kondo, and Yukimasa Tamatsu. 2008. Effect of emotional expression to gaze guidance using a face robot. In RO-MAN 2008-The 17th IEEE International Symposium on Robot and Human Interactive Communication. IEEE, 95--100.Google ScholarGoogle ScholarCross RefCross Ref
  8. Mohammed Moshiul Hoque, Tomomi Onuki, Yoshinori Kobayashi, and Yoshinori Kuno. 2011. Controlling human attention through robot's gaze behaviors. In 2011 4th International Conference on Human System Interactions, HSI 2011. IEEE, 195--202.Google ScholarGoogle ScholarCross RefCross Ref
  9. Mohammed Moshiul Hoque, Tomami Onuki, Emi Tsuburaya, Yoshinori Kobayashi, Yoshinori Kuno, Takayuki Sato, and Sachiko Kodama. 2010. An empirical framework to control human attention by robot. In Asian Conference on Computer Vision. Springer, 430--439.Google ScholarGoogle Scholar
  10. Hermann Kaindl, Roman Popp, David Raneburger, Dominik Ertl, Jurgen Falb, Alexander Szep, and Cristian Bogdan. 2011. Robot-supported cooperative work: A shared-shopping scenario. In 2011 44th Hawaii International Conference on System Sciences. IEEE, 1--10. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Hiromi Kobayashi and Shiro Kohshima. 2001. Unique morphology of the human eye and its adaptive meaning: comparative studies on external morphology of the primate eye. Journal of human evolution 40, 5 (2001), 419--435. Google ScholarGoogle ScholarCross RefCross Ref
  12. Yutaka Kondo, Masato Kawamura, Kentaro Takemura, Jun Takamatsu, and Tsukasa Ogasawara. 2011. Gaze motion planning for android robot. In Proceedings of the 6th international conference on Human-robot interaction. ACM, 171--172.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Daisuke Sakamoto and Hiroshi Ishiguro. 2009. Geminoid: Remote-controlled android system for studying human presence. Kansei Engineering International 8, 1 (2009), 3--9.Google ScholarGoogle ScholarCross RefCross Ref
  14. Wataru Sato, Takashi Okada, and Motomi Toichi. 2007. Attentional shift by gaze is triggered without awareness. Experimental Brain Research 183, 1 (2007), 87--94.Google ScholarGoogle ScholarCross RefCross Ref
  15. Jiye Shen and Eyal M Reingold. 2001. Visual search asymmetry: The influence of stimulus familiarity and low-level features. Perception & Psychophysics 63, 3 (2001), 464--475.Google ScholarGoogle ScholarCross RefCross Ref
  16. Christopher Stanton and Catherine J Stevens. 2014. Robot pressure: the impact of robot eye gaze and lifelike bodily movements upon decision-making and trust. In International Conference on Social Robotics. Springer, 330--339.Google ScholarGoogle ScholarCross RefCross Ref
  17. Takayuki Todo. 2018. SEER. http://www.takayukitodo.com/.Google ScholarGoogle Scholar
  18. Tomoko Yonezawa, Hirotake Yamazoe, Akira Utsumi, and Shinji Abe. 2007. Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking. In Proceedings of the 9th international conference on Multimodal interfaces. ACM, 140--145.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Evaluation of attention inducing effects using ubiquitous humanlike face robots

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          MUM '19: Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia
          November 2019
          462 pages
          ISBN:9781450376242
          DOI:10.1145/3365610

          Copyright © 2019 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 26 November 2019

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          Overall Acceptance Rate190of465submissions,41%
        • Article Metrics

          • Downloads (Last 12 months)6
          • Downloads (Last 6 weeks)1

          Other Metrics

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader