skip to main content
10.1145/3544549.3585629acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Work in Progress

A Field Study on Pedestrians’ Thoughts toward a Car with Gazing Eyes

Published:19 April 2023Publication History

ABSTRACT

It is promising to apply eye-gaze techniques in designing an external human-machine interface (eHMI) for a self-driving car. We can find several prior "eye" studies; however, due to the difficulty of running a study in a real environment, prior research was often evaluated in a controlled VR environment. It is unclear how physical eyes on the car affect pedestrians’ thoughts in the real-world outdoor environment. To answer the question, we built and mounted a set of physical eyes of suitable size for a real car, drove the car in a public open space, activated the physical eyes, and performed the eye-gaze interaction with pedestrians without providing them any prior explanation. We administered a questionnaire to collect pedestrians’ thoughts and conducted a thematic (inductive) analysis. By comparing our findings to the previous results through a literature review, we highlighted the significance of physical implementation of the "eye concept" for future research.

Skip Supplemental Material Section

Supplemental Material

3544549.3585629-video-preview.mp4

mp4

47.5 MB

3544549.3585629-talk-video.mp4

mp4

169.3 MB

References

  1. Henny Admoni and Brian Scassellati. 2017. Social eye gaze in human-robot interaction: a review. Journal of Human-Robot Interaction 6, 1 (2017), 25–63.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Narges Ashtari, Andrea Bunt, Joanna McGrenere, Michael Nebeling, and Parmit K Chilana. 2020. Creating augmented and virtual reality applications: Current practices, challenges, and opportunities. In Proceedings of the 2020 CHI conference on human factors in computing systems. 1–13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Pavlo Bazilinskyy, Dimitra Dodou, and Joost De Winter. 2019. Survey on eHMI concepts: The effect of text, color, and perspective. Transportation research part F: traffic psychology and behaviour 67 (2019), 175–194.Google ScholarGoogle Scholar
  4. Pavlo Bazilinskyy, Dimitra Dodou, and Joost De Winter. 2020. External Human-Machine Interfaces: Which of 729 colors is best for signaling ‘Please (do not) cross’?. In 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, 3721–3728.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Marc-Philipp Böckle, Anna Pernestål Brenden, Maria Klingegård, Azra Habibovic, and Martijn Bout. 2017. SAV2P: Exploring the impact of an interface for shared automated vehicles on pedestrians’ experience. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct. 136–140.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Chia-Ming Chang, Koki Toda, Xinyue Gui, Stela H. Seo, and Takeo Igarashi. 2022. Can Eyes on a Car Reduce Traffic Accidents?. In Proceedings of the 14th international conference on automotive user interfaces and interactive vehicular applications.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Chia-Ming Chang, Koki Toda, Takeo Igarashi, Masahiro Miyata, and Yasuhiro Kobayashi. 2018. A video-based study comparing communication modalities between an autonomous car and a pedestrian. In Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 104–109.Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Chia-Ming Chang, Koki Toda, Daisuke Sakamoto, and Takeo Igarashi. 2017. Eyes on a Car: an Interface Design for Communication between an Autonomous Car and a Pedestrian. In Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications. 65–73.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Mark Colley, Christian Hummler, and Enrico Rukzio. 2022. Effects of mode distinction, user visibility, and vehicle appearance on mode confusion when interacting with highly automated vehicles. Transportation research part F: traffic psychology and behaviour 89 (2022), 303–316.Google ScholarGoogle Scholar
  10. Mark Colley and Enrico Rukzio. 2020. A design space for external communication of autonomous vehicles. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 212–222.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Mark Colley, Marcel Walch, and Enrico Rukzio. 2020. Unveiling the lack of scalability in research on external communication of autonomous vehicles. In Extended abstracts of the 2020 chi conference on human factors in computing systems. 1–9.Google ScholarGoogle Scholar
  12. Rebecca Currano, So Yeon Park, Lawrence Domingo, Jesus Garcia-Mancilla, Pedro C Santana-Mancilla, Victor M Gonzalez, and Wendy Ju. 2018. !‘ Vamos! Observations of pedestrian interactions with driverless cars in Mexico. In Proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications. 210–220.Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Joost de Winter and Dimitra Dodou. 2022. External Human-Machine Interfaces: Gimmick or Necessity. Delft University of Technology, Delft, The Netherlands (2022).Google ScholarGoogle Scholar
  14. Frédéric Dehais, Mickaël Causse, François Vachon, and Sébastien Tremblay. 2012. Cognitive conflict in human–automation interactions: a psychophysiological study. Applied ergonomics 43, 3 (2012), 588–595.Google ScholarGoogle Scholar
  15. Debargha Dey, Azra Habibovic, Andreas Löcken, Philipp Wintersberger, Bastian Pfleging, Andreas Riener, Marieke Martens, and Jacques Terken. 2020. Taming the eHMI jungle: A classification taxonomy to guide, compare, and assess the design principles of automated vehicles’ external human-machine interfaces. Transportation Research Interdisciplinary Perspectives 7 (2020), 100174.Google ScholarGoogle ScholarCross RefCross Ref
  16. Debargha Dey, Azra Habibovic, Bastian Pfleging, Marieke Martens, and Jacques Terken. 2020. Color and animation preferences for a light band eHMI in interactions between automated vehicles and pedestrians. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Debargha Dey and Jacques Terken. 2017. Pedestrian interaction with vehicles: roles of explicit and implicit communication. In Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications. 109–113.Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Stefanie M Faas, Lesley-Ann Mathis, and Martin Baumann. 2020. External HMI for self-driving vehicles: which information shall be displayed?Transportation research part F: traffic psychology and behaviour 68 (2020), 171–186.Google ScholarGoogle Scholar
  19. Stephen M Fiore, Travis J Wiltshire, Emilio JC Lobato, Florian G Jentsch, Wesley H Huang, and Benjamin Axelrod. 2013. Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior. Frontiers in psychology 4 (2013), 859.Google ScholarGoogle Scholar
  20. Lex Fridman, Bruce Mehler, Lei Xia, Yangyang Yang, Laura Yvonne Facusse, and Bryan Reimer. 2017. To walk or not to walk: Crowdsourced assessment of external vehicle-to-pedestrian displays. arXiv preprint arXiv:1707.02698 (2017).Google ScholarGoogle Scholar
  21. Xinyue Gui, Koki Toda, Stela H. Seo, Chia-Ming Chang, and Takeo Igarashi. 2022. “I am going this way”: Gazing Eyes on Self-Driving Car Show Multiple Driving Directions. In Proceedings of the 14th international conference on automotive user interfaces and interactive vehicular applications.Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Azra Habibovic, Jonas Andersson, Maria Nilsson, V Malmsten Lundgren, and J Nilsson. 2016. Evaluating interactions with non-existing automated vehicles: three Wizard of Oz approaches. In 2016 IEEE intelligent vehicles symposium (IV). IEEE, 32–37.Google ScholarGoogle Scholar
  23. Marius Hoggenmüller, Martin Tomitsch, Luke Hespanhol, Tram Thi Minh Tran, Stewart Worrall, and Eduardo Nebot. 2021. Context-Based Interface Prototyping: Understanding the Effect of Prototype Representation on User Feedback. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama, Japan) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 370, 14 pages.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Qianni JIANG, Xiangling ZHUANG, and Guojie MA. 2021. Evaluation of external HMI in autonomous vehicles based on pedestrian road crossing decision-making model. Advances in Psychological Science 29, 11 (2021), 1979.Google ScholarGoogle ScholarCross RefCross Ref
  25. Omae Laboratory. 2011. Research on information transmission to the exterior in self-driving self-driving cars using eyeball devices. Retrieved December 8, 2022 from http://web.sfc.keio.ac.jp/ omae/poster/omaelab.pdfGoogle ScholarGoogle Scholar
  26. Jae-Gil Lee, Ki Joon Kim, Sangwon Lee, and Dong-Hee Shin. 2015. Can autonomous vehicles be safe and trustworthy? Effects of appearance and autonomy of unmanned driving systems. International Journal of Human-Computer Interaction 31, 10 (2015), 682–691.Google ScholarGoogle ScholarCross RefCross Ref
  27. Jamy Li, Rebecca Currano, David Sirkin, David Goedicke, Hamish Tennent, Aaron Levine, Vanessa Evers, and Wendy Ju. 2020. On-road and online studies to investigate beliefs and behaviors of Netherlands, US and Mexico pedestrians encountering hidden-driver vehicles. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. 141–149.Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Ritch Macefield. 2007. Usability Studies and the Hawthorne Effect. J. Usability Studies 2, 3 (may 2007), 145–154.Google ScholarGoogle Scholar
  29. Karthik Mahadevan, Elaheh Sanoubari, Sowmya Somanath, James E Young, and Ehud Sharlin. 2019. AV-Pedestrian interaction design using a pedestrian mixed traffic simulator. In Proceedings of the 2019 on designing interactive systems conference. 475–486.Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Karthik Mahadevan, Sowmya Somanath, and Ehud Sharlin. 2018. Communicating awareness and intent in autonomous vehicle-pedestrian interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–12.Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Takafumi Matsumaru, Kazuya Iwase, Kyouhei Akiyama, Takashi Kusada, and Tomotaka Ito. 2005. Mobile robot with eyeball expression as the preliminary-announcement and display of the robot’s following motion. Autonomous Robots 18, 2 (2005), 231–246.Google ScholarGoogle ScholarCross RefCross Ref
  32. Milecia Matthews, Girish Chowdhary, and Emily Kieson. 2017. Intent communication between autonomous vehicles and pedestrians. arXiv preprint arXiv:1708.07123 (2017).Google ScholarGoogle Scholar
  33. Yoichi Ochiai and Keisuke Toyoshima. 2011. Homunculus: the vehicle as augmented clothes. In Proceedings of the 2nd Augmented Human International Conference. 1–4.Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Oskar Palinko, Francesco Rea, Giulio Sandini, and Alessandra Sciutti. 2016. Robot reading human gaze: Why eye tracking is better than head tracking for human-robot collaboration. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 5048–5054.Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Ana Rodríguez Palmeiro, Sander van der Kint, Luuk Vissers, Haneen Farah, Joost CF de Winter, and Marjan Hagenzieker. 2018. Interaction between pedestrians and automated vehicles: A Wizard of Oz experiment. Transportation research part F: traffic psychology and behaviour 58 (2018), 1005–1020.Google ScholarGoogle Scholar
  36. Nicholas Pennycooke. 2012. AEVITA: Designing biomimetic vehicle-to-pedestrian communication protocols for autonomously operating & parking on-road electric vehicles. Ph. D. Dissertation. Massachusetts Institute of Technology.Google ScholarGoogle Scholar
  37. Christian Purucker, David E Sprott, and Andreas Herrmann. 2014. Consumer response to car fronts: eliciting biological preparedness with product design. Review of Managerial Science 8, 4 (2014), 523–540.Google ScholarGoogle ScholarCross RefCross Ref
  38. Dirk Rothenbücher, Jamy Li, David Sirkin, Brian Mok, and Wendy Ju. 2016. Ghost driver: A field study investigating the interaction between pedestrians and driverless vehicles. In 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, 795–802.Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Alexandros Rouchitsas and Håkan Alm. 2019. External human–machine interfaces for autonomous vehicle-to-pedestrian communication: A review of empirical work. Frontiers in psychology 10 (2019), 2757.Google ScholarGoogle Scholar
  40. Alexandros Rouchitsas and Håkan Alm. 2022. Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention. Information 13, 9 (2022), 420.Google ScholarGoogle ScholarCross RefCross Ref
  41. Jaguar Land Rover. 2018. THE VIRTUAL EYES HAVE IT. Retrieved Sept 12, 2022 from https://www.jaguarlandrover.com/2018/virtual-eyes-have-itGoogle ScholarGoogle Scholar
  42. Jinjuan She, Jack Neuhoff, and Qingcong Yuan. 2021. Shaping pedestrians’ trust in autonomous vehicles: an effect of communication style, speed information, and adaptive strategy. Journal of Mechanical Design 143, 9 (2021).Google ScholarGoogle ScholarCross RefCross Ref
  43. David Sirkin, Sonia Baltodano, Brian Mok, Dirk Rothenbücher, Nikhil Gowda, Jamy Li, Nikolas Martelaro, David Miller, Srinath Sibi, and Wendy Ju. 2016. Embodied design improvisation for autonomous vehicles. In Design Thinking Research. Springer, 125–143.Google ScholarGoogle Scholar
  44. Raimo Streefkerk. 2022. Inductive vs. Deductive Research Approach. Retrieved Sept 12, 2022 from hhttps://www.scribbr.com/methodology/inductive-deductive-reasoning/Google ScholarGoogle Scholar
  45. Elena Torta, Jim van Heumen, Raymond H Cuijpers, and James F Juola. 2012. How can a robot attract the attention of its human partner? a comparative study over different modalities for attracting attention. In International Conference on Social Robotics. Springer, 288–297.Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Akira Utsumi, Masahiro Tada, Naoki Yamamoto, Noriyoshi Matsuo, Takeshi Torii, and Kazunori Shidoji. 2013. Active safety with enhanced communication among traffic participants. In 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013). IEEE, 1167–1172.Google ScholarGoogle ScholarCross RefCross Ref
  47. Sonja Windhager, Florian Hutzler, Claus-Christian Carbon, Elisabeth Oberzaucher, Katrin Schaefer, Truls Thorstensen, Helmut Leder, and Karl Grammer. 2010. Laying eyes on headlights: Eye movements suggest facial features in cars. Collegium antropologicum 34, 3 (2010), 1075–1080.Google ScholarGoogle Scholar

Index Terms

  1. A Field Study on Pedestrians’ Thoughts toward a Car with Gazing Eyes

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems
      April 2023
      3914 pages
      ISBN:9781450394222
      DOI:10.1145/3544549

      Copyright © 2023 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 19 April 2023

      Check for updates

      Qualifiers

      • Work in Progress
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate6,164of23,696submissions,26%

      Upcoming Conference

      CHI '24
      CHI Conference on Human Factors in Computing Systems
      May 11 - 16, 2024
      Honolulu , HI , USA
    • Article Metrics

      • Downloads (Last 12 months)343
      • Downloads (Last 6 weeks)90

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Full Text

    View this article in Full Text.

    View Full Text

    HTML Format

    View this article in HTML Format .

    View HTML Format