ABSTRACT
It is promising to apply eye-gaze techniques in designing an external human-machine interface (eHMI) for a self-driving car. We can find several prior "eye" studies; however, due to the difficulty of running a study in a real environment, prior research was often evaluated in a controlled VR environment. It is unclear how physical eyes on the car affect pedestrians’ thoughts in the real-world outdoor environment. To answer the question, we built and mounted a set of physical eyes of suitable size for a real car, drove the car in a public open space, activated the physical eyes, and performed the eye-gaze interaction with pedestrians without providing them any prior explanation. We administered a questionnaire to collect pedestrians’ thoughts and conducted a thematic (inductive) analysis. By comparing our findings to the previous results through a literature review, we highlighted the significance of physical implementation of the "eye concept" for future research.
Supplemental Material
- Henny Admoni and Brian Scassellati. 2017. Social eye gaze in human-robot interaction: a review. Journal of Human-Robot Interaction 6, 1 (2017), 25–63.Google ScholarDigital Library
- Narges Ashtari, Andrea Bunt, Joanna McGrenere, Michael Nebeling, and Parmit K Chilana. 2020. Creating augmented and virtual reality applications: Current practices, challenges, and opportunities. In Proceedings of the 2020 CHI conference on human factors in computing systems. 1–13.Google ScholarDigital Library
- Pavlo Bazilinskyy, Dimitra Dodou, and Joost De Winter. 2019. Survey on eHMI concepts: The effect of text, color, and perspective. Transportation research part F: traffic psychology and behaviour 67 (2019), 175–194.Google Scholar
- Pavlo Bazilinskyy, Dimitra Dodou, and Joost De Winter. 2020. External Human-Machine Interfaces: Which of 729 colors is best for signaling ‘Please (do not) cross’?. In 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, 3721–3728.Google ScholarDigital Library
- Marc-Philipp Böckle, Anna Pernestål Brenden, Maria Klingegård, Azra Habibovic, and Martijn Bout. 2017. SAV2P: Exploring the impact of an interface for shared automated vehicles on pedestrians’ experience. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct. 136–140.Google ScholarDigital Library
- Chia-Ming Chang, Koki Toda, Xinyue Gui, Stela H. Seo, and Takeo Igarashi. 2022. Can Eyes on a Car Reduce Traffic Accidents?. In Proceedings of the 14th international conference on automotive user interfaces and interactive vehicular applications.Google ScholarDigital Library
- Chia-Ming Chang, Koki Toda, Takeo Igarashi, Masahiro Miyata, and Yasuhiro Kobayashi. 2018. A video-based study comparing communication modalities between an autonomous car and a pedestrian. In Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 104–109.Google ScholarDigital Library
- Chia-Ming Chang, Koki Toda, Daisuke Sakamoto, and Takeo Igarashi. 2017. Eyes on a Car: an Interface Design for Communication between an Autonomous Car and a Pedestrian. In Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications. 65–73.Google ScholarDigital Library
- Mark Colley, Christian Hummler, and Enrico Rukzio. 2022. Effects of mode distinction, user visibility, and vehicle appearance on mode confusion when interacting with highly automated vehicles. Transportation research part F: traffic psychology and behaviour 89 (2022), 303–316.Google Scholar
- Mark Colley and Enrico Rukzio. 2020. A design space for external communication of autonomous vehicles. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. 212–222.Google ScholarDigital Library
- Mark Colley, Marcel Walch, and Enrico Rukzio. 2020. Unveiling the lack of scalability in research on external communication of autonomous vehicles. In Extended abstracts of the 2020 chi conference on human factors in computing systems. 1–9.Google Scholar
- Rebecca Currano, So Yeon Park, Lawrence Domingo, Jesus Garcia-Mancilla, Pedro C Santana-Mancilla, Victor M Gonzalez, and Wendy Ju. 2018. !‘ Vamos! Observations of pedestrian interactions with driverless cars in Mexico. In Proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications. 210–220.Google ScholarDigital Library
- Joost de Winter and Dimitra Dodou. 2022. External Human-Machine Interfaces: Gimmick or Necessity. Delft University of Technology, Delft, The Netherlands (2022).Google Scholar
- Frédéric Dehais, Mickaël Causse, François Vachon, and Sébastien Tremblay. 2012. Cognitive conflict in human–automation interactions: a psychophysiological study. Applied ergonomics 43, 3 (2012), 588–595.Google Scholar
- Debargha Dey, Azra Habibovic, Andreas Löcken, Philipp Wintersberger, Bastian Pfleging, Andreas Riener, Marieke Martens, and Jacques Terken. 2020. Taming the eHMI jungle: A classification taxonomy to guide, compare, and assess the design principles of automated vehicles’ external human-machine interfaces. Transportation Research Interdisciplinary Perspectives 7 (2020), 100174.Google ScholarCross Ref
- Debargha Dey, Azra Habibovic, Bastian Pfleging, Marieke Martens, and Jacques Terken. 2020. Color and animation preferences for a light band eHMI in interactions between automated vehicles and pedestrians. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–13.Google ScholarDigital Library
- Debargha Dey and Jacques Terken. 2017. Pedestrian interaction with vehicles: roles of explicit and implicit communication. In Proceedings of the 9th international conference on automotive user interfaces and interactive vehicular applications. 109–113.Google ScholarDigital Library
- Stefanie M Faas, Lesley-Ann Mathis, and Martin Baumann. 2020. External HMI for self-driving vehicles: which information shall be displayed?Transportation research part F: traffic psychology and behaviour 68 (2020), 171–186.Google Scholar
- Stephen M Fiore, Travis J Wiltshire, Emilio JC Lobato, Florian G Jentsch, Wesley H Huang, and Benjamin Axelrod. 2013. Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior. Frontiers in psychology 4 (2013), 859.Google Scholar
- Lex Fridman, Bruce Mehler, Lei Xia, Yangyang Yang, Laura Yvonne Facusse, and Bryan Reimer. 2017. To walk or not to walk: Crowdsourced assessment of external vehicle-to-pedestrian displays. arXiv preprint arXiv:1707.02698 (2017).Google Scholar
- Xinyue Gui, Koki Toda, Stela H. Seo, Chia-Ming Chang, and Takeo Igarashi. 2022. “I am going this way”: Gazing Eyes on Self-Driving Car Show Multiple Driving Directions. In Proceedings of the 14th international conference on automotive user interfaces and interactive vehicular applications.Google ScholarDigital Library
- Azra Habibovic, Jonas Andersson, Maria Nilsson, V Malmsten Lundgren, and J Nilsson. 2016. Evaluating interactions with non-existing automated vehicles: three Wizard of Oz approaches. In 2016 IEEE intelligent vehicles symposium (IV). IEEE, 32–37.Google Scholar
- Marius Hoggenmüller, Martin Tomitsch, Luke Hespanhol, Tram Thi Minh Tran, Stewart Worrall, and Eduardo Nebot. 2021. Context-Based Interface Prototyping: Understanding the Effect of Prototype Representation on User Feedback. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (Yokohama, Japan) (CHI ’21). Association for Computing Machinery, New York, NY, USA, Article 370, 14 pages.Google ScholarDigital Library
- Qianni JIANG, Xiangling ZHUANG, and Guojie MA. 2021. Evaluation of external HMI in autonomous vehicles based on pedestrian road crossing decision-making model. Advances in Psychological Science 29, 11 (2021), 1979.Google ScholarCross Ref
- Omae Laboratory. 2011. Research on information transmission to the exterior in self-driving self-driving cars using eyeball devices. Retrieved December 8, 2022 from http://web.sfc.keio.ac.jp/ omae/poster/omaelab.pdfGoogle Scholar
- Jae-Gil Lee, Ki Joon Kim, Sangwon Lee, and Dong-Hee Shin. 2015. Can autonomous vehicles be safe and trustworthy? Effects of appearance and autonomy of unmanned driving systems. International Journal of Human-Computer Interaction 31, 10 (2015), 682–691.Google ScholarCross Ref
- Jamy Li, Rebecca Currano, David Sirkin, David Goedicke, Hamish Tennent, Aaron Levine, Vanessa Evers, and Wendy Ju. 2020. On-road and online studies to investigate beliefs and behaviors of Netherlands, US and Mexico pedestrians encountering hidden-driver vehicles. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction. 141–149.Google ScholarDigital Library
- Ritch Macefield. 2007. Usability Studies and the Hawthorne Effect. J. Usability Studies 2, 3 (may 2007), 145–154.Google Scholar
- Karthik Mahadevan, Elaheh Sanoubari, Sowmya Somanath, James E Young, and Ehud Sharlin. 2019. AV-Pedestrian interaction design using a pedestrian mixed traffic simulator. In Proceedings of the 2019 on designing interactive systems conference. 475–486.Google ScholarDigital Library
- Karthik Mahadevan, Sowmya Somanath, and Ehud Sharlin. 2018. Communicating awareness and intent in autonomous vehicle-pedestrian interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1–12.Google ScholarDigital Library
- Takafumi Matsumaru, Kazuya Iwase, Kyouhei Akiyama, Takashi Kusada, and Tomotaka Ito. 2005. Mobile robot with eyeball expression as the preliminary-announcement and display of the robot’s following motion. Autonomous Robots 18, 2 (2005), 231–246.Google ScholarCross Ref
- Milecia Matthews, Girish Chowdhary, and Emily Kieson. 2017. Intent communication between autonomous vehicles and pedestrians. arXiv preprint arXiv:1708.07123 (2017).Google Scholar
- Yoichi Ochiai and Keisuke Toyoshima. 2011. Homunculus: the vehicle as augmented clothes. In Proceedings of the 2nd Augmented Human International Conference. 1–4.Google ScholarDigital Library
- Oskar Palinko, Francesco Rea, Giulio Sandini, and Alessandra Sciutti. 2016. Robot reading human gaze: Why eye tracking is better than head tracking for human-robot collaboration. In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). 5048–5054.Google ScholarDigital Library
- Ana Rodríguez Palmeiro, Sander van der Kint, Luuk Vissers, Haneen Farah, Joost CF de Winter, and Marjan Hagenzieker. 2018. Interaction between pedestrians and automated vehicles: A Wizard of Oz experiment. Transportation research part F: traffic psychology and behaviour 58 (2018), 1005–1020.Google Scholar
- Nicholas Pennycooke. 2012. AEVITA: Designing biomimetic vehicle-to-pedestrian communication protocols for autonomously operating & parking on-road electric vehicles. Ph. D. Dissertation. Massachusetts Institute of Technology.Google Scholar
- Christian Purucker, David E Sprott, and Andreas Herrmann. 2014. Consumer response to car fronts: eliciting biological preparedness with product design. Review of Managerial Science 8, 4 (2014), 523–540.Google ScholarCross Ref
- Dirk Rothenbücher, Jamy Li, David Sirkin, Brian Mok, and Wendy Ju. 2016. Ghost driver: A field study investigating the interaction between pedestrians and driverless vehicles. In 2016 25th IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, 795–802.Google ScholarDigital Library
- Alexandros Rouchitsas and Håkan Alm. 2019. External human–machine interfaces for autonomous vehicle-to-pedestrian communication: A review of empirical work. Frontiers in psychology 10 (2019), 2757.Google Scholar
- Alexandros Rouchitsas and Håkan Alm. 2022. Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention. Information 13, 9 (2022), 420.Google ScholarCross Ref
- Jaguar Land Rover. 2018. THE VIRTUAL EYES HAVE IT. Retrieved Sept 12, 2022 from https://www.jaguarlandrover.com/2018/virtual-eyes-have-itGoogle Scholar
- Jinjuan She, Jack Neuhoff, and Qingcong Yuan. 2021. Shaping pedestrians’ trust in autonomous vehicles: an effect of communication style, speed information, and adaptive strategy. Journal of Mechanical Design 143, 9 (2021).Google ScholarCross Ref
- David Sirkin, Sonia Baltodano, Brian Mok, Dirk Rothenbücher, Nikhil Gowda, Jamy Li, Nikolas Martelaro, David Miller, Srinath Sibi, and Wendy Ju. 2016. Embodied design improvisation for autonomous vehicles. In Design Thinking Research. Springer, 125–143.Google Scholar
- Raimo Streefkerk. 2022. Inductive vs. Deductive Research Approach. Retrieved Sept 12, 2022 from hhttps://www.scribbr.com/methodology/inductive-deductive-reasoning/Google Scholar
- Elena Torta, Jim van Heumen, Raymond H Cuijpers, and James F Juola. 2012. How can a robot attract the attention of its human partner? a comparative study over different modalities for attracting attention. In International Conference on Social Robotics. Springer, 288–297.Google ScholarDigital Library
- Akira Utsumi, Masahiro Tada, Naoki Yamamoto, Noriyoshi Matsuo, Takeshi Torii, and Kazunori Shidoji. 2013. Active safety with enhanced communication among traffic participants. In 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013). IEEE, 1167–1172.Google ScholarCross Ref
- Sonja Windhager, Florian Hutzler, Claus-Christian Carbon, Elisabeth Oberzaucher, Katrin Schaefer, Truls Thorstensen, Helmut Leder, and Karl Grammer. 2010. Laying eyes on headlights: Eye movements suggest facial features in cars. Collegium antropologicum 34, 3 (2010), 1075–1080.Google Scholar
Index Terms
- A Field Study on Pedestrians’ Thoughts toward a Car with Gazing Eyes
Recommendations
“I am going this way”: Gazing Eyes on Self-Driving Car Show Multiple Driving Directions
AutomotiveUI '22: Proceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular ApplicationsModern cars express three moving directions (left, right, straight) using turn signals (i.e., blinkers), which is insufficient when multiple paths are toward the same side. As such, drivers give additional hints (e.g., gesture, eye contact) in the ...
Eyes on a Car: an Interface Design for Communication between an Autonomous Car and a Pedestrian
AutomotiveUI '17: Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular ApplicationsSelf-driving technologies have been increasingly developed and tested in recent years (e.g., Volvo's and Google's self-driving cars). However, only a limited number of investigations have so far been conducted into communication between self-driving ...
Comments