skip to main content
10.1145/3125739.3132578acmconferencesArticle/Chapter ViewAbstractPublication PageshaiConference Proceedingsconference-collections
research-article
Open access

Exploring Gaze-Activated Object With the CoffeePet

Published: 27 October 2017 Publication History

Abstract

The feeling of being looked back when we look at someone and that someone is also aware that we are looking at him/her is a basic fundamental during social interaction. This situation can only occur if both realize the presence of each other. Based on these theories, this research is motivated in exploiting the possibility of designing for a gaze sensitive object - how people can relate to object by depending on their eyes only. In this paper, we present a gaze-activated coffee machine called the CoffeePet attached with two small, OLED screen that will displays animated eyes. These eyes are responsive towards the user's gaze behavior. Furthermore, we used a sensor module (HVC Omron) to detect and track the eyes of a user in real time. It gives the ability for the user to interact with the CoffeePet simply by moving their eyes. The CoffeePet is also able to automatically brew and pour the coffee out of its spout if it feels appropriate during the interaction. We further explain the description of the system, modification of the real product, and the experimental plan to compare the user's perception of the CoffeePet's eyes and to investigate whether the user realizes or not that their gaze behavior influences the CoffeePet to react.

References

[1]
Anas, S.A. binti, Qiu, S., Rauterberg, M. and Hu, J. 2016. Exploring gaze in interacting with everyday objects with an interactive cup. In Proceedings of the Fourth International Conference on Human Agent Interaction (HAI '16) (2016).
[2]
Anas, S.A. binti, Qiu, S., Rauterberg, M. and Hu, J. 2016. Exploring social interaction with everyday object based on perceptual crossing. In Proceedings of the Fourth International Conference on Human Agent Interaction (HAI '16) (2016).
[3]
Bee, N., Wagner, J., André, E., Vogt, T., Charles, F., Pizzi, D. and Cavazza, M. 2010. Discovering eye gaze behavior during human-agent conversation in an interactive storytelling application. International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction (ICMI-MLMI '10) (2010).
[4]
Fukayama, A., Ohno, T., Mukawa, N., Sawaki, M. and Hagita, N. 2002. Messages embedded in gaze of interface agents ' impression management with agent's gaze. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '02) (2002).
[5]
Hsiao, J.H. and Cottrell, G. 2008. Two Fixations Suffice in Face Recognition. Psychological Science. 19, 10 (2008), 998--1006.
[6]
Jacob, R.J.K. 1991. The Use of Eye Movements in Human-computer Interaction Techniques:What You Look at is What You Get. ACM Trans. Inf. Syst. 9, 2 (1991), 152--169.
[7]
Jacob, R.J.K. and Karn, K.S. 2003. Eye Tracking in Human--Computer Interaction and Usability Research:Ready to Deliver the Promises. J. Hyona, R. Radach, and H. Deubel, eds. Elsevier B.V. 573-- 603.
[8]
Khamis, M., Bulling, A. and Alt, F. 2015. Tackling Challenges of Interactive Public Displays Using Gaze. Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers (Osaka, Japan, 2015), 763--766.
[9]
Liversedge, S., Gilchrist, I. and Everling, S. 2011. The Oxford Handbook of Eye Movements. OUP Oxford.
[10]
Parise, S., Kiesler, S., Sproull, L. and Waters, K. 1999. Cooperating with life-like interface agents. Computers in Human Behavior. 15, 2 (1999), 123-- 142.
[11]
Philip Burgess. Electronic Animated Eyes using Teensy 3.1/3.2. 2015. Retrieved September 17, 2016 from https://learn.adafruit.com/animated-electroniceyes-using-teensy-3--1/overview.
[12]
Zhang, Y., Chong, M.K., Müller, J., Bulling, A. and Gellersen, H. 2015. Eye Tracking for Public Displays in the Wild. Personal and Ubiquitous Computing. 19, 5 (2015), 967--981.

Index Terms

  1. Exploring Gaze-Activated Object With the CoffeePet

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HAI '17: Proceedings of the 5th International Conference on Human Agent Interaction
    October 2017
    550 pages
    ISBN:9781450351133
    DOI:10.1145/3125739
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 27 October 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. anthropomorphism
    2. eye tracking
    3. gaze sensitive object
    4. gaze-based interaction
    5. human-object interaction.
    6. perceptual crossing

    Qualifiers

    • Research-article

    Conference

    HAI '17
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 121 of 404 submissions, 30%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 240
      Total Downloads
    • Downloads (Last 12 months)33
    • Downloads (Last 6 weeks)8
    Reflects downloads up to 20 Jan 2025

    Other Metrics

    Citations

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media