skip to main content
10.1145/3384657.3385333acmotherconferencesArticle/Chapter ViewAbstractPublication PagesahsConference Proceedingsconference-collections
poster

Understanding Face Gestures with a User-Centered Approach Using Personal Computer Applications as an Example

Published: 06 June 2020 Publication History

Abstract

While face gesture input has been proposed by researchers, the issue of practical gestures remains unsolved. We present the first comprehensive investigation of user-defined face gestures as an augmented input modality. Based on a focus group discussion, we developed three sets of tasks, where we asked participants to spontaneously produce face gestures to complete these tasks. We report our findings of a user study and discuss the user preference of face gestures. The results inform the development of future interaction systems utilizing face gestures.

References

[1]
Toshiyuki Ando, Yuki Kubo, Buntarou Shizuki, and Shin Takahashi. 2017. CanalSense: Face-Related Movement Recognition System Based on Sensing Air Pressure in Ear Canals. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (UIST '17). ACM, 679--689.
[2]
Margaret M. Bradley and Peter J. Lang. 1994. Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry 25, 1 (1994), 49--59.
[3]
M. J. Lyons. 2004. Facial gesture interfaces for expression and communication. In 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583), Vol. 1. 598--603 vol.1.
[4]
Denys J. C. Matthies, Bernhard A. Strecker, and Bodo Urban. 2017. EarFieldSensing: A Novel In-Ear Electric Field Sensing to Enrich Wearable Gesture Input Through Facial Expressions. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, 1911--1922.
[5]
David Rozado, Jason Niu, and Martin Lochner. 2017. Fast Human-Computer Interaction by Combining Gaze Pointing and Face Gestures. ACM Trans. Access. Comput. 10, 3 (Aug. 2017), 10:1--10:18.
[6]
Junichi Shimizu, Juyoung Lee, Murtaza Dhuliawala, Andreas Bulling, Thad Starner, Woontack Woo, and Kai Kunze. 2016. Solar System: Smooth Pursuit Interactions Using EOG Glasses. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (UbiComp '16). ACM, 369--372.
[7]
Radu-Daniel Vatavu and Jacob O. Wobbrock. 2015. Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). 1325--1334.
[8]
Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. [n. d.]. User-defined Gestures for Surface Computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, 1083--1092.

Cited By

View all
  • (2022)HandyGaze: A Gaze Tracking Technique for Room-Scale Environments using a Single SmartphoneProceedings of the ACM on Human-Computer Interaction10.1145/35677156:ISS(143-160)Online publication date: 14-Nov-2022
  • (2021)Understanding the Design Space of Mouth MicrogesturesProceedings of the 2021 ACM Designing Interactive Systems Conference10.1145/3461778.3462004(1068-1081)Online publication date: 28-Jun-2021
  • (2020)Face Commands - User-Defined Facial Gestures for Smart Glasses2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR50242.2020.00064(374-386)Online publication date: Nov-2020

Index Terms

  1. Understanding Face Gestures with a User-Centered Approach Using Personal Computer Applications as an Example

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Other conferences
        AHs '20: Proceedings of the Augmented Humans International Conference
        March 2020
        296 pages
        ISBN:9781450376037
        DOI:10.1145/3384657
        Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 06 June 2020

        Check for updates

        Author Tags

        1. Face gesture
        2. elicitation study
        3. user-centered
        4. user-defined gesture

        Qualifiers

        • Poster
        • Research
        • Refereed limited

        Funding Sources

        Conference

        AHs '20
        AHs '20: Augmented Humans International Conference
        March 16 - 17, 2020
        Kaiserslautern, Germany

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)4
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 16 Feb 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2022)HandyGaze: A Gaze Tracking Technique for Room-Scale Environments using a Single SmartphoneProceedings of the ACM on Human-Computer Interaction10.1145/35677156:ISS(143-160)Online publication date: 14-Nov-2022
        • (2021)Understanding the Design Space of Mouth MicrogesturesProceedings of the 2021 ACM Designing Interactive Systems Conference10.1145/3461778.3462004(1068-1081)Online publication date: 28-Jun-2021
        • (2020)Face Commands - User-Defined Facial Gestures for Smart Glasses2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR50242.2020.00064(374-386)Online publication date: Nov-2020

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Figures

        Tables

        Media

        Share

        Share

        Share this Publication link

        Share on social media