skip to main content
10.1145/3098279.3098538acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
research-article

EarTouch: turning the ear into an input surface

Published: 04 September 2017 Publication History

Abstract

In this paper, we propose EarTouch, a new sensing technology for ear-based input for controlling applications by slightly pulling the ear and detecting the deformation by an enhanced earphone device. It is envisioned that EarTouch will enable control of applications such as music players, navigation systems, and calendars as an "eyes-free" interface. As for the operation of EarTouch, the shape deformation of the ear is measured by optical sensors. Deformation of the skin caused by touching the ear with the fingers is recognized by attaching optical sensors to the earphone and measuring the distance from the earphone to the skin inside the ear. EarTouch supports recognition of multiple gestures by applying a support vector machine (SVM). EarTouch was validated through a set of user studies.

References

[1]
earable. http://www.earable.jp/
[2]
Yuichiro Kojima, Yuki Hashimoto, Shogo Fukushima, and Hiroyuki Kajimoto. 2009. Pull-navi: a novel tactile navigation interface by pulling the ears. In ACM SIGGRAPH 2009 Emerging Technologies (SIGGRAPH '09). ACM, Article 19, 1 pages.
[3]
Chris Harrison, Desney Tan, and Dan Morris. 2010. Skinput: appropriating the body as an input surface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). ACM, 453--462.
[4]
Naoya Koizumi, Hidekazu Tanaka, Yuji Uema, and Masahiko Inami. 2011. Chewing jockey: augmented food texture by using sound based on the cross-modal effect. In Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology (ACE '11), ACM, Article 21, 4 pages.
[5]
Gierad Laput, Xiang 'Anthony' Chen, and Chris Harrison. 2016. SweepSense: Ad Hoc Configuration Sensing Using Reflected Swept-Frequency Ultrasonics. In Proceedings of the 21st International Conference on Intelligent User Interfaces (IUI '16). ACM, 332--335.
[6]
Roman Lissermann, Jochen Huber, Aristotelis Hadjakos, and Max Mühlhäuser. 2014. EarPut: augmenting ear-worn devices for ear-based interaction. In Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of Design (OzCHI '14). ACM, 300--307.
[7]
Hiroyuki Manabe and Masaaki Fukumoto. 2006. Full-time wearable headphone-type gaze detector. In CHI '06 Extended Abstracts on Human Factors in Computing Systems (CHI EA '06). ACM, 1073--1078.
[8]
Katsutoshi Masai, Yuta Sugiura, Masa Ogata, Kai Kunze, Masahiko Inami, and Maki Sugimoto. 2016. Facial Expression Recognition in Daily Life by Embedded Photo Reflective Sensors on Smart Eyewear. In Proceedings of the 21st International Conference on Intelligent User Interfaces (IUI '16). ACM, 317--326.
[9]
Kohei Matsumura, Daisuke Sakamoto, Masahiko Inami, and Takeo Igarashi. 2012. Universal earphones: earphones with automatic side and shared use detection. In Proceedings of the 2012 ACM international conference on Intelligent User Interfaces (IUI '12). ACM, 305--306.
[10]
Christian Metzger, Matt Anderson, and Thad Starner. 2004. FreeDigiter: a contact-free device for gesture control, In Proceedings of the 8th International Symposium on Wearable Computers (ISWC '04). IEEE, 18--21.
[11]
Hiromi Nakamura and Homei Miyashita. 2010. Control of augmented reality information volume by glabellar fader. In Proceedings of the 1st Augmented Human International Conference (AH '10). ACM, Article 20, 3 pages.
[12]
Kei Nakatsuma, Hiroyuki Shinoda, Yasutoshi Makino, Katsunari Sato, and Takashi Maeno. 2011. Touch interface on back of the hand. In ACM SIGGRAPH 2011 Emerging Technologies (SIGGRAPH '11). ACM, Article 19, 1 pages.
[13]
Masa Ogata, Yuta Sugiura, Yasutoshi Makino, Masahiko Inami, and Michita Imai. 2013. SenSkin: adapting skin as a soft interface. In Proceedings of the 26th annual ACM symposium on User interface software and technology (UIST '13). ACM, 539--544.
[14]
PSVM: Support Vector Machines for Processing. http://makematics.com/code/psvm/
[15]
Marcos Serrano, Barrett M. Ens, and Pourang P. Irani. 2014. Exploring the use of hand-to-face input for interacting with head-worn displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, 3181--3190.
[16]
Julia Schwarz, Chris Harrison, Scott Hudson, and Jennifer Mankoff. 2010. Cord input: an intuitive, high-accuracy, multi-degree-of-freedom input method for mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). ACM, 1657--1660.
[17]
Martin Weigel, Vikram Mehta, and Jürgen Steimle. 2014. More than touch: understanding how people use skin as an input surface for mobile computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, 179--188.
[18]
Martin Weigel, Tong Lu, Gilles Bailly, Antti Oulasvirta, Carmel Majidi, and Jürgen Steimle. 2015. iSkin: Flexible, Stretchable and Visually Customizable On-Body Touch Sensors for Mobile Computing. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, 2991--3000.
[19]
Jacob O. Wobbrock, Andrew D. Wilson, and Yang Li. 2007. Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In Proceedings of the 20th annual ACM symposium on User interface software and technology (UIST '07). ACM, 159--168.

Cited By

View all
  • (2025)OpenEarable 2.0: Open-Source Earphone Platform for Physiological Ear SensingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/37120699:1(1-33)Online publication date: 3-Mar-2025
  • (2025)SkinRing: Ring-shaped Device Enabling Wear Direction-Independent Gesture Input on Side of Finger2025 IEEE/SICE International Symposium on System Integration (SII)10.1109/SII59315.2025.10871054(386-392)Online publication date: 21-Jan-2025
  • (2024)Exploring User-Defined Gestures as Input for Hearables and Recognizing Ear-Level Gestures with IMUsProceedings of the ACM on Human-Computer Interaction10.1145/36765038:MHCI(1-23)Online publication date: 24-Sep-2024
  • Show More Cited By

Index Terms

  1. EarTouch: turning the ear into an input surface

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MobileHCI '17: Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services
    September 2017
    874 pages
    ISBN:9781450350754
    DOI:10.1145/3098279
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 04 September 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. earphone
    2. photo reflective sensor
    3. skin deformation

    Qualifiers

    • Research-article

    Funding Sources

    • JSPS KAKENHI

    Conference

    MobileHCI '17
    Sponsor:

    Acceptance Rates

    MobileHCI '17 Paper Acceptance Rate 45 of 224 submissions, 20%;
    Overall Acceptance Rate 202 of 906 submissions, 22%

    Upcoming Conference

    MobileHCI '25
    27th International Conference on Mobile Human-Computer Interaction
    September 22 - 26, 2025
    Sharm El-Sheikh , Egypt

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)75
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)OpenEarable 2.0: Open-Source Earphone Platform for Physiological Ear SensingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/37120699:1(1-33)Online publication date: 3-Mar-2025
    • (2025)SkinRing: Ring-shaped Device Enabling Wear Direction-Independent Gesture Input on Side of Finger2025 IEEE/SICE International Symposium on System Integration (SII)10.1109/SII59315.2025.10871054(386-392)Online publication date: 21-Jan-2025
    • (2024)Exploring User-Defined Gestures as Input for Hearables and Recognizing Ear-Level Gestures with IMUsProceedings of the ACM on Human-Computer Interaction10.1145/36765038:MHCI(1-23)Online publication date: 24-Sep-2024
    • (2024)Designing More Private and Socially Acceptable Hand-to-Face Gestures for Heads-Up ComputingCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3678994(637-639)Online publication date: 5-Oct-2024
    • (2024)EarHover: Mid-Air Gesture Recognition for Hearables Using Sound Leakage SignalsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676367(1-13)Online publication date: 13-Oct-2024
    • (2024)EarAuthCam: Personal Identification and Authentication Method Using Ear Images Acquired with a Camera-Equipped Hearable DeviceProceedings of the Augmented Humans International Conference 202410.1145/3652920.3653059(119-130)Online publication date: 4-Apr-2024
    • (2024)Exploring Uni-manual Around Ear Off-Device Gestures for EarablesProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36435138:1(1-29)Online publication date: 6-Mar-2024
    • (2024)BodyTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314267:4(1-22)Online publication date: 12-Jan-2024
    • (2023)Can You Ear Me?Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109257:3(1-23)Online publication date: 27-Sep-2023
    • (2023)TouchLog: Finger Micro Gesture Recognition Using Photo-Reflective SensorsProceedings of the 2023 ACM International Symposium on Wearable Computers10.1145/3594738.3611371(92-97)Online publication date: 8-Oct-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media