skip to main content
10.1145/2047196.2047210acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

A tongue input device for creating conversations

Published: 16 October 2011 Publication History

Abstract

We present a new tongue input device, the tongue joystick, for use by an actor inside an articulated-head character costume. Using our device, the actor can maneuver through a dialogue tree, selecting clips of prerecorded audio to hold a conversation in the voice of the character. The device is constructed of silicone sewn with conductive thread, a unique method for creating rugged, soft, low-actuation force devices. This method has application for entertainment and assistive technology. We compare our device against other portable mouth input devices, showing it to be the fastest and most accurate in tasks mimicking our target application. Finally, we show early results of an actor inside an articulated-head costume using the tongue joystick to interact with a child.

Supplementary Material

JPG File (fp134.jpg)
MOV File (fp134.mov)

References

[1]
L. Andreasen Struijk. An inductive tongue computer interface for control of computers and assistive devices. IEEE Transactions on Biomedical Engineering, 53(12):2594--2597, 2006.
[2]
M. Betke, J. Gips, and P. Fleming. The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 10:1--10, 2002.
[3]
J. Bilmes, X. Li, J. Malkin, K. Kilanski, R. Wright, K. Kirchhoff, A. Subramanya, S. Harada, J. Landay, P. Dowden, and H. Chizeck. The vocal joystick: a voice-based human-computer interface for individuals with motor impairments. In Conference on Human Language Technology and Empirical Methods in Natural Language Processing, pages 995--1002. Association for Computational Linguistics, 2005.
[4]
Broadened Horizons. Bite Switch, 2011. broadenedhorizons.com.
[5]
Broadened Horizons. QuadMouse, 2011. broadenedhorizons.com.
[6]
C. Darves and S. Oviatt. Adaptation of users' spoken dialogue patterns in a conversational interface. In International Conference on Spoken Language Processing, pages 561--564, 2002.
[7]
P. DiMattia and J. Gips. Eagleeyes: Technologies for non-verbal persons. In K. Thies and J. Travers, editors, Handbook of Human Development for Health Care Professionals, pages 429--448. Jones and Bartlett, 2005.
[8]
EnableMart. EyeTech TM2, 2009. enablemart.com.
[9]
G. Evreinov and T. Evreinova. "Breath-joystick" -- graphical manipulator for physically disabled users. In International Conference on Computers Helping People with Special Needs, pages 193--200, 2000.
[10]
T. Felzer and B. Freisleben. Hawcos: the "hands-free" wheelchair control system. In ACM Conference on Assistive Technologies, pages 127--134, 2002.
[11]
L. Hochberg and J. Donoghue. Sensors for brain-computer interfaces. IEEE Engineering in Medicine and Biology Magazine, 25(5):32--38, Sept.-Oct. 2006.
[12]
X. Huo and M. Ghovanloo. Using constrained tongue motion as an alternative control surface for wheeled mobility. IEEE Transactions on Biomedical Engineering, 56(6):1719--1726, 2009.
[13]
T. Igarashi and J. Hughes. Voice as sound: using non-verbal voice input for interactive control. In ACM Symposium on User Interface Software and Technology, pages 155--156, 2001.
[14]
J. S. Kim, H. Jeong, and W. Son. A new means of HCI: EMG-MOUSE. In IEEE Conference on Systems, Man and Cybernetics, volume 1, pages 100--104, 2004.
[15]
K. Koichi. Input device for disabled persons using expiration and tooth-touch sound signals. In ACM Symposium on Applied Computing, pages 1159--1164, 2010.
[16]
T. Kuiken, G. Li, B. Lock, R. Lipschutz, L. Miller, K. Stubblefield, and K. Englehart. Targeted muscle reinnervation for real-time myoelectric control of multifunction artificial arms. Journal of the American Medical Association, 301(6):619--628, 2009.
[17]
J. LaCourse and F. Hludik. An eye movement communication-control system for the disabled. IEEE Transactions on Biomedical Engineering, 37(12):1215 --1220, 1990.
[18]
F. Loewenich and F. Maire. Hands-free mouse-pointer manipulation using motion-tracking and speech recognition. In ACM Australasian Conference on Computer-Human Interaction: Entertaining User Interfaces, pages 295--302, 2007.
[19]
F. Masuda and C. Wada. Effects of visual stimuli on a communication assistive method using sympathetic skin response. In International Conference on Computers Helping People with Special Needs, pages 189--192. Springer-Verlag, 2010.
[20]
M. Moore and U. Dua. A galvanic skin response interface for people with severe motor disabilities. In ACM Conference on Computers and Accessibility, pages 48--54, 2004.
[21]
Natural Point. SmartNav, 2011. naturalpoint.com.
[22]
New Abilities. Toungue-touch keypad, 2009. newabilities.com.
[23]
Origin Instruments Corporation. Sip/puff Breeze, 2011. orin.com.
[24]
S. Patel and G. Abowd. BLUI: low-cost localized blowable user interfaces. In ACM Symposium on User Interface Software and Technology, pages 217--220, 2007.
[25]
C. Salem and S. Zhai. An isometric tongue pointing device. In ACM Conference on Human Factors in Computing Systems, pages 538--539, 1997.
[26]
T. S. Saponas, D. Kelly, B. Parviz, and D. Tan. Optically sensing tongue gestures for computer input. In ACM Symposium on User Interface Software and Technology, pages 177--180, 2009.
[27]
A. Sears, M. Young, and J. Feng. Physical disabilities and computing technologies: An analysis of impairments. In A. Sears and J. Jacko, editors, The Human-Computer Interaction Handbook, pages 829--852. 2008.
[28]
T. Stivers, N. Enfield, P. Brown, C. Englert, M. Hayashi, T. Heinemann, G. Hoymann, F. Rossano, J. Peter de Ruiter, K. E. Yoon, and S. Levinson. Universals and cultural variation in turn-taking in conversation. Proceedings of the National Academy of Sciences, 106(26):10587--10592, 2009.
[29]
J. Wolpaw, N. Birbaumer, D. McFarland, G. Pfurtscheller, and T. Vaughan. Brain-computer interfaces for communication and control. Clinical Neurophysiology, 113(6):767--791, 2002.

Cited By

View all
  • (2023)TactTongue: Prototyping ElectroTactile Stimulations on the TongueProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606829(1-14)Online publication date: 29-Oct-2023
  • (2023)TOFI: Designing Intraoral Computer Interfaces for Gamified Myofunctional TherapyExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3573848(1-8)Online publication date: 19-Apr-2023
  • (2019)ChewIt. An Intraoral Interface for Discreet InteractionsProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300556(1-13)Online publication date: 2-May-2019
  • Show More Cited By

Index Terms

  1. A tongue input device for creating conversations

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '11: Proceedings of the 24th annual ACM symposium on User interface software and technology
    October 2011
    654 pages
    ISBN:9781450307161
    DOI:10.1145/2047196
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 October 2011

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. dialogue tree
    2. interface
    3. mouth
    4. turn-taking

    Qualifiers

    • Research-article

    Conference

    UIST '11

    Acceptance Rates

    UIST '11 Paper Acceptance Rate 67 of 262 submissions, 26%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)10
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 17 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)TactTongue: Prototyping ElectroTactile Stimulations on the TongueProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606829(1-14)Online publication date: 29-Oct-2023
    • (2023)TOFI: Designing Intraoral Computer Interfaces for Gamified Myofunctional TherapyExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3573848(1-8)Online publication date: 19-Apr-2023
    • (2019)ChewIt. An Intraoral Interface for Discreet InteractionsProceedings of the 2019 CHI Conference on Human Factors in Computing Systems10.1145/3290605.3300556(1-13)Online publication date: 2-May-2019
    • (2018)Earable TEMPO: A Novel, Hands-Free Input Device that Uses the Movement of the Tongue Measured with a Wearable Ear SensorSensors10.3390/s1803073318:3(733)Online publication date: 1-Mar-2018
    • (2018)TongueInput: Input Method by Tongue Gestures Using Optical Sensors Embedded in Mouthpiece2018 57th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE)10.23919/SICE.2018.8492690(1219-1224)Online publication date: Sep-2018
    • (2018)A System for Training Stuffed-Suit Posing Without a SuitMobile Computing, Applications, and Services10.1007/978-3-319-90740-6_11(183-200)Online publication date: 6-May-2018
    • (2017)You as a PuppetProceedings of the 30th Annual ACM Symposium on User Interface Software and Technology10.1145/3126594.3126608(217-228)Online publication date: 20-Oct-2017
    • (2017)Non-contact human computer interaction system design and implementationProceedings of the Second IEEE/ACM International Conference on Connected Health: Applications, Systems and Engineering Technologies10.1109/CHASE.2017.114(312-320)Online publication date: 17-Jul-2017
    • (2016)LumiOProceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/2971648.2971704(605-615)Online publication date: 12-Sep-2016
    • (2014)Tongue-able interfacesProceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility10.1145/2661334.2661395(277-278)Online publication date: 20-Oct-2014
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media