skip to main content
research-article

Pitch in non-verbal vocal input

Published: 01 June 2009 Publication History

Abstract

Recently, numerous prototypes of user interfaces have been presented that are based on interpretation of non-verbal sounds produced by the users, such as humming, whistling, or hissing. These sounds can be characterized by numerous properties, such as pitch, volume, or timbre. The user may intentionally change these properties while producing the sound. The properties (or their profiles over time) can be mapped to different actions. In order to trigger an action or modify an input value, the user produces a corresponding acoustic gesture.

References

[1]
S. Al-Hashimi. Blowtter: A voice-controlled plotter. In Proceedings of HCI 2006 Engage, The 20th BCS HCI Group conference in co-operation with ACM, vol. 2, London, England, September 2006.
[2]
J. A. Bilmes, X. Li, J. Malkin, K. Kilanski, R. Wright, K. Kirchhoff, A. Subramanya, S. Harada, J. A. Landay, P. Dowden, and H. Chizeck. The Vocal Joystick: A voice-based human-computer interface for individuals with motor impairments. In Human Language Technology Conf./Conf. on Empirical Methods in Natural Language Processing, October 2005.
[3]
P. Hämäläinen, T. Mäki-Patola, V. Pulkki, and M. Airas. Musical computer games played by singing. In G. Evangelista and I. Testa, editors, Proceedings of 7th International Conference on Digital Audio Effects, Naples, Italy, pages 367--371, 2004.
[4]
T. Igarashi and J. F. Hughes. Voice as sound: using non-verbal voice input for interactive control. In UIST'01: Proc 14th Annual ACM Symp on User Interface Software and Technology, pages 155--156, New York, NY, USA, 2001. ACM Press.
[5]
MIDI Manufacturers Association. Complete MIDI 1.0 Detailed Specification. March 1996.
[6]
A. Sporka, S. H. Kurniawan, M. Mahmud, and P. Slavik. Non-speech input and speech recognition for real-time control of computer games. In The Proceedings of The Eighth International ACM SIGACCESS Conference on Computers & Accessibility, ASSETS 2006, Portland, Oregon. ACM, 2006.
[7]
A. J. Sporka, S. H. Kurniawan, and P. Slavík. Accoustic control of mouse pointer. Universal Access in the Information Society, 4(3): 237--245, 2006.
[8]
A. J. Sporka, S. H. Kurniawan, and P. Slavík. Non-speech operated emulation of keyboard. In J. Clarkson, P. Langdon, and P. Robinson, editors, Cambridge Workshop on Universal Access and Assistive Technology, CWUAAT 2006. Designing Accessible Technology, pages 145--154. Springer-Verlag London Ltd, 2006.

Cited By

View all
  • (2024)A Relative Pitch Based Approach to Non-verbal Vocal Interaction as a Continuous and One-Dimensional ControllerHuman-Computer Interaction10.1007/978-3-031-60449-2_12(169-186)Online publication date: 29-Jun-2024
  • (2022)Evaluating Singing for Computer Input Using Pitch, Interval, and MelodyProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517691(1-15)Online publication date: 29-Apr-2022
  • (2015)Control of word processing environment using myoelectric signalsJournal on Multimodal User Interfaces10.1007/s12193-015-0200-99:4(299-311)Online publication date: 16-Oct-2015
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM SIGACCESS Accessibility and Computing
ACM SIGACCESS Accessibility and Computing Just Accepted
June 2009
23 pages
ISSN:1558-2337
EISSN:1558-1187
DOI:10.1145/1595061
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 June 2009
Published in SIGACCESS , Issue 94

Check for updates

Qualifiers

  • Research-article

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)A Relative Pitch Based Approach to Non-verbal Vocal Interaction as a Continuous and One-Dimensional ControllerHuman-Computer Interaction10.1007/978-3-031-60449-2_12(169-186)Online publication date: 29-Jun-2024
  • (2022)Evaluating Singing for Computer Input Using Pitch, Interval, and MelodyProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517691(1-15)Online publication date: 29-Apr-2022
  • (2015)Control of word processing environment using myoelectric signalsJournal on Multimodal User Interfaces10.1007/s12193-015-0200-99:4(299-311)Online publication date: 16-Oct-2015
  • (2014)1-of-N selection in GUIs using myoelectric sensors2014 5th IEEE Conference on Cognitive Infocommunications (CogInfoCom)10.1109/CogInfoCom.2014.7020421(67-72)Online publication date: Nov-2014
  • (2011)An interactive Whistle-to-Music composing system based on transcription, variation and chords generationMultimedia Tools and Applications10.1007/s11042-010-0510-653:1(253-269)Online publication date: 1-May-2011

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media