skip to main content
10.1145/1520340.1520452acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
extended-abstract

A hand clap interface for sonic interaction with the computer

Published: 04 April 2009 Publication History

Abstract

We present a hand clapping interface for sonic interaction with the computer. The current implementation has been built on the Pure Data (PD) software. The interface makes use of the cyclic nature of hand clapping and recognition of the clap type, and enables interactive control over different applications. Three prototype applications for the interface are presented: a virtual crowd of clappers, controlling the tempo of music, and a simple sampler. Preliminary tests indicate that rather than having total control via the interface, the user negotiates with the computer to control the tempo.

Supplementary Material

MOV File (p3175.mov)

References

[1]
Abowd, G. and Mynatt, E. Charting past, present, and future research in ubiquitous computing. ACM Trans. Computer-Human Interaction 7, 1 (2000), 29--58.
[2]
Bello, J.P., Daudet, L., Abdallah, S., Duxbury, C., Davies, M., and Sandler, M.B. A tutorial on onset detection in music signals. IEEE Trans. Speech and Audio Proc. 13, 5 (2005), 1035--1047.
[3]
Brewster, S., Lumsden, J., Bell, M., Hall, M., and Tasker, S. Multimodal 'eyes-free' interaction techniques for wearable devices. In Proc. CHI 2003, ACM Press (2003), 463--480.
[4]
Buxton, B., Gaver, W., and Bly, S. Auditory Interfaces: the Use of Non-Speech Audio at the Interface. Unfinished book manuscript, available at http://www.billbuxton.com/Audio.TOC.html.
[5]
Erkut, C., and Tahiroglu, K. ClaPD: A testbed for control of multiple sound sources in interactive and participatory contexts. In PureData Convention 2007. Online proc. at http://artengine.ca/~catalogue-pd.
[6]
Gaver, W. What in the World Do We Hear?: An Ecological Approach to Auditory Event Perception. Ecological Psychology 5, 1 (1993), 1--29.
[7]
Gouyon, F. and Dixon, S. A review of automatic rhythm description systems. Comp. Music J. 29, 1 (2005), 34--54.
[8]
Hanahara, K., Tada, Y., and Muroi, T. Human-robot communication by means of hand-clapping (preliminary experiment with hand-clapping language). In Proc. IEEE Intl. Conf. Systems, Man and Cybernetics, IEEE (2007), 2995--3000.
[9]
Jylhä, A. and Erkut, C. Inferring the hand configuration from hand clapping sounds. In Proc. DAFx--08, (2008), 301--304.
[10]
Jylhä, A. and Erkut, C. Sonic interactions with hand clap sounds. In Proc. Audio Mostly 2008, (2008), 93--100.
[11]
Norman, D. The Design of Future Things. Basic Books, New York, NY, USA, 2007.
[12]
Peltola, L., Erkut, C., Cook, P.R., and Välimäki, V. Synthesis of hand clapping sounds. IEEE Trans. Audio, Speech and Language Proc. 15, 3 (2007), 1021--1029.
[13]
Puckette, M. Pure data: Another integrated computer music environment. In Proc. Second Intercollege Comp. Music Concerts, (1996), 37--41.
[14]
Puckette, M., Apel, T., and Zicarelli, D. Real-time audio analysis tool for Pd and MSP. In Proc. Intl. Comp. Music Conf., (1998), 109--112.
[15]
Robare, P. and Forlizzi, J. Sound in computing: A short history. In interactions 16, 1 (2009), 62--65.
[16]
Seppänen, J. Computational Models of Musical Meter Recognition. Master's thesis, Tampere University of Technology, Finland, 2001.
[17]
Vesa, S. and Lokki, T. An eyes-free user interface controlled by finger snaps. In Proc DAFx-05, (2005), 262--265.

Cited By

View all
  • (2024)EchoTap: Non-Verbal Sound Interaction with Knock and Tap GesturesInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2348837(1-22)Online publication date: 3-Jun-2024
  • (2020)Handclap for Acoustic Measurements: Optimal Application and LimitationsAcoustics10.3390/acoustics20200152:2(224-245)Online publication date: 26-Apr-2020
  • (2020)Facilitating Temporal Synchronous Target Selection through User Behavior ModelingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/33698393:4(1-24)Online publication date: 14-Sep-2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '09: CHI '09 Extended Abstracts on Human Factors in Computing Systems
April 2009
2470 pages
ISBN:9781605582474
DOI:10.1145/1520340
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 April 2009

Check for updates

Author Tags

  1. audio interfaces
  2. hand clapping
  3. human-computer interaction
  4. sonic interaction design

Qualifiers

  • Extended-abstract

Conference

CHI '09
Sponsor:

Acceptance Rates

CHI EA '09 Paper Acceptance Rate 385 of 1,130 submissions, 34%;
Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)3
  • Downloads (Last 6 weeks)0
Reflects downloads up to 07 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)EchoTap: Non-Verbal Sound Interaction with Knock and Tap GesturesInternational Journal of Human–Computer Interaction10.1080/10447318.2024.2348837(1-22)Online publication date: 3-Jun-2024
  • (2020)Handclap for Acoustic Measurements: Optimal Application and LimitationsAcoustics10.3390/acoustics20200152:2(224-245)Online publication date: 26-Apr-2020
  • (2020)Facilitating Temporal Synchronous Target Selection through User Behavior ModelingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/33698393:4(1-24)Online publication date: 14-Sep-2020
  • (2020)Non-Verbal Auditory Input for Controlling Binary, Discrete, and Continuous Input in Automotive User InterfacesProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376816(1-13)Online publication date: 21-Apr-2020
  • (2017)Multi-kinect Skeleton Fusion for Enactive GamesInteractivity, Game Creation, Design, Learning, and Innovation10.1007/978-3-319-55834-9_20(173-180)Online publication date: 18-Mar-2017
  • (2015)Exploring Felt Qualities of Embodied Interaction with Movement and SoundArts and Technology10.1007/978-3-319-18836-2_10(77-85)Online publication date: 24-May-2015
  • (2013)Mobile rhythmic interaction in a sonic tennis gameCHI '13 Extended Abstracts on Human Factors in Computing Systems10.1145/2468356.2479570(2903-2906)Online publication date: 27-Apr-2013
  • (2011)Real-time recognition of percussive sounds by a model-based methodEURASIP Journal on Advances in Signal Processing10.1155/2011/2918602011(1-14)Online publication date: 1-Jan-2011
  • (2011)Rhythmic blueprintsProceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments10.1145/2181037.2181038(3-4)Online publication date: 28-Sep-2011
  • (2010)Basic exploration of narration and performativity for sounding interactive commoditiesProceedings of the 5th international conference on Haptic and audio interaction design10.5555/1887984.1887994(65-74)Online publication date: 16-Sep-2010
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media