Skip to main content
Log in

Acoustic control of mouse pointer

  • SHORT PAPER
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

This paper describes the design and implementation of a system for controlling mouse pointer using non-verbal sounds such as whistling and humming. Two control modes have been implemented—an orthogonal mode (where the pointer moves with variable speed either horizontally or vertically at any one time) and a melodic mode (where the pointer moves with fixed speed in any direction). A preliminary user study with four users indicates that the orthogonal control was easier to operate and that the humming was less tiring for the users than whistling. The developed system may contribute as an inexpensive, alternative pointing device for people with motor disabilities.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Baier G, Herman T (2004) The sonification of rhythms in human electro-encephalorgam. In: Barrass S, Vickers P (eds) Proceedings of the 10th International Conference on Auditory Display, Sydney (ICAD), 2004, pp 1–5

  2. Barra M, Cillo T, De Santis A, Umberto FP, Negro A, Scarano V, Matlock T, Maglio PP (2001) Personal webmelody: customized sonification of web servers. In: Hiipakka J, Zacharov N, Takala T (eds) Proceedings of the 7th International Conference on Auditory Display. Laboratory of Acoustics and Audio Signal Processing and the Telecommunications Software and Multimedia Laboratory, Helsinki University of Technology, Espoo, pp 1–9

  3. Basson S (2002) Speech recognition and accessible education. Speech Technol Mag 7(4) [on-line]. http://www.speechtechmag.com/issues/7_4/avios/

  4. Blattner MM, Sumikawa DA, Greenberg RM (1989) Earcons and icons: their structure and common design principles. Hum–Comput Interact 4:11–44 (Lawrence Erlbaum, Hillsdale, NJ)

  5. van Buskirk R, LaLomia M (1995) The just noticeable difference of speech recognition accuracy. Proceedings of ACM CHI’95, Conference on Human Factors in Computing Systems, vol 2. ACM Press, New York, p 96

  6. Franklin KM, Roberts JC (2003) Pie chart sonification. Proceedings of the Seventh International Conference on Information Visualization, IEEE, London, pp 4–9

  7. Frigo M, Johnson SG (2005) The design and implementation of FFTW3. Proceedings of the IEEE. Special Issue on Program Generation, Optimization, and Platform Adaptation, vol 93, pp 216–231

  8. Gaver WW (1993) Sythesizing auditory icons. ACM INTERCHI’93 Conference on Human Factors in Computing Systems. ACM Press, New York, pp 228–235

  9. Hämäläinen P, Mäki T, Pulkki V, Airas M (2004) Musical computer games played by singing. In: Evangelista G, Testa I (eds) Proceedings of the Seventh International Conference on Digital Audio Effects, Naples

  10. Igarashi T, Hughes JF (2001) Voice as Sound: using non-verbal voice input for interactive control. In: Proceedings of UIST 2001. ACM Press, Orlando, FL, pp 155–156

  11. Kitto KL (1993) Development of a low-cost sip and puff mouse. In: Proceedings of 16th Annual Conference of RESNA. RESNA Press, Las Vegas, pp 452–454

  12. Nicol C, Brewster SA, Gray PD (2004) A system for manipulating auditory interfaces using timbre spaces. In: Jacob R, Limbourg Q, Vanderdonckt J (eds) Proceedings of CADUI. ACM Press, Madeira, pp 366–379

    Google Scholar 

  13. Nicol C, Brewster S, Gray P (2004) Designing sound Towards a system for designing audio interfaces using timbre spaces. In: Barrass S, Vickers P (eds) Proceedings of the 10th International Conference on Auditory Display, Sydney 2004, pp 1–5

  14. Pirhonen A, Brewster S, Holguin C (2002) Gestural and audio metaphors as a means of control for mobile devices. Proceedings of the CHI 2002 Conference on Human Factors in Computing Systems. ACM Press, New York, pp 291–298

  15. Rabiner L, Juang BH (1993) Fundamentals of speech recognition. Prentice Hall, Englewood Cliffs, NJ (ISBN 0130151572)

  16. Sibert LE, Jacob RJK (2000) Evaluation of eye gaze interaction. Proceedings of CHI 2000 Conference on Human Factors in Computing Systems. ACM Press, The Hague, pp 281–288

  17. U3I Project homepage (2005) [On-line] http://www.u3i.info

  18. Walker A, Brewster SA (2000) Spatial audio in small display screen devices. Pers Technol 4:144–154

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Adam J. Sporka.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sporka, A.J., Kurniawan, S.H. & Slavík, P. Acoustic control of mouse pointer. Univ Access Inf Soc 4, 237–245 (2006). https://doi.org/10.1007/s10209-005-0010-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-005-0010-z

Keywords

Navigation