Skip to main content

Recognition of Hearing Needs from Body and Eye Movements to Improve Hearing Instruments

  • Conference paper
Pervasive Computing (Pervasive 2011)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 6696))

Included in the following conference series:

Abstract

Hearing instruments (HIs) have emerged as true pervasive computers as they continuously adapt the hearing program to the user’s context. However, current HIs are not able to distinguish different hearing needs in the same acoustic environment. In this work, we explore how information derived from body and eye movements can be used to improve the recognition of such hearing needs. We conduct an experiment to provoke an acoustic environment in which different hearing needs arise: active conversation and working while colleagues are having a conversation in a noisy office environment. We record body movements on nine body locations, eye movements using electrooculography (EOG), and sound using commercial HIs for eleven participants. Using a support vector machine (SVM) classifier and person-independent training we improve the accuracy of 77% based on sound to an accuracy of 92% using body movements. With a view to a future implementation into a HI we then perform a detailed analysis of the sensors attached to the head. We achieve the best accuracy of 86% using eye movements compared to 84% for head movements. Our work demonstrates the potential of additional sensor modalities for future HIs and motivates to investigate the wider applicability of this approach on further hearing situations and needs.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Atallah, L., Aziz, O., Lo, B., Yang, G.Z.: Detecting walking gait impairment with an ear-worn sensor. In: International Workshop on Wearable and Implantable Body Sensor Networks, pp. 175–180 (2009)

    Google Scholar 

  2. Bannach, D., Amft, O., Lukowicz, P.: Rapid prototyping of activity recognition applications. IEEE Pervasive Computing 7, 22–31 (2008)

    Article  Google Scholar 

  3. Biggins, A.: Benefits of wireless technology. Hearing Review (2009)

    Google Scholar 

  4. Büchler, M., Allegro, S., Launer, S., Dillier, N.: Sound Classification in Hearing Aids Inspired by Auditory Scene Analysis. EURASIP Journal on Applied Signal Processing 18, 2991–3002 (2005)

    Article  MATH  Google Scholar 

  5. Bulling, A., Roggen, D., Tröster, G.: Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments. Journal of Ambient Intelligence and Smart Environments 1(2), 157–171 (2009)

    Google Scholar 

  6. Bulling, A., Ward, J.A., Gellersen, H.: Multi-Modal Recognition of Reading Activity in Transit Using Body-Worn Sensors. ACM Transactions on Applied Perception (to appear, 2011)

    Google Scholar 

  7. Bulling, A., Ward, J.A., Gellersen, H., Tröster, G.: Eye Movement Analysis for Activity Recognition Using Electrooculography. IEEE Transactions on Pattern Analysis and Machine Intelligence 33(4), 741–753 (2011)

    Article  Google Scholar 

  8. Choudhury, T., Pentland, A.: Sensing and modeling human networks using the sociometer. In: ISWC, p. 216. IEEE Computer Society, Washington, DC, USA (2003)

    Google Scholar 

  9. Hadar, U., Steiner, T.J., Clifford Rose, F.: Head movement during listening turns in conversation. Journal of Nonverbal Behavior 9(4), 214–228 (1985)

    Article  Google Scholar 

  10. Hamacher, V., Chalupper, J., Eggers, J., Fischer, E., Kornagel, U., Puder, H., Rass, U.: Signal processing in high-end hearing aids: State of the art, challenges, and future trends. EURASIP Journal on Applied Signal Processing 18(2005), 2915–2929 (2005)

    Article  MATH  Google Scholar 

  11. Hart, J., Onceanu, D., Sohn, C., Wightman, D., Vertegaal, R.: The attentive hearing aid: Eye selection of auditory sources for hearing impaired users. In: Gross, T., Gulliksen, J., Kotzé, P., Oestreicher, L., Palanque, P., Prates, R.O., Winckler, M. (eds.) INTERACT 2009. LNCS, vol. 5726, pp. 19–35. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  12. Keidser, G.: Many factors are involved in optimizing environmentally adaptive hearing aids. The Hearing Journal 62(1), 26 (2009)

    Article  Google Scholar 

  13. Kochkin, S.: MarkeTrak VIII: 25-year trends in the hearing health market. Hearing Review 16(11), 12–31 (2009)

    Google Scholar 

  14. Kochkin, S.: MarkeTrak VIII: Consumer satisfaction with hearing aids is slowly increasing. The Hearing Journal 63(1), 19 (2010)

    Article  Google Scholar 

  15. Lin, C.J.: LIBLINEAR - a library for large linear classification (February 2008) http://www.csientuedutw/ cjlin/liblinear/

    Google Scholar 

  16. Manabe, H., Fukumoto, M.: Full-time wearable headphone-type gaze detector. In: Ext. Abstracts of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1073–1078. ACM Press, New York (2006)

    Google Scholar 

  17. Molen, M., Somsen, R., Jennings, J.: Does the heart know what the ears hear? A heart rate analysis of auditory selective attention. Psychophysiology (1996)

    Google Scholar 

  18. Morency, L.P., Sidner, C., Lee, C., Darrell, T.: Contextual recognition of head gestures. In: ICMI 2005: Proceedings of the 7th International Conference on Multimodal Interfaces, pp. 18–24. ACM, New York (2005)

    Google Scholar 

  19. Naylor, G.: Modern hearing aids and future development trends, http://www.lifesci.sussex.ac.uk/home/Chris_Darwin/BSMS/Hearing%20Aids/Naylor.ppt

  20. Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(8) (2005)

    Google Scholar 

  21. Schaub, A.: Digital Hearing Aids. Thieme Medical Pub. (2008)

    Google Scholar 

  22. Shargorodsky, J., Curhan, S., Curhan, G., Eavey, R.: Change in Prevalence of Hearing Loss in US Adolescents. JAMA 304(7), 772 (2010)

    Article  Google Scholar 

  23. Shinn-Cunningham, B.: I want to party, but my hearing aids won’t let me? Hearing Journal 62, 10–13 (2009)

    Article  Google Scholar 

  24. Shinn-Cunningham, B., Best, V.: Selective attention in normal and impaired hearing. Trends in Amplification 12(4), 283 (2008)

    Article  Google Scholar 

  25. Stiefmeier, T., Roggen, D., Ogris, G., Lukowicz, P., Tröster, G.: Wearable activity tracking in car manufacturing. IEEE Pervasive Computing 7(2), 42–50 (2008)

    Article  Google Scholar 

  26. Tessendorf, B., Bulling, A., Roggen, D., Stiefmeier, T., Tröster, G., Feilner, M., Derleth, P.: Towards multi-modal context recognition for hearing instruments. In: Proc. of the International Symposium on Wearable Computers (ISWC) (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Tessendorf, B. et al. (2011). Recognition of Hearing Needs from Body and Eye Movements to Improve Hearing Instruments. In: Lyons, K., Hightower, J., Huang, E.M. (eds) Pervasive Computing. Pervasive 2011. Lecture Notes in Computer Science, vol 6696. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21726-5_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21726-5_20

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21725-8

  • Online ISBN: 978-3-642-21726-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics