Skip to main content

A Multimodal Approach to Accessible Web Content on Smartphones

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 7383))

Abstract

Mainstream smartphones can now be used to implement efficient speech-based and multimodal interfaces. The current status and continued development of mobile technologies opens up for possibilities of interface design for smartphones that were unattainable only a few years ago. Better and more intuitive multimodal interfaces for smartphones can provide access to information and services on the Internet through mobile devices, thus enabling users with different abilities to access this information at any place and at any time. In this paper we present our current work in the area of multimodal interfaces on smartphones. We have implemented a multimodal framework, and has used it as a foundation for development of a prototype which have been used in a user test. There are two main contributions: 1) How we have implemented W3C’s multimodal interaction framework on smartphones running the Android OS, and 2) the results from user tests and interviews with blind and visually impaired users.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anegg, H., Niklfeld, G., Schatz, R., Simon, R., Wegscheider, F.: Multimodal interfaces in mobile devices–the mona project. In: MobEA II-Emerging Applications for Wireless and Mobile Access Workshop at the 13th International World Wide Web Conference. Citeseer (2004)

    Google Scholar 

  2. Hedvall, P.O.: Towards the Era of Mixed Reality: Accessibility Meets Three Waves of HCI. In: Holzinger, A., Miesenberger, K. (eds.) USAB 2009. LNCS, vol. 5889, pp. 264–278. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  3. Krajnc, E., Feiner, J., Schmidt, S.: User Centered Interaction Design for Mobile Applications Focused on Visually Impaired and Blind People. In: Leitner, G., Hitz, M., Holzinger, A. (eds.) USAB 2010. LNCS, vol. 6389, pp. 195–202. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  4. Krüger, A., Butz, A., Müller, C., Stahl, C., Wasinger, R., Steinberg, K.E., Dirschl, A.: The connected user interface: Realizing a personal situated navigation service. In: Proceedings of the 9th International Conference on Intelligent User Interfaces, pp. 161–168. ACM (2004)

    Google Scholar 

  5. Nardelli, L., Orlandi, M., Falavigna, D.: A multi-modal architecture for cellular phones. In: Proceedings of the 6th International Conference on Multimodal Interfaces, pp. 323–324. ACM (2004)

    Google Scholar 

  6. Oviatt, S.: Ten myths of multimodal interaction. Communications of the ACM 42(11), 74–81 (1999)

    Article  Google Scholar 

  7. Oviatt, S.: Designing robust multimodal systems for universal access. In: Proceedings of the 2001 EC/NSF Workshop on Universal Accessibility of Ubiquitous Computing: Providing for the Elderly, pp. 71–74. ACM (2001)

    Google Scholar 

  8. Paay, J., Kjeldskov, J.: Understanding and modelling built environments for mobile guide interface design. Behaviour and Information Technology 24(1), 21–36 (2005)

    Article  Google Scholar 

  9. Ringland, S.P.A., Scahill, F.J.: Multimodality - the future of the wireless user interface. BT Technology Journal 21(3), 181–191 (2003)

    Article  Google Scholar 

  10. Turunen, M., Hakulinen, J., Salonen, E.P., Kainulainen, A., Helin, L.: Spoken and multimodal bus timetable systems: design, development and evaluation. In: Proceedings of 10th International Conference on Speech and Computer (SPECOM 2005), pp. 389–392 (2005)

    Google Scholar 

  11. Turunen, M., Hurtig, T., Hakulinen, J., Virtanen, A., Koskinen, S.: Mobile Speech-based and Multimodal Public Transport Information Services. In: Proceedings of MobileHCI 2006 Workshop on Speech in Mobile and Pervasive Environments. Citeseer (2006)

    Google Scholar 

  12. W3C. Emma: Extensible multimodal annotation markup language, http://www.w3.org/TR/emma/

  13. W3C. W3c multimodal interaction framework, http://www.w3.org/TR/mmi-framework/

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Knudsen, L.E., Holone, H. (2012). A Multimodal Approach to Accessible Web Content on Smartphones. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds) Computers Helping People with Special Needs. ICCHP 2012. Lecture Notes in Computer Science, vol 7383. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31534-3_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-31534-3_1

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-31533-6

  • Online ISBN: 978-3-642-31534-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics