Skip to main content

FingerTalkie: Designing a Low-Cost Finger-Worn Device for Interactive Audio Labeling of Tactile Diagrams

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12182))

Abstract

Traditional tactile diagrams for the visually-impaired (VI) use short Braille keys and annotations to provide additional information in separate Braille legend pages. Frequent navigation between the tactile diagram and the annex pages during the diagram exploration results in low efficiency in diagram comprehension. We present the design of FingerTalkie, a finger-worn device that uses discrete colors on a color-tagged tactile diagram for interactive audio labeling of the graphical elements. Through an iterative design process involving 8 VI users, we designed a unique offset point-and-click technique that enables the bimanual exploration of the diagrams without hindering the tactile perception of the fingertips. Unlike existing camera-based and finger-worn audio-tactile devices, FingerTalkie supports one-finger interaction and can work in any lighting conditions without calibration. We conducted a controlled experiment with 12 blind-folded sighted users to evaluate the usability of the device. Further, a focus-group interview with 8 VI users shows their appreciation for the FingerTalkie’s ease of use, support for two-hand exploration, and its potential in improving the efficiency of comprehending tactile diagrams by replacing Braille labels.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    ‘Glass’ sound file in the MacOS sound effects.

  2. 2.

    ‘Basso’ sound file in the MacOS sound effects.

References

  1. Al-Khalifa, H.S.: Utilizing QR code and mobile phones for blinds and visually impaired people. In: Miesenberger, K., Klaus, J., Zagler, W., Karshmer, A. (eds.) ICCHP 2008. LNCS, vol. 5105, pp. 1065–1069. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-70540-6_159

    Chapter  Google Scholar 

  2. Ashbrook, D., Baudisch, P., White, S.: Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2043–2046. ACM (2011)

    Google Scholar 

  3. Baker, C.M., Milne, L.R., Scofield, J., Bennett, C.L., Ladner, R.E.: Tactile graphics with a voice: using QR codes to access text in tactile graphics. In: Proceedings of the 16th International ACM SIGACCESS Conference on Computers & Accessibility, pp. 75–82. ACM (2014)

    Google Scholar 

  4. Bau, O., Poupyrev, I., Israr, A., Harrison, C.: TeslaTouch: electrovibration for touch surfaces. In: Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology, pp. 283–292. ACM (2010)

    Google Scholar 

  5. Boldu, R., Dancu, A., Matthies, D.J., Buddhika, T., Siriwardhana, S., Nanayakkara, S.: FingerReader2.0: designing and evaluating a wearable finger-worn camera to assist people with visual impairments while shopping. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2(3), 94 (2018)

    Article  Google Scholar 

  6. Brock, A.: Interactive maps for visually impaired people: design, usability and spatial cognition. Ph.D. thesis (2013)

    Google Scholar 

  7. Brock, A.M., Truillet, P., Oriola, B., Picard, D., Jouffrais, C.: Interactivity improves usability of geographic maps for visually impaired people. Hum.-Comput. Interact. 30(2), 156–194 (2015). https://doi.org/10.1080/07370024.2014.924412

    Article  Google Scholar 

  8. Ducasse, J., Brock, A.M., Jouffrais, C.: Accessible interactive maps for visually impaired users. In: Pissaloux, E., Velázquez, R. (eds.) Mobility of Visually Impaired People, pp. 537–584. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-54446-5_17

    Chapter  Google Scholar 

  9. Edman, P.: Tactile Graphics. American Foundation for the Blind (1992)

    Google Scholar 

  10. Foulke, E.: Reading braille. In: Schiff, W., Foulke, E. (eds.) Tactual Perception: A Sourcebook, vol. 168. Cambridge University Press, Cambridge (1982)

    Google Scholar 

  11. Fukumoto, M., Suenaga, Y.: “Fingering”: a full-time wearable interface. In: Conference Companion on Human Factors in Computing Systems, pp. 81–82. ACM (1994)

    Google Scholar 

  12. Fusco, G., Morash, V.S.: The tactile graphics helper: providing audio clarification for tactile graphics using machine vision. In: Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, pp. 97–106 (2015)

    Google Scholar 

  13. Gardner, J.A.: Access by blind students and professionals to mainstream math and science. In: Miesenberger, K., Klaus, J., Zagler, W. (eds.) ICCHP 2002. LNCS, vol. 2398, pp. 502–507. Springer, Heidelberg (2002). https://doi.org/10.1007/3-540-45491-8_94

    Chapter  Google Scholar 

  14. Ghosh, S., Kim, H.C., Cao, Y., Wessels, A., Perrault, S.T., Zhao, S.: Ringteraction: coordinated thumb-index interaction using a ring. In: Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2640–2647. ACM (2016)

    Google Scholar 

  15. Heller, M.A.: Picture and pattern perception in the sighted and the blind: the advantage of the late blind. Perception 18(3), 379–389 (1989)

    Article  Google Scholar 

  16. Hinton, R.A.: Tactile and audio-tactile images as vehicles for learning. Colloques-Institut National de la sante et de la Recherche Medicale Colloques et Seminaires, p. 169 (1993)

    Google Scholar 

  17. Je, S., Lee, M., Kim, Y., Chan, L., Yang, X.D., Bianchi, A.: PokeRing: notifications by poking around the finger. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, p. 542. ACM (2018)

    Google Scholar 

  18. Jing, L., Zhou, Y., Cheng, Z., Huang, T.: Magic Ring: a finger-worn device for multiple appliances control using static finger gestures. Sensors 12(5), 5775–5790 (2012)

    Article  Google Scholar 

  19. Kane, S.K., Frey, B., Wobbrock, J.O.: Access lens: a gesture-based screen reader for real-world documents. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 347–350. ACM (2013)

    Google Scholar 

  20. Kendrick, D.: PenFriend and touch memo: a comparison of labeling tools. AFB AccessWorld Mag. 12(9) (2011)

    Google Scholar 

  21. Kienzle, W., Hinckley, K.: LightRing: always-available 2D input on any surface. In: Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, pp. 157–160. ACM (2014)

    Google Scholar 

  22. Krueger, M.W., Gilden, D.: KnowWhere: an audio/spatial interface for blind people. Georgia Institute of Technology (1997)

    Google Scholar 

  23. Landau, S., Wells, L.: Merging tactile sensory input and audio data by means of the talking tactile tablet. In: Proceedings of EuroHaptics, vol. 3, pp. 414–418 (2003)

    Google Scholar 

  24. Lévesque, V.: Virtual display of tactile graphics and Braille by lateral skin deformation. Ph.D. thesis, McGill University Library (2009)

    Google Scholar 

  25. Lim, H., Chung, J., Oh, C., Park, S., Suh, B.: OctaRing: examining pressure-sensitive multi-touch input on a finger ring device. In: Proceedings of the 29th Annual Symposium on User Interface Software and Technology, pp. 223–224. ACM (2016)

    Google Scholar 

  26. Minagawa, H., Ohnishi, N., Sugie, N.: Tactile-audio diagram for blind persons. IEEE Trans. Rehabil. Eng. 4(4), 431–437 (1996)

    Article  Google Scholar 

  27. Morash, V.S., Pensky, A.E.C., Tseng, S.T., Miele, J.A.: Effects of using multiple hands and fingers on haptic performance in individuals who are blind. Perception 43(6), 569–588 (2014)

    Article  Google Scholar 

  28. Nanayakkara, S., Shilkrot, R., Maes, P.: EyeRing: an eye on a finger. In: CHI 2012 Extended Abstracts on Human Factors in Computing Systems, pp. 1047–1050. ACM (2012)

    Google Scholar 

  29. Ogata, M., Sugiura, Y., Osawa, H., Imai, M.: iRing: intelligent ring using infrared reflection. In: Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, pp. 131–136. ACM (2012)

    Google Scholar 

  30. Roumen, T., Perrault, S.T., Zhao, S.: NotiRing: a comparative study of notification channels for wearable interactive rings. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 2497–2500. ACM (2015)

    Google Scholar 

  31. Schiff, W., Foulke, E.: Tactual Perception: A Sourcebook. Cambridge University Press, Cambridge (1982)

    Google Scholar 

  32. Seisenbacher, G., Mayer, P., Panek, P., Zagler, W.L.: 3D-finger-system for auditory support of haptic exploration in the education of blind and visually impaired students-idea and feasibility study. In: Assistive Technology: From Virtuality to Reality: AAATE 2005, vol. 16, p. 73 (2005)

    Google Scholar 

  33. Shi, L., McLachlan, R., Zhao, Y., Azenkot, S.: Magic touch: interacting with 3D printed graphics. In: Proceedings of the 18th International ACM SIGACCESS Conference on Computers and Accessibility, pp. 329–330. ACM (2016)

    Google Scholar 

  34. Shi, L., Zhao, Y., Azenkot, S.: Markit and Talkit: a low-barrier toolkit to augment 3D printed models with audio annotations. In: Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, pp. 493–506. ACM (2017)

    Google Scholar 

  35. Shilkrot, R., Huber, J., Meng Ee, W., Maes, P., Nanayakkara, S.C.: FingerReader: a wearable device to explore printed text on the go. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 2363–2372. ACM (2015)

    Google Scholar 

  36. Sporka, A.J., Němec, V., Slavík, P.: Tangible newspaper for the visually impaired users. In: CHI 2005 Extended Abstracts on Human Factors in Computing Systems, pp. 1809–1812. ACM (2005)

    Google Scholar 

  37. Tatham, A.F.: The design of tactile maps: theoretical and practical considerations. In: Proceedings of International Cartographic Association: Mapping the Nations, pp. 157–166 (1991)

    Google Scholar 

  38. The Braille Authority of North America: Guidelines and standards for tactile graphics (2012). http://www.brailleauthority.org/tg/web-manual/index.html

  39. Thinus-Blanc, C., Gaunet, F.: Representation of space in blind persons: vision as a spatial sense? Psychol. Bull. 121(1), 20 (1997)

    Article  Google Scholar 

  40. Vogt, H.: Efficient object identification with passive RFID tags. In: Mattern, F., Naghshineh, M. (eds.) Pervasive 2002. LNCS, vol. 2414, pp. 98–113. Springer, Heidelberg (2002). https://doi.org/10.1007/3-540-45866-2_9

    Chapter  Google Scholar 

  41. Wilhelm, M., Krakowczyk, D., Trollmann, F., Albayrak, S.: eRing: multiple finger gesture recognition with one ring using an electric field. In: Proceedings of the 2nd International Workshop on Sensor-Based Activity Recognition and Interaction, p. 7. ACM (2015)

    Google Scholar 

  42. Wong, P.C., Zhu, K., Fu, H.: FingerT9: leveraging thumb-to-finger interaction for same-side-hand text entry on smartwatches. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, p. 178 (2018)

    Google Scholar 

  43. Yang, X.D., Grossman, T., Wigdor, D., Fitzmaurice, G.: Magic finger: always-available input through finger instrumentation. In: Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology, pp. 147–156. ACM (2012)

    Google Scholar 

  44. Zhu, K., Perrault, S., Chen, T., Cai, S., Peiris, R.L.: A sense of ice and fire: exploring thermal feedback with multiple thermoelectric-cooling elements on a smart ring. Int. J. Hum.-Comput. Stud. 130, 234–247 (2019)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Nasser, A., Chen, T., Liu, C., Zhu, K., Rao, P. (2020). FingerTalkie: Designing a Low-Cost Finger-Worn Device for Interactive Audio Labeling of Tactile Diagrams. In: Kurosu, M. (eds) Human-Computer Interaction. Multimodal and Natural Interaction. HCII 2020. Lecture Notes in Computer Science(), vol 12182. Springer, Cham. https://doi.org/10.1007/978-3-030-49062-1_32

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-49062-1_32

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-49061-4

  • Online ISBN: 978-3-030-49062-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics