Skip to main content

Vibrotactile Encoding of Object Features and Alert Levels for the Visually Impaired

  • Conference paper
  • First Online:
Haptic Interaction (AsiaHaptics 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14063))

Included in the following conference series:

  • 107 Accesses

Abstract

Representation of object features can help visually impaired people better comprehend their surrounding environment. Tacton (Tactile Icon) is an effective method to extract and express information non-visually, utilizing users’ tactile perception capacities. However existing vibrotactile displays mainly place emphasis on directional guidance, and the number of representable object features is very limited. To leverage the egocentric spatial cognition habit and high tactile perception sensitivity of visually impaired users, this research proposes a user-centered vibrotactile cueing strategy to convey 30 kinds of spatial information through 30 tactons played by 4 vibrators on the back and front side of a pair of gloves. Three parameters including vibration sequence, stimulus location, and intensity are used to encode 10 typical objects located in 3 directions with 2 alert levels. User tests in both laboratory and natural settings are conducted to evaluate the validity of the strategy. The recognition accuracy of the designed tacton has reached 98.99% within a recognition time of less than 0.6s, indicating that this strategy can provide practical assistance for visually impaired users to perceive and respond to the pre-defined spatial information. The multi-parameter tactons provide possibility to encode a wide variety of spatial information by exploiting the communication capacities of the tactile channel of visually impaired users.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Xu, J.X., Gu, X.R.: Research Progress on safety of urban pedestrian space based on the use of visually impaired people. Garden 10, 36–40 (2020)

    Google Scholar 

  2. Yang, Y.N., Liu, J., Wang, Y., Zhang, Z.X., Dai, Y.R.: Research on the construction of barrier free facilities for the blind in smart cities. Huazhong Architecture. 37, 36–40 (2019)

    Google Scholar 

  3. Yu, Y.M., et al.: Research on the optimization of “15 minute life circle” in urban communities -- the direction of “visually impaired groups, vol. 17, pp. 7–11 (2020)

    Google Scholar 

  4. Chen, X.M., Liu, C.L., Qiao, F.Q., Qi, K.M.: A study on the optimization of “15 minute life circle” in urban communities – strategies and effects of “visually impaired group” to construct spatial representation of unfamiliar environment for the blind. Acta Psychol. Sin. 48, 637 (2016)

    Article  Google Scholar 

  5. Kaul, O.B., Rohs, M., Mogalle, M., Simon, B.: Around-the-head tactile system for supporting micro navigation of people with visual impairments. ACM Trans. Comput.-Hum. Interact. 28, 27 (2021). https://doi.org/10.1145/3458021

  6. Flores, G., Kurniawan, S., Manduchi, R., Martinson, E., Morales, L.M., Sisbot, E.A.: Vibrotactile guidance for wayfinding of blind walkers. IEEE Trans. Haptics 8, 306–317 (2015). https://doi.org/10.1109/TOH.2015.2409980

    Article  Google Scholar 

  7. Ho, C., Tan, H.Z., Spence, C.: Using spatial vibrotactile cues to direct visual attention in driving scenes. Transport. Res. F: Traffic Psychol. Behav. 8, 397–412 (2005)

    Article  Google Scholar 

  8. Ho, C., Reed, N., Spence, C.: Assessing the effectiveness of “intuitive” vibrotactile warning signals in preventing front-to-rear-end collisions in a driving simulator. Accid. Anal. Prev. 38, 988–996 (2006). https://doi.org/10.1016/j.aap.2006.04.002

    Article  Google Scholar 

  9. Scott, J.J., Gray, R.: A comparison of tactile, visual, and auditory warnings for rear-end collision prevention in simulated driving. Hum. Factors 50, 264–275 (2008). https://doi.org/10.1518/001872008X250674

    Article  Google Scholar 

  10. Lévesque, V.: Blindness, technology and haptics. Center for Intelligent Machines, pp. 19–21 (2005)

    Google Scholar 

  11. Faria, J., Lopes, S., Fernandes, H., Martins, P., Barroso, J.: Electronic white cane for blind people navigation assistance. In: 2010 World Automation Congress, pp. 1–7. IEEE (2010)

    Google Scholar 

  12. Yuan, D., Manduchi, R.: Dynamic environment exploration using a virtual white cane. In: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR2005), vol. 1, pp. 243–249 (2005). https://doi.org/10.1109/CVPR.2005.136

  13. Zhao, Y., et al.: Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–14 (2018)

    Google Scholar 

  14. Bach-y-Rita, P., Kercel, S.W.: Sensory substitution and the human–machine interface. Trends Cogn. Sci. 7, 541–546 (2003)

    Article  Google Scholar 

  15. Bach-y-Rita, P.: Tactile sensory substitution studies. Ann. N. Y. Acad. Sci. 1013, 83–91 (2004)

    Article  Google Scholar 

  16. Lenay, C., Canu, S., Villon, P.: Technology and perception: the contribution of sensory substitution systems. In: Proceedings Second International Conference on Cognitive Technology Humanizing the Information Age, pp. 44–53. IEEE (1997)

    Google Scholar 

  17. Jones, L.A.: Chapter 8 - Tactile communication systems: optimizing the display of information. In: Green, A., Chapman, C.E., Kalaska, J.F., Lepore, F. (eds.) Progress in Brain Research, pp. 113–128. Elsevier (2011). https://doi.org/10.1016/B978-0-444-53355-5.00008-7

  18. Brewster, S.A., Brown, L.M.: Non-visual information display using tactons. In: CHI2004 Extended Abstracts on Human Factors in Computing Systems, pp. 787–788 (2004)

    Google Scholar 

  19. Lin, M.-W., Cheng, Y.-M., Yu, W.: Using tactons to provide navigation cues in pedestrian situations. In: Proceedings of the 5th Nordic Conference on Human-Computer Interaction: Building Bridges, pp. 507–510 (2008)

    Google Scholar 

  20. Barralon, P., Ng, G., Dumont, G., Schwarz, S.K.W., Ansermino, M.: Development and evaluation of multidimensional tactons for a wearable tactile display. In: Proceedings of the 9th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 186–189. Association for Computing Machinery, New York, NY, USA (2007). https://doi.org/10.1145/1377999.1378005

  21. van Erp, J.B.F.: Tactile navigation display. In: Brewster, S. and Murray-Smith, R. (eds.) Haptic Human-Computer Interaction. Haptic HCI 2000. LNCS, vol. 2058, pp. 165–173. Springer, Heidelberg (2001). https://doi.org/10.1007/3-540-44589-7_18

  22. Petermeijer, S.M., De Winter, J.C., Bengler, K.J.: Vibrotactile displays: a survey with a view on highly automated driving. IEEE Trans. Intell. Transp. Syst. 17, 897–907 (2015)

    Article  Google Scholar 

  23. Enriquez, M., Afonin, O., Yager, B., Maclean, K.: A pneumatic tactile alerting system for the driving environment (2002). https://doi.org/10.1145/971478.971506

  24. Spence, C., Driver, J.: Crossmodal space and crossmodal attention (2004). https://doi.org/10.1093/acprof:oso/9780198524861.001.0001

  25. Meng, F., Spence, C.: Tactile warning signals for in-vehicle systems. Accid. Anal. Prev. 75, 333–346 (2015). https://doi.org/10.1016/j.aap.2014.12.013

    Article  Google Scholar 

  26. Krüger, M., Wiebel-Herboth, C.B., Wersing, H.: Tactile encoding of directions and temporal distances to safety hazards supports drivers in overtaking and intersection scenarios. Transport. Res. F: Traffic Psychol. Behav. 81, 201–222 (2021). https://doi.org/10.1016/j.trf.2021.05.014

    Article  Google Scholar 

  27. Brhel, M., Meth, H., Maedche, A., Werder, K.: Exploring principles of user-centered agile software development: a literature review. Inf. Softw. Technol. 61, 163–181 (2015). https://doi.org/10.1016/j.infsof.2015.01.004

  28. Sharp, H.: Interaction design. Wiley (2003)

    Google Scholar 

  29. Hoggan, E., Brewster, S.: New parameters for tacton design. In: CHI2007 Extended Abstracts on Human Factors in Computing Systems, pp. 2417–2422 (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Liwen He .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

He, L., Wang, Y., Luo, H., Wang, D. (2023). Vibrotactile Encoding of Object Features and Alert Levels for the Visually Impaired. In: Wang, D., et al. Haptic Interaction. AsiaHaptics 2022. Lecture Notes in Computer Science, vol 14063. Springer, Cham. https://doi.org/10.1007/978-3-031-46839-1_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-46839-1_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-46838-4

  • Online ISBN: 978-3-031-46839-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics