Abstract
Representation of object features can help visually impaired people better comprehend their surrounding environment. Tacton (Tactile Icon) is an effective method to extract and express information non-visually, utilizing users’ tactile perception capacities. However existing vibrotactile displays mainly place emphasis on directional guidance, and the number of representable object features is very limited. To leverage the egocentric spatial cognition habit and high tactile perception sensitivity of visually impaired users, this research proposes a user-centered vibrotactile cueing strategy to convey 30 kinds of spatial information through 30 tactons played by 4 vibrators on the back and front side of a pair of gloves. Three parameters including vibration sequence, stimulus location, and intensity are used to encode 10 typical objects located in 3 directions with 2 alert levels. User tests in both laboratory and natural settings are conducted to evaluate the validity of the strategy. The recognition accuracy of the designed tacton has reached 98.99% within a recognition time of less than 0.6s, indicating that this strategy can provide practical assistance for visually impaired users to perceive and respond to the pre-defined spatial information. The multi-parameter tactons provide possibility to encode a wide variety of spatial information by exploiting the communication capacities of the tactile channel of visually impaired users.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Xu, J.X., Gu, X.R.: Research Progress on safety of urban pedestrian space based on the use of visually impaired people. Garden 10, 36–40 (2020)
Yang, Y.N., Liu, J., Wang, Y., Zhang, Z.X., Dai, Y.R.: Research on the construction of barrier free facilities for the blind in smart cities. Huazhong Architecture. 37, 36–40 (2019)
Yu, Y.M., et al.: Research on the optimization of “15 minute life circle” in urban communities -- the direction of “visually impaired groups, vol. 17, pp. 7–11 (2020)
Chen, X.M., Liu, C.L., Qiao, F.Q., Qi, K.M.: A study on the optimization of “15 minute life circle” in urban communities – strategies and effects of “visually impaired group” to construct spatial representation of unfamiliar environment for the blind. Acta Psychol. Sin. 48, 637 (2016)
Kaul, O.B., Rohs, M., Mogalle, M., Simon, B.: Around-the-head tactile system for supporting micro navigation of people with visual impairments. ACM Trans. Comput.-Hum. Interact. 28, 27 (2021). https://doi.org/10.1145/3458021
Flores, G., Kurniawan, S., Manduchi, R., Martinson, E., Morales, L.M., Sisbot, E.A.: Vibrotactile guidance for wayfinding of blind walkers. IEEE Trans. Haptics 8, 306–317 (2015). https://doi.org/10.1109/TOH.2015.2409980
Ho, C., Tan, H.Z., Spence, C.: Using spatial vibrotactile cues to direct visual attention in driving scenes. Transport. Res. F: Traffic Psychol. Behav. 8, 397–412 (2005)
Ho, C., Reed, N., Spence, C.: Assessing the effectiveness of “intuitive” vibrotactile warning signals in preventing front-to-rear-end collisions in a driving simulator. Accid. Anal. Prev. 38, 988–996 (2006). https://doi.org/10.1016/j.aap.2006.04.002
Scott, J.J., Gray, R.: A comparison of tactile, visual, and auditory warnings for rear-end collision prevention in simulated driving. Hum. Factors 50, 264–275 (2008). https://doi.org/10.1518/001872008X250674
Lévesque, V.: Blindness, technology and haptics. Center for Intelligent Machines, pp. 19–21 (2005)
Faria, J., Lopes, S., Fernandes, H., Martins, P., Barroso, J.: Electronic white cane for blind people navigation assistance. In: 2010 World Automation Congress, pp. 1–7. IEEE (2010)
Yuan, D., Manduchi, R.: Dynamic environment exploration using a virtual white cane. In: 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR2005), vol. 1, pp. 243–249 (2005). https://doi.org/10.1109/CVPR.2005.136
Zhao, Y., et al.: Enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–14 (2018)
Bach-y-Rita, P., Kercel, S.W.: Sensory substitution and the human–machine interface. Trends Cogn. Sci. 7, 541–546 (2003)
Bach-y-Rita, P.: Tactile sensory substitution studies. Ann. N. Y. Acad. Sci. 1013, 83–91 (2004)
Lenay, C., Canu, S., Villon, P.: Technology and perception: the contribution of sensory substitution systems. In: Proceedings Second International Conference on Cognitive Technology Humanizing the Information Age, pp. 44–53. IEEE (1997)
Jones, L.A.: Chapter 8 - Tactile communication systems: optimizing the display of information. In: Green, A., Chapman, C.E., Kalaska, J.F., Lepore, F. (eds.) Progress in Brain Research, pp. 113–128. Elsevier (2011). https://doi.org/10.1016/B978-0-444-53355-5.00008-7
Brewster, S.A., Brown, L.M.: Non-visual information display using tactons. In: CHI2004 Extended Abstracts on Human Factors in Computing Systems, pp. 787–788 (2004)
Lin, M.-W., Cheng, Y.-M., Yu, W.: Using tactons to provide navigation cues in pedestrian situations. In: Proceedings of the 5th Nordic Conference on Human-Computer Interaction: Building Bridges, pp. 507–510 (2008)
Barralon, P., Ng, G., Dumont, G., Schwarz, S.K.W., Ansermino, M.: Development and evaluation of multidimensional tactons for a wearable tactile display. In: Proceedings of the 9th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 186–189. Association for Computing Machinery, New York, NY, USA (2007). https://doi.org/10.1145/1377999.1378005
van Erp, J.B.F.: Tactile navigation display. In: Brewster, S. and Murray-Smith, R. (eds.) Haptic Human-Computer Interaction. Haptic HCI 2000. LNCS, vol. 2058, pp. 165–173. Springer, Heidelberg (2001). https://doi.org/10.1007/3-540-44589-7_18
Petermeijer, S.M., De Winter, J.C., Bengler, K.J.: Vibrotactile displays: a survey with a view on highly automated driving. IEEE Trans. Intell. Transp. Syst. 17, 897–907 (2015)
Enriquez, M., Afonin, O., Yager, B., Maclean, K.: A pneumatic tactile alerting system for the driving environment (2002). https://doi.org/10.1145/971478.971506
Spence, C., Driver, J.: Crossmodal space and crossmodal attention (2004). https://doi.org/10.1093/acprof:oso/9780198524861.001.0001
Meng, F., Spence, C.: Tactile warning signals for in-vehicle systems. Accid. Anal. Prev. 75, 333–346 (2015). https://doi.org/10.1016/j.aap.2014.12.013
Krüger, M., Wiebel-Herboth, C.B., Wersing, H.: Tactile encoding of directions and temporal distances to safety hazards supports drivers in overtaking and intersection scenarios. Transport. Res. F: Traffic Psychol. Behav. 81, 201–222 (2021). https://doi.org/10.1016/j.trf.2021.05.014
Brhel, M., Meth, H., Maedche, A., Werder, K.: Exploring principles of user-centered agile software development: a literature review. Inf. Softw. Technol. 61, 163–181 (2015). https://doi.org/10.1016/j.infsof.2015.01.004
Sharp, H.: Interaction design. Wiley (2003)
Hoggan, E., Brewster, S.: New parameters for tacton design. In: CHI2007 Extended Abstracts on Human Factors in Computing Systems, pp. 2417–2422 (2007)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
He, L., Wang, Y., Luo, H., Wang, D. (2023). Vibrotactile Encoding of Object Features and Alert Levels for the Visually Impaired. In: Wang, D., et al. Haptic Interaction. AsiaHaptics 2022. Lecture Notes in Computer Science, vol 14063. Springer, Cham. https://doi.org/10.1007/978-3-031-46839-1_5
Download citation
DOI: https://doi.org/10.1007/978-3-031-46839-1_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-46838-4
Online ISBN: 978-3-031-46839-1
eBook Packages: Computer ScienceComputer Science (R0)