Skip to main content
Log in

A Walking Assistant Robotic System for the Visually Impaired Based on Computer Vision and Tactile Perception

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

It is a challenging task for the visually impaired to perceive environment information and walk independently. This paper presents a novel design of a walking assistant robotic system based on computer vision and tactile perception. A novel rollator structure is applied to provide a strong physical support. A Kinect device is used as eyes of the visually impaired to capture the front environment information including color images and depth images. And ultrasonic sensors are utilized to detect the evenness of road surface. A wearable vibro-tactile belt is designed to provide the visually impaired with the environment information through different vibration modes. A feature extraction method of safe directions based on depth image compression is proposed. Background difference method is used to realize moving object detection in order to help the visually impaired perceive environment conditions. The experiment results show that the wearable vibro-tactile belt is practical and the walking assistant robotic system is effective and helpful in aiding the visually impaired to walk independently.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17

Similar content being viewed by others

References

  1. World Health Organization (2013) Visual impairment and blindness. Media center, http://www.who.int/mediacentre/factsheets/fs282/en/. Accessed 8 June 2015

  2. Welsh RL, Blasch BB (1980) Foundations of orientation and mobility. AFB Press, New York

    Google Scholar 

  3. Dakopoulos D, Bourbakis NG (2010) Wearable obstacle avoidance electronic travel aids for blind: a survey. IEEE Trans Syst Man Cybern 40(1):25–35

    Article  Google Scholar 

  4. Manduchi R, Coughlan J (2012) (Computer) vision without sight. Commun ACM 55(1):96–104

    Article  Google Scholar 

  5. Bousbia-Salah M, Fezari M (2007) Innovations and advanced techniques in computer and information sciences and engineering. Springer, Dordrecht

    Google Scholar 

  6. Okayasu M (2010) Newly developed walking apparatus for identification of obstructions by visually impaired people. J Mech Sci Technol 24(6):1261–1264

    Article  Google Scholar 

  7. Wahab A, Helmy M, Talib AA, Kadir HA, Johari A, Noraziah A, Sidek RM, Mutalib AA (2011) Smart cane: assistive cane for visually-impaired people. Int J Comput Sci Issues 8(2):21–27

    Google Scholar 

  8. Wilson J, Walker BN, Lindsay J, Cambias C, Dellaert F (2007) Swan: system for wearable audio navigation. In: The 11th IEEE international symposium. Wearable computers, pp. 91–98. Accessed Oct 2007

  9. Sánchez J, Tadres A (2010) Audio and haptic based virtual environments for orientation and mobility in people who are blind. In: The 12th international ACM SIGACCESS conference on computers and accessibility, pp 237–238. Accessed Oct 2010

  10. Ulrich I, Borenstein J (2001) The GuideCane-applying mobile robot technologies to assist the visually impaired. IEEE Trans Syst Man Cybern Part A 31(2):131–136

    Article  Google Scholar 

  11. Lacey G, Rodriguez-Losada D (2008) The evolution of guido. IEEE Robot Autom Mag 15(4):75–83

    Article  Google Scholar 

  12. Gomez JV, Sandnes FE (2012) RoboGuideDog: guiding blind users through physical environments with laser range scanners. Proc Comput Sci 14:218–225

    Article  Google Scholar 

  13. Saegusa S, Yasuda Y, Uratani Y, Tanaka E, Makino T, Chang JY (2011) Development of a guide-dog robot: human–robot interface considering walking conditions for a visually handicapped person. Microsyst Technol 17(5–7):1169–1174

    Article  Google Scholar 

  14. Terven JR, Salas J, Raducanu B (2014) New opportunities for computer vision-based assistive technology systems for the visually impaired. IEEE Comput Soc Comput 47(4):52–58

    Article  Google Scholar 

  15. Peter BL, Meijer (2014) See with your ears! The vOICe, http://www.seeingwithsound.com

  16. Pradeep V, Medioni G, Weiland J (2010) A wearable system for the visually impaired. In: Annual international conference of the IEEE on engineering in medicine and biology society (EMBS), pp. 6233–6236. Accessed Aug 2010

  17. Bhatlawande S, Sunkari A, Mahadevappa M, Mukhopadhyay J, Biswas M, Das D, Gupta S (2014) Electronic bracelet and vision-enabled waist-belt for mobility of visually impaired people. Assist Technol 26(4):186–195

    Article  Google Scholar 

  18. Liu R, Wang YX (2012) Auditory feedback and sensory substitution during teleoperated navigation. IEEE/ASME Trans Mech 17(4):680–686

    Article  Google Scholar 

  19. Pei SC, Wang YY (2011) Census-based vision for auditory depth images and speech navigation of visually impaired users. IEEE Trans Consum Elect 57(4):1883–1890

    Article  Google Scholar 

  20. Bourbakis N (2008) Sensing surrounding 3-D space for navigation of the blind. In: Annual international conference of the IEEE engineering in medicine and biology society, vol 27(1), pp. 49–55

  21. Kammoun S, Parseihian G, Gutierrez O, Brilhault A, Serpa A, Raynal M, Oriola B, Macéa MJ-MM, Auvray M, Thorpe SJ, Truillet P, Katz BFG, Jouffrais C (2012) Navigation and space perception assistance for the visually impaired: the NAVIG project. IRBM 33(2):182–189

    Article  Google Scholar 

  22. Johnson LA , Higgins CM (2006) A navigation aid for the blind using tactile-visual sensory substitution. In: 28th Annual international conference of the IEEE on engineering in medicine and biology society (EMBS), pp. 6289–6292

  23. Wang S, Pan H, Zhang C, Tian Y (2014) RGB-D image-based detection of stairs, pedestrian crosswalks and traffic signs. J Visual Commun Image Represent 25(2):263–272

    Article  Google Scholar 

  24. Kim D H, Ryu H G (2011) Obstacle recognition system using ultrasonic sensor and duplex radio-frequency camera for the visually impaired person. In: The 13th international conference of the IEEE on advanced communication technology (ICACT), pp. 326–329

  25. Moreno M, Shahrabadi S, José J, du Buf JMH, Rodrigues JMF (2012) Realtime local navigation for the blind: detection of lateral doors and sound interface. Proc Comput Sci 14:74–82

    Article  Google Scholar 

  26. Scherlen AC, Dumas JC, Guedj B, Vignot A (2007) “RecognizeCane”: the new concept of a cane which recognizes the most common objects and safety clues. In: The 29th annual international conference of the IEEE on engineering in medicine and biology society (EMBS), pp. 6356–6359

  27. Takizawa H, Yamaguchi S, Aoyagi M, Ezaki N, Mizuno S (2012) Kinect cane: an assistive system for the visually impaired based on three-dimensional object recognition. In: IEEE/SICE international symposium on system integration (SII), pp. 740–745

  28. Jafri R, Ali SA, Arabnia HR, Fatima S (2014) Computer vision-based object recognition for the visually impaired in an indoors environment: a survey. Visual Comput 30(11):1197–1222

    Article  Google Scholar 

  29. Yi C, Tian Y, Arditi A (2014) Portable camera-based assistive text and product label reading from hand-held objects for blind persons. IEEE/ASME Trans Mechatron 19(3):808–817

    Article  Google Scholar 

  30. Xiao H, Li Z, Yang C, Yuan W, Wang L (2015) RGB-D sensor-based visual target detection and tracking for an intelligent wheelchair robot in indoors environments. Int J Control Autom Syst, pp. 1–9

  31. Bourbakis N, Makrogiannis SK, Dakopoulos D (2013) A system-prototype representing 3d space via alternative-sensing for visually impaired navigation. IEEE Sens J 13(7):2535–2547

    Article  Google Scholar 

  32. Yaagoubi R, Edwards G, Badard T, Mostafavi MA (2012) Enhancing the mental representations of space used by blind pedestrians, based on an image schemata model. Cognit Process 13(4):333–347

    Article  Google Scholar 

  33. Rollator.org (2008) Rollator. WikiTalks. http://www.rollator.org/. Accessed 8 June 2015

  34. Ni D, Wang L, Ding Y, Zhang J, Song A, Wu J (2013) The design and implementation of a walking assistant system with vibrotactile indication and voice prompt for the visually impaired. In: IEEE conference on robotics and biomimetics (ROBIO), pp. 2721–2726

  35. Tukey JW (1977) Exploratory data analysis. Addison-Wesley, Cambridge

    MATH  Google Scholar 

  36. Telea A (2004) An image inpainting technique based on the fast marching method. J Graph Tools 9(1):23–34

    Article  Google Scholar 

  37. Elgammal A, Duraiswami R, Harwood D, Davis LS (2002) Background and foreground modeling using nonparametric kernel density estimation for visual surveillance. Proc IEEE 90(7):1151–1163

    Article  Google Scholar 

  38. Wu J, Ding Y, Ni D, Song G, Liu W (2013) Vibrotactile representation of three-dimensional shape and implementation on a vibrotactile pad. Sens Mater 25(1):79–97

    Google Scholar 

Download references

Acknowledgments

This work was supported by Natural Science Foundation of China under grant number 61325018.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Aiguo Song.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ni, D., Song, A., Tian, L. et al. A Walking Assistant Robotic System for the Visually Impaired Based on Computer Vision and Tactile Perception. Int J of Soc Robotics 7, 617–628 (2015). https://doi.org/10.1007/s12369-015-0313-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-015-0313-z

Keywords

Navigation