Skip to main content
Log in

A wearable guidance system with interactive user interface for persons with visual impairment

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

We propose a wearable system that helps visually impaired persons to walk to their destination. After a destination is selected, our system computes an optimal path and guides the user with a marker position and its identifier detected by a camera (indoors) or positioning data from a GPS receiver (outdoors). Simultaneously, it utilizes multiple ultrasonic sensors to avoid obstacles lying in the path. In addition, we propose a fast correction algorithm to reduce the positioning error of GPS data and we deploy a map-matching algorithm when the user breaks away from the correct path. We evaluate spatial layout in front of the user on the basis of predefined patterns, and we determine the appropriate avoidance direction by analysing these patterns. The system safely guides a visually impaired person to the destination with interactive user interface.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21

Similar content being viewed by others

References

  1. Ando B, Graziani S (2009) Multisensor strategies to assist blind people: a clear-path indicator. IEEE Trans Instrum Meas 58(8):2488–2494

    Article  Google Scholar 

  2. Blasch BB, Wiener WR, Welsh RL (1997) Foundations of orientation and mobility, 2nd edn. AFB Press, New York

    Google Scholar 

  3. Borenstein J, Ulrich I (1997) The GuideCane: A computerized travel aid for the active guidance of blind pedestrians. Proc IEEE Int Conf Robot Autom, pp 1283–1288

  4. Cardin S, Thalmann D, Vexo F (2007) Wearable system for mobility improvement of visually impaired people. Vis Comput J 23(2):109–118

    Article  Google Scholar 

  5. Chagnaadorj O, Tanaka J (2014) Gesture input as an out-of-band channel. J Inf Process Syst 10(1):92–102

    Article  Google Scholar 

  6. Chang SM, Chang HH, Yen SH, Shih TK (2013) Panoramic human structure maintenance based on invariant features of video frames. Hum-Centric Comput Inf Sci 3(14)

  7. Cho H, Choi M (2014) Personal mobile album/diary application development. J Converg 5(1):32–37

    Google Scholar 

  8. Choi BS, Lee JJ (2008) Localization of a mobile robot based on an ultrasonic sensor using dynamic obstacles. Artif Life Robot 12:280–283

    Article  Google Scholar 

  9. Christou G (2013) A comparison between experienced and inexperienced video game players’ perceptions. Human-centric Computing and Information Sciences 3(15)

  10. Chung CS, Shin IS, Seo JS, Eun SD, In K (2001) The analysis of a gait pattern and the mechanical efficiency on ages and speed conditions. Korea J Sport Biomech 10(2):205–219

    Google Scholar 

  11. Collins CC, Scadden LA, Alden AB (1977) Mobile studies with a tactile imaging device. Fourth Conference on Systems & Devices for the Disabled

  12. Dakopoulos D, Bourbakis N (2008) Preserving visual information in low resolution images during navigation for visually impaired. Proceedings of the 1st International Conference on Pervasive Technologies Related to Assistive Environments, pp 607–612

  13. Dakopoulos D, Bourbakis NG (2010) Wearable obstacle avoidance electronic travel aids for blind: a survey. IEEE Trans Syst Man Cybern Part C Appl Rev 40(1):25–35

    Article  Google Scholar 

  14. Daniel W, Diester S (2007) Artoolkitplus for pose tracking on mobile devices. Proceedings of 12th Computer Vision Winter Workshop

  15. Fiala M (2004) ARTag Revision 1: A fiducial marker system using digital techniques. In National Research Council Publication 47419/ERB-1117, pp 1–47

  16. Hameed O, Naseem B, Iqbal J, Ahmad M, Anwar O (2007) Assistive technology-based navigation aid for the visually impaired. Proceedings of the 7th WSEAS International Conference on Robotics Control & Manufacturing Technology

  17. Hart PE, Nilsson NJ, Raphael B (1968) A formal basis for the heuristic determination of minimum cost paths. IEEE Trans Syst Sci Cybern 4(2):100–107

    Article  Google Scholar 

  18. Imai A, Seiyama N, Takagi T, Ifukube T (2010) Methods for the Visually Impaired to Retrieve Audio Information Efficiently. Lect Notes Comput Sci, pp 367–372

  19. Ito K, Okamoto M, Akita J, Ono T (2005) CyARM: an alternative aid device for blind persons. Proc Conf Hum Factors Comput Syst, pp 1483–1486

  20. Kang JH, Song BS, Yu KH, Yoon MJ (2006) Walking guide robot with tactile display and 3d ultrasonic sensing system for the blind. The 21st International Technical Conference on Circuits/Systems Computers and Communications

  21. Kato H, Billinghurst M (1999) Marker tracking and hmd calibration for a video-based augmented reality conferencing system. Proceedings of 2nd IEEE and ACM International Workshop on Augmented Reality

  22. Kay L (1974) A sonar aid to enhance spatial perception of the blind: engineering design and evaluation. Radio Electron Eng 44(11):605–627

    Article  Google Scholar 

  23. Kim JE, Han JH, Lee CG (2008) Optimal 3-coverage with minimum separation requirements for ubiquitous computing environments. ACM/Springer Mob Netw Appl 14(5):556–570

    Article  Google Scholar 

  24. Kim J, Jun H (2008) Vision-based location positioning using augmented reality for indoor navigation. IEEE Trans Consum Electron 54(3):954–962

    Article  Google Scholar 

  25. Kim TJ, Kim BG, Park CS, Jang KS (2014) Efficient block mode determination algorithm using adaptive search direction information for scalable video coding (SVC). J Converg 5(1):14–19

    MathSciNet  Google Scholar 

  26. Kim HM, Smith TL, Nam CS (2013) Elicitation of haptic user interface needs of people with low vision. Int J Hum Comput Interact 29(7):488–500

    Article  Google Scholar 

  27. Krisna S, Colbry D, Black J, Balasubramanian V, Panchanathan S (2008) A systematic requirements analysis and development of an assistive device to enhance the social interaction of people who are blind or visually impaired. Workshop on Computer Vision Applications for the Visually Impaired

  28. Lee JH, Lim SH, Lee ES, Shin BS (2009) An outdoor navigation system for the visually impaired persons incorporating GPS and ultrasonic sensors. Korea J KIISE: Softw Appl 36(6):462–470

    Google Scholar 

  29. Liew LH, Lee BY, Wang YC, Cheah WS (2013) Aerial images rectification using non-parametric approach. J Converg 4(2):15–22

    Google Scholar 

  30. Lim H, Lee YS (2007) Camera pose estimation method for implementing vision augmented reality. J Korean Inst Next Gener Comput 3(3):39–48

    Google Scholar 

  31. Lim H, Lee YS (2009) Real-time single camera SLAM using fiducial markers. ICCAS-SICE, pp 177–182

  32. Liu X, Makino H, Maeda Y (2008) Basic study on indoor location estimation using visible light communication platform. Proceedings of 30th Annual Conference IEEE Engineering in Medicine and Biology Society, pp 2377–2380

  33. Liu JJ, Phillips C, Daniilidis K (2010) Video-based localization without 3D mapping for the visually impaired. Comput Vis Pattern Recog Workshops, pp 13–18

  34. Meers S, Ward K (2005) A substitute vision system for providing 3D perception and GPS navigation via electro-tactile stimulation. Proc Int Conf Sens Technol, pp 21–23

  35. Mori H, Kotani S (1998) Robotic travel aid for the blind: HARUNOBU-6. Proceedings of the 2nd European Conference on Disability, pp 193–202

  36. Nakazato Y, Kanbara M, Yokoya N (2005) A localization system using invisible retro-reflective markers. Proc IAPR Conf Mach Vis Appl, pp 140–143

  37. Nakazato Y, Kanbara M, Yokoya N (2008) Localization of wearable users using invisible retro-reflective markers and an IR camera. IEEE Trans Consum Electron 54(3):954–962

    Article  Google Scholar 

  38. National Geographic Information Institute (2006) Technology guideline of public survey and conversion into world geodetic reference system transformation. pp 24–31

  39. Pradeep V, Medioni G, Weiland J (2010) Robot vision for the visually impaired. Comput Vis Pattern Recog Workshops, pp 15–22

  40. Rajamaki J, Viinikainen P, Tuomisto J, Sederholm T, Saamanen M (2007) LaureaPOP indoor navigation service for the visually impaired in a WLAN environment. Proc. 6th WSEAS Int. Conf. on Electronics, Hardware, Wireless and Optical Communications, pp 96–101

  41. Shahabi C, Kim SH, Nocera L, Constantinou G, Lu Y, Cai Y, Medioni G, Nevatia R, Banaei-Kashani F (2014) Janus-multi source event detection and collection system for effective surveillance of criminal activity. J Inf Process Syst 10(1):1–22

    Article  Google Scholar 

  42. Shim HM, Lee JS, Lee EH, Hong SH (2002) A study on the sound-imaging algorithm of obstacle information for the visually impaired. The 2002 International Conference on Circuits/Systems, Computers and Communications, pp 389–392

  43. Shoval S, Ulrich I, Borenstein J (2003) Robotics-based obstacle-avoidance systems for the blind and visually impaired. NavBelt and the GuideCane. IEEE Robot Autom Mag, pp 9–20

  44. Soeda K, Aoki S, Yanashima K, Magatani K (2004) Development of the visually impaired person guidance system using GPS. Proceedings of the 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp 4870–4873

  45. Tachi S, Komiyama K, Abe M (1982) Electrocutaneous communication in seeing-eye robot(MELDOG). Proceedings 4th Annual Conference IEEE Engineering in Medicine and Biology Society, pp 356–361

  46. Takatori N, Nojima K, Matsumoto M, Yanashima K (2006) Development of voice navigation system for the visually impaired by using IC tags. Proc. 28th Annual International Conference for the IEEE EMBS, pp 5181–5184

  47. Tenmoku R, Kanbara M, Yokoya N (2005) Wearable augmented reality system using invisible visual markers and an IR camera. Proceedings of 9th IEEE International Symposium on Wearable Computers, pp 198–199

  48. Treuillet S, Royer E, Chateau T, Dhome M, Lavest JM (2007) Body mounted vision system for visually impaired outdoor and indoor wayfinding assistance. Proc Conf Assist Technol People Vis Hear Impair

  49. Tsai RY (1987) A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf tv cameras and lenses. IEEE J Robot Autom 3(4):323–344

    Article  Google Scholar 

  50. Verma OP, Jain V, Gumber R (2013) Simple fuzzy rule based edge detection. J Inf Process Syst 9(4):575–591

    Article  Google Scholar 

  51. Vipparthi SK, Nagar SK (2014) Color directional local quinary patterns for content based indexing and retrieval. Hum-Centric Comput Inf Sci 4(6)

  52. White CE, Bernstein D, Kornhauser AL (2000) Some map matching algorithms for personal navigation assistants. Elsevier, pp 91–108

  53. Yasumuro Y, Murakami M, Imura M (2003) E-cane with situation presumption for the visually impaired. Lect Notes Comput Sci, pp 409–421

  54. Yatani K, Truong KN (2009) SemFeel: A user interface with semantic tactile feedback for mobile touch-screen devices. Proc UIST

  55. Yuta N, Ryosuke K, Noboru B (2011) Indoor positioning system using digital audio watermarking. IEICE Trans Inf Syst E94-D(11):2201–2211

    Article  Google Scholar 

  56. Zhang Z (1998) A flexible new technique for camera calibration. Technical Report MSR-TR-98-71, Microsoft Corporation

Download references

Acknowledgments

This research was supported by Next-Generation Information Computing Development Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Science, ICT & Future Planning (No. 2012M3C4A7032781). This work was supported by INHA UNIVERSITY Research Grant.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Byeong-Seok Shin.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lee, JH., Kim, D. & Shin, BS. A wearable guidance system with interactive user interface for persons with visual impairment. Multimed Tools Appl 75, 15275–15296 (2016). https://doi.org/10.1007/s11042-014-2385-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-014-2385-4

Keywords

Navigation