Skip to main content

A Bio-Inspired Model for Robust Navigation Assistive Devices: A Proof of Concept

  • Conference paper
  • First Online:
ICT for Health, Accessibility and Wellbeing (IHAW 2022)

Abstract

This paper proposes an implementation and evaluation in a real-world environment of a new bio-inspired predictive navigation model for mobility control, suitable especially for visually impaired people. This model relies on the interactions between formal models of three types of neurons identified in the mammals’ brain implied in navigation tasks (namely place cells, grid cells, and head direction cells) to construct a topological model of the environment under the form of a decentralized navigation graph. The proposed model, previously tested in virtual environments, demonstrated a high tolerance to motion drift and robustness to environment changes. This paper presents an implementation of this navigation model, based on a stereoscopic camera, and evaluates its possibilities to map and guide a person in an unknown real environment. The evaluation results confirm the effectiveness of the proposed bio-inspired navigation model to build a path map and guide a person through this path, while remaining robust to environment changes, and estimating traveled distances with an error rate below 2% over test paths, up to 100 m. These results open the way toward efficient wearable assistive devices for visually impaired people navigation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://docs.opencv.org/3.4/dc/d6b/group__video__track.html#ga473e4b886d0bcc6b65831eb88ed93323.

References

  1. APH (American Printing House). https://tech.aph.org/neios/. Accessed 15 Nov 2022

  2. Seeing Eye GPS, Sendero Group. http://www.senderogroup.com/products/SeeingEyeGPS/. Accessed 15 Nov 2022

  3. Hengle, A., Kulkarni, A., Bavadekar, N., Kulkarni, N., Udyawar,R.: Smart cap: a deep learning and IoT based assistant for the visually impaired. In: 3rd International Conference on Smart Systems and Inventive Technology (ICSSIT), pp. 1109–1116 (2020)

    Google Scholar 

  4. Google Map. https://www.google.com/maps. Accessed 15 Nov 2022

  5. Flores, G., Manduchi, R.: Easy return: an App for indoor backtracking assistance. In: ACM CHI 2018, pp. 1–12, USA (2018)

    Google Scholar 

  6. Fallah, N., Apostolopoulos, I., Bekris, K., Folmer, E.: Indoor human navigation systems: a survey. Interact. Comput. 25(1), 21–33 (2013)

    Google Scholar 

  7. Liu, K., Motta, G., Dong, J., Hashish I.A.: Wi-Fi-aided magnetic field positioning with floor estimation in indoor multi-floor navigation services. In: ICIOT, Honolulu (2017)

    Google Scholar 

  8. Fusco, G., Coughlan, J.M.: Indoor localization for visually impaired travelers using computer vision on a smartphone. In: Proceedings of the ACM Web4All Conference: Automation for Accessibility, Taiwan (2020)

    Google Scholar 

  9. Campos, C., Elvira, R., Gómez Rodríguez, J.J., Montiel, J.M.M., Tardós, J.D.: ORB-SLAM3: an accurate open-source library for visual, visual-inertial and multi-map SLAM. IEEE Trans. Rob. 37(6), 1874–1890 (2021)

    Article  Google Scholar 

  10. O’Keefe, J., Nadel, L.: The Hippocampus as a Cognitive Map. Clarendon Press, Oxford (1978)

    Google Scholar 

  11. Hafting, T., Fyhn, M., Molden, S., Moser, M.B., Moser, E.I.: Microstructure of a spatial map in the entorhinal cortex. Nature 436(7052), 801–806 (2005)

    Article  Google Scholar 

  12. Taube, J., Muller, R., Ranck, J.: Head-direction cells recorded from the postsubiculum in freely moving rats. I. description and quantitative analysis. J. Neurosci. 10, 420–435 (1990)

    Article  Google Scholar 

  13. Gaussier, P., et al.: Merging information in the entorhinal cortex: what can we learn from robotics experiments and modeling? J. Exp. Biol. 222, 1–13 (2019)

    Article  Google Scholar 

  14. Zhou, X., Weber, C., Wermter, S.: A self-organizing method for robot navigation based on learned place and head-direction cells. In: Proceedings of the International Joint Conference Neural Networks, vol. 2018, pp. 1–8 (2018)

    Google Scholar 

  15. Zhou, X., Bai, T., Gao, Y., Han, Y.: Vision-based robot navigation through combining unsupervised learning and hierarchical reinforcement learning. Sensors 19(7), 1–23 (2019)

    Article  Google Scholar 

  16. Chen, Q., Mo, H.: A brain-inspired goal-oriented robot navigation system. Appl. Sci. 9(22), 4869 (2019)

    Article  Google Scholar 

  17. Karaouzene, A., Delarboulas, P., Vidal, D., Gaussier, P., Quoy, M., Ramesh, C.: Social interaction for object recognition and tracking. In: IEEE ROMAN Workshop on Developmental and Bio-Inspired Approaches for Social Cognitive Robotics (2013)

    Google Scholar 

  18. Milford, M., Wyeth, G.: Persistent navigation and mapping using a biologically inspired slam system. Int. J. Rob. Res. 29(9), 1131–1153 (2010)

    Article  Google Scholar 

  19. Milford, M.J., Wyeth, G.F.: SeqSLAM: Visual route-based navigation for sunny summer days and stormy winter nights. In: Proceedings of the IEEE International Conference Robotics Automation, pp. 1643–1649 (2012)

    Google Scholar 

  20. Tang, H., Yan, R., Tan, K.C.: Cognitive navigation by neuro-inspired localization, mapping, and episodic memory. IEEE Trans. Cogn. Dev. Syst. 10(3), 751–761 (2018)

    Article  Google Scholar 

  21. Gay, S.L., Le Run, K., Pissaloux, E., Romeo, K., Lecomte, C.: Towards a predictive bio-inspired navigation model. J. Inf. 12(3), 1–19 (2021)

    Google Scholar 

  22. Pissaloux, E., Velazquez, R., Maingreaud, F.: A new framework for cognitive mobility of visually impaired users and associated tactile device. IEEE T-HMS Trans. Hum.-Mach. Syst. 47(6), 2168–2291 (2017)

    Google Scholar 

  23. Stensola, H., Stensola, T., Solstad, T., FrØland, K., Moser, M.B., Moser, E.I.: The entorhinal grid map is discretized. Nature 492(7427), 72–78 (2012)

    Article  Google Scholar 

  24. sieuwe elferink github repository. https://github.com/sieuwe1/PS4-eye-camera-for-linux-with-python-and-OpenCV. Accessed 15 Nov 2022

Download references

Acknowledgements

This work is supported by the French National Research Agency (ANR) in the frameworks of “Investissements d’avenir” (ANR-15-IDEX-02) and “Inclusive Museum Guide” (IMG, ANR-20-CE38-0007), and by the Region of Normandy and European Commission in the frame of “Guide Muséal”.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Simon L. Gay .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gay, S.L., Pissaloux, E., Jamont, JP. (2023). A Bio-Inspired Model for Robust Navigation Assistive Devices: A Proof of Concept. In: Papadopoulos, G.A., Achilleos, A., Pissaloux, E., Velázquez, R. (eds) ICT for Health, Accessibility and Wellbeing. IHAW 2022. Communications in Computer and Information Science, vol 1799. Springer, Cham. https://doi.org/10.1007/978-3-031-29548-5_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-29548-5_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-29547-8

  • Online ISBN: 978-3-031-29548-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics