Skip to main content

Metric Localisation for the NAO Robot

  • Conference paper
  • First Online:
Pattern Recognition (MCPR 2021)

Abstract

We present a metric localisation approach for the NAO robot based on the methodology of depth estimation using optical flow in a frame-to-frame basis. We propose to convert optical flow into a 2-channel image from which images patches of \(60\times 60\) are extracted. Each patch is passed as input to a Convolutional Neural Network (CNN) with a regressor in the last layer, thus a depth value is estimated for such patch. A depth image is formed by putting together all the depth estimates obtained for each patch. The depth image is coupled with the RGB image and then passed to the well known ORB-SLAM system in its RGB-D version, this is, a visual simultaneous localisation and mapping approach that uses RGB and depth images to build a 3D map of the scene and use it to localise the camera. When using the depth images, the estimates are recovered with metric. Hence, the NAO’s position can be estimated in metres. Our approach aims at exploiting the walking motion of the robot, which produces image displacements in consecutive frames, and by taking advantage from the fact that the NAO’s walking motion could be programmed to be performed at constant speed. We mount a depth camera on the NAO’s head to produce a training dataset that associates patch RGB images with depth values. Then, a CNN can be trained to learn the patterns in between optical flow vectors and the scene depth. For evaluation, we use one of the in-built NAO’s camera. Our experiments show that this approach is feasible and could be exploited in applications where the NAO requires a localisation systems without depending on additional sensors or external localisation systems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Alquisiris-Quecha, O., Martinez-Carranza, J.: Depth estimation using optical flow and CNN for the NAO robot. Res. Comput. Sci. 148(11), 49–58 (2019)

    Article  Google Scholar 

  2. Farnebäck, G.: Two-frame motion estimation based on polynomial expansion. In: Bigun, J., Gustavsson, T. (eds.) SCIA 2003. LNCS, vol. 2749, pp. 363–370. Springer, Heidelberg (2003). https://doi.org/10.1007/3-540-45103-X_50

    Chapter  Google Scholar 

  3. Gil, C.R., Calvo, H., Sossa, H.: Learning an efficient gait cycle of a biped robot based on reinforcement learning and artificial neural networks. Appl. Sci. 9(3), 502 (2019)

    Article  Google Scholar 

  4. Ho, H.W., de Croon, G.C.H.E., Chu, Q.: Distance and velocity estimation using optical flow from a monocular camera. Int. J. Micro Air Veh. 9(3), 198–208 (2017)

    Article  Google Scholar 

  5. Hornung, A., Wurm, K.M., Bennewitz, M.: Humanoid robot localization in complex indoor environments. In: IROS, pp. 1690–1695 (2010)

    Google Scholar 

  6. Li, R., Wang, S., Long, Z., Gu, D.: UnDeepVO: monocular visual odometry through unsupervised deep learning. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 7286–7291. IEEE (2018)

    Google Scholar 

  7. Lobos-Tsunekawa, K., Leiva, F., Ruiz-del Solar, J.: Visual navigation for biped humanoid robots using deep reinforcement learning. IEEE Robot. Autom. Lett. 3(4), 3247–3254 (2018)

    Article  Google Scholar 

  8. Mur-Artal, R., Tardós, J.D.: ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Trans. Robot. 33(5), 1255–1262 (2017)

    Article  Google Scholar 

  9. Ponce, H., Brieva, J., Moya-Albor, E.: Distance estimation using a bio-inspired optical flow strategy applied to neuro-robotics. In: 2018 International Joint Conference on Neural Networks (IJCNN), pp. 1–7. IEEE (2018)

    Google Scholar 

  10. Rioux, A., Suleiman, W.: Autonomous slam based humanoid navigation in a cluttered environment while transporting a heavy load. Robot. Auton. Syst. 99, 50–62 (2018)

    Article  Google Scholar 

  11. Scheper, K.Y.W., de Croon, G.C.H.E.: Evolution of robust high speed optical-flow-based landing for autonomous MAVs. Robot. Auton. Syst. 124, 103380 (2020)

    Article  Google Scholar 

  12. Tiwari, L., Ji, P., Tran, Q.-H., Zhuang, B., Anand, S., Chandraker, M.: Pseudo RGB-D for self-improving monocular slam and depth prediction. arXiv preprint, page arXiv:2004.10681 (2020)

  13. Wang, C., Ji, T., Nguyen, T.-M., Xie, L.: Correlation flow: robust optical flow using kernel cross-correlators. arXiv preprint arXiv:1802.07078 (2018)

  14. Wang, K., Shen, S.: Flow-motion and depth network for monocular stereo and beyond. IEEE Robot. Autom. Lett. 5(2), 3307–3314 (2020)

    Article  Google Scholar 

  15. Wen, S., Othman, K.M., Rad, A.B., Zhang, Y., Zhao, Y.: Indoor SLAM using laser and camera with closed-loop controller for NAO humanoid robot. In: Abstract and Applied Analysis, vol. 2014. Hindawi (2014)

    Google Scholar 

  16. Wen, S., Zhang, Z., Ma, C., Wang, Y., Wang, H.: An extended Kalman filter-simultaneous localization and mapping method with Harris-scale-invariant feature transform feature recognition and laser mapping for humanoid robot navigation in unknown environment. Int. J. Adv. Robot. Syst. 14(6), 1729881417744747 (2017)

    Article  Google Scholar 

  17. Xu, X., Hong, B., Guan, Y.: Humanoid robot localization based on hybrid map. In: 2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC), pp. 509–514. IEEE (2017)

    Google Scholar 

  18. Zhou, H., Ummenhofer, B., Brox, T.: DeepTAM: deep tracking and mapping. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11220, pp. 851–868. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01270-0_50

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Oswualdo Alquisiris-Quecha .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Alquisiris-Quecha, O., Martinez-Carranza, J. (2021). Metric Localisation for the NAO Robot. In: Roman-Rangel, E., Kuri-Morales, Á.F., Martínez-Trinidad, J.F., Carrasco-Ochoa, J.A., Olvera-López, J.A. (eds) Pattern Recognition. MCPR 2021. Lecture Notes in Computer Science(), vol 12725. Springer, Cham. https://doi.org/10.1007/978-3-030-77004-4_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-77004-4_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-77003-7

  • Online ISBN: 978-3-030-77004-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics