Skip to main content

Nao Robot Localization and Navigation Using Fusion of Odometry and Visual Sensor Data

  • Conference paper
Intelligent Robotics and Applications (ICIRA 2012)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7507))

Included in the following conference series:

Abstract

Nao humanoid robot from Aldebaran Robotics is equipped with an odometry sensor providing rather inaccurate robot pose estimates. We propose using Structure from Motion (SfM) to enable visual odometry from Nao camera without the necessity to add artificial markers to the scene and show that the robot pose estimates can be significantly improved by fusing the data from the odometry sensor and visual odometry. The implementation consists of the sensor modules streaming robot data, the mapping module creating a 3D model, the visual localization module estimating camera pose w.r.t. the model, and the navigation module planning robot trajectories and performing the actual movement. All of the modules are connected through the RSB middleware, which makes the solution independent on the given robot type.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aldebaran Robotics: Nao Hardware Specification for SDK v1.12 (2012), http://www.aldebaran-robotics.com/documentation/nao/hardware/index.html

  2. Bay, H., Ess, A., Tuytelaars, T., Van Gool, L.: Speeded-up robust features (SURF). CVIU 110(3), 346–359 (2008)

    Google Scholar 

  3. Bielefeld University: Robotics Service Types (2012), https://code.cor-lab.org/projects/rst

  4. Davison, A.J., Reid, I.D., Molton, N.D., Stasse, O.: MonoSLAM: real-time single camera SLAM. IEEE Transactions on Pattern Analysis and Machine Intelligence  29(6), 1052–1067 (2007)

    Article  Google Scholar 

  5. Fischler, M., Bolles, R.: Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Comm. ACM 24(6), 381–395 (1981)

    Article  MathSciNet  Google Scholar 

  6. Fojtů, Š.: Nao Localization and Navigation Based on Sparse 3D Point Cloud Re- construction. Master’s thesis, Czech Technical University in Prague (2011)

    Google Scholar 

  7. Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision, 2nd edn. Cambridge University Press (2003)

    Google Scholar 

  8. Hu, H., Gu, D.: Landmark-based Navigation of Industrial Mobile Robots. Industrial Robot: An International Journal 27(6), 458–467 (2000)

    Article  Google Scholar 

  9. Lourakis, M.I.A., Argyros, A.A.: SBA: A software package for generic sparse bundle adjustment. ACM Transactions on Mathematical Software (2009)

    Google Scholar 

  10. Mareček, P.: A Camera Calibration System. Master’s thesis, Center for Machine Perception, K13133 FEE Czech Technical University, Czech Republic (2001)

    Google Scholar 

  11. Muja, M., Lowe, D.: Fast approximate nearest neighbors with automatic algorithm configuration. In: VISAPP 2009 (2009)

    Google Scholar 

  12. Nistér, D.: A minimal solution to the generalized 3-point pose problem. In: CVPR 2004. pp. I:560–I:567 (2004)

    Google Scholar 

  13. Osswald, S., Hornung, A., Bennewitz, M.: Learning reliable and efficient navigation with a humanoid. In: 2010 IEEE International Conference on Robotics and Automation (ICRA), pp. 2375–2380. IEEE (2010)

    Google Scholar 

  14. Quigley, M., Conley, K., Gerkey, B.: ROS: an open-source Robot Operating System. In: Open-Source Software workshop of the International Conference on Robotics and Automation, ICRA 2009 (2009)

    Google Scholar 

  15. Snavely, N., Seitz, S., Szeliski, R.: Modeling the world from internet photo collections. IJCV 80(2), 189–210 (2008)

    Article  Google Scholar 

  16. Torii, A., Havlena, M., Pajdla, T.: Omnidirectional image stabilization for visual object recognition. International Journal of Computer Vision 91(2), 157–174 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  17. Umeyama, S.: Least-squares Estimation of Transformation Parameters Between Two Point Patterns. IEEE Transactions on Pattern Analysis and Machine Intelligence, 376–380 (1991)

    Google Scholar 

  18. Wienke, J., Wrede, S.: A Middleware for Collaborative Research in Experimental Robotics. In: 2011 IEEE/SICE International Symposium on System Integration, SII 2011, Kyoto, Japan, pp. 1183–1190 (2011)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Fojtů, Š., Havlena, M., Pajdla, T. (2012). Nao Robot Localization and Navigation Using Fusion of Odometry and Visual Sensor Data. In: Su, CY., Rakheja, S., Liu, H. (eds) Intelligent Robotics and Applications. ICIRA 2012. Lecture Notes in Computer Science(), vol 7507. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33515-0_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33515-0_43

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33514-3

  • Online ISBN: 978-3-642-33515-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics