Skip to main content

An Efficient Closed-Form Solution to Probabilistic 6D Visual Odometry for a Stereo Camera

  • Conference paper
Advanced Concepts for Intelligent Vision Systems (ACIVS 2007)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4678))

Abstract

Estimating the ego-motion of a mobile robot has been traditionally achieved by means of encoder-based odometry. However, this method presents several drawbacks, such as the existence of accumulative drifts, its sensibility to slippage, and its limitation to planar environments. In this work we present an alternative method for estimating the incremental change in the robot pose from images taken by a stereo camera. In contrast to most previous approaches for 6D visual odometry, based on iterative, approximate methods, we propose here to employ an optimal closed-form formulation which is more accurate, efficient, and does not exhibit convergence problems. We also derive the expression for the covariance associated to this estimation, which enables the integration of our approach into vision-based SLAM frameworks. Additionally, our proposal combines highly-distinctive SIFT descriptors with the fast KLT feature tracker, thus achieving robust and efficient execution in real-time. To validate our research we provide experimental results for a real robot.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Campbell, J., Sukthankar, R., Nourbakhsh, I., Pahwa, A.: A Robust Visual Odometry and Precipice Detection System Using Consumer-grade Monocular Vision. In: A Robust Visual Odometry and Precipice Detection System Using Consumer-grade Monocular Vision, pp. 3421–3427 (2005)

    Google Scholar 

  2. Davison, A.J., Reid, I., Molton, N., Stasse, O.: MonoSLAM: Real-Time Single Camera SLAM. IEEE Transactions on Pattern Analysis and Machine Intelligence (2007)

    Google Scholar 

  3. Fernandez, D., Price, A.: Visual Odometry for an Outdoor Mobile Robot. In: Conference on Robotics, Automation and Mechatronics, pp. 816–821 (2004)

    Google Scholar 

  4. Hahnel, D., Burgard, W., Fox, D., Thrun, S.: An efficient fastslam algorithm for generating maps of large-scale cyclic environments from raw laser range measurements. In: Proc. of Int. Conference on Intelligent Robots and Systems (IROS) (2003)

    Google Scholar 

  5. Harris, C.J., Stephens, M.: A combined edge and corner detector. In: Proceedings of 4th Alvey Vision Conference, Manchester, pp. 147–151 (1988)

    Google Scholar 

  6. Horn, B.K.P.: Closed-form solution of absolute orientation using unit quaternions. Journal of the Optical Society of America A 4, 629–642 (1987)

    Article  Google Scholar 

  7. Kitchen, L., Rosenfeld, A.: Gray-level corner detection. Pattern Recognition Letters 1, 95–102 (1982)

    Article  Google Scholar 

  8. Lowe, D.G.: Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision 60(2), 91–110 (2004)

    Article  Google Scholar 

  9. Matthies, L., Shafer, S.A.: Error modeling in Stereo Navigation. IEEE Journal of Robotics and Automation RA-3(3) (1987)

    Google Scholar 

  10. Nistér, D., Naroditsky, O., Bergen, J.: Visual Odometry. In: Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 652–659 (2004)

    Google Scholar 

  11. Olson, C.F., Matthies, L.H., Schoppers, M., Maimone, M.W.: Rover navigation using stereo ego-motion. Robotics and Autonomous Systems 43(4), 215–229 (2003)

    Google Scholar 

  12. Shi, J., Tomasi, C.: Good features to track. In: Proc. Computer Vision and Pattern Recognition, pp. 593–600 (1994)

    Google Scholar 

  13. Sim, R., Elinas, P., Griffin, M., Little, J.J.: Vision-based SLAM using the Rao-Blackwellised Particle Filter. In: IJCAI Workshop Reasoning with Uncertainty in Robotics, Edinburgh, Scotland (2005)

    Google Scholar 

  14. Stachniss, C., Grisetti, G., Burgard, W.: Recovering Particle Diversity in a Rao-Blackwellized Particle Filter for SLAM After Actively Closing Loops. In: Proc. of the IEEE Int. Conf. on Robotics and Automation (ICRA), IEEE Computer Society Press, Los Alamitos (2005)

    Google Scholar 

  15. Thrun, S., Burgard, W., Fox, D.: Probabilistic Robotics. MIT Press, Cambridge (2006)

    Google Scholar 

  16. Wang, H., Yuan, K., Zou, W., Zhou, Q.: Visual Odometry Based on Locally Planar Ground Assumption. In: Int. Conference on Information Acquisition, pp. 59–64 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Jacques Blanc-Talon Wilfried Philips Dan Popescu Paul Scheunders

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Moreno, F.A., Blanco, J.L., González, J. (2007). An Efficient Closed-Form Solution to Probabilistic 6D Visual Odometry for a Stereo Camera. In: Blanc-Talon, J., Philips, W., Popescu, D., Scheunders, P. (eds) Advanced Concepts for Intelligent Vision Systems. ACIVS 2007. Lecture Notes in Computer Science, vol 4678. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74607-2_85

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74607-2_85

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74606-5

  • Online ISBN: 978-3-540-74607-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics