Skip to main content
Log in

Visual odometry with a single-camera stereo omnidirectional system

  • Original Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

This paper presents the advantages of a single-camera stereo omnidirectional system (SOS) in estimating egomotion in real-world environments. The challenge of applying omnidirectional stereo vision via a single camera is what separates our work from others. In practice, dynamic environments, deficient illumination, and poor textured surfaces result in the lack of features to track in the observable scene. As a consequence, this negatively affects the pose estimation of visual odometry systems, regardless of their field of view. We compare the tracking accuracy and stability of the single-camera SOS versus an RGB-D device under various real circumstances. Our quantitative evaluation is performed with respect to 3D ground-truth data obtained from a motion capture system. The datasets and experimental results we provide are unique due to the nature of our catadioptric omnistereo rig, and the situations in which we captured these motion sequences. We have implemented a tracking system with simple rules applicable to both synthetic and real scenes. Our implementation does not make any motion model assumptions, and it maintains a fixed configuration among the compared sensors. Our experimental outcomes confer the robustness in 3D metric visual odometry estimation that the single-camera SOS can achieve under normal and special conditions in which other perspective narrow view systems such as RGB-D cameras would fail.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. http://ubuntuslave.github.io/publication/2018-vo_sos.

  2. http://opencv.org.

  3. http://www.povray.org.

References

  1. Jaramillo, C., Valenti, R.G., Guo, L., Xiao, J.: Design and analysis of a single-camera omnistereo sensor for quadrotor micro aerial vehicles (MAVs). Sensors 16(2), 217 (2016). 1

    Article  Google Scholar 

  2. Zhang, Z., Rebecq, H., Forster, C., Scaramuzza, D.: Benefit of large field-of-view cameras for visual odometry. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2016)

  3. Zou, D., Tan, P.: Coslam: collaborative visual slam in dynamic environments. IEEE Trans. Pattern Anal. Mach. Intell. 35(2), 354–366 (2013)

    Article  Google Scholar 

  4. Newcombe, R.A., Lovegrove, S.J., Davison, A.J.: DTAM: dense tracking and mapping in real-time. In: Proceedings of the IEEE International Conference on Computer Vision (ICCV), pp. 2320–2327 (2011)

  5. Engel, J., Schöps, T., Cremers, D.: LSD-SLAM: large-scale direct monocular SLAM. In: Proceedings of the European Conference on Computer Vision (ECCV) (2014)

  6. Mur-Artal, R., Tardos, J.D.: ORB-SLAM2: an open-source SLAM system for monocular, stereo, and RGB-D cameras. IEEE Trans. Robot. 33(5), 1255–1262 (2017)

    Article  Google Scholar 

  7. Tardif, J.-P., Pavlidis, Y., Daniilidis, K.: Monocular visual odometry in urban environments using an omnidirectional camera. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2008)

  8. Rituerto, A., Puig, L., Guerrero, J.J.: Visual SLAM with an omnidirectional camera. In: Proceedings of the IEEE International Conference on Pattern Recognition (ICPR), vol. 8, pp. 348–351 (2010)

  9. Gutierrez, D., Rituerto, A., Montiel, J.M.M., Guerrero, J.J.: Adapting a real-time monocular visual SLAM from conventional to omnidirectional cameras. In: Proceedings of the IEEE International Conference on Computer Vision (ICCV) Workshops (2011)

  10. Lemaire, T., Lacroix, S.: SLAM with panoramic vision. J. Field Robot. 24(1–2), 91–111 (2007)

    Article  Google Scholar 

  11. Geyer, C., Daniilidis, K.: A unifying theory for central panoramic systems and practical implications. In: Proceedings of the European Conference on Computer Vision (ECCV), pp. 445–461 (2000)

    Google Scholar 

  12. Rosten, E., Porter, R., Drummond, T.: Faster and better: a machine learning approach to corner detection. IEEE Trans. Pattern Anal. Mach. Intell. 32(1), 105–119 (2010). 1

    Article  Google Scholar 

  13. Schönbein, M., Geiger, A.: Omnidirectional 3D reconstruction in augmented Manhattan worlds. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2014)

  14. Zhu, Z.: Omnidirectional stereo vision. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2001)

  15. Jang, G., Kim, S., Kweon, I.: Single camera catadioptric stereo system. In: OMNIVIS Workshop (2005)

  16. Jaramillo, C., Valenti, R.G., Xiao, J.: GUMS: a generalized unified model for stereo omnidirectional vision (demonstrated via a folded catadioptric system. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2528–2533 (2016)

  17. Kneip, L., Furgale, P.: OpenGV: A unified and generalized approach to real-time calibrated geometric vision. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2014)

  18. Shi, J., Tomasi, C.: Good features to track. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 593–600 (1994)

  19. Rublee, E., Rabaud, V., Konolige, K., Bradski, G.: ORB: an efficient alternative to SIFT or SURF. In: Proceedings of the IEEE International Conference on Computer Vision (ICCV) (2011)

  20. Kneip, L., Furgale, P., Siegwart, R.: Using multi-camera systems in robotics: efficient solutions to the NPnP problem. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), no. 2004 (2013)

  21. Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 1–10 (2007)

  22. Handa, A., Whelan, T., McDonald, J.B., Davison, A.J.: A benchmark for RGB-D visual odometry, 3D reconstruction and SLAM. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2014)

  23. Sturm, J., Engelhard, N., Endres, F., Burgard, W., Cremers, D.: A benchmark for the evaluation of RGB-D SLAM systems. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 573–580 (2012)

  24. Xiong, Z., Chen, W., Zhang, M.: Catadioptric omnidirectional stereo vision and its applications in moving objects detection. In: In Tech: Computer Vision, ch. 26, pp. 493–538 (2008)

Download references

Acknowledgements

We thank the MTA Metro-North Railroad for letting us collect video sequences at the main lobby of the Grand Central Terminal in NYC.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Carlos Jaramillo or Jizhong Xiao.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

C.J., L.Y., and J.X. were supported by U.S. Army Research Office Grant No. W911NF-09-1-0565, US National Science Foundation Grants No. IIS- 0644127 and No. CBET-1160046, Federal Highway Administration (FHWA) Grants No. DTFH61-12-H-00002 and No. DTFH61-17-C-00007.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 41453 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jaramillo, C., Yang, L., Muñoz, J.P. et al. Visual odometry with a single-camera stereo omnidirectional system. Machine Vision and Applications 30, 1145–1155 (2019). https://doi.org/10.1007/s00138-019-01041-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-019-01041-9

Navigation