Skip to main content

Camera Localization and Pose Estimation Using an RGBD Sensor

  • Conference paper
Book cover Intelligence Science and Big Data Engineering (IScIDE 2013)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 8261))

  • 2505 Accesses

Abstract

Localization and pose estimation is of great importance for robotic applications e.g. navigation, and mapping. In this paper we developed a localization system based on image data alone. Through feature detection and tracking, we previously build a sparse feature map of an area, then use data captured from an RGBD sensor to track the position and orientation of the camera relative to the map. Matching frames from a live video stream with the underlying map will provide a set of geometric constraints that will allow us to estimate the camera pose. We also compare different methods of calculation, in conjunction with descriptor distance criteria, window region adjustment and RANSAC to decrease the redundancy of the feature point cloud and improve resultant precision. Furthermore, we developed a novel GPU-based implementation for real-time requirement.

An Erratum for this chapter can be found at http://dx.doi.org/10.1007/978-3-642-42057-3_113

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hogman, V., et al.: Build a 3D map from RGB-D sensors. Royal Institute of Technology (KTH)

    Google Scholar 

  2. Davison, A.J., et al.: MonoSLAM: Real-Time Single camera SLAM. IEEE Transactions on Pattern Analysis and Machine Intelligence 29(6), 1052–1067 (2007)

    Article  MathSciNet  Google Scholar 

  3. Moravec, H.: Obstacle avoidance and navigation in the real world by a seeing robot rover (1980)

    Google Scholar 

  4. Kaess, M., Ranganathan, A., Dellaert, F.: Incremental smoothing and mapping. IEEE Trans. on Robotics (TRO) 24(6), 1365–1378 (2008)

    Article  Google Scholar 

  5. Triggs, B., McLauchlan, P.: Bundle adjustment –a modern synthesis. Vision Algorithms: Theory and Practice, 153–177 (2000)

    Google Scholar 

  6. Scherer, S.A., et al.: Using Depth in Visual Simultaneous Localisation and Mapping. In: 2012 IEEE International Conference on Robotics and Automation, May 14-18 (2012)

    Google Scholar 

  7. Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Proc. Sixth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2007), Nara, Japan (November 2007)

    Google Scholar 

  8. Dryanovski, I., et al.: Real-Time Pose Estimation with RGB-D Camera. In: 2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Hamburg, Germany, September 13-15 (2012)

    Google Scholar 

  9. Ancukiewicz, D., et al.: 3D reconstruction from RGB and Depth Video

    Google Scholar 

  10. Morel, J.-M., Yu, G.: A New Framework for Fully Affine Invariant Image Comparison. SIAM Journal on Imaging Sciences 2, 438 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  11. Davison, A.J., Izadi, S., et al.: KinectFusion: Real-time dense surface mapping and tracking. In: 2011 10th IEEE International Symposium on Mixed and Augmented Reality (2011)

    Google Scholar 

  12. Whelan, T., et al.: Robust Real-Time Visual Odometry for Dense RGB-D Mapping. In: IEEE Intl. Conf. on Robotics and Automation, ICRA, Karlsruhe, Germany (May 2013)

    Google Scholar 

  13. Ranganathan, A.: The Levenberg-Marquardt Algorithm (June 8, 2004)

    Google Scholar 

  14. Zhang, Z.: A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(11), 1330–1334 (2000)

    Article  Google Scholar 

  15. Lepetit, V., et al.: ePnP: An Accurate O(n) Solution to the PnP Problem. International Journal of Computer Vision 81, 155–166 (2009)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Chen, H., Yuan, Y. (2013). Camera Localization and Pose Estimation Using an RGBD Sensor. In: Sun, C., Fang, F., Zhou, ZH., Yang, W., Liu, ZY. (eds) Intelligence Science and Big Data Engineering. IScIDE 2013. Lecture Notes in Computer Science, vol 8261. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-42057-3_101

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-42057-3_101

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-42056-6

  • Online ISBN: 978-3-642-42057-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics