Abstract
Localization and pose estimation is of great importance for robotic applications e.g. navigation, and mapping. In this paper we developed a localization system based on image data alone. Through feature detection and tracking, we previously build a sparse feature map of an area, then use data captured from an RGBD sensor to track the position and orientation of the camera relative to the map. Matching frames from a live video stream with the underlying map will provide a set of geometric constraints that will allow us to estimate the camera pose. We also compare different methods of calculation, in conjunction with descriptor distance criteria, window region adjustment and RANSAC to decrease the redundancy of the feature point cloud and improve resultant precision. Furthermore, we developed a novel GPU-based implementation for real-time requirement.
An Erratum for this chapter can be found at http://dx.doi.org/10.1007/978-3-642-42057-3_113
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Hogman, V., et al.: Build a 3D map from RGB-D sensors. Royal Institute of Technology (KTH)
Davison, A.J., et al.: MonoSLAM: Real-Time Single camera SLAM. IEEE Transactions on Pattern Analysis and Machine Intelligence 29(6), 1052–1067 (2007)
Moravec, H.: Obstacle avoidance and navigation in the real world by a seeing robot rover (1980)
Kaess, M., Ranganathan, A., Dellaert, F.: Incremental smoothing and mapping. IEEE Trans. on Robotics (TRO) 24(6), 1365–1378 (2008)
Triggs, B., McLauchlan, P.: Bundle adjustment –a modern synthesis. Vision Algorithms: Theory and Practice, 153–177 (2000)
Scherer, S.A., et al.: Using Depth in Visual Simultaneous Localisation and Mapping. In: 2012 IEEE International Conference on Robotics and Automation, May 14-18 (2012)
Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: Proc. Sixth IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR 2007), Nara, Japan (November 2007)
Dryanovski, I., et al.: Real-Time Pose Estimation with RGB-D Camera. In: 2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Hamburg, Germany, September 13-15 (2012)
Ancukiewicz, D., et al.: 3D reconstruction from RGB and Depth Video
Morel, J.-M., Yu, G.: A New Framework for Fully Affine Invariant Image Comparison. SIAM Journal on Imaging Sciences 2, 438 (2009)
Davison, A.J., Izadi, S., et al.: KinectFusion: Real-time dense surface mapping and tracking. In: 2011 10th IEEE International Symposium on Mixed and Augmented Reality (2011)
Whelan, T., et al.: Robust Real-Time Visual Odometry for Dense RGB-D Mapping. In: IEEE Intl. Conf. on Robotics and Automation, ICRA, Karlsruhe, Germany (May 2013)
Ranganathan, A.: The Levenberg-Marquardt Algorithm (June 8, 2004)
Zhang, Z.: A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence 22(11), 1330–1334 (2000)
Lepetit, V., et al.: ePnP: An Accurate O(n) Solution to the PnP Problem. International Journal of Computer Vision 81, 155–166 (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Chen, H., Yuan, Y. (2013). Camera Localization and Pose Estimation Using an RGBD Sensor. In: Sun, C., Fang, F., Zhou, ZH., Yang, W., Liu, ZY. (eds) Intelligence Science and Big Data Engineering. IScIDE 2013. Lecture Notes in Computer Science, vol 8261. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-42057-3_101
Download citation
DOI: https://doi.org/10.1007/978-3-642-42057-3_101
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-42056-6
Online ISBN: 978-3-642-42057-3
eBook Packages: Computer ScienceComputer Science (R0)