Loading [a11y]/accessibility-menu.js
Vision-based simultaneous localization and mapping with two cameras | IEEE Conference Publication | IEEE Xplore

Vision-based simultaneous localization and mapping with two cameras


Abstract:

In this paper, we propose a novel method for the simultaneous localization and mapping (SLAM) problem with two cameras. A single camera based approach suffers from a lack...Show More

Abstract:

In this paper, we propose a novel method for the simultaneous localization and mapping (SLAM) problem with two cameras. A single camera based approach suffers from a lack of information for feature initialization and the instability of covariance of the 3D camera location and feature position. To solve this problem, we use two cameras which move independently, unlike the stereo camera. We derive new formulations for the extended Kalman filter and map management of two cameras. We also present a method for the new features initialization and feature matching with two cameras. In our method, the covariance of camera and feature location converges more rapidly. This characteristic enables a reduction of the computational complexity by fixing the feature position whose covariance converges. Experimental results prove that our approach estimates the 3D camera location and feature position more accurately and the covariance of camera and feature location converges more rapidly when compared with the single camera case.
Date of Conference: 02-06 August 2005
Date Added to IEEE Xplore: 05 December 2005
Print ISBN:0-7803-8912-3

ISSN Information:

Conference Location: Edmonton, AB, Canada

Contact IEEE to Subscribe

References

References is not available for this document.