Loading [a11y]/accessibility-menu.js
Visual SLAM for autonomous MAVs with dual cameras | IEEE Conference Publication | IEEE Xplore

Visual SLAM for autonomous MAVs with dual cameras


Abstract:

This paper extends a monocular visual simultaneous localization and mapping (SLAM) system to utilize two cameras with non-overlap in their respective field of views (FOVs...Show More

Abstract:

This paper extends a monocular visual simultaneous localization and mapping (SLAM) system to utilize two cameras with non-overlap in their respective field of views (FOVs). We achieve using it to enable autonomous navigation of a micro aerial vehicle (MAV) in unknown environments. The methodology behind this system can easily be extended to multi-camera rigs, if the onboard computation capability allows this. We analyze the iterative optimizations for pose tracking and map refinement of the SLAM system in multicamera cases. This ensures the soundness and accuracy of each optimization update. Our method is more resistant to tracking failure than conventional monocular visual SLAM systems, especially when MAVs fly in complex environments. It also brings more flexibility to configurations of multiple cameras used onboard of MAVs. We demonstrate its efficiency with both autonomous flight and manual flight of a MAV. The results are evaluated by comparisons with ground truth data provided by an external tracking system.
Date of Conference: 31 May 2014 - 07 June 2014
Date Added to IEEE Xplore: 29 September 2014
Electronic ISBN:978-1-4799-3685-4
Print ISSN: 1050-4729
Conference Location: Hong Kong, China

References

References is not available for this document.