Abstract
The main contribution of this paper is a tightly-coupled visual-inertial fusion algorithm for simultaneous localisation and mapping (SLAM) for a quadrotor micro aerial vehicle (MAV). Proposed algorithm is based on an extended Kalman filter that uses a platform specific dynamic model to integrate information from an inertial measurement unit (IMU) and a monocular camera on board the MAV. MAV dynamic model exploits the unique characteristics of the quadrotor, making it possible to generate relatively accurate motion predictions. This, together with an undelayed feature initialisation strategy based on inverse depth parametrisation enables more effective feature tracking and reliable visual SLAM with a small number of features even during rapid manoeuvres. Experimental results are presented to demonstrate the effectiveness of the proposed algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Mahony, R., Kumar, V., Corke, P.: Multirotor aerial vehicles: Modeling, estimation, and control of quadrotor. IEEE Robot. Autom. Mag. 19(3), 20–32 (2012)
Bachrach, A., Prentice, S., He, R., Roy, N.: RANGE - robust autonomous navigation in GPS-denied environments. Journal of Field Robotics 28(5), 644–666 (2011)
Bachrach, A., Prentice, S., He, R., Henry, P., Huang, A.S., Krainin, M., Maturana, D., Fox, D., Roy, N.: Estimation, planning and mapping for autonomous flight using an rgb-d camera in GPS-denied environments. Int. J. Robot. Res. 31(11), 1320–1343 (2012)
Bryson, M., Sukkarieh, S.: Building a robust implementation of bearing-only inertial SLAM for a UAV. Journal of Field Robotics, Special Issue on SLAM in the Field 24(1-2), 113–143 (2007)
Klein, G., Murray, D.: Parallel tracking and mapping for small ar workspaces. In: Proc. 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 225–234 (November 2007)
Weiss, S., Achtelik, M., Lynen, S., Chli, M., Siegwart, R.: Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. In: Proc. IEEE Int. Conf. Robot. Autom., pp. 957–964 (May 2012)
Nützi, G., Weiss, S., Scaramuzza, D., Siegwart, R.: Fusion of IMU and vision for absolute scale estimation in monocular SLAM. Journal of Intelligent and Robotic Systems 61, 287–299 (2011)
Engel, J., Sturm, J., Cremers, D.: Camera-based navigation of a low-cost quadrocopter. In: Proc. IEEE/RSJ Int. Conf. Intell. Robots Syst., pp. 2815–2821 (October 2012)
Sa, I., He, H., Huynh, V., Corke, P.: Monocular vision based autonomous navigation for a cost-effective mav in gps-denied environments. In: 2013 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), pp. 1355–1360. IEEE (2013)
Abeywardena, D., Wang, Z., Kodagoda, S., Dissanayake, G.: Visual-inertial fusion for quadrotor Micro Air Vehicles with improved scale observability. In: Proc. IEEE Int. Conf. Robot. Autom., pp. 3133–3138 (May 2013)
Mirzaei, F., Roumeliotis, S.: A Kalman filter-based algorithm for IMU-camera calibration: Observability analysis and performance evaluation. IEEE Trans. Robot. 24(5), 1143–1156 (2008)
Kelly, J., Sukhatme, G.S.: Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration. Int. J. Robot. Res. 30(1), 56–79 (2011)
Jones, E.S., Soatto, S.: Visual-inertial navigation, mapping and localization: A scalable real-time causal approach. The International Journal of Robotics Research 30(4), 407–430 (2011)
Piniés, P., Lupton, T., Sukkarieh, S., Tardós, J.D.: Inertial aiding of inverse depth slam using a monocular camera. In: 2007 IEEE International Conference on Robotics and Automation, pp. 2797–2802. IEEE (2007)
Civera, J., Davison, A., Montiel, J.: Inverse depth parametrization for monocular SLAM. IEEE Trans. Robot. 24(5), 932–945 (2008)
Civera, J., Grasa, O.G., Davison, A.J., Montiel, J.M.M.: 1-point RANSAC for extended Kalman filtering: Application to real-time structure from motion and visual odometry. Journal of Field Robotics 27(5), 609–631 (2010)
Abeywardena, D., Kodagoda, S., Dissanayake, G., Munasinghe, R.: Improved state estimation in quadrotor MAVs: A novel drift-free velocity estimator. IEEE Robot. Autom. Mag. PP(99), 1 (2013)
Rosten, E., Drummond, T.: Machine learning for high-speed corner detection. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) ECCV 2006, Part I. LNCS, vol. 3951, pp. 430–443. Springer, Heidelberg (2006)
Lowe, D.G.: Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision 60(2), 91–110 (2004)
Munguia, R., Grau, A.: Delayed features initialization for inverse depth monocular slam. In: European Conference on Mobile Robotics (2007)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Abeywardena, D., Dissanayake, G. (2015). Tightly-Coupled Model Aided Visual-Inertial Fusion for Quadrotor Micro Air Vehicles. In: Mejias, L., Corke, P., Roberts, J. (eds) Field and Service Robotics. Springer Tracts in Advanced Robotics, vol 105. Springer, Cham. https://doi.org/10.1007/978-3-319-07488-7_11
Download citation
DOI: https://doi.org/10.1007/978-3-319-07488-7_11
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-07487-0
Online ISBN: 978-3-319-07488-7
eBook Packages: EngineeringEngineering (R0)