Abstract
Unmanned miniature air vehicles (MAVs) have recently become a focus of much research, due to their potential utility in a number of information gathering applications. MAVs currently carry inertial sensor packages that allow them to perform basic flight maneuvers reliably in a completely autonomous manner. However, MAV navigation requires knowledge of location that is currently available only through GPS sensors, which depend on an external infrastructure and are thus prone to reliability issues. Vision-based methods such as Visual Odometry (VO) have been developed that are capable of estimating MAV pose purely from vision, and thus have the potential to provide an autonomous alternative to GPS for MAV navigation. Because VO estimates pose by combining relative pose estimates, constraining relative pose error is the key element of any Visual Odometry system. In this paper, we present a system that fuses measurements from an MAV inertial navigation system (INS) with a novel VO framework based on direct image registration. We use the inertial sensors in the measurement step of the Extended Kalman Filter to determine the direction of gravity, and hence provide error-bounded measurements of certain portions of the aircraft pose. Because of the relative nature of VO measurements, we use VO in the EKF prediction step. To allow VO to be used as a prediction, we develop a novel linear approximation to the direct image registration procedure that allows us to propagate the covariance matrix at each time step. We present offline results obtained from our pose estimation system using actual MAV flight data. We show that fusion of VO and INS measurements greatly improves the accuracy of pose estimation and reduces the drift compared to unaided VO during medium-length (tens of seconds) periods of GPS dropout.
Similar content being viewed by others
References
Beard, R., Kingston, D., Quigley, M., Snyder, D., Christiansen, R., Johnson, W., McLain, T., Goodrich, M.: Autonomous vehicle technologies for small fixed wing UAVs. AIAA J. Aerosp. Comput. Inf. Commun. 2(1), (2005)
Saunders, J.B., Call, B., Curtis, A., Beard, R.W., McLain, T.W.: Static and dynamic obstacle avoidance in miniature air vehicles. In: AIAA 5th Aviation, Technology, Integration, and Operations Conference, Arlington, 26–28 September 2005
Kingston, D.B., Beard, R.W.: Real-time attitude and position estimation for small UAV’s using low-cost sensors. In: AIAA Unmanned Unlimited Systems Conference and Workshop, Chicago, September 2004
Christiansen, R.S.: Design of an autopilot for small unmanned aerial vehicles. Master’s thesis, Brigham Young University (2004)
Volpe, J.: Vulnerability assessment of the transport infrastructure relying on the global positioning system. Technical report, Office of the Assistant Secretary for Transportation Policy, U.S. Department of Transportation, Aug. (2001)
Kim, J.-H., Sukkarieh, S.: Airborne simultaneous localisation and map building. In: Robotics and Automation, 2003. Proceedings. IEEE International Conference on ICRA ‘03, vol. 1, pp. 406–411. IEEE, Piscataway (2003)
Bryson, M., Sukkarieh, S.: Bearing-only SLAM for an airborne vehicle. In: Australasian Conference on Robotics and Automation, Sydney, 5–7 December 2005
Kim, J., Sukkarieh, S.: SLAM aided GPS/INS navigation in GPS denied and unknown environments. In: The 2004 International Symposium on GNSS/GPS, Sydney, 6–8 December 2004
Langelaan, J., Rock, S.: Passive GPS-free navigation for small UAVs. In: Proc. IEEE Aerospace Conference, pp. 1–9. IEEE, Piscataway (2005)
Davison, A.J., Reid, I.D., Molton, N.D., Stasse, O.: MonoSLAM: real-time single camera SLAM. J. Pattern Anal. Mach. Intell. 29(6), 1052–1067 (2007)
Dellaert, F., Thrun, S., Thorpe, C.: Jacobian images of super-resolved texture maps for model-based motion estimation and tracking. In: 1998 IEEE Workshop on Applications of Computer Vision, pp. 2–7. IEEE, Piscataway (1998)
Dellaert, F., Thorpe, C., Thrun, S.: Super-resolved texture tracking of planar surface patches. In: 1998 IEEE/RSJ International Conference on Intelligent Robotic Systems, vol. 1, pp. 197–203. IEEE, Piscataway (1998)
Corke, P., Lobo, J., Dias, J.: An introduction to inertial and visual sensing. Int. J. Rob. Res. 26(6), 519–535 (2007)
Roumeliotis, S.I., Johnson, A.E., Montgomery, J.F.: Augmenting inertial navigation with image-based motion estimation. In: 2002 IEEE International Conference on Robotics and Automation, vol. 4, pp. 4326–4333. IEEE, Piscataway (2002)
Diel, D.D.: Stochastic constraints for vision aided inertial navigation. Master’s thesis, MIT (2005)
Bayard, D.S., Brugarolas, P.B.: An estimation algorithm for vision-based exploration of small bodies in space. In: 2005 American Control Conference, Portland, 8–10 June 2005
Mourikis, A.I., Roumeliotis, S.I.: A multi-state constraint Kalman filter for vision-aided intertial navigation. In: 2007 IEEE International Conference on Robotics and Automation, Roma, 10–14 April 2007
Armesto, L., Tornero, J., Vincze, M.: Fast ego-motion estimation with multi-rate fusion of inertial and vision. Int. J. Rob. Res. 26(6), 577–589 (2007)
Viéville, T., Clergue, E., Facao, P.E.D.S.: Computation of ego-motion and structure from visual and inertial sensors using the vertical cue. In: International Conference on Computer Vision, pp. 591–598, Berlin, 11–14 May 1993
Domke, J., Aloimonos, Y.: Integration of visual and inertial information for egomotion: a stochastic approach. In: 2006 IEEE International Conference on Robotics and Automation, pp. 2053–2059. IEEE, Piscataway (2006)
Lobo, J., Dias, J.: Vision and inertial sensor cooperation using gravity as a vertical reference. IEEE Trans. Pattern Anal. Mach. Intell. 25(12), 1597–1608 (2003)
Lobo, J., Dias, J.: Inertial Sensed ego-motion for 3D Vision. J. Robot. Syst. 21(1), 3–12 (2004)
Lobo, J., Dias, J.: Relative pose calibration between visual and inertial sensors. Int. J. Rob. Res. 26(6), 561–575 (2007)
Nister, D., Naroditsky, O., Bergen, J.: Visual odometry. In: 2004 IEEE Conference on Computer Vision and Pattern Recognition, vol. 1. IEEE, Piscataway (2004)
Nister, D., Naroditsky, O., Bergen, J.: Visual odometry for ground vehicle applications. J. Field Robot. 23(1), 3–20 (2006)
Kanade, T., Amidi, O., Ke, Q.: Real-time and 3D vision for autonomous small and micro air vehicles. In: 2004 IEEE Conference on Decision and Control, vol. 2, pp. 1655–1662. IEEE, Piscataway (2004)
Kehoe, J.J., Causey, R.S., Arvai, A., Lind, R.: Partial aircraft state estimation from optical flow using non-model-based optimization. In: 2006 American Control Conference, p. 6, Minneapolis, 14–16 June 2006
Kehoe, J.J., Watkins, A.S., Causey, R.S., Lind, R.: State estimation using optical flow from parallax-weighted feature tracking. In: 2006 AIAA Guidance, Navigation, and Control Conference, Keystone, 21–24 August 2006
Kaiser, K., Gans, N., Dixon, W.: Position and orientation of an aerial vehicle through chained, vision-based pose reconstruction. In: 2006 AIAA Guidance, Navigation, and Control Conference, Keystone, 21–24 August 2006
Kaiser, K., Gans, N., Dixon, W.: Localization and control of an aerial vehicle through chained, vision-based pose reconstruction. In: 2007 American Control Conference, pp. 5934–5939, New York, 11–13 July 2007
Nister, D.: An efficient solution to the five-point relative pose problem. IEEE Trans. Pattern Anal. Mach. Intell. 26(6), 756–770 (2004)
Ma, Y., Soatto, S., Kosecka, J., Shankar, S., Sastry: An Invitation to 3-D Vision: From Images to Geometric Models. Springer, Heidelberg (2004)
Eustice, R., Singh, H., Leonard, J., Walter, M., Ballard, R.: Visually navigating the RMS Titanic with SLAM information filters. In: Robotics: Science and Systems, Cambridge, June 2005
Richmond, K., Rock, S.: A real-time visual mosaicking and navigation system. Unmanned Untethered Submersible Technology (2005)
Fleischer, S.D.: Bounded-error vision-based navigation of autonomous underwater vehicles. Ph.D. Thesis, Stanford University (2000)
Baker, S., Matthews, I.: Lucas-Kanade 20 years on: a unifying framework. part 1: the quantity approximated, the warp update rule, and the gradient descent approximation. Int. J. Comput. Vis. 56(3), 221–255 (2004)
Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press, Cambridge (2004)
Kim, J., Sukkarieh, S.: Robust multi-loop airborne slam in unknown wind environments. In: Robotics and Automation, 2006, Proceedings 2006 IEEE International Conference on ICRA 2006, pp. 1536–1541. IEEE, Piscataway (2006)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Ready, B.B., Taylor, C.N. Inertially Aided Visual Odometry for Miniature Air Vehicles in GPS-denied Environments. J Intell Robot Syst 55, 203–221 (2009). https://doi.org/10.1007/s10846-008-9294-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10846-008-9294-6