Skip to main content

Optimal-State-Constraint EKF for Visual-Inertial Navigation

  • Chapter
  • First Online:
Robotics Research

Part of the book series: Springer Proceedings in Advanced Robotics ((SPAR,volume 2))

Abstract

As a visual-inertial navigation system (VINS) becomes prevalent thanks to recent advancements in cameras and inertial sensors, optimal sensor fusion algorithms are demanding. In this paper, we introduce a new optimal-state-constraint (OSC)-EKF for VINS, which performs tightly-coupled visual-inertial sensor fusion over a sliding window of poses only (i.e., without including features in the state vector), and thus has complexity independent of the size of the environment. The key idea of the proposed OSC-EKF is to design a novel measurement model that utilizes all feature measurements available within the sliding window and derives probabilistically optimal constraints between poses while without estimating these features as part of the state vector. To this end, for each sliding window, we perform structure and motion using only the available camera measurements and subsequently marginalize out the structure (features) to obtain the optimal motion constraints that will be used in the EKF update. The proposed approach is validated in the proof-of-concept, real-world experiments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Throughout this paper the subscript \(\ell |j\) refers to the estimate of a quantity at time-step \(\ell \), after all measurements up to time-step j have been processed. \(\hat{x}\) is used to denote the estimate of a random variable x, while \(\tilde{x} = x-\hat{x}\) is the error in this estimate. \(\mathbf I_n\) and \(\mathbf 0_n\) are the \(n \times n\) identity and zero matrices, respectively. \(\mathbf e_1, \mathbf e_2\) and \(\mathbf e_3 \in \mathbb R^3\) are the unit vectors along \(x-\), \(y-\) and \(z-\)axes. Finally, the left superscript denotes the frame of reference which the vector is expressed with respect to.

  2. 2.

    While in principle we can choose an arbitrary coordinate of any relative position to normalize in order to compensate for the unknown scale, in practice we may select the one that yields the best numerical stability.

References

  1. Civera, J., Davison, A., Montiel, J.: Inverse depth parametrization for monocular SLAM. IEEE Trans. Robot. 24(5), 932–945 (2008)

    Article  Google Scholar 

  2. Corke, P., Lobo, J., Dias, J.: An introduction to inertial and visual sensing. Int. J. Robot. Res. 26(6), 519–535 (2007)

    Article  Google Scholar 

  3. Ebcin, S., Veth, M.: Tightly-Coupled Image-Aided Inertial Navigation Using the Unscented Kalman Filter. Tech. Rep., Air Force Institute of Technology, Dayton, OH (2007)

    Google Scholar 

  4. Google: Google Project Tango. Available: https://www.google.com/atap/projecttango

  5. Hernandez, J., Tsotsos, K., Soatto, S.: Observability, identifiability and sensitivity of vision-aided inertial navigation. In: Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, pp. 2319–2325 (2015)

    Google Scholar 

  6. Hesch, J., Kottas, D., Bowman, S., Roumeliotis, S.: Consistency analysis and improvement of vision-aided inertial navigation. IEEE Trans. Robot. 30(1), 158–176 (2013)

    Article  Google Scholar 

  7. Hesch, J., Kottas, D., Bowman, S., Roumeliotis, S.: Camera-IMU-based localization: observability analysis and consistency improvement. Int. J. Robot. Res. 33, 182–201 (2014)

    Article  Google Scholar 

  8. Huang, G.: Improving the Consistency of Nonlinear Estimators: Analysis, Algorithms, and Applications. Ph.D. Dissertation, Department of Computer Science and Engineering, University of Minnesota (2012)

    Google Scholar 

  9. Huang, G., Mourikis, A.I., Roumeliotis, S.I.: Analysis and improvement of the consistency of extended Kalman filter-based SLAM. In: Proceedings of the IEEE International Conference on Robotics and Automation, Pasadena, CA, pp. 473–479 (2008)

    Google Scholar 

  10. Huang, G., Kaess, M., Leonard, J.: Towards consistent visual-inertial navigation. In: Proceedings of the IEEE International Conference on Robotics and Automation, Hong Kong, China, pp. 4926–4933 (2014)

    Google Scholar 

  11. Indelman, V., Dellaert, F.: Incremental light bundle adjustment: probabilistic analysis and application to robotic navigation. In: Sun, Y., Behal, A., Chung, C.-K.R. (eds.) New Development in Robot Vision, vol. 23, pp. 111–136. Springer, Berlin (2015)

    Google Scholar 

  12. Indelman, V., Roberts, R., Beall, C., Dellaert, F.: Incremental light bundle adjustment. In: Proceedings of the British Machine Vision Conference, Surrey, UK, pp. 1–11 (2012)

    Google Scholar 

  13. Indelman, V., Williams, S., Kaess, M., Dellaert, F.: Information fusion in navigation systems via factor graph based incremental smoothing. Robot. Auton. Syst. 61(8), 721–738 (2013)

    Article  Google Scholar 

  14. Jones, E.S., Soatto, S.: Visual-inertial navigation, mapping and localization: a scalable real-time causal approach. Int. J. Robot. Res. 30(4), 407–430 (2011)

    Article  Google Scholar 

  15. Kelly, J., Sukhatme, G.S.: Visual-inertial sensor fusion: localization, mapping and sensor-to-sensor self-calibration. Int. J. Robot. Res. 30(1), 56–79 (2011)

    Article  Google Scholar 

  16. Kim, J., Sukkarieh, S.: Real-time implementation of airborne inertial-SLAM. Robot. Auton. Syst. 55(1), 62–71 (2007)

    Article  Google Scholar 

  17. Li, M., Mourikis, A.: High-precision, consistent EKF-based visual-inertial odometry. Int. J. Robot. Res. 32(6), 690–711 (2013)

    Article  Google Scholar 

  18. Lucas, B., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of the International Joint Conference on Artificial Intelligence, Vancouver, BC, pp. 674–679 (1981)

    Google Scholar 

  19. Martinelli, A.: Vision and IMU data fusion: closed-form solutions for attitude, speed, absolute scale, and bias determination. IEEE Trans. Robot. 28(1), 44–60 (2012)

    Article  Google Scholar 

  20. Maybeck, P.S.: Stochastic Models, Estimation, and Control, vol. 1. Academic Press, London (1979)

    Google Scholar 

  21. Mirzaei, F.M., Roumeliotis, S.I.: A Kalman filter-based algorithm for IMU-camera calibration: observability analysis and performance evaluation. IEEE Trans. Robot. 24(5), 1143–1156 (2008)

    Article  Google Scholar 

  22. Mourikis, A.I., Roumeliotis, S.I.: A multi-state constraint Kalman filter for vision-aided inertial navigation. In: Proceedings of the IEEE International Conference on Robotics and Automation, Rome, Italy, pp. 3565–3572 (2007)

    Google Scholar 

  23. Mourikis, A., Trawny, N., Roumeliotis, S., Johnson, A., Ansar, A., Matthies, L.: Vision-aided inertial navigation for spacecraft entry, descent, and landing. IEEE Trans. Robot. 25(2), 264–280 (2009)

    Article  Google Scholar 

  24. Pollefeys, M.: Visual 3D modeling from images. In: Girod, B., Seidel, H.-P., Magnor, M.A. (eds.) Vision, Modeling, and Visualization. IOS Press, The Netherlands (2004)

    Google Scholar 

  25. Roumeliotis, S.I., Burdick, J.W.: Stochastic cloning: a generalized framework for processing relative state measurements. In: Proceedings of the IEEE International Conference on Robotics and Automation, Washington, DC, pp. 1788–1795 (2002)

    Google Scholar 

  26. Shen, S., Mulgaonkar, Y., Michael, N., Kumar, V.: Vision-based state estimation for autonomous rotorcraft MAVs in complex environments. In: Proceedings of the IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, pp. 1750–1756 (2013)

    Google Scholar 

  27. Shi, J., Tomasi, C.: Good features to track. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Seattle, WA, pp. 593–600 (1994)

    Google Scholar 

  28. Strelow, D.: Motion Estimation from Image and Inertial Measurements, Ph.D. dissertation, CMU (2004)

    Google Scholar 

  29. Trawny, N., Roumeliotis, S.I.: Indirect Kalman Filter for 3D Attitude Estimation. University of Minnesota, Department of Computer Science & Engineering, Technical Report (2005)

    Google Scholar 

  30. Troiani, C., Martinelli, A., Laugier, C., Scaramuzza, D.: 2-point-based outlier rejection for camera-IMU systems with applications to micro aerial vehicles. In: Proceedings of the IEEE International Conference on Robotics and Automation, Hong Kong, China, pp. 5530–5536 (2014)

    Google Scholar 

  31. Tsotsos, K., Chiuso, A., Soatto, S.: Robust inference for visual-inertial sensor fusion. In: Proceedings of the IEEE International Conference on Robotics and Automation, Seattle, WA, pp. 5203–5210 (2015)

    Google Scholar 

  32. Weiss, S., Siegwart, R.: Real-time metric state estimation for modular vision-inertial systems. In: Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China, pp. 4531–4537 (2011)

    Google Scholar 

Download references

Acknowledgements

This work was partially supported by the University of Delaware College of Engineering, the Office of Naval Research (ONR) N00014-10-1-0936, N00014-11-1-0688 and N00014-13-1-0588, and the National Science Foundation (NSF) IIS-1318392 and IIS-1566129.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guoquan Huang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this chapter

Cite this chapter

Huang, G., Eckenhoff, K., Leonard, J. (2018). Optimal-State-Constraint EKF for Visual-Inertial Navigation. In: Bicchi, A., Burgard, W. (eds) Robotics Research. Springer Proceedings in Advanced Robotics, vol 2. Springer, Cham. https://doi.org/10.1007/978-3-319-51532-8_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-51532-8_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-51531-1

  • Online ISBN: 978-3-319-51532-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics