Skip to main content

An Integrated Navigation Method for UAV Autonomous Landing Based on Inertial and Vision Sensors

  • Conference paper
  • First Online:
Artificial Intelligence (CICAI 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13605))

Included in the following conference series:

  • 1295 Accesses

Abstract

In the process of autonomous landing of unmanned aerial vehicles (UAV), the vision sensor is restricted by the field of view and UAV maneuvering process, which may make the acquired relative position/attitude parameters unstable or even odd (not unique), and there is a ‘blind area’ of vision measurement in the UAV rollout stage, which loses the navigation ability and seriously affects the safety of landing. In this paper, an autonomous landing navigation method based on inertial/visual sensor information fusion is proposed. When the UAV is far away from the airport and the runway imaging is complete, landing navigation parameters are determined by vision sensor based on the object image conjugate relationship of the runway sideline, and fuses with the inertial information to improve the measure performance. When the UAV is close to the airport and the runway imaging is incomplete, the measurement information of the vision sensor appears singular. The estimation of the landing navigation parameters is realized by inertial information in the aid of vision. When the UAV rollouts, the vision sensor enters the ‘blind area’, judges the UAV’s motion state through the imaging features of two adjacent frames, and suppresses the inertial sensor error by using the UAV’s motion state constraint, so as to achieve the high-precision maintenance of landing navigation parameters. The flight test shows that the lateral relative position error is less than 10m when the inertial with low accuracy and visual sensor are used, which can meet the requirement of UAV landing safely.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Eitner, C., Holzapfel, F.: Development of a navigation solution for an image aided automatic landing system. In: Proceedings of the ION 2013 Pacific PNT Meeting, Honolulu, Hawaii, pp. 879–891, April 2013

    Google Scholar 

  2. Cai, M., et al.: Vision/INS integrated navigation for UAV autonomous landing. J. Appl. Opt. 36(3), 343–350 (2015)

    Article  Google Scholar 

  3. Wang, G., et al.: UAV autonomous landing using visual servo control based on aerostack. In: CSAE 2019, Sanya, China, October 2019

    Google Scholar 

  4. Zhou, H., et al.: Vision-based precision localization of UAVs for sensor payload placement and pickup for field monitoring applications. In: Proceedings Volume 10970, Sensors and Smart Structures Technologies for Civil, Mechanical, and Aerospace Systems 2019 (2019)

    Google Scholar 

  5. Wang, J., et al.: Integration of GPS/INS/vision sensors to navigate unmanned aerial vehicles. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 37, 963–969 (2008)

    Google Scholar 

  6. Zhang, L., et al.: Visual–inertial fusion-based registration between real and synthetic images in airborne combined vision system. Int. J. Adv. Robot. Syst. 1–14 (2019)

    Google Scholar 

  7. Wolkow, S., Angermann, M., Dekiert, A., Bestmann, U.: Model-based threshold and centerline detection for aircraft positioning during landing approach. In: Proceedings of the ION 2019 Pacific PNT Meeting, Honolulu, Hawaii, May 2019

    Google Scholar 

  8. Angermann, M., Wolkow, S., Dekiert, A., Bestmann, U., Hecker, P.: Fusion of dual optical position solutions for augmentation of GNSS-based aircraft landing systems. In: Proceedings of the 2019 International Technical Meeting of The Institute of Navigation, Reston, Virginia, pp. 283–295, January 2019

    Google Scholar 

  9. Angermann, M., Wolkow, S., Dekiert, A., Bestmann, U., Hecker, P.: Linear blend: data fusion in the image domain for image-based aircraft position during landing approach. In: Proceedings of the ION 2019 Pacific PNT Meeting, Honolulu, Hawaii, May 2019

    Google Scholar 

  10. Hesch, J.A., Roumeliotis, S.I.: A direct least-squares (DLS) method for PnP. In: Proceedings of 13th International Conference on Computer Vision. Barcelona, pp. 383–390 (2011)

    Google Scholar 

  11. Penate-Sanchez, A., Andrade-Cetto, J., Moreno-Noguer, F.: Exhaustive linearization for robust camera pose and focal length estimation. IEEE Trans. Pattern Anal. Mach. Intell. 35(10), 2387–2400 (2013)

    Article  Google Scholar 

  12. Ke-jun, S., Xin, Z., Wang Liu-jun, H., Guang-feng, L.-L.: Image semantic segmentation-based navigation method for UAV auto-landing. J. Chin. Inertial Technol. 28(5), 1–9 (2020)

    Google Scholar 

  13. Li, F., Tang, D.-Q., Shen, N.: Vision-based pose estimation of UAV from line correspondences. Procedia Eng. 15, 578–584 (2011)

    Article  Google Scholar 

  14. Zhuang, L., Han, Y., Fan, Y., Cao, Y., Wang, B., Zhang, Q.: Method of pose estimation for UAV landing. Chin. Opt. Lett 10(s2), S20401 (2012)

    Article  Google Scholar 

  15. Wolkow, S., Schwithal, A., Tonhäuser, C., Angermann, M., Hecker, P.: Image-aided position estimation based on line correspondences during automatic landing approach. In: Proceedings of the ION 2015 Pacific PNT Meeting, Honolulu, Hawaii, pp. 702–712, April 2015

    Google Scholar 

  16. Dosse, M.B., Kiers, H.A.L., Ten Berge, J.: Anisotropic generalized procrustes analysis. Comput. Stat. Data Anal. 55(5):1961–1968 (2011)

    Google Scholar 

  17. Garro, V., Crosilla, F., Fusiello, A.: Solving the pnp problem with anisotropic orthogonal procrustes analysis. In: 2012 Second Joint 3DIM/3DPVT Conference, Zurich, pp. 262–269 (2012)

    Google Scholar 

  18. Cai, Y., Li, D.: AUV underwater positioning method based on monocular-vision. J. Chin. Inertial Technol. 23(4), 489–492 (2015)

    Google Scholar 

  19. Zhang, H., Guo, P., Li, Z.: Vision aided alignment method for inertial navigation system on moving base. J. Chin. Inertial Technol. 22(4), 469–473 (2014)

    Google Scholar 

  20. Sun, T., Xing, F., You, Z.: Accuracy measurement of star trackers based on astronomy. J Tsinghua Univ. (Sci. Tech.) 52(4), 430–435 (2012)

    Google Scholar 

  21. Liu, C., Liu, L., Hu, G., et al.: A P3P problem solving algorithm for landing vision navigation. Navig. Position. Timing 5(1), 58–61 (2018)

    MathSciNet  Google Scholar 

  22. Liu, C., Yang, L., Liu, F., et al.: Navigation algorithm based on inertial/vision information fusion of UAV autonomous landing. Navig. Position. Timing 3(6), 6–11 (2016)

    Google Scholar 

  23. Schwithal, A., et al.: Integrity monitoring in GNSS/INS systems by optical augmentation. In: Inertial Sensors and Systems 2017, Karlsruhe, Germany (2017)

    Google Scholar 

  24. Angermann, M., Wolkow, S., Schwithal, A., Tonhäuser, C., Hecker, P.: High precision approaches enabled by an optical-based navigation system. In: Proceedings of the ION 2015 Pacific PNT Meeting, Honolulu, Hawaii, pp. 694–701, April 2015

    Google Scholar 

  25. Kim, S.B., Bazin, J.C., Lee, H.K., et al.: Ground vehicle navigation in harsh urban conditions by integrating inertial navigation system, global positioning system, odometer and vision data. Radar Sonar Navig. 5(8), 814–823 (2011)

    Article  Google Scholar 

  26. Georgy, J., Noureldin, A., Korenberg, M.J., et al.: Modeling the stochastic drift of a MEMS-based gyroscope in gyro/odometer/GPS integrated navigation. IEEE Trans. Intell. Transp. Syst. 11(4), 856–872 (2010)

    Article  Google Scholar 

  27. Dissanayake, G., Sukkarieh, S., Nebot, E., et al.: The aiding of a low-cost strapdown inertial measurement unit using vehicle model constraints for land vehicle applications. IEEE Trans. Robot. Autom. 17(5), 731–747 (2001)

    Article  Google Scholar 

  28. Gong, J., Liang, J., Wang, Y., et al.: On-line calibration method of SINS/DVL integrated navigation system. In: 2018 25th Saint Petersburg International Conference on Integrated Navigation Systems (ICINS)

    Google Scholar 

  29. Guo, F., Xie, L., Chen, J., et al.: Research of SINS/DVL/OD integrated navigation system based on observability analysis. In: 2016 35th Chinese Control Conference (CCC)

    Google Scholar 

  30. Jiang, Y., Lin, Y.: Error estimation of INS ground alignment through observability analysis. IEEE Trans. Aerosp. Electron. Syst. 28(1), 92–96 (1992)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kejun Shang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Shang, K., Li, X., Liu, C., Ming, L., Hu, G. (2022). An Integrated Navigation Method for UAV Autonomous Landing Based on Inertial and Vision Sensors. In: Fang, L., Povey, D., Zhai, G., Mei, T., Wang, R. (eds) Artificial Intelligence. CICAI 2022. Lecture Notes in Computer Science(), vol 13605. Springer, Cham. https://doi.org/10.1007/978-3-031-20500-2_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-20500-2_15

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-20499-9

  • Online ISBN: 978-3-031-20500-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics