Skip to main content

Precision UAV Landing in Unstructured Environments

  • Conference paper
  • First Online:
Proceedings of the 2018 International Symposium on Experimental Robotics (ISER 2018)

Part of the book series: Springer Proceedings in Advanced Robotics ((SPAR,volume 11))

Included in the following conference series:

Abstract

Autonomous landing of a drone is a necessary part of autonomous flight. One way to have high certainty of safety in landing is to return to the same location the drone took-off from. Implementations of return-to-home functionality fall short when relying solely on GPS or odometry as inaccuracies in the measurements and drift in the state estimate guides the drone to a position with a large offset from the initial position. This can be particularly dangerous if the drone took-off next to something like a body of water. Current work on precision landing relies on localizing to a known landing pattern, which requires the pilot to carry a landing pattern with them. We propose a method using a downward facing fisheye lens camera to accurately land a UAV from where it took off on an unstructured surface, without a landing pattern. Specifically, this approach uses a position estimate relative to the take-off path of the drone to guide the drone back. With the large Field-of-View provided by the fisheye lens, our algorithm can provide visual feedback starting with a large position error at the beginning of the landing, until 25 cm above the ground at the end of the landing. This algorithm empirically shows it can correct the drift error in the state estimation and land with an accuracy of 40 cm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Arora, S., Jain, S., Scherer, S., Nuske, S., Chamberlain, L., Singh, S.: Infrastructure-free shipdeck tracking for autonomous landing. In: 2013 IEEE International Conference on Robotics and Automation (ICRA), pp. 323–330. IEEE (2013)

    Google Scholar 

  2. Bai, W., Pan, F., Xing, B.Y., Pan, C., Pei, M.X.: Visual landing system of UAV based on ADRC. In: 2017 29th Chinese Control And Decision Conference (CCDC), pp. 7509–7514. IEEE (2017)

    Google Scholar 

  3. Benini, A., Rutherford, M.J., Valavanis, K.P.: Real-time, GPU-based pose estimation of a UAV for autonomous takeoff and landing. In: 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 3463–3470. IEEE (2016)

    Google Scholar 

  4. Chaumette, F., Hutchinson, S.: Visual servo control. i. basic approaches. IEEE Robot. Autom. Mag. 13(4), 82–90 (2006)

    Article  Google Scholar 

  5. Chaumette, F., Hutchinson, S.: Visual servo control ii advanced approaches [tutorial]. IEEE Robot. Autom. Mag. 14(1), 109–118 (2007)

    Article  Google Scholar 

  6. Clement, L., Kelly, J., Barfoot, T.D.: Monocular visual teach and repeat aided by local ground planarity. In: Field and Service Robotics, pp. 547–561. Springer, Cham (2016)

    Google Scholar 

  7. Clement, L., Kelly, J., Barfoot, T.D.: Robust monocular visual teach and repeat aided by local ground planarity and color-constant imagery. J. Field Robot. 34(1), 74–97 (2017)

    Article  Google Scholar 

  8. Honegger, D., Meier, L., Tanskanen, P., Pollefeys, M.: An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In: 2013 IEEE International Conference on Robotics and Automation (ICRA), pp. 1736–1741. IEEE (2013)

    Google Scholar 

  9. Kannala, J., Brandt, S.S.: A generic camera model and calibration method for conventional, wide-angle, and fish-eye lenses. IEEE Trans. Pattern Anal. Mach. Intell. 28(8), 1335–1340 (2006)

    Article  Google Scholar 

  10. Kannala, J., Heikkilä, J., Brandt, S.S.: Geometric camera calibration. Wiley Encyclopedia of Computer Science and Engineering (2008)

    Google Scholar 

  11. Kong, W., Zhou, D., Zhang, D., Zhang, J.: Vision-based autonomous landing system for unmanned aerial vehicle: a survey. In: 2014 International Conference on Multisensor Fusion and Information Integration for Intelligent Systems (MFI), pp. 1–8. IEEE (2014)

    Google Scholar 

  12. Merz, T., Duranti, S., Conte, G.: Autonomous landing of an unmanned helicopter based on vision and inertial sensing. In: Experimental Robotics IX, pp. 343–352. Springer, Heidelberg (2006)

    Google Scholar 

  13. Nguyen, T., Mann, G.K., Gosine, R.G., Vardy, A.: Appearance-based visual-teach-and-repeat navigation technique for micro aerial vehicle. J. Intell. Robot. Syst. 84(1–4), 217–240 (2016)

    Article  Google Scholar 

  14. Pfrunder, A., Schoellig, A.P., Barfoot, T.D.: A proof-of-concept demonstration of visual teach and repeat on a quadrocopter using an altitude sensor and a monocular camera. In: 2014 Canadian Conference on Computer and Robot Vision (CRV), pp. 238–245. IEEE (2014)

    Google Scholar 

  15. Rublee, E., Rabaud, V., Konolige, K., Bradski, G.: ORB: an efficient alternative to sift or surf. In: 2011 IEEE international conference on Computer Vision (ICCV), pp. 2564–2571. IEEE (2011)

    Google Scholar 

  16. Saripalli, S., Montgomery, J.F., Sukhatme, G.S.: Visually guided landing of an unmanned aerial vehicle. IEEE Trans. Robot. Autom. 19(3), 371–380 (2003)

    Article  Google Scholar 

  17. Simo-Serra, E., Trulls, E., Ferraz, L., Kokkinos, I., Fua, P., Moreno-Noguer, F.: Discriminative learning of deep convolutional feature point descriptors. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 118–126 (2015)

    Google Scholar 

  18. Guili, X., Zhang, Y., Ji, S., Cheng, Y., Tian, Y.: Research on computer vision-based for uav autonomous landing on a ship. Pattern Recogn. Lett. 30(6), 600–605 (2009)

    Article  Google Scholar 

  19. Yang, S., Scherer, S.A., Zell, A.: An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle. J. Intell. Robot. Syst. 69(1–4), 499–515 (2013)

    Article  Google Scholar 

  20. Zhao, Q., Feng, W., Wan, L., Zhang, J.: SPHORB: A fast and robust binary feature on the sphere. Int. J. Comput. Vision 113(2), 143–159 (2015). https://doi.org/10.1007/s11263-014-0787-4

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This work was supported by Autel Robotics under award number A018532.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kevin Pluckter .

Editor information

Editors and Affiliations

1 Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 8637 KB)

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Pluckter, K., Scherer, S. (2020). Precision UAV Landing in Unstructured Environments. In: Xiao, J., Kröger, T., Khatib, O. (eds) Proceedings of the 2018 International Symposium on Experimental Robotics. ISER 2018. Springer Proceedings in Advanced Robotics, vol 11. Springer, Cham. https://doi.org/10.1007/978-3-030-33950-0_16

Download citation

Publish with us

Policies and ethics