Abstract
Autonomous landing of a drone is a necessary part of autonomous flight. One way to have high certainty of safety in landing is to return to the same location the drone took-off from. Implementations of return-to-home functionality fall short when relying solely on GPS or odometry as inaccuracies in the measurements and drift in the state estimate guides the drone to a position with a large offset from the initial position. This can be particularly dangerous if the drone took-off next to something like a body of water. Current work on precision landing relies on localizing to a known landing pattern, which requires the pilot to carry a landing pattern with them. We propose a method using a downward facing fisheye lens camera to accurately land a UAV from where it took off on an unstructured surface, without a landing pattern. Specifically, this approach uses a position estimate relative to the take-off path of the drone to guide the drone back. With the large Field-of-View provided by the fisheye lens, our algorithm can provide visual feedback starting with a large position error at the beginning of the landing, until 25 cm above the ground at the end of the landing. This algorithm empirically shows it can correct the drift error in the state estimation and land with an accuracy of 40 cm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Arora, S., Jain, S., Scherer, S., Nuske, S., Chamberlain, L., Singh, S.: Infrastructure-free shipdeck tracking for autonomous landing. In: 2013 IEEE International Conference on Robotics and Automation (ICRA), pp. 323–330. IEEE (2013)
Bai, W., Pan, F., Xing, B.Y., Pan, C., Pei, M.X.: Visual landing system of UAV based on ADRC. In: 2017 29th Chinese Control And Decision Conference (CCDC), pp. 7509–7514. IEEE (2017)
Benini, A., Rutherford, M.J., Valavanis, K.P.: Real-time, GPU-based pose estimation of a UAV for autonomous takeoff and landing. In: 2016 IEEE International Conference on Robotics and Automation (ICRA), pp. 3463–3470. IEEE (2016)
Chaumette, F., Hutchinson, S.: Visual servo control. i. basic approaches. IEEE Robot. Autom. Mag. 13(4), 82–90 (2006)
Chaumette, F., Hutchinson, S.: Visual servo control ii advanced approaches [tutorial]. IEEE Robot. Autom. Mag. 14(1), 109–118 (2007)
Clement, L., Kelly, J., Barfoot, T.D.: Monocular visual teach and repeat aided by local ground planarity. In: Field and Service Robotics, pp. 547–561. Springer, Cham (2016)
Clement, L., Kelly, J., Barfoot, T.D.: Robust monocular visual teach and repeat aided by local ground planarity and color-constant imagery. J. Field Robot. 34(1), 74–97 (2017)
Honegger, D., Meier, L., Tanskanen, P., Pollefeys, M.: An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In: 2013 IEEE International Conference on Robotics and Automation (ICRA), pp. 1736–1741. IEEE (2013)
Kannala, J., Brandt, S.S.: A generic camera model and calibration method for conventional, wide-angle, and fish-eye lenses. IEEE Trans. Pattern Anal. Mach. Intell. 28(8), 1335–1340 (2006)
Kannala, J., Heikkilä, J., Brandt, S.S.: Geometric camera calibration. Wiley Encyclopedia of Computer Science and Engineering (2008)
Kong, W., Zhou, D., Zhang, D., Zhang, J.: Vision-based autonomous landing system for unmanned aerial vehicle: a survey. In: 2014 International Conference on Multisensor Fusion and Information Integration for Intelligent Systems (MFI), pp. 1–8. IEEE (2014)
Merz, T., Duranti, S., Conte, G.: Autonomous landing of an unmanned helicopter based on vision and inertial sensing. In: Experimental Robotics IX, pp. 343–352. Springer, Heidelberg (2006)
Nguyen, T., Mann, G.K., Gosine, R.G., Vardy, A.: Appearance-based visual-teach-and-repeat navigation technique for micro aerial vehicle. J. Intell. Robot. Syst. 84(1–4), 217–240 (2016)
Pfrunder, A., Schoellig, A.P., Barfoot, T.D.: A proof-of-concept demonstration of visual teach and repeat on a quadrocopter using an altitude sensor and a monocular camera. In: 2014 Canadian Conference on Computer and Robot Vision (CRV), pp. 238–245. IEEE (2014)
Rublee, E., Rabaud, V., Konolige, K., Bradski, G.: ORB: an efficient alternative to sift or surf. In: 2011 IEEE international conference on Computer Vision (ICCV), pp. 2564–2571. IEEE (2011)
Saripalli, S., Montgomery, J.F., Sukhatme, G.S.: Visually guided landing of an unmanned aerial vehicle. IEEE Trans. Robot. Autom. 19(3), 371–380 (2003)
Simo-Serra, E., Trulls, E., Ferraz, L., Kokkinos, I., Fua, P., Moreno-Noguer, F.: Discriminative learning of deep convolutional feature point descriptors. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 118–126 (2015)
Guili, X., Zhang, Y., Ji, S., Cheng, Y., Tian, Y.: Research on computer vision-based for uav autonomous landing on a ship. Pattern Recogn. Lett. 30(6), 600–605 (2009)
Yang, S., Scherer, S.A., Zell, A.: An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle. J. Intell. Robot. Syst. 69(1–4), 499–515 (2013)
Zhao, Q., Feng, W., Wan, L., Zhang, J.: SPHORB: A fast and robust binary feature on the sphere. Int. J. Comput. Vision 113(2), 143–159 (2015). https://doi.org/10.1007/s11263-014-0787-4
Acknowledgements
This work was supported by Autel Robotics under award number A018532.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
1 Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Pluckter, K., Scherer, S. (2020). Precision UAV Landing in Unstructured Environments. In: Xiao, J., Kröger, T., Khatib, O. (eds) Proceedings of the 2018 International Symposium on Experimental Robotics. ISER 2018. Springer Proceedings in Advanced Robotics, vol 11. Springer, Cham. https://doi.org/10.1007/978-3-030-33950-0_16
Download citation
DOI: https://doi.org/10.1007/978-3-030-33950-0_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-33949-4
Online ISBN: 978-3-030-33950-0
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)