Abstract
We present a vision system design for landing an unmanned aerial vehicle on a ship’s flight deck autonomously. The edge information from the international landing target is used to perform line segment detection, feature point mapping and clustering. Then a cascade filtering scheme is applied for target recognition. Meanwhile, the 4 DoF pose of the vehicle with respect to the target is estimated. The vision system has been implemented on the Asctec Pelican quadrotor in conjunction with a state estimator to perform real-time target recognition and tracking. An onboard controller is designed to close the control loop. Experiments show that the vision system is accurate, robust, and capable of dealing with an incomplete landing target, whilst the overall implementation shows the practicability of real-time onboard target tracking and closed-loop control.
Similar content being viewed by others
References
Akinlar, C., & Topal, C. (2011). EDLines: A real-time line segment detector with a false detection control. Pattern Recognition Letters, 32(13), 1633–1642.
Asctec MAV Framework. http://wiki.ros.org/asctec_mav_framework. Accessed 5 February 2014.
Arora, S., Jain, S., Scherer, S., Nuske, S., Chamberlain, L. J., Singh S., et al. (2013). Infrastructure-free Shipdeck tracking for autonomous landing. In International conference on robotics and automation (ICRA), Karlsruhe, Germany, 6–10, May 2013.
Bagen, W., Hu, J., Xu, Y., et al. (2009). In 2nd International congress on image and signal processing (pp. 1–5). Tianjin, China, 17–19 Oct 2009.
Bay, H., Ess, A., Tuytelaars, T., Gool, L. V., et al. (2008). Speeded-Up Robust Features (SURF). Computer Vision and Image Understanding, 110(3), 346–359.
Butler, A. (2013). Latest X-47B Tests Conducted in Off-Nominal Winds. Aviation Week and Space Technology, 175(42), 25.
Cocchioni, F., Mancini, A., Longhi, S., et al. (2014). Autonomous navigation, landing and recharge of a quadrotor using artificial vision. In International conference on unmanned aircraft systems (ICUAS) (pp. 418–429). Orlando, USA, 27–30 May 2014 .
Duda, R. O., & Hart, P. E. (1972). Use of the hough transformation to detect lines and curves in pictures. Communications of the ACM, 15(1), 11–15.
Ester, M., Kriegel, H. P., Sander, J., Xu, X. et al. (1996). A density-based algorithm for discovering clusters in large spatial databases with noise. In Proceedings of 2nd International conference on knowledge discovery and data mining (pp. 226–231).
Fitzgibbon, A. W., Pilu, M., & Fisher, R. B. (1999). Direct Least Square Fitting of Ellipses. IEEE Transactions on Pattern Analysis and Machine Intelligence, 21(5), 476–480.
Garratt, M., Pota, H., Lambert, A., Eckersley-Maslin, S., Farabet, C., et al. (2009). Visual racking and LIDAR relative positioning for automated launch and recovery of an unmanned rotorcraft from ships at sea. Naval Engineers Journal, 121(2), 99–110.
Gioi, R. G., Jakubowicz, J., Morel, J. M., Randall, G., et al. (2010). LSD: a fast line segment detector with a false detection control. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(4), 722–732.
Hu, M. K. (1962). Visual pattern recognition by moment invariants. IRE Transactions on Information Theory, 8(2), 179–187.
Huh, S., & Shim, D. H. (2010). A vision-based landing system for small unmanned aerial vehicles using an airbag. Control Engineering Practice, 18, 812–823.
Lee, D., Ryan, T., Kim, H. J., et al. (2012). Autonomous landing of a VTOL UAV on a moving platform using image-based visual servoing. In IEEE international conference on robotics and automation (ICRA) (pp. 971–976). RiverCentre, Saint Paul, Minnesota, USA, 14–18 May 2012.
Lin, F., Chen, B. M., Lee, T. H., et al. (2010). Vision aided motion estimation for unmanned helicopters in GPS denied environments. In IEEE conference on cybernetics and intelligent systems (CIS) (pp. 64–69). Singapore, 28–30 June, 2010.
Lowe, D. G. (2004). Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 60(2), 91–110.
Mahoor, M. H., Godzdanker, R., Dalamagkidis, K., Valavanis, K. P., et al. (2011). Vision-based landing of light weight unmanned helicopters on a smart landing platform. Journal of Intelligent & Robotic System, 61(1–4), 251–265.
Masselli, A., & Zell, A. (2012). A novel marker based tracking method for position and attitude control of MAVs. In International micro air vehicle conference and flight competition (pp. 1–6).
Merz, T., Duranti, S., Conte, G., et al. (2006). Autonomous landing of an unmanned helicopter based on vision and inertial sensing. In M. H. Ang & O. Khatib (Eds.), Experimental Robotics IX, STAR 21 (pp. 343–352). Heidelberg: Springer.
Newman, W. M., & Sproull, R. F. (1979). Principles of interactive computer graphics. Tokyo: McGraw-Hill Kogakusha.
Noris, G. (2006). Born survivor—An In-depth look at the Northrop Grumman MQ-8B fire scout vertical take-off and landing UAV. Flight International.
Otsu, N. (1979). Threshold selection method from gray-level histograms. IEEE Transactions on System, Man and Cybernetics, 9(1), 62–66.
ROS: The robotic operating system. http://www.ros.org. Accessed 5 February 2014.
Sanchez-Lopez, J. L., Pestana, J., Saripalli, S., Campoy, P., et al. (2014). An approach toward visual autonomous ship board landing of a VTOL UAV. Journal of Intelligent & Robotic System, 74(1–2), 113–127.
Saripalli, S., Montgomery, J. F., Sukhatme, G., et al. (2003). Visually guided landing of an unmanned aerial vehicle. IEEE Transactions on Robotics and Automation, 19(3), 371–380.
Sharp, C. S., Shakernia, O., Sastry, S. S., et al. (May 2001). A vision system for landing an unmanned aerial vehicle. In Proceedings of the 2001 IEEE international conference on robotics and automation (ICRA) (Vol. 2, pp. 1720–1727). Seoul, Korea.
Shi, H., & Wang, H. (2009). A vision system for landing an unmanned helicopter in complex environment. In Proceedings of SPIE 7469, MIPPR 2009: Pattern Recognition and Computer Vision (pp. 1–8).
Trilaksono, B. R., Triadhitama, R., Adiprawita, W., Wibowo, A., Sreenatha, A., et al. (2011). Hardware-in-the-loop simulation for visual target tracking of octorotor UAV. Aircraft Engineering and Aerospace Technology, 83(6), 407–419.
UAV Common Automatic Recovery System-Version 2. http://www.sncorp.com/pdfs/BusinessAreas/UCARS-V2Product%20Sheet.pdf. Accessed 5 February 2014.
Wenzel, K. E., Masselli, A., Zell, A., et al. (2011). Automatic take off, tracking and landing of a miniature UAV on a moving carrier vehicle. Journal of Intelligent & Robotic System, 61(1–4), 221–238.
Xu, C., Qiu, L., Liu, M,. Kong, B., Ge, Y., et al. (Aug, 2006). Stereo vision based relative pose and motion estimation for unmanned helicopter landing. In Proceedings of the 2006 IEEE international conference on information acquisition (pp. 31–36). Weihai, Shandong, China.
Xu, L., Oja, E., Kultanen, P., et al. (1990). A new curve detection method: Randomized hough transform (RHT). Pattern Recognition Letters, 11(5), 331–338.
Xu, G., Zhang, Y., Ji, S., Cheng, Y., Tian, Y., et al. (2009). Research on computer vision-based for UAV autonomous landing on a ship. Pattern Recognition Letters, 30(6), 600–605.
Yakimenko, O., Kaminer, I., Lentz, J., et al. (2002). Unmanned aircraft navigation for shipboard landing using infrared vision. IEEE Transactions on Aerospace and Electronic Systems, 38(4), 1181–1200.
Yang, S., Schere, S. A., Zell, A., et al. (2013). An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle. Journal of Intelligent & Robotic System, 69(1–4), 499–515.
Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11), 1330–1334.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The author declare that they have no conflict of interest.
Rights and permissions
About this article
Cite this article
Lin, S., Garratt, M.A. & Lambert, A.J. Monocular vision-based real-time target recognition and tracking for autonomously landing an UAV in a cluttered shipboard environment. Auton Robot 41, 881–901 (2017). https://doi.org/10.1007/s10514-016-9564-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10514-016-9564-2