Skip to main content
Log in

Monocular vision-based real-time target recognition and tracking for autonomously landing an UAV in a cluttered shipboard environment

  • Published:
Autonomous Robots Aims and scope Submit manuscript

Abstract

We present a vision system design for landing an unmanned aerial vehicle on a ship’s flight deck autonomously. The edge information from the international landing target is used to perform line segment detection, feature point mapping and clustering. Then a cascade filtering scheme is applied for target recognition. Meanwhile, the 4 DoF pose of the vehicle with respect to the target is estimated. The vision system has been implemented on the Asctec Pelican quadrotor in conjunction with a state estimator to perform real-time target recognition and tracking. An onboard controller is designed to close the control loop. Experiments show that the vision system is accurate, robust, and capable of dealing with an incomplete landing target, whilst the overall implementation shows the practicability of real-time onboard target tracking and closed-loop control.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23

Similar content being viewed by others

References

  • Akinlar, C., & Topal, C. (2011). EDLines: A real-time line segment detector with a false detection control. Pattern Recognition Letters, 32(13), 1633–1642.

    Article  Google Scholar 

  • Asctec MAV Framework. http://wiki.ros.org/asctec_mav_framework. Accessed 5 February 2014.

  • Arora, S., Jain, S., Scherer, S., Nuske, S., Chamberlain, L. J., Singh S., et al. (2013). Infrastructure-free Shipdeck tracking for autonomous landing. In International conference on robotics and automation (ICRA), Karlsruhe, Germany, 6–10, May 2013.

  • Bagen, W., Hu, J., Xu, Y., et al. (2009). In 2nd International congress on image and signal processing (pp. 1–5). Tianjin, China, 17–19 Oct 2009.

  • Bay, H., Ess, A., Tuytelaars, T., Gool, L. V., et al. (2008). Speeded-Up Robust Features (SURF). Computer Vision and Image Understanding, 110(3), 346–359.

    Article  Google Scholar 

  • Butler, A. (2013). Latest X-47B Tests Conducted in Off-Nominal Winds. Aviation Week and Space Technology, 175(42), 25.

    Google Scholar 

  • Cocchioni, F., Mancini, A., Longhi, S., et al. (2014). Autonomous navigation, landing and recharge of a quadrotor using artificial vision. In International conference on unmanned aircraft systems (ICUAS) (pp. 418–429). Orlando, USA, 27–30 May 2014 .

  • Duda, R. O., & Hart, P. E. (1972). Use of the hough transformation to detect lines and curves in pictures. Communications of the ACM, 15(1), 11–15.

    Article  MATH  Google Scholar 

  • Ester, M., Kriegel, H. P., Sander, J., Xu, X. et al. (1996). A density-based algorithm for discovering clusters in large spatial databases with noise. In Proceedings of 2nd International conference on knowledge discovery and data mining (pp. 226–231).

  • Fitzgibbon, A. W., Pilu, M., & Fisher, R. B. (1999). Direct Least Square Fitting of Ellipses. IEEE Transactions on Pattern Analysis and Machine Intelligence, 21(5), 476–480.

    Article  Google Scholar 

  • Garratt, M., Pota, H., Lambert, A., Eckersley-Maslin, S., Farabet, C., et al. (2009). Visual racking and LIDAR relative positioning for automated launch and recovery of an unmanned rotorcraft from ships at sea. Naval Engineers Journal, 121(2), 99–110.

    Article  Google Scholar 

  • Gioi, R. G., Jakubowicz, J., Morel, J. M., Randall, G., et al. (2010). LSD: a fast line segment detector with a false detection control. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(4), 722–732.

    Article  Google Scholar 

  • Hu, M. K. (1962). Visual pattern recognition by moment invariants. IRE Transactions on Information Theory, 8(2), 179–187.

    Article  MATH  Google Scholar 

  • Huh, S., & Shim, D. H. (2010). A vision-based landing system for small unmanned aerial vehicles using an airbag. Control Engineering Practice, 18, 812–823.

    Article  Google Scholar 

  • Lee, D., Ryan, T., Kim, H. J., et al. (2012). Autonomous landing of a VTOL UAV on a moving platform using image-based visual servoing. In IEEE international conference on robotics and automation (ICRA) (pp. 971–976). RiverCentre, Saint Paul, Minnesota, USA, 14–18 May 2012.

  • Lin, F., Chen, B. M., Lee, T. H., et al. (2010). Vision aided motion estimation for unmanned helicopters in GPS denied environments. In IEEE conference on cybernetics and intelligent systems (CIS) (pp. 64–69). Singapore, 28–30 June, 2010.

  • Lowe, D. G. (2004). Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision, 60(2), 91–110.

    Article  Google Scholar 

  • Mahoor, M. H., Godzdanker, R., Dalamagkidis, K., Valavanis, K. P., et al. (2011). Vision-based landing of light weight unmanned helicopters on a smart landing platform. Journal of Intelligent & Robotic System, 61(1–4), 251–265.

    Article  Google Scholar 

  • Masselli, A., & Zell, A. (2012). A novel marker based tracking method for position and attitude control of MAVs. In International micro air vehicle conference and flight competition (pp. 1–6).

  • Merz, T., Duranti, S., Conte, G., et al. (2006). Autonomous landing of an unmanned helicopter based on vision and inertial sensing. In M. H. Ang & O. Khatib (Eds.), Experimental Robotics IX, STAR 21 (pp. 343–352). Heidelberg: Springer.

    Chapter  Google Scholar 

  • Newman, W. M., & Sproull, R. F. (1979). Principles of interactive computer graphics. Tokyo: McGraw-Hill Kogakusha.

    MATH  Google Scholar 

  • Noris, G. (2006). Born survivor—An In-depth look at the Northrop Grumman MQ-8B fire scout vertical take-off and landing UAV. Flight International.

  • Otsu, N. (1979). Threshold selection method from gray-level histograms. IEEE Transactions on System, Man and Cybernetics, 9(1), 62–66.

    Article  MathSciNet  Google Scholar 

  • ROS: The robotic operating system. http://www.ros.org. Accessed 5 February 2014.

  • Sanchez-Lopez, J. L., Pestana, J., Saripalli, S., Campoy, P., et al. (2014). An approach toward visual autonomous ship board landing of a VTOL UAV. Journal of Intelligent & Robotic System, 74(1–2), 113–127.

    Article  Google Scholar 

  • Saripalli, S., Montgomery, J. F., Sukhatme, G., et al. (2003). Visually guided landing of an unmanned aerial vehicle. IEEE Transactions on Robotics and Automation, 19(3), 371–380.

    Article  Google Scholar 

  • Sharp, C. S., Shakernia, O., Sastry, S. S., et al. (May 2001). A vision system for landing an unmanned aerial vehicle. In Proceedings of the 2001 IEEE international conference on robotics and automation (ICRA) (Vol. 2, pp. 1720–1727). Seoul, Korea.

  • Shi, H., & Wang, H. (2009). A vision system for landing an unmanned helicopter in complex environment. In Proceedings of SPIE 7469, MIPPR 2009: Pattern Recognition and Computer Vision (pp. 1–8).

  • Trilaksono, B. R., Triadhitama, R., Adiprawita, W., Wibowo, A., Sreenatha, A., et al. (2011). Hardware-in-the-loop simulation for visual target tracking of octorotor UAV. Aircraft Engineering and Aerospace Technology, 83(6), 407–419.

    Article  Google Scholar 

  • UAV Common Automatic Recovery System-Version 2. http://www.sncorp.com/pdfs/BusinessAreas/UCARS-V2Product%20Sheet.pdf. Accessed 5 February 2014.

  • Wenzel, K. E., Masselli, A., Zell, A., et al. (2011). Automatic take off, tracking and landing of a miniature UAV on a moving carrier vehicle. Journal of Intelligent & Robotic System, 61(1–4), 221–238.

    Article  Google Scholar 

  • Xu, C., Qiu, L., Liu, M,. Kong, B., Ge, Y., et al. (Aug, 2006). Stereo vision based relative pose and motion estimation for unmanned helicopter landing. In Proceedings of the 2006 IEEE international conference on information acquisition (pp. 31–36). Weihai, Shandong, China.

  • Xu, L., Oja, E., Kultanen, P., et al. (1990). A new curve detection method: Randomized hough transform (RHT). Pattern Recognition Letters, 11(5), 331–338.

    Article  MATH  Google Scholar 

  • Xu, G., Zhang, Y., Ji, S., Cheng, Y., Tian, Y., et al. (2009). Research on computer vision-based for UAV autonomous landing on a ship. Pattern Recognition Letters, 30(6), 600–605.

    Article  Google Scholar 

  • Yakimenko, O., Kaminer, I., Lentz, J., et al. (2002). Unmanned aircraft navigation for shipboard landing using infrared vision. IEEE Transactions on Aerospace and Electronic Systems, 38(4), 1181–1200.

    Article  Google Scholar 

  • Yang, S., Schere, S. A., Zell, A., et al. (2013). An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle. Journal of Intelligent & Robotic System, 69(1–4), 499–515.

    Article  Google Scholar 

  • Zhang, Z. (2000). A flexible new technique for camera calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(11), 1330–1334.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shanggang Lin.

Ethics declarations

Conflict of Interest

The author declare that they have no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lin, S., Garratt, M.A. & Lambert, A.J. Monocular vision-based real-time target recognition and tracking for autonomously landing an UAV in a cluttered shipboard environment. Auton Robot 41, 881–901 (2017). https://doi.org/10.1007/s10514-016-9564-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10514-016-9564-2

Keywords

Navigation