Skip to main content

Drone Detection Using Deep Learning: A Benchmark Study

  • Conference paper
  • First Online:
Computer Aided Systems Theory – EUROCAST 2022 (EUROCAST 2022)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13789))

Included in the following conference series:

  • 951 Accesses

Abstract

Since Unmanned Aerial Vehicles (UAVs) became available to the civilian public, it has witnessed dramatic spread and exponential popularity. This escalation gave rise to privacy and security concerns, both on the recreational and institutional levels. Although it is mainly used for leisure and productivity activities, it is evident that UAVs can also be used for malicious purposes. Today, as legislation and law enforcement federations can hardly control every incident, many institutions resort to surveillance systems to prevent hostile drone intrusion.

Although drone detection can be carried out using different technologies, such as radar or ultra-sonic, visual detection is arguably the most efficient method. Other than being cheap and readily available, cameras are typically a part of any surveillance system. Moreover, the rise of deep learning and neural network models rendered visual recognition very reliable [9, 21].

In this work, three state-of-the-art object detectors, namely YOLOv4, SSD-MobileNetv1 and SSD-VGG16, are tested and compared to find the best performing detector on our drone data-set of 23,863 collected and annotated images. The main work covers detailed reportage of the results of each model, as well as a comprehensive comparison between them. In terms of accuracy and real-time capability, the best performance was achieved by the SSD-VGG16 model, which scored average precision (AP50) of 90.4%, average recall (AR) of 72.7% and inference speed of 58 frames per second on the NVIDIA Jetson Xavier kit.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bitcraze: Home – Bitcraze. https://www.bitcraze.io/

  2. Bochkovskiy, A., Wang, C.Y., Liao, H.Y.M.: YOLOv4: optimal speed and accuracy of object detection, April 2020. http://arxiv.org/abs/2004.10934

  3. Everingham, M., Gool, L.V., Williams, C.K.I., Winn, J., Zisserman, A.: The PASCAL Visual Object Classes Homepage. http://host.robots.ox.ac.uk/pascal/VOC/

  4. Google: Google Trends. https://trends.google.com/trends/?geo=US

  5. Gruber, I.: The Evolution of Drones: From Military to Hobby & Commercial - Percepto. https://percepto.co/the-evolution-of-drones-from-military-to-hobby-commercial/

  6. Howard, A.G., et al.: MobileNets: efficient convolutional neural networks for mobile vision applications, April 2017. https://arxiv.org/abs/1704.04861v1

  7. Intel: Depth Camera D455 - Intel® RealSense™ Depth and Tracking Cameras. https://www.intelrealsense.com/depth-camera-d455/

  8. Intel: openvinotoolkit/cvat: Powerful and efficient Computer Vision Annotation Tool (CVAT). https://github.com/openvinotoolkit/cvat

  9. Lee, D.R., La, W.G., Kim, H.: Drone detection and identification system using artificial intelligence. In: Proceedings of the 9th International Conference on Information and Communication Technology Convergence: ICT Convergence Powered by Smart Intelligence, pp. 1131–1133, November 2018. https://doi.org/10.1109/ICTC.2018.8539442

  10. Lin, T.Y., et al.: COCO - Common Objects in Context. https://cocodataset.org/#detection-eval

  11. Liu, W., et al.: SSD: single shot MultiBox detector. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9905, pp. 21–37. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46448-0_2. https://arxiv.org/abs/1512.02325v5

  12. NVIDIA: GeForce GTX 1060 Graphics Cards – NVIDIA GeForce. https://www.nvidia.com/en-in/geforce/products/10series/geforce-gtx-1060/

  13. NVIDIA: GeForce RTX 2080 TI-Grafikkarte – NVIDIA. https://www.nvidia.com/de-at/geforce/graphics-cards/rtx-2080-ti/

  14. NVIDIA: Jetson AGX Xavier Developer Kit – NVIDIA Developer. https://developer.nvidia.com/embedded/jetson-agx-xavier-developer-kit

  15. NVIDIA: Programming Tensor Cores in CUDA 9 – NVIDIA Developer Blog. https://developer.nvidia.com/blog/programming-tensor-cores-cuda-9/

  16. Parrot: Parrot Mambo drone downloads – Parrot Support Center. https://www.parrot.com/us/support/documentation/mambo-range

  17. Pawełczyk, M., Wojtyra, M.: Real world object detection dataset for quadcopter unmanned aerial vehicle detection. IEEE Access 8, 174394–174409 (2020). https://doi.org/10.1109/ACCESS.2020.3026192

    Article  Google Scholar 

  18. PyTorch: PyTorch. https://pytorch.org/

  19. PyTorch: PyTorch documentation - PyTorch 1.9.1 documentation. https://pytorch.org/docs/stable/index.html

  20. Syma: SYMA X5SC EXPLORERS 2 - Drone - SYMA Official Site. http://www.symatoys.com/goodshow/x5sc-syma-x5sc-explorers-2.html

  21. Taha, B., Shoufan, A.: Machine learning-based drone detection and classification: state-of-the-art in research. IEEE Access 7, 138669–138682 (2019). https://doi.org/10.1109/ACCESS.2019.2942944

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Thomas Schlechter .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Hashem, A., Schlechter, T. (2022). Drone Detection Using Deep Learning: A Benchmark Study. In: Moreno-Díaz, R., Pichler, F., Quesada-Arencibia, A. (eds) Computer Aided Systems Theory – EUROCAST 2022. EUROCAST 2022. Lecture Notes in Computer Science, vol 13789. Springer, Cham. https://doi.org/10.1007/978-3-031-25312-6_55

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-25312-6_55

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-25311-9

  • Online ISBN: 978-3-031-25312-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics