Precise drone location and tracking by adaptive matched filtering from a top-view ToF camera

https://doi.org/10.1016/j.eswa.2019.112989Get rights and content

Highlights

  • Robust detection and tracking of drones are essential for precise UAV maneuvers.

  • New local positioning system using a top-view ToF camera.

  • Precise drone location based on adaptive matched filtering.

  • Model of a drone disturbance caused in a depth map.

  • Tracking and anti-occlusion algorithm working in real time.

Abstract

Nowadays the use of drones is rapidly growing, and the autonomous navigation capability depends on knowing their position at any time. The precision and robustness of a local positioning system for these devices is of particular importance in takeoff and landing maneuvers, especially in GPS-denied environments. The main contribution of this work lies in the development of a precise local positioning and tracking system for drones. For this purpose, a ToF camera has been installed on the ceiling in order to make use of its depth maps. Taking as a reference the disturbance caused by a quadrotor in one of these maps, a novel 2D matched filter has been designed based on a Gaussian wavelet. This filter allows the system to quickly detect all drones flying in the scene. Moreover, it is dynamically adapted to the image portion that they occupy, taking into account the variation of this parameter with their flying altitude, which has also been theoretically determined. The whole algorithm leads to a precise 3D drone positioning.

In addition, the proposed system is also robust against short-time occlusions, since the measurements history can be used to predict future positions, thus avoiding the complete loss of tracking.

Introduction

In the last few years, numerous works on UAVs (Unmanned Aerial Vehicles), also known as drones, have arisen due to the increasing use of these devices, both military and civilian. One of the most important tasks consists in automatizing flights, since the pilot usually stays far away to control it (Kendoul, 2012). Ironically, the main concern is currently connected with the lack of security involved by the presence of an unmanned vehicle, as analyzed in Samland, Fruth, Hildebrandt, Hoppe, and Dittmann (2012). Hence, knowing its position becomes essential to assemble either an autonomous system or a security one able to assume decisions based on that information.

The global positioning problem has been solved some time ago with the inclusion of GNSS (Global Navigation Satellite System) receivers for the most advanced drones, as can be seen in some off-the-shelf UAVs (DeDrone, 2019, DJI, 2019, ErleRobotics, 2019, Parrot, 2019), or even in works focused on improving measurements from GPS (Global Positioning System) (Tahar & Kamarudin, 2016). Some others have implemented detecting and tracking systems, even determining whether they are hostile or not. For instance, it is manually done in Boddhu, McCartney, Ceccopieri, and Williams (2013), where users send the photos they take of sighted UAVs to a remote server together with hand-typed information about the approximate device size, noise intensity and possible distance at which the drone is located. Finally, all information is processed and sent to other users. At least three observers are needed so as to get a suitable running. Another example can be found in Multerer et al. (2017), which addresses the issue via a 3D MIMO (Multiple Input Multiple Output) radar for jamming signals coming from potentially hazardous drones, and forcing their landing.

However, all researches described above have not been fully interested in knowing the precise position, but only a global detection and tracking. Those applications allowing the device to be as autonomous as possible require a precise and robust knowledge of its position in local environments. Obviously, these local positioning systems, intended either for indoor or outdoor, should improve the precision achieved with the former ones. This work falls within this framework and is aimed at achieving centimetric positioning in local environments and knowing the coordinates of all drones flying in the scene, via a Time-of-Flight (ToF) camera, allowing real time processing. The proposed algorithm has been adapted for better performance in an indoor environment, although the method can be extrapolated to different surroundings.

The rest of the paper is organized as follows: first, literature reviews in positioning drones as well as in ToF camera based positioning are introduced in Section 2; secondly, the working principles of these cameras are described in Section 3; thirdly, the whole algorithm is presented in Section 4; then, experimental tests and results are shown in Section 5; and finally, conclusions are exposed in Section 6.

Section snippets

Drone detection and tracking

The performance of 3D positioning systems has notably improved since the number of drones in logistic industry has rocketed. These improvements have been conducted by means of different technologies. For instance, one of them widely applied in the past for local positioning in general such as RF (Radio Frequency) is now adapted to detect UAVs. There exist passive techniques, which take advantage of signals emitted by the UAV, like eavesdropping, reaching 50 m, or vibration pattern analysis (

Time-of-flight cameras

A ToF camera is an active range imaging device that employs time-of-flight techniques. This kind of cameras resolves distances between the scene and itself by measuring the round trip time of an infrared light signal provided by a laser or an LED. Hence, this kind of camera provides not only 2D information in a single image, but also information about the third dimension. Noteworthy is that they do not furnish volumetric data but surfaces in 3D, and for this reason these devices are also known

Algorithm description

This section presents the details regarding the proposed detection algorithm, which basically consists in computing the 2D correlation between a processed version FNM (Section 4.1) of the depth matrix DNM taken from the ToF camera installed on the ceiling, and a 2D wavelet ΨPQ representing the disturbance caused by an UAV:ΘKL=FNMΨPQwhere ⊗ represents the discrete correlation operator. Every element is given byΘk,l=i=1Mj=1NFi,j(Ψik,jl)*{P+1kM1Q+1lN1The operator * denotes complex

Performance analysis and results

As final step in this development, this section presents the analysis of the proposed system in a real scenario. The experimental setup will next be described, as well as the pattern size function extraction and the global system performance.

Conclusion

Due to the necessity of having precise coordinates in GPS-denied environments, this paper has developed a precise and robust local positioning and tracking system for drones. For that purpose, a ToF camera has been used. This camera, installed on the ceiling for a top view, provides a depth map of the environment, the X and Y corresponding data, as well as a confidence map with the reliability of the measurement in every pixel.

One of the main challenges tackled in this work has been the

Acknowledgments

This work has been supported in part by the Spanish Government and the European Regional Development Fund (ERDF) through Project MICROCEBUS under Grant RTI2018-095168-B-C54, and in part by the Regional Government of Extremadura and ERDF-ESF under Project GR18038 and through the Pre-Doctoral Scholarship under Grant 45/2016 Exp. PD16030.

CRediT authorship contribution statement

José A. Paredes: Conceptualization, Methodology, Software, Writing - original draft. Fernando J. Álvarez: Formal analysis, Supervision, Writing - review & editing. Teodoro Aguilera: Validation, Writing - review & editing. Fernando J. Aranda: Visualization, Writing - review & editing.

References (45)

  • P. Corke et al.

    Height estimation for an autonomous helicopter

    Experimental robotics VII (ISER 2000)

    (2000)
  • M. Curetti et al.

    Use of inertial and altimeter information for rectified searches in image target tracking for drone applications

    XVI Workshop on information processing and control (RPIC)

    (2015)
  • DeDrone

    RF3000

    (2019)
  • DJI

    Phamton 4

    (2019)
  • J. Drozdowicz et al.

    35 GHz FMCW drone detection system

    17th International radar symposium (IRS)

    (2016)
  • J. Engel et al.

    Camera-based navigation of a low-cost quadrocopter

    IEEE/RSJ International conference onintelligent robots and systems

    (2012)
  • ErleRobotics

    ErleBrain

    (2019)
  • S.B. Gokturk et al.

    3d Head tracking based on recognition and interpolation using a time-of-flight depth sensor

    IEEE Computer society conference on computer vision and pattern recognition (CVPR)

    (2004)
  • S. Hussmann et al.

    Real-time processing of 3D-TOF data in machine vision applications

  • M. Iacono et al.

    Path following and obstacle avoidance for an autonomous UAV using a depth camera

    Robotics and Autonomous Systems

    (2018)
  • T. Kahlmann et al.

    Calibration for increased accuracy of the range imaging camera swissranger

    ISPRS Commission V symposium ’image engineering and vision metrology’

    (2006)
  • F. Kendoul

    Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems

    Journal of Field Robotics

    (2012)
  • Cited by (0)

    View full text