Precise drone location and tracking by adaptive matched filtering from a top-view ToF camera
Introduction
In the last few years, numerous works on UAVs (Unmanned Aerial Vehicles), also known as drones, have arisen due to the increasing use of these devices, both military and civilian. One of the most important tasks consists in automatizing flights, since the pilot usually stays far away to control it (Kendoul, 2012). Ironically, the main concern is currently connected with the lack of security involved by the presence of an unmanned vehicle, as analyzed in Samland, Fruth, Hildebrandt, Hoppe, and Dittmann (2012). Hence, knowing its position becomes essential to assemble either an autonomous system or a security one able to assume decisions based on that information.
The global positioning problem has been solved some time ago with the inclusion of GNSS (Global Navigation Satellite System) receivers for the most advanced drones, as can be seen in some off-the-shelf UAVs (DeDrone, 2019, DJI, 2019, ErleRobotics, 2019, Parrot, 2019), or even in works focused on improving measurements from GPS (Global Positioning System) (Tahar & Kamarudin, 2016). Some others have implemented detecting and tracking systems, even determining whether they are hostile or not. For instance, it is manually done in Boddhu, McCartney, Ceccopieri, and Williams (2013), where users send the photos they take of sighted UAVs to a remote server together with hand-typed information about the approximate device size, noise intensity and possible distance at which the drone is located. Finally, all information is processed and sent to other users. At least three observers are needed so as to get a suitable running. Another example can be found in Multerer et al. (2017), which addresses the issue via a 3D MIMO (Multiple Input Multiple Output) radar for jamming signals coming from potentially hazardous drones, and forcing their landing.
However, all researches described above have not been fully interested in knowing the precise position, but only a global detection and tracking. Those applications allowing the device to be as autonomous as possible require a precise and robust knowledge of its position in local environments. Obviously, these local positioning systems, intended either for indoor or outdoor, should improve the precision achieved with the former ones. This work falls within this framework and is aimed at achieving centimetric positioning in local environments and knowing the coordinates of all drones flying in the scene, via a Time-of-Flight (ToF) camera, allowing real time processing. The proposed algorithm has been adapted for better performance in an indoor environment, although the method can be extrapolated to different surroundings.
The rest of the paper is organized as follows: first, literature reviews in positioning drones as well as in ToF camera based positioning are introduced in Section 2; secondly, the working principles of these cameras are described in Section 3; thirdly, the whole algorithm is presented in Section 4; then, experimental tests and results are shown in Section 5; and finally, conclusions are exposed in Section 6.
Section snippets
Drone detection and tracking
The performance of 3D positioning systems has notably improved since the number of drones in logistic industry has rocketed. These improvements have been conducted by means of different technologies. For instance, one of them widely applied in the past for local positioning in general such as RF (Radio Frequency) is now adapted to detect UAVs. There exist passive techniques, which take advantage of signals emitted by the UAV, like eavesdropping, reaching 50 m, or vibration pattern analysis (
Time-of-flight cameras
A ToF camera is an active range imaging device that employs time-of-flight techniques. This kind of cameras resolves distances between the scene and itself by measuring the round trip time of an infrared light signal provided by a laser or an LED. Hence, this kind of camera provides not only 2D information in a single image, but also information about the third dimension. Noteworthy is that they do not furnish volumetric data but surfaces in 3D, and for this reason these devices are also known
Algorithm description
This section presents the details regarding the proposed detection algorithm, which basically consists in computing the 2D correlation between a processed version FNM (Section 4.1) of the depth matrix DNM taken from the ToF camera installed on the ceiling, and a 2D wavelet ΨPQ representing the disturbance caused by an UAV:where ⊗ represents the discrete correlation operator. Every element is given byThe operator * denotes complex
Performance analysis and results
As final step in this development, this section presents the analysis of the proposed system in a real scenario. The experimental setup will next be described, as well as the pattern size function extraction and the global system performance.
Conclusion
Due to the necessity of having precise coordinates in GPS-denied environments, this paper has developed a precise and robust local positioning and tracking system for drones. For that purpose, a ToF camera has been used. This camera, installed on the ceiling for a top view, provides a depth map of the environment, the X and Y corresponding data, as well as a confidence map with the reliability of the measurement in every pixel.
One of the main challenges tackled in this work has been the
Acknowledgments
This work has been supported in part by the Spanish Government and the European Regional Development Fund (ERDF) through Project MICROCEBUS under Grant RTI2018-095168-B-C54, and in part by the Regional Government of Extremadura and ERDF-ESF under Project GR18038 and through the Pre-Doctoral Scholarship under Grant 45/2016 Exp. PD16030.
CRediT authorship contribution statement
José A. Paredes: Conceptualization, Methodology, Software, Writing - original draft. Fernando J. Álvarez: Formal analysis, Supervision, Writing - review & editing. Teodoro Aguilera: Validation, Writing - review & editing. Fernando J. Aranda: Visualization, Writing - review & editing.
References (45)
- et al.
A visual odometer for autonomous helicopter flight
Robotics and Autonomous Systems
(1999) - et al.
Real-time indoor positioning using range imaging sensors
SPIE - Real-time image and video processing 2010, Brussels (Belgium)
(2010) - et al.
An omnidirectional time-of-flight camera and its application to indoor slam
11th international conference on control automation robotics & vision (ICARCV 2010)
(2010) - et al.
Position tracking system using single RGB-D camera for evaluation of multi-rotor UAV control and self-localization
IEEE International conference on advanced intelligent mechatronics (AIM)
(2015) - et al.
Acoustic detection and tracking of a class I UAS with a small tetrahedral microphone array
Technical Report
(2014) - et al.
People tracking using a time-of-flight depth sensor
IEEE International conference on video and signal based surveillance
(2006) - et al.
UAS Detection classification and neutralization: Market survey 2015
Technical Report
(2015) - et al.
A collaborative smartphone sensing platform for detecting and tracking hostile drones
- et al.
Detection and tracking of drones using advanced acoustic cameras
- et al.
Millimeter wave radar for perimeter surveillance and detection of MAVs (micro aerial vehicles)
16th International radar symposium (IRS)
(2015)