Abstract
In this paper, we introduce a novel vision-based framework for tracking multiple active objects using guidance laws based on a rendezvous cone method. These guidance laws enable an unmanned aircraft system, equipped with a monocular camera, to continuously observe a set of moving objects within the field of view of its sensor. During the multi-object tracking process, we detect and categorize feature point estimators for controlling the occurrence of occlusions in a comprehensive fashion. Furthermore, we extend our open-source simulation environment and perform a series of simulations to show the efficacy of our proposed approach.
Similar content being viewed by others
Explore related subjects
Discover the latest articles and news from researchers in related subjects, suggested using machine learning.Data Availability
All simulation data in this study is included in the article. A preliminary version of this paper was presented in ICUAS 2021 [12].
References
Kanellakis, C, Nikolakopoulos, G: Survey on computer vision for uavs: Current developments and trends. J. Intell. Robot. Syst. 87(1), 141–168 (2017)
Al-Kaff, A, Martin, D, Garcia, F, de la Escalera, A, Armingol, J M: Survey of computer vision algorithms and applications for unmanned aerial vehicles. Expert Syst. Appl. 92, 447–463 (2018)
Kanistras, K, Martins, G, Rutherford, M J, Valavanis, K P: A survey of unmanned aerial vehicles (uavs) for traffic monitoring. In: Proceedings of the International Conference on Unmanned Aircraft Systems, pp 221–234 (2013)
Zhou, H, Kong, H, Wei, L, Creighton, D, Nahavandi, S: Efficient road detection and tracking for unmanned aerial vehicle. IEEE Trans. Intell. Transp. Syst. 16(1), 297–309 (2014)
Yeong, SP, King, LM, Dol, SS: A review on marine search and rescue operations using unmanned aerial vehicles. Int. J. Marine Environ. Sci. 9(2), 396–399 (2015)
Van Tilburg, C: First report of using portable unmanned aircraft systems (drones) for search and rescue. Wilderness Environ. Med. 28(2), 116–118 (2017)
Samad, T, Bay, J S, Godbole, D: Network-centric systems for military operations in urban terrain: The role of uavs. Proc. IEEE 95(1), 92–107 (2007)
Manyam, S G, Rasmussen, S, Casbeer, D W, Kalyanam, K, Manickam, S: Multi-uav routing for persistent intelligence surveillance & reconnaissance missions. In: Proceedings of the International Conference on Unmanned Aircraft Systems, pp 573–580 (2017)
Lee, J H, Millard, J D, Lusk, P C, Beard, R W: Autonomous target following with monocular camera on uas using recursive-ransac tracker. In: Proceedings of the International Conference on Unmanned Aircraft Systems, pp 1070–1074 (2018)
Savkin, A V, Huang, H: Navigation of a uav network for optimal surveillance of a group of ground targets moving along a road. IEEE Trans. Intell. Transp. Syst. (2021)
Li, X, Savkin, A V: Networked unmanned aerial vehicles for surveillance and monitoring: A survey. Future Internet 13(7), 174 (2021)
Karmokar, P, Dhal, K, Beksi, W J, Chakravarthy, A: Vision-based guidance for tracking dynamic objects. In: Proceedings of the International Conference on Unmanned Aircraft Systems, pp 1106–1115 (2021)
Chakravarthy, A, Ghose, D: Obstacle avoidance in a dynamic environment: A collision cone approach. IEEE Trans. Syst. Man Cybern.-Part A: Syst. Humans 28(5), 562–574 (1998)
Goss, J, Rajvanshi, R, Subbarao, K: Aircraft conflict detection and resolution using mixed geometric and collision cone approaches. In: Proceedings of the AIAA Guidance, Navigation, and Control Conference and Exhibit, p 4879 (2004)
Watanabe, Y, Calise, A, Johnson, E, Evers, J: Minimum-effort guidance for vision-based collision avoidance. In: Proceedings of the AIAA Atmospheric Flight Mechanics Conference and Exhibit, p 6641 (2006)
Watanabe, Y, Calise, A, Johnson, E: Vision-based obstacle avoidance for uavs. In: Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, p 6829 (2007)
Ferrara, A, Vecchio, C: Second order sliding mode control of vehicles with distributed collision avoidance capabilities. Mechatronics 19(4), 471–477 (2009)
Dhal, K, Kashyap, A, Chakravarthy, A: Collision avoidance and rendezvous of quadric surfaces moving on planar environments. In: 2021 60th IEEE Conference on Decision and Control (CDC), pp 3569–3575 (2021)
Gopalakrishnan, B, Singh, A K, Krishna, K M: Time scaled collision cone based trajectory optimization approach for reactive planning in dynamic environments. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp 4169–4176 (2014)
Brace, N L, Hedrick, T L, Theriault, D H, Fuller, N W, Wu, Z, Betke, M, Parrish, J K, Grünbaum, D, Morgansen, K A: Using collision cones to assess biological deconfliction methods. J. R. Soc. Interface. 13(122), 20160502 (2016)
Chakravarthy, A, Ghose, D: Collision cones for quadric surfaces in n-dimensions. IEEE Robot. Autom. Lett. 3(1), 604–611 (2017)
Zuo, W, Dhal, K, Keow, A, Chakravarthy, A, Chen, Z: Model-based control of a robotic fish to enable 3d maneuvering through a moving orifice. IEEE Robot. Autom. Lett. 5(3), 4719–4726 (2020)
Yilmaz, A: Object tracking and activity recognition in video acquired using mobile cameras. Ph.D. Thesis, University of Central Florida (2004)
Yilmaz, A, Javed, O, Shah, M: Object tracking: A survey. ACM Comput. Surv. 38(4), 13–es (2006)
Porikli, F, Yilmaz, A: Object detection and tracking. In: Video Analytics for Business Intelligence, pp 3–41. Springer (2012)
Carelli, R, Soria, C M, Morales, B: Vision-based tracking control for mobile robots. In: Proceedings of the International Conference on Advanced Robotics, pp 148–152. IEEE (2005)
Lee, H, Jung, S, Shim, D H: Vision-based uav landing on the moving vehicle. In: Proceedings of the International Conference on Unmanned Aircraft Systems, pp 1–7 (2016)
Henriques, J F, Caseiro, R, Martins, P, Batista, J: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2014)
Bergmann, P, Meinhardt, T, Leal-Taixe, L: Tracking without bells and whistles. In: Proceedings of the International Conference on Computer Vision, pp 941–951. IEEE (2019)
Zhou, X, Koltun, V, Krähenbühl, P: Tracking objects as points. In: Proceedings of the European Conference on Computer Vision, pp 474–490. Springer (2020)
Pan, J, Hu, B: Robust occlusion handling in object tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 1–8 (2007)
Shen, S, Mulgaonkar, Y, Michael, N, Kumar, V: Vision-based state estimation and trajectory control towards high-speed flight with a quadrotor. In: Proceedings of Robotics: Science and Systems, vol. 1, p 32. Citeseer (2013)
Madasu, V K, Hanmandlu, M: Estimation of vehicle speed by motion tracking on image sequences. In: Proceedings of the IEEE Intelligent Vehicles Symposium, pp 185–190 (2010)
Li, B, Wu, W, Wang, Q, Zhang, F, Xing, J, Yan, J: Siamrpn++: Evolution of siamese visual tracking with very deep networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 4282–4291 (2019)
Wang, Q, Zhang, L, Bertinetto, L, Hu, W, Torr, Philip HS: Fast online object tracking and segmentation: a unifying approach. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp 1328–1338 (2019)
Dobrokhodov, V N, Kaminer, I I, Jones, K D, Ghabcheloo, R: Vision-based tracking and motion estimation for moving targets using small uavs. In: Proceedings of the American Control Conference, pp 6–pp. IEEE (2006)
Jeon, B, Baek, K, Kim, C, Bang, H: Mode changing tracker for ground target tracking on aerial images from unmanned aerial vehicles. In: Proceedings of the International Conference on Control, Automation and Systems, pp 1849–1853. IEEE (2013)
Kim, Y, Jung, W, Bang, H: Visual target tracking and relative navigation for unmanned aerial vehicles in a gps-denied environment. Int. J. Aeronaut. Space Sci. 15(3), 258–266 (2014)
Liu, S, Wang, S, Shi, W, Liu, H, Li, Z, Mao, T: Vehicle tracking by detection in uav aerial video. Sci. Chin. Inform. Sci. 62(2), 24101 (2019)
Lee, B Y, Liew, L H, Cheah, W S, Wang, Y C: Occlusion handling in videos object tracking: A survey. In: Proceedings of the IOP Conference Series: Earth and Environmental Science, vol. 18, p 012020. IOP Publishing (2014)
Galton, A: Lines of sight. In: Proceedings of the AISB Workshop on Spatial and Spatio-Temporal Reasoning, vol. 35, pp 37–39 (1994)
Köhler, C: The occlusion calculus. In: Proceedings of the Cognitive Vision Workshop, pp 420–450. Citeseer (2002)
Zhang, T, Jia, K, Xu, C, Ma, Y, Ahuja, N: Partial occlusion handling for visual tracking via robust part matching. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 1258–1265 (2014)
Bouachir, W, Bilodeau, G-A: Structure-aware keypoint tracking for partial occlusion handling. In: Proceedings of the Winter Conference on Applications of Computer Vision, pp 877–884. IEEE (2014)
Han, B, Paulson, C, Lu, T, Wu, D, Li, J: Tracking of multiple objects under partial occlusion. In: Proceedings of Automatic Target Recognition XIX, vol. 7335, p 733515. International Society for Optics and Photonics (2009)
Pang, C C C, Lam, W W L, Yung, N H C: A novel method for resolving vehicle occlusion in a monocular traffic-image sequence. IEEE Trans. Intell. Transp. Syst. 5(3), 129–141 (2004)
Guha, P, Mukerjee, A, Subramanian, V K: Formulation, detection and application of occlusion states (oc-7) in the context of multiple object tracking. In: Proceedings of the International Conference on Advanced Video and Signal Based Surveillance, pp 191–196. IEEE (2011)
Zhou, B, Krähenbühl, P, Koltun, V: Does computer vision matter for action? arXiv:1905.12887 (2019)
Moshtagh, N: Minimum volume enclosing ellipsoid. Convex Optim. 111, 1–9 (2005)
Khachiyan, L G: Rounding of polytopes in the real number model of computation. Math. Oper. Res. 21(2), 307–320 (1996)
Chakravarthy, A, Ghose, D: Collision cones for quadric surfaces. IEEE Trans. Robot. 27(6), 1159–1166 (2011)
Singer, R A: Estimating optimal tracking filter performance for manned maneuvering targets. IEEE Trans. Aerosp. Electron. Syst. 4, 473–483 (1970)
Mahapatra, P R, Mehrotra, K: Mixed coordinate tracking of generalized maneuvering targets using acceleration and jerk models. IEEE Trans. Aerosp. Electron. Syst. 36(3), 992–1000 (2000)
Shi, J, Tomasi, C: Good features to track. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp 593–600 (1994)
Lucas, B D, Kanade, T: An iterative image registration technique with an application to stereo vision. In: Proceedings of the International Joint Conference on Artificial Intelligence, Vancouver (1981)
Tomasi, C, Kanade, T: Detection and tracking of point features. CMU-CS-91-132, Carnegie Mellon University (1991)
Horn, Berthold KP, Schunck, B G: Determining optical flow. Artif. Intell. 17(1–3), 185–203 (1981)
Memin, E, Perez, P: A multigrid approach for hierarchical motion estimation. In: Proceedings of the International Conference on Computer Vision, pp 933–938. IEEE (1998)
Brox, T, Bruhn, A, Papenberg, N, Weickert, J: High accuracy optical flow estimation based on a theory for warping. In: Proceedings of the European Conference on Computer Vision, pp 25–36. Springer (2004)
Bruhn, A, Weickert, J: Towards ultimate motion estimation: Combining highest accuracy with real-time performance. In: Proceedings of the International Conference on Computer Vision, vol. 1, pp 749–755. IEEE (2005)
Lowe, D.G: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60 (2), 91–110 (2004)
Open-Source Community: Pygame 2. https://www.pygame.org (2022)
Wojke, N, Bewley, A, Paulus, D: Simple online and realtime tracking with a deep association metric. In: Proceedings of the IEEE International Conference on Image Processing, pp 3645–3649 (2017)
Zhang, Y, Wang, C, Wang, X, Zeng, W, Liu, W: Fairmot: On the fairness of detection and re-identification in multiple object tracking. arXiv:2004.01888 (2020)
Dendorfer, P, Rezatofighi, H, Milan, A, Shi, J, Cremers, D, Reid, I, Roth, S, Schindler, K, Leal-Taixé, L: Mot20: A benchmark for multi object tracking in crowded scenes. arXiv:2003.09003 (2020)
Funding
This material is based in part upon work supported by the National Science Foundation through grant #IIS-1851817 and a University of Texas at Arlington Research Enhancement Program grant #270079.
Author information
Authors and Affiliations
Contributions
All authors contributed to the study conception and design. The first draft of the manuscript was jointly written by Kashish Dhal and Pritam Karmokar, and all authors commented on previous versions of the manuscript. All authors have read and approved the final manuscript.
Corresponding author
Ethics declarations
Ethics approval
All of the authors confirm that there are no potential acts of misconduct in this work and approve of the journal upholding the integrity of the scientific record.
Consent to participate
The authors consent to participate.
Consent for Publication
The authors consent to publish.
Conflict of Interests
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Kashish Dhal and Pritam Karmokar contributed equally to this work.
Rights and permissions
About this article
Cite this article
Dhal, K., Karmokar, P., Chakravarthy, A. et al. Vision-Based Guidance for Tracking Multiple Dynamic Objects. J Intell Robot Syst 105, 66 (2022). https://doi.org/10.1007/s10846-022-01657-6
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10846-022-01657-6