Skip to main content
Log in

Vision-Based Guidance for Tracking Multiple Dynamic Objects

  • Short Paper
  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

In this paper, we introduce a novel vision-based framework for tracking multiple active objects using guidance laws based on a rendezvous cone method. These guidance laws enable an unmanned aircraft system, equipped with a monocular camera, to continuously observe a set of moving objects within the field of view of its sensor. During the multi-object tracking process, we detect and categorize feature point estimators for controlling the occurrence of occlusions in a comprehensive fashion. Furthermore, we extend our open-source simulation environment and perform a series of simulations to show the efficacy of our proposed approach.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Data Availability

All simulation data in this study is included in the article. A preliminary version of this paper was presented in ICUAS 2021 [12].

References

  1. Kanellakis, C, Nikolakopoulos, G: Survey on computer vision for uavs: Current developments and trends. J. Intell. Robot. Syst. 87(1), 141–168 (2017)

    Article  Google Scholar 

  2. Al-Kaff, A, Martin, D, Garcia, F, de la Escalera, A, Armingol, J M: Survey of computer vision algorithms and applications for unmanned aerial vehicles. Expert Syst. Appl. 92, 447–463 (2018)

    Article  Google Scholar 

  3. Kanistras, K, Martins, G, Rutherford, M J, Valavanis, K P: A survey of unmanned aerial vehicles (uavs) for traffic monitoring. In: Proceedings of the International Conference on Unmanned Aircraft Systems, pp 221–234 (2013)

  4. Zhou, H, Kong, H, Wei, L, Creighton, D, Nahavandi, S: Efficient road detection and tracking for unmanned aerial vehicle. IEEE Trans. Intell. Transp. Syst. 16(1), 297–309 (2014)

    Article  Google Scholar 

  5. Yeong, SP, King, LM, Dol, SS: A review on marine search and rescue operations using unmanned aerial vehicles. Int. J. Marine Environ. Sci. 9(2), 396–399 (2015)

    Google Scholar 

  6. Van Tilburg, C: First report of using portable unmanned aircraft systems (drones) for search and rescue. Wilderness Environ. Med. 28(2), 116–118 (2017)

    Article  Google Scholar 

  7. Samad, T, Bay, J S, Godbole, D: Network-centric systems for military operations in urban terrain: The role of uavs. Proc. IEEE 95(1), 92–107 (2007)

    Article  Google Scholar 

  8. Manyam, S G, Rasmussen, S, Casbeer, D W, Kalyanam, K, Manickam, S: Multi-uav routing for persistent intelligence surveillance & reconnaissance missions. In: Proceedings of the International Conference on Unmanned Aircraft Systems, pp 573–580 (2017)

  9. Lee, J H, Millard, J D, Lusk, P C, Beard, R W: Autonomous target following with monocular camera on uas using recursive-ransac tracker. In: Proceedings of the International Conference on Unmanned Aircraft Systems, pp 1070–1074 (2018)

  10. Savkin, A V, Huang, H: Navigation of a uav network for optimal surveillance of a group of ground targets moving along a road. IEEE Trans. Intell. Transp. Syst. (2021)

  11. Li, X, Savkin, A V: Networked unmanned aerial vehicles for surveillance and monitoring: A survey. Future Internet 13(7), 174 (2021)

    Article  Google Scholar 

  12. Karmokar, P, Dhal, K, Beksi, W J, Chakravarthy, A: Vision-based guidance for tracking dynamic objects. In: Proceedings of the International Conference on Unmanned Aircraft Systems, pp 1106–1115 (2021)

  13. Chakravarthy, A, Ghose, D: Obstacle avoidance in a dynamic environment: A collision cone approach. IEEE Trans. Syst. Man Cybern.-Part A: Syst. Humans 28(5), 562–574 (1998)

    Article  Google Scholar 

  14. Goss, J, Rajvanshi, R, Subbarao, K: Aircraft conflict detection and resolution using mixed geometric and collision cone approaches. In: Proceedings of the AIAA Guidance, Navigation, and Control Conference and Exhibit, p 4879 (2004)

  15. Watanabe, Y, Calise, A, Johnson, E, Evers, J: Minimum-effort guidance for vision-based collision avoidance. In: Proceedings of the AIAA Atmospheric Flight Mechanics Conference and Exhibit, p 6641 (2006)

  16. Watanabe, Y, Calise, A, Johnson, E: Vision-based obstacle avoidance for uavs. In: Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit, p 6829 (2007)

  17. Ferrara, A, Vecchio, C: Second order sliding mode control of vehicles with distributed collision avoidance capabilities. Mechatronics 19(4), 471–477 (2009)

    Article  Google Scholar 

  18. Dhal, K, Kashyap, A, Chakravarthy, A: Collision avoidance and rendezvous of quadric surfaces moving on planar environments. In: 2021 60th IEEE Conference on Decision and Control (CDC), pp 3569–3575 (2021)

  19. Gopalakrishnan, B, Singh, A K, Krishna, K M: Time scaled collision cone based trajectory optimization approach for reactive planning in dynamic environments. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp 4169–4176 (2014)

  20. Brace, N L, Hedrick, T L, Theriault, D H, Fuller, N W, Wu, Z, Betke, M, Parrish, J K, Grünbaum, D, Morgansen, K A: Using collision cones to assess biological deconfliction methods. J. R. Soc. Interface. 13(122), 20160502 (2016)

    Article  Google Scholar 

  21. Chakravarthy, A, Ghose, D: Collision cones for quadric surfaces in n-dimensions. IEEE Robot. Autom. Lett. 3(1), 604–611 (2017)

    Article  Google Scholar 

  22. Zuo, W, Dhal, K, Keow, A, Chakravarthy, A, Chen, Z: Model-based control of a robotic fish to enable 3d maneuvering through a moving orifice. IEEE Robot. Autom. Lett. 5(3), 4719–4726 (2020)

    Article  Google Scholar 

  23. Yilmaz, A: Object tracking and activity recognition in video acquired using mobile cameras. Ph.D. Thesis, University of Central Florida (2004)

  24. Yilmaz, A, Javed, O, Shah, M: Object tracking: A survey. ACM Comput. Surv. 38(4), 13–es (2006)

    Article  Google Scholar 

  25. Porikli, F, Yilmaz, A: Object detection and tracking. In: Video Analytics for Business Intelligence, pp 3–41. Springer (2012)

  26. Carelli, R, Soria, C M, Morales, B: Vision-based tracking control for mobile robots. In: Proceedings of the International Conference on Advanced Robotics, pp 148–152. IEEE (2005)

  27. Lee, H, Jung, S, Shim, D H: Vision-based uav landing on the moving vehicle. In: Proceedings of the International Conference on Unmanned Aircraft Systems, pp 1–7 (2016)

  28. Henriques, J F, Caseiro, R, Martins, P, Batista, J: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2014)

    Article  Google Scholar 

  29. Bergmann, P, Meinhardt, T, Leal-Taixe, L: Tracking without bells and whistles. In: Proceedings of the International Conference on Computer Vision, pp 941–951. IEEE (2019)

  30. Zhou, X, Koltun, V, Krähenbühl, P: Tracking objects as points. In: Proceedings of the European Conference on Computer Vision, pp 474–490. Springer (2020)

  31. Pan, J, Hu, B: Robust occlusion handling in object tracking. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 1–8 (2007)

  32. Shen, S, Mulgaonkar, Y, Michael, N, Kumar, V: Vision-based state estimation and trajectory control towards high-speed flight with a quadrotor. In: Proceedings of Robotics: Science and Systems, vol. 1, p 32. Citeseer (2013)

  33. Madasu, V K, Hanmandlu, M: Estimation of vehicle speed by motion tracking on image sequences. In: Proceedings of the IEEE Intelligent Vehicles Symposium, pp 185–190 (2010)

  34. Li, B, Wu, W, Wang, Q, Zhang, F, Xing, J, Yan, J: Siamrpn++: Evolution of siamese visual tracking with very deep networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 4282–4291 (2019)

  35. Wang, Q, Zhang, L, Bertinetto, L, Hu, W, Torr, Philip HS: Fast online object tracking and segmentation: a unifying approach. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp 1328–1338 (2019)

  36. Dobrokhodov, V N, Kaminer, I I, Jones, K D, Ghabcheloo, R: Vision-based tracking and motion estimation for moving targets using small uavs. In: Proceedings of the American Control Conference, pp 6–pp. IEEE (2006)

  37. Jeon, B, Baek, K, Kim, C, Bang, H: Mode changing tracker for ground target tracking on aerial images from unmanned aerial vehicles. In: Proceedings of the International Conference on Control, Automation and Systems, pp 1849–1853. IEEE (2013)

  38. Kim, Y, Jung, W, Bang, H: Visual target tracking and relative navigation for unmanned aerial vehicles in a gps-denied environment. Int. J. Aeronaut. Space Sci. 15(3), 258–266 (2014)

    Article  Google Scholar 

  39. Liu, S, Wang, S, Shi, W, Liu, H, Li, Z, Mao, T: Vehicle tracking by detection in uav aerial video. Sci. Chin. Inform. Sci. 62(2), 24101 (2019)

    Article  Google Scholar 

  40. Lee, B Y, Liew, L H, Cheah, W S, Wang, Y C: Occlusion handling in videos object tracking: A survey. In: Proceedings of the IOP Conference Series: Earth and Environmental Science, vol. 18, p 012020. IOP Publishing (2014)

  41. Galton, A: Lines of sight. In: Proceedings of the AISB Workshop on Spatial and Spatio-Temporal Reasoning, vol. 35, pp 37–39 (1994)

  42. Köhler, C: The occlusion calculus. In: Proceedings of the Cognitive Vision Workshop, pp 420–450. Citeseer (2002)

  43. Zhang, T, Jia, K, Xu, C, Ma, Y, Ahuja, N: Partial occlusion handling for visual tracking via robust part matching. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 1258–1265 (2014)

  44. Bouachir, W, Bilodeau, G-A: Structure-aware keypoint tracking for partial occlusion handling. In: Proceedings of the Winter Conference on Applications of Computer Vision, pp 877–884. IEEE (2014)

  45. Han, B, Paulson, C, Lu, T, Wu, D, Li, J: Tracking of multiple objects under partial occlusion. In: Proceedings of Automatic Target Recognition XIX, vol. 7335, p 733515. International Society for Optics and Photonics (2009)

  46. Pang, C C C, Lam, W W L, Yung, N H C: A novel method for resolving vehicle occlusion in a monocular traffic-image sequence. IEEE Trans. Intell. Transp. Syst. 5(3), 129–141 (2004)

    Article  Google Scholar 

  47. Guha, P, Mukerjee, A, Subramanian, V K: Formulation, detection and application of occlusion states (oc-7) in the context of multiple object tracking. In: Proceedings of the International Conference on Advanced Video and Signal Based Surveillance, pp 191–196. IEEE (2011)

  48. Zhou, B, Krähenbühl, P, Koltun, V: Does computer vision matter for action? arXiv:1905.12887 (2019)

  49. Moshtagh, N: Minimum volume enclosing ellipsoid. Convex Optim. 111, 1–9 (2005)

    MathSciNet  Google Scholar 

  50. Khachiyan, L G: Rounding of polytopes in the real number model of computation. Math. Oper. Res. 21(2), 307–320 (1996)

    Article  MathSciNet  Google Scholar 

  51. Chakravarthy, A, Ghose, D: Collision cones for quadric surfaces. IEEE Trans. Robot. 27(6), 1159–1166 (2011)

    Article  Google Scholar 

  52. Singer, R A: Estimating optimal tracking filter performance for manned maneuvering targets. IEEE Trans. Aerosp. Electron. Syst. 4, 473–483 (1970)

    Article  Google Scholar 

  53. Mahapatra, P R, Mehrotra, K: Mixed coordinate tracking of generalized maneuvering targets using acceleration and jerk models. IEEE Trans. Aerosp. Electron. Syst. 36(3), 992–1000 (2000)

    Article  Google Scholar 

  54. Shi, J, Tomasi, C: Good features to track. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp 593–600 (1994)

  55. Lucas, B D, Kanade, T: An iterative image registration technique with an application to stereo vision. In: Proceedings of the International Joint Conference on Artificial Intelligence, Vancouver (1981)

  56. Tomasi, C, Kanade, T: Detection and tracking of point features. CMU-CS-91-132, Carnegie Mellon University (1991)

  57. Horn, Berthold KP, Schunck, B G: Determining optical flow. Artif. Intell. 17(1–3), 185–203 (1981)

    Article  Google Scholar 

  58. Memin, E, Perez, P: A multigrid approach for hierarchical motion estimation. In: Proceedings of the International Conference on Computer Vision, pp 933–938. IEEE (1998)

  59. Brox, T, Bruhn, A, Papenberg, N, Weickert, J: High accuracy optical flow estimation based on a theory for warping. In: Proceedings of the European Conference on Computer Vision, pp 25–36. Springer (2004)

  60. Bruhn, A, Weickert, J: Towards ultimate motion estimation: Combining highest accuracy with real-time performance. In: Proceedings of the International Conference on Computer Vision, vol. 1, pp 749–755. IEEE (2005)

  61. Lowe, D.G: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60 (2), 91–110 (2004)

    Article  Google Scholar 

  62. Open-Source Community: Pygame 2. https://www.pygame.org (2022)

  63. Wojke, N, Bewley, A, Paulus, D: Simple online and realtime tracking with a deep association metric. In: Proceedings of the IEEE International Conference on Image Processing, pp 3645–3649 (2017)

  64. Zhang, Y, Wang, C, Wang, X, Zeng, W, Liu, W: Fairmot: On the fairness of detection and re-identification in multiple object tracking. arXiv:2004.01888 (2020)

  65. Dendorfer, P, Rezatofighi, H, Milan, A, Shi, J, Cremers, D, Reid, I, Roth, S, Schindler, K, Leal-Taixé, L: Mot20: A benchmark for multi object tracking in crowded scenes. arXiv:2003.09003 (2020)

Download references

Funding

This material is based in part upon work supported by the National Science Foundation through grant #IIS-1851817 and a University of Texas at Arlington Research Enhancement Program grant #270079.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study conception and design. The first draft of the manuscript was jointly written by Kashish Dhal and Pritam Karmokar, and all authors commented on previous versions of the manuscript. All authors have read and approved the final manuscript.

Corresponding author

Correspondence to Animesh Chakravarthy.

Ethics declarations

Ethics approval

All of the authors confirm that there are no potential acts of misconduct in this work and approve of the journal upholding the integrity of the scientific record. 

Consent to participate

The authors consent to participate.

Consent for Publication

The authors consent to publish.

Conflict of Interests

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Kashish Dhal and Pritam Karmokar contributed equally to this work.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dhal, K., Karmokar, P., Chakravarthy, A. et al. Vision-Based Guidance for Tracking Multiple Dynamic Objects. J Intell Robot Syst 105, 66 (2022). https://doi.org/10.1007/s10846-022-01657-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s10846-022-01657-6

Keywords

Navigation