Skip to main content
Log in

Fuzzy-aided solution for out-of-view challenge in visual tracking under IoT-assisted complex environment

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

With the rapid development in computer vision domain, research on object tracking has directed more attention by scholars. Out of view (OV) is an important challenge often encountered in the tracking process of objects, especially in Internet of Things surveillance. Therefore, this paper proposes a fuzzy-aided solution for OV challenge. This solution uses a fuzzy-aided system to detect whether the target is poorly tracked by using the response matrix of samples. When poor tracking occurs, the target is relocated according to the stored template. The proposed solution is tested on OTB100 dataset, where the experimental results show that the auxiliary solution is effective for the OV challenge. The proposed solution also ensures the tracking speed and overall success rate of visual tracking as well as improves the robustness to a certain extent for IoT-assisted complex environment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Pan Z, Liu S, Fu W (2017) A review of visual moving target tracking. Multim Tools Appl 76(16):16989–17018

    Article  Google Scholar 

  2. Sajjad M, Zahir S, Ullah A, Akhtar Z, Muhammad K (2019) Human behavior understanding in big multimedia data using CNN based facial expression recognition. Mob Netw Appl. https://doi.org/10.1007/s11036-019-01366-9

    Article  Google Scholar 

  3. Muhammad K, Tanveer H, Ser JD, Palade V, Albuquerque VHCD (2020) DeepReS: a deep learning-based video summarization strategy for resource-constrained industrial surveillance scenarios. IEEE Trans Ind Inf. https://doi.org/10.1109/TII.2019.2960536

    Article  Google Scholar 

  4. Paranjothi A, Khan MS, Patan R, Parizi RM (2020) VANETomo: A congestion identification and control scheme in connected vehicles using network tomography. Comput Commun 151:275–289

    Article  Google Scholar 

  5. Wu Y, Lim J, Yang M-H (2015) Object tracking benchmark. IEEE Trans Pattern Anal Mach Intell 37(9):1834–1848

    Article  Google Scholar 

  6. Paranjothi A, Khan MS, Zeadally S, Pawar A, Hicks D (2019) GSTR: secure multi-hop message dissemination in connected vehicles using social trust model. Internet Things 7:100071

    Article  Google Scholar 

  7. Singanamalla V, Patan R, Khan MS, Kallam S (2019) Reliable and energy-efficient emergency transmission in wireless sensor networks. Internet Technol Lett 2(2):e91

    Article  Google Scholar 

  8. Xue M, Ling H (2019)Robust visual tracking using l1 minimization. In: IEEE 12th international conference on computer vision, pp 1436–1443

  9. He X, Wang K, Xu W (2019) QoE-driven content-centric caching with deep reinforcement learning in edge-enabled IoT. IEEE Comput Intell Mag 14(4):12–20

    Article  Google Scholar 

  10. Danelljan M, Robinson A, Khan FS, Felsberg M (2016) Beyond correlation filters: learning continuous convolution operators for visual tracking. In 2016 European conference on computer vision, pp 472–488

  11. Bolme DS, Beveridge JR, Draper BA, Lui YM (2010) Visual object tracking using adaptive correlation filters. In 23rd IEEE conference on computer vision & pattern recognition, pp 2544–2550

  12. He X, Wang K, Huang H, Miyazaki T, Wang Y, Guo S (2018) Green resource allocation based on deep reinforcement learning in content-centric IoT. IEEE Trans Emerg Top Comput. https://doi.org/10.1109/TETC.2018.2805718

    Article  Google Scholar 

  13. Shaul O, Aharon BH, Dan L, Shai A (2015) Locally orderless tracking. Int J Comput Vis 111:213–228

    Article  MathSciNet  Google Scholar 

  14. Kalal Z, Mikolajczyk K, Matas J (2011) Tracking-learning-detection. IEEE Trans Pattern Anal Mach Intell 34(7):1409–1422

    Article  Google Scholar 

  15. Sam H, Stuart G, Amir S, Vibhav V, MingMC SLH, Philip HST (2016) Struck: structured output tracking with kernels. IEEE Trans Pattern Anal Mach Intell 38(10):2096–2109

    Article  Google Scholar 

  16. Grabner H, Nguyen TT, Gruber B, Bischof H (2008) On-line boosting-based car detection from aerial images. ISPRS J Photogramm Remote Sens 63(3):382–396

    Article  Google Scholar 

  17. Henriques JF, Rui C, Martins P, Batista J (2012) Exploiting the circulant structure of tracking-by-detection with kernels. In: Proceedings of European conference on computer vision, pp 702–715

  18. Henriques JF, Caseiro R, Martins P, Batista J (2015) High-speed tracking with kernelized correlation filters. IEEE Trans Pattern Anal Mach Intell 37(3):583–596

    Article  Google Scholar 

  19. Dalal N, Triggs B (2005) Histograms of oriented gradients for human detection. In: IEEE computer society conference on computer vision and pattern recognition, pp 886–893

  20. Danelljan M, Khan FS, Felsberg M, Weijer JVD (2014) Adaptive color attributes for real-time visual tracking. In: IEEE conference on computer vision and pattern recognition (CVPR), pp 1090–1097

  21. Yang L, Zhu J (2014) A scale adaptive kernel correlation filter tracker with feature integration. In: European conference on computer vision workshops, pp 254–265

  22. Liu T, Gang W, Yang Q (2015) Real-time part-based visual tracking via adaptive correlation filters. In: IEEE conference on computer vision and pattern recognition, pp 4902–4912

  23. Hong Z, Chen Z, Wang C, Mei X, Prokhorov D, Tao D (2015) MUlti-Store Tracker (MUSTer): a cognitive psychology inspired approach to object tracking. In: IEEE conference on computer vision and pattern recognition, pp 749–758

  24. Danelljan M, Häger G, Khan FS, Felsberg M (2015) Learning spatially regularized correlation filters for visual tracking. IEEE international conference on computer vision, pp 4310–4318

  25. Ruan Y, Wei Z (2016) Discriminative descriptors for object tracking. J Vis Commun Image Represent 35:146–154

    Article  Google Scholar 

  26. Chao M, Huang JB, Yang X, Yang MH (2015) Hierarchical convolutional features for visual tracking. In: IEEE international conference on computer vision, pp 3074–3082

  27. Dockstaderf SL, Tekalp AM (2001) Multi-view spatial integration and tracking with Bayesian networks. In: International conference on image processing, pp 630–633

  28. Fu Y, Erdem AT, Tekalp AM (2000) Tracking visible boundary of objects using occlusion adaptive motion snake. IEEE Trans Image Process 9(12):2051–2060

    Article  Google Scholar 

  29. Comaniciu D, Ramesh V, Meer P (2000) Real-time tracking of non-rigid objects using mean shift. In: IEEE conference on computer vision & pattern recognition, pp 142–149

  30. Comaniciu D, Ramesh V, Meer P (2003) Kernel-based object tracking. IEEE Trans Pattern Anal Mach Intell 25(5):564–577

    Article  Google Scholar 

  31. Kalal Z, Matas J, Mikolajczyk K (2009) Online learning of robust object detectors during unstable tracking. In: IEEE international conference on computer vision workshops, pp 1417–1424

  32. Hariharakrishnan K, Schonfeld D (2005) Fast object tracking using adaptive block matching. IEEE Trans Multim 7(5):853–859

    Article  Google Scholar 

  33. Zhu Z, Wang Q, Li B, Wu W, Yan J, Hu W (2018) Distractor-aware siamese networks for visual object tracking. In: European conference on computer vision, pp 103–119

Download references

Acknowledgements

This work was supported in part by the Key Scientific Research Projects of Department of Education of Hunan Province (19A312), Hunan Provincial Science & Technology Project Foundation (2018TP1018, 2018RS3065), National Natural Science Foundation of China under Grant 61502254.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Khan Muhammad.

Ethics declarations

Conflict of interest

In compliance with Springer policy and our ethical obligation as researchers, no potential conflict of interest should be reported. The authors certify that they have no involvement in any organization or entity with any financial interest, or non-financial interest in the subject matter discussed in this manuscript.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, S., Liu, X., Wang, S. et al. Fuzzy-aided solution for out-of-view challenge in visual tracking under IoT-assisted complex environment. Neural Comput & Applic 33, 1055–1065 (2021). https://doi.org/10.1007/s00521-020-05021-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-020-05021-3

Keywords

Navigation