Skip to main content

Real-Time Grayscale-Thermal Tracking via Laplacian Sparse Representation

  • Conference paper
  • First Online:
MultiMedia Modeling (MMM 2016)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9517))

Included in the following conference series:

Abstract

Grayscale and thermal data can complement to each other to improve tracking performance in some challenging scenarios. In this paper, we propose a real-time online grayscale-thermal tracking method via Laplacian sparse representation in Bayesian filtering framework. Specifically, a generative multimodal feature model is induced by the Laplacian sparse representation, which makes the best use of similarities among local patches to refine their sparse codes, so that different source data can be seamlessly fused for object tracking. In particular, the multimodal feature model encodes both the spatial local information and occlusion handling to improve its robustness. With such feature representation, the confidence of each candidate is computed by the sparse feature similarity with the object template. Given the motion model, object tracking is then carried out in Bayesian filtering framework by maximizing the observation likelihood, i.e., finding the candidate with highest confidence. In addition, to achieve real-time demand in related visual information processing systems, we adopt the reverse representation and the parallel computation to improve tracking efficiency. Extensive experiments on both public and collected grayscale-thermal video sequences demonstrate accuracy and efficiency of the proposed method against other state-of-the-art sparse representation based trackers.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Babenko, B., Yang, M.H., Belongie, S.: Robust object tracking with online multiple instance learning. IEEE Trans. Pattern Anal. Mach. Intell. 33(7), 1619–1632 (2011)

    Article  Google Scholar 

  2. Bunyak, F., Palaniappan, K., Nath, S.K., Seetharaman, G.: Geodesic active contour based fusion of visible and infrared video for persistent object tracking. In: Proceedings of IEEE Workshop on Applications of Computer Vision (2007)

    Google Scholar 

  3. Conaire, C.O., Connor, N.E., Cooke, E., Smeaton, A.F.: Comparison of fusion methods for thermo-visual surveillance tracking. In: Proceedings of International Conference on Information Fusion (2006)

    Google Scholar 

  4. Conaire, C.O., Connor, N.E., Smeaton, A.: Thermo-visual feature fusion for object tracking using multiple spatiogram trackers. Mach. Vis. Appl. 7, 1–12 (2007)

    Google Scholar 

  5. Cvejic, N., Nikolov, S.G., Knowles, H.D., Loza, A., Achim, A., Bull, D.R., Canagarajah, C.N.: The effect of pixel-level fusion on object tracking in multi-sensor surveillance video. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2007)

    Google Scholar 

  6. Danelljan, M., Khan, F.S., Felsberg, M., van de Weijer, J.: Adaptive color attributes for real-time visual tracking. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2014)

    Google Scholar 

  7. Davis, J.W., Sharma, V.: Background-subtraction using contour-based fusion of thermal and visible imagery. Comput. Vis. Image Underst. 106(2), 162–182 (2007)

    Article  Google Scholar 

  8. Gade, R., Moeslund, T.B.: Thermal cameras and applications: a survey. Mach. Vis. Appl. 25, 245–262 (2014)

    Article  Google Scholar 

  9. Gao, S., Tsang, W.H., Chia, L.T., Zhao, P.: Local features are not lonely Å‚laplacian sparse coding for image classification. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2010)

    Google Scholar 

  10. Grabner, H., Leistner, C., Bischof, H.: Semi-supervised on-line boosting for robust tracking. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008, Part I. LNCS, vol. 5302, pp. 234–247. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  11. Hare, S., Saffari, A., Torr, P.H.S.: Struck: structured output tracking with kernels. In: Proceedings of IEEE International Conference on Computer Vision (2011)

    Google Scholar 

  12. Henriques, J.F., Caseiro, R., Martins, P., Batista, J.: High-speed tracking with kernelized correlation filters. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 583–596 (2015)

    Article  Google Scholar 

  13. Leykin, A., Hammoud, R.: Pedestrian tracking by fusion of thermal-visible surveillance videos. Mach. Vis. Appl. 21(4), 587–595 (2010)

    Article  Google Scholar 

  14. Li, C., Lin, L., Zuo, W., Yan, S., Tang, J.: Sold: sub-optimal low-rank decomposition for efficient video segmentation. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2015)

    Google Scholar 

  15. Liu, H., Sun, F.: Fusion tracking in color and infrared images using joint sparse representation. Inf. Sci. 55(3), 590–599 (2012)

    MathSciNet  Google Scholar 

  16. Mei, X., Ling, H.: Robust visual tracking using \(l_1\) minimization. In: Proceedings of IEEE International Conference on Computer Vision (2009)

    Google Scholar 

  17. Parikh, N., Boyd, S.: Proximal algorithms. Found. Trends Optim. 1–96 (2013)

    Google Scholar 

  18. Shen, F., Shen, C., Liu, W., Shen, H.T.: Supervised discrete hashing. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2015)

    Google Scholar 

  19. Torabi, A., Masse, G., Bilodeau, G.A.: An iterative integrated framework for thermal-visible image registration, sensor fusion, and people tracking for video surveillance applications. Comput. Vis. Image Underst. 116(2), 210–221 (2012)

    Article  Google Scholar 

  20. Walchshausal, L., Lindl, R.: Multi-sensor classification using a boosted cascade detector. In: Proceedings of IEEE Intelligent Vehicles Symposium (2007)

    Google Scholar 

  21. Wu, Y., Blasch, E., Chen, G., Bai, L., Ling, H.: Multiple source data fusion via sparse representation for robust visual tracking. In: Proceedings of International Conference on Information Fusion (2011)

    Google Scholar 

  22. Wu, Y., Lim, J., Yang, M.H.: Online object tracking: a benchmark. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2013)

    Google Scholar 

  23. Yang, Y., Yang, Y., Huang, Z., Shen, H.T., Nie, F.: Tag localization with spatial correlations and joint group sparsity. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2011)

    Google Scholar 

  24. Zhang, J., Ma, S., Sclaroff, S.: MEEM: robust tracking via multiple experts using entropy minimization. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014, Part VI. LNCS, vol. 8694, pp. 188–203. Springer, Heidelberg (2014)

    Google Scholar 

  25. Zhang, K., Zhang, L., Liu, Q., Zhang, D., Yang, M.-H.: Fast visual tracking via dense spatio-temporal context learning. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014, Part V. LNCS, vol. 8693, pp. 127–141. Springer, Heidelberg (2014)

    Google Scholar 

  26. Zhang, T., Ghanem, B., Liu, S., Ahuja, N.: Low-rank sparse learning for robust visual tracking. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012, Part VI. LNCS, vol. 7577, pp. 470–484. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  27. Zhang, W., Li, C., Zheng, A., Tang, J., Luo, B.: Motion compensation based fast moving object detection in dynamic background. In: Zha, H., Chen, X., Wang, L., Miao, Q. (eds.) CCCV 2015. CCIS, vol. 547, pp. 247–256. Springer, Heidelberg (2015). doi:10.1007/978-3-662-48570-5_24

    Chapter  Google Scholar 

  28. Zhong, W., Lu, H., Yang, M.H.: Robust object tracking via sparsity-based collaborative model. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2012)

    Google Scholar 

  29. Zhong, W., Lu, H., Yang, M.H.: Robust object tracking via sparse collaborative appearance model. IEEE Trans. Image Process. 23(5), 2356–2368 (2014)

    Article  MathSciNet  Google Scholar 

  30. Zhuang, B., Lu, H., Xiao, Z., Wang, D.: Visual tracking via discriminative sparse similarity map. IEEE Trans. Image Process. 23(4), 1872–1881 (2014)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgement

This work was supported by the Development Program (863 Program) of China (No. 2014AA015104) and the Natural Science Foundation of China (No. 61472002).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jin Tang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Li, C., Hu, S., Gao, S., Tang, J. (2016). Real-Time Grayscale-Thermal Tracking via Laplacian Sparse Representation. In: Tian, Q., Sebe, N., Qi, GJ., Huet, B., Hong, R., Liu, X. (eds) MultiMedia Modeling. MMM 2016. Lecture Notes in Computer Science(), vol 9517. Springer, Cham. https://doi.org/10.1007/978-3-319-27674-8_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-27674-8_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-27673-1

  • Online ISBN: 978-3-319-27674-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics