Skip to main content

A Performance of Low-Cost NVIDIA Jetson Nano Embedded System in the Real-Time Siamese Single Object Tracking: A Comparison Study

  • Conference paper
  • First Online:
Computing Science, Communication and Security (COMS2 2022)

Abstract

The tracking of objects is a complex mission computer vision and machine learning (ML). There are several types of objects tracking like tracking one object or more than one. The tracking of one object is applied in video frames as the tracking of more than one object is applied to tracking for multiple objects in the video. Single object tracking is usually implemented using the method of correlation filter-based or of Siamese Network-based. Siamese Network, the state of art method, has an active search area, nowadays, due to its good achievements in accordance with localization real-time and accuracy application. Especially within a new surveillance system that is built on UAV to get unbounded tracking. GPU-based embedded systems give superior performance in comparison with CPU-based systems in the implementation of ML in the terms of speed. In this paper low-cost NVIDIA Jetson nano embedded system performance was evaluated for real-time Siamese single object tracking. 14 Siamese single object tracking algorithms were tested using NVIDIA Jetson nano board. The result shows that the bord gives the best performance with the Lighttrack algorithm with 8.3 frames per second speed. Such performance can be used in real-time tracking applications.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zhu, P., et al.: Vision meets drones: past, present and future. IEEE Access 9, 110149–110172 (2020). https://doi.org/10.1109/ACCESS.2021.3101988

    Article  Google Scholar 

  2. Ondrasovic, M., Tarabek, P.: Siamese visual object tracking: a survey. IEEE Access 9, 110149–110172 (2021). https://doi.org/10.1109/ACCESS.2021.3101988

    Article  Google Scholar 

  3. Cao, Z., et al.: SiamAPN++: Siamese Attentional Aggregation Network for Real-Time UAV Tracking (2021)

    Google Scholar 

  4. You, S., Zhu, H., Li, M., Li, Y.: A review of visual trackers and analysis of its application to mobile robot. arXiv preprint arXiv:1910.09761 (2019)

    Google Scholar 

  5. Fu, C., et al.: Correlation filters for unmanned aerial vehicle-based aerial tracking: a review and experimental evaluation. IEEE Geosci. Remote Sens. Mag. 10, 1–28 (2021). https://doi.org/10.1109/MGRS.2021.3072992

    Article  Google Scholar 

  6. Bertinetto, L., Valmadre, J., Henriques, J.F., Vedaldi, A., Torr, P.H.S.: Fully-convolutional siamese networks for object tracking. In: Hua, G., Jégou, H. (eds.) ECCV 2016. LNCS, vol. 9914, pp. 850–865. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-48881-3_56

    Chapter  Google Scholar 

  7. Li, B., et al.: High performance visual tracking with siamese region proposal network. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 8971–8980 (2018). https://doi.org/10.1109/CVPR.2018.00935

  8. Zhu, Z., et al.: Distractor-aware Siamese Networks for Visual Object Tracking. arXiv:1808.06048v1 [cs.CV]. ECCV 2018, pp. 1–17 (2018)

    Google Scholar 

  9. Li, B., et al.: SIAMRPN++: evolution of siamese visual tracking with very deep networks. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, pp. 4277–4286 (2019). https://doi.org/10.1109/CVPR.2019.00441

  10. Fu, C., et al.: Siamese Anchor Proposal Network for High-Speed Aerial Tracking (2020). https://doi.org/10.1109/icra48506.2021.9560756

  11. Yan, B., et al.: LightTrack: Finding Lightweight Neural Networks for Object Tracking via One-Shot Architecture Search (2021). https://doi.org/10.1109/cvpr46437.2021.01493

  12. Liu, W., et al.: SSD: single shot MultiBox detector. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9905, pp. 21–37. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46448-0_2

    Chapter  Google Scholar 

  13. Reddi, V.J., et al.: MLPerf inference benchmark. In: Proceedings - International Symposium on Computer Architecture, 2020-May, pp. 446–459 (2020). https://doi.org/10.1109/ISCA45697.2020.00045

  14. Jo, J., Jeong, S., Kang, P.: Benchmarking GPU-accelerated edge devices. In: IEEE International Conference on Big Data and Smart Computing (BigComp), pp. 117–120 (2020). https://doi.org/10.1109/BigComp48618.2020.00-89

  15. Marvasti-Zadeh, S.M., et al.: Deep learning for visual tracking: a comprehensive survey. IEEE Trans. Intell. Transp. Syst. (2021). https://doi.org/10.1109/TITS.2020.3046478

  16. Jo, J., Jeong, S., Kang, P.: Benchmarking GPU-accelerated edge devices. In: Proceedings - 2020 IEEE International Conference on Big Data and Smart Computing, BigComp 2020, pp. 117–120 (2020). https://doi.org/10.1109/BigComp48618.2020.00-89

  17. Bock, C., Moor, M., Jutzeler, C.R., Borgwardt, K.: Machine learning for biomedical time series classification: from shapelets to deep learning. In: Cartwright, H. (ed.) Artificial Neural Networks. MMB, vol. 2190, pp. 33–71. Springer, New York (2021). https://doi.org/10.1007/978-1-0716-0826-5_2

    Chapter  Google Scholar 

  18. Thiollière, R., et al.: A hybrid dynamic time warping-deep neural network architecture for unsupervised acoustic modelling. In: Proceedings of the Annual Conference of the International Speech Communication Association, INTERSPEECH, 2015-January(2), pp. 3179–3183 (2015). https://doi.org/10.21437/interspeech.2015-640

  19. Barnard, E., et al.: The NCHLT speech corpus of the South African languages. In: Spoken Language Technologies for Under-Resourced Languages, (May), pp. 194–200 (2014)

    Google Scholar 

  20. Siddhant, A., Jyothi, P., Ganapathy, S.: Leveraging native language speech for accent identification using deep Siamese networks. In: 2017 IEEE Automatic Speech Recognition and Understanding Workshop, ASRU 2017 - Proceedings, 2018-January, pp. 621–628 (2018). https://doi.org/10.1109/ASRU.2017.8268994

  21. Jindal, S., et al.: Siamese networks for chromosome classification. In: Proceedings - 2017 IEEE International Conference on Computer Vision Workshops, ICCVW 2017, 2018-January, pp. 72–81 (2017). https://doi.org/10.1109/ICCVW.2017.17

  22. Zheng, W., et al.: SENSE: Siamese neural network for sequence embedding and alignment-free comparison. Bioinformatics 35(11), 1820–1828 (2019). https://doi.org/10.1093/bioinformatics/bty887

    Article  Google Scholar 

  23. Jeon, M., et al.: ReSimNet: drug response similarity prediction using Siamese neural networks. Bioinformatics 35(24), 5249–5256 (2019). https://doi.org/10.1093/bioinformatics/btz411

    Article  Google Scholar 

  24. Sun, Z., et al.: Embedded spectral descriptors: learning the point-wise correspondence metric via Siamese neural networks. J. Comput. Des. Eng. 7(1), 18–29 (2020). https://doi.org/10.1093/jcde/qwaa003

    Article  Google Scholar 

  25. Kassis, M., Nassour, J., El-Sana, J.: Writing Style Invariant Deep Learning Model for Historical Manuscripts Alignment (2018)

    Google Scholar 

  26. Cheng, G., et al.: Remote sensing image scene classification meets deep learning: challenges, methods, benchmarks, and opportunities. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 13, 3735–3756 (2020). https://doi.org/10.1109/JSTARS.2020.3005403

  27. Roy, S., et al.: Siamese networks: the tale of two manifolds. In: Proceedings of the IEEE International Conference on Computer Vision, 2019-October, pp. 3046–3055 (2019). https://doi.org/10.1109/ICCV.2019.00314

  28. Li, X., et al.: Target-aware deep tracking. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, pp. 1369–1378 (2019). https://doi.org/10.1109/CVPR.2019.00146

  29. Cen, M., Jung, C.: Fully convolutional siamese fusion networks for object tracking. In: 25th IEEE International Conference on Image Processing (ICIP), pp. 3718–3722 (2018)

    Google Scholar 

  30. Yang, L., et al.: Region-based fully convolutional siamese networks for robust real-time visual tracking. In: 2017 IEEE International Conference on Image Processing (ICIP), pp. 1–5 (2017)

    Google Scholar 

  31. He, A., Luo, C., Tian, X., Zeng, W.: Towards a better match in Siamese network based visual object tracker. In: Leal-Taixé, L., Roth, S. (eds.) ECCV 2018. LNCS, vol. 11129, pp. 132–147. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-11009-3_7

    Chapter  Google Scholar 

  32. Zhang, L., et al.: Learning the model update for siamese trackers. In: Proceedings of the IEEE International Conference on Computer Vision, 2019-October, pp. 4009–4018 (2019). https://doi.org/10.1109/ICCV.2019.00411

  33. Xu, Y., et al.: SiamFC++: towards robust and accurate visual tracking with target estimation guidelines. In: AAAI 2020 - 34th AAAI Conference on Artificial Intelligence, pp. 12549–12556 (2020). https://doi.org/10.1609/aaai.v34i07.6944

  34. Wang, Q., et al.: Fast online object tracking and segmentation: a unifying approach. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, pp. 1328–1338 (2019). https://doi.org/10.1109/CVPR.2019.00142

  35. Guo, D., Wang, J., et al.: SiamCAR: siamese fully convolutional classification and regression for visual tracking. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 6268–6276 (2020). https://doi.org/10.1109/CVPR42600.2020.00630

  36. Chen, Z., et al.: Siamese box adaptive network for visual tracking. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 6667–6676 (2020). https://doi.org/10.1109/CVPR42600.2020.00670

  37. Zhang, Z., Peng, H.: Deeper and wider siamese networks for real-time visual tracking. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2019-June, pp. 4586–4595 (2019). https://doi.org/10.1109/CVPR.2019.00472

  38. Guo, D., et al.: Graph Attention Tracking (2020). http://arxiv.org/abs/2011.11204

  39. Mishra, D., Matas, J.: The Visual Object Tracking VOT2017 Challenge Results The Visual Object Tracking VOT2017 challenge results. ICVC, vol. 1, no. November 2017, pp. 777–823 (2019). https://openaccess.thecvf.com/content_ICCVW_2019/papers/VOT/Kristan_The_Seventh_Visual_Object_Tracking_VOT2019_Challenge_Results_ICCVW_2019_paper.pdf

  40. Kristan, M., et al.: VOT2018 results. Chinese Acad. Sci. 26(1), 1–15 (2018). http://vision.fe.uni-lj.si/cvbase06/

  41. Huang, L., Zhao, X., Huang, K.: Got-10k: a large high-diversity benchmark for generic object tracking in the wild. IEEE Trans. Pattern Anal. Mach. Intell. 43(5), 1562–1577 (2021). https://doi.org/10.1109/TPAMI.2019.2957464

    Article  Google Scholar 

  42. Zhu, X., Badr, Y.: Benchmarking deep trackers on aerial videos Abu. Sensors 18(12) (2018). https://doi.org/10.3390/sxx010005

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dalal Abdulmohsin Hammood .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kareem, A.A., Hammood, D.A., Alchalaby, A.A., Khamees, R.A. (2022). A Performance of Low-Cost NVIDIA Jetson Nano Embedded System in the Real-Time Siamese Single Object Tracking: A Comparison Study. In: Chaubey, N., Thampi, S.M., Jhanjhi, N.Z. (eds) Computing Science, Communication and Security. COMS2 2022. Communications in Computer and Information Science, vol 1604. Springer, Cham. https://doi.org/10.1007/978-3-031-10551-7_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-10551-7_22

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-10550-0

  • Online ISBN: 978-3-031-10551-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics