Skip to main content
Log in

Identifying human activities in megastores through postural data to monitor shoplifting events

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In recent years, modeling activity patterns for understanding events and human behavior has drawn prominent attention in research. Multiple methods have been proposed for developing automated vision systems that are capable of inferring accurate semantics from the moving dynamics. The multi-disciplinary nature of Human Activity Recognition (HAR) methods and the expanding technologies in this field inspire continual updates in existing methods. However, a cost-effective solution is still needed to recognize human activities like shoplifting in an occluded environment. With this motivation, we present a novel approach to identify human stealing actions by analyzing the postural information of the human body. This approach involves extracting 2D postural body joints of a human being from the captured frame. Pose encoding and postural feature generation in parameter space are the foremost contributions of this work, which can handle the occluded actions too. The feature reduction is done to scale the features into a smaller dimension with an objective of the computationally efficient and real-time solution. Activity classification is done on the reduced feature sets to detect human shoplifting actions in real-time scenarios. Experiments are performed on the synthesized shoplifting dataset, where the results derived are found more promising compared to other state-of-the-art methods, with an accuracy of 96.87%. Additionally, this method exhibits commendable real-time performance in processing actual store camera footage.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Data availability

The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.

References

  1. Deccan Herald (2019) Retail chains battle shoplifting. https://www.deccanherald.com/metrolife/retail-chains-battle-shoplifting-722520.html. Accessed 22 Oct 2021

  2. Arroyo R, Yebes JJ, Bergasa LM, Daza IG, Almazán J (2015) Expert video-surveillance system for real-time detection of suspicious behaviours in shopping malls. Expert Syst Appl 42(21):7991–8005

    Article  Google Scholar 

  3. Martínez-Mascorro GA, Abreu-Pederzini JR, Ortiz-Bayliss JC, Garcia-Collantes A, Terashima-Marín H (2021) Criminal intention detection at early stages of shoplifting cases by using 3D convolutional neural networks. Computation 9(2):24

    Article  Google Scholar 

  4. Ansari MA, Singh DK (2022) An expert eye for identifying shoplifters in mega stores. In: International conference on innovative computing and communications, pp 107–115

  5. Beddiar DR, Nini B, Sabokrou M, Hadid A (2020) Vision-based human activity recognition: a survey. Multimed Tools Appl 79(41):30509–30555

    Article  Google Scholar 

  6. Khan NS, Ghani MS (2021) A survey of deep learning based models for human activity recognition. Wirel Pers Commun 120(2):1593–1635

    Article  Google Scholar 

  7. Ansari MA, Singh DK (2022) An expert video surveillance system to identify and mitigate shoplifting in megastores. Multimed Tools Appl 81(16):22497–22525

    Article  Google Scholar 

  8. Munea TL, Jembre YZ, Weldegebriel HT, Chen L, Huang C, Yang C (2020) The progress of human pose estimation: a survey and taxonomy of models applied in 2D human pose estimation. IEEE Access 8:133330–133348

    Article  Google Scholar 

  9. da Silva MV, Marana AN (2020) Human action recognition in videos based on spatiotemporal features and bag-of-poses. Appl Soft Comput 95:106513

    Article  Google Scholar 

  10. Dwivedi N, Singh DK, Kushwaha DS (2020) Orientation invariant skeleton feature (OISF): a new feature for human activity recognition. Multimed Tools Appl 79(29):21037–21072

    Article  Google Scholar 

  11. Li C, Tong R, Tang M (2018) Modelling human body pose for action recognition using deep neural networks. Arab J Sci Eng 43(12):7777–7788

    Article  Google Scholar 

  12. Cao Z, Simon T, Wei SE, Sheikh Y (2017) Realtime multi-person 2D pose estimation using part affinity fields. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 7291–7299

  13. Sanal Kumar KP, Bhavani R (2020) Human activity recognition in egocentric video using HOG, GiST and color features. Multimed Tools Appl 79(5):3543–3559

    Article  Google Scholar 

  14. Smagulova K, James AP (2019) A survey on LSTM memristive neural network architectures and applications. European Phys J Spec Top 228(10):2313–2324

    Article  Google Scholar 

  15. Jayaswal R, Dixit M (2021) A framework for anomaly classification using deep transfer learning approach. Rev d’Intelligence Artif 35(3):255–263

    Article  Google Scholar 

  16. Donahue J, Anne Hendricks L, Guadarrama S, Rohrbach M, Venugopalan S, Saenko K, Darrell T (2015) Long-term recurrent convolutional networks for visual recognition and description. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2625–2634

  17. Noori FM, Wallace B, Uddin M, Torresen J (2019, June) A robust human activity recognition approach using OpenPose, motion features, and deep recurrent neural network. In: Scandinavian conference on image analysis, pp 299–310

  18. Wang R, Wu X (2019) Combining multiple deep cues for action recognition. Multimed Tools Appl 78(8):9933–9950

    Article  Google Scholar 

  19. Singh DK (2018, July) Human action recognition in video. In: International conference on advanced informatics for computing research, pp 54–66

  20. Sultani W, Chen C, Shah M (2018) Real-world anomaly detection in surveillance videos. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 6479–6488

Download references

Funding

None.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohd. Aquib Ansari.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ansari, M.A., Singh, D.K. Identifying human activities in megastores through postural data to monitor shoplifting events. Neural Comput & Applic 35, 6515–6528 (2023). https://doi.org/10.1007/s00521-022-08028-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-08028-0

Keywords

Navigation