Skip to main content

Advertisement

Log in

A hand rubbing classification model based on image sequence enhanced by feature-based confidence metric

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

Hand hygiene is critical for declining the spread of viruses and diseases. Over recent years, it has been globally known as one of the most effective ways against COVID-19 outbreak. The World Health Organization (WHO) has suggested a 12-step guideline for hand rubbing. Due to the importance of this guideline, several studies have been conducted to measure compliance with it using Computer Vision. However, almost all of them are based on processing single images as input, referred to as baseline models in this paper. This study proposes a sequence model in order to process sequences of consecutive images as input. The model is a mixture of Inception-ResNet architecture for spatial feature extraction and LSTM for detecting time-series information. After training the model on a comprehensive dataset, an accuracy of 98.99% was achieved on the test set. Compared to the best baseline models, the proposed sequence model is correspondingly about 1% and 4% better in terms of accuracy and confidence, though 3 times slower in inference time. Furthermore, this study demonstrates that the accuracy metric is not necessarily adequate to compare different models and optimize their hyperparameters. Accordingly, the Feature-Based Confidence Metric was utilized in order to provide a more pleasing comparison to discriminate the proposed sequence model with the best baseline model and optimize its hyperparameters.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data availability

All of the data and material are available and owned by the authors, and/or no permissions are required.

References

  1. Aman, T., Yunita, R., Pasaribu, A., Ardinata, F.: Effectiveness hand washing and hand rub method in reducing total bacteria colony from nurses in medan. Open Access Maced. J. Med. Sci. 10(7), 3380–3383 (2019). https://doi.org/10.3889/oamjms.2019.427

    Article  Google Scholar 

  2. Mathur, P.: Hand hygiene: back to the basics of infection control. Indian J. Med. Res. 11(134), 611–20 (2011). https://doi.org/10.4103/0971-5916.90985

    Article  Google Scholar 

  3. Mulu, A.: Alcohol-based handrub utilization practice for COVID-19 prevention among pharmacy professionals in Ethiopian public hospitals: a cross-sectional study. Drug, healthcare and patient safety. Drug Healthc. Patient Saf. 13, 02 (2021). https://doi.org/10.2147/DHPS.S295599

    Article  Google Scholar 

  4. Organization, W.: WHO Guidelines for Hand Hygiene in Health Care. Clean care is safer care., World Health Organization, First Global Patient Safety Challenge Patient Safety (2009)

    Google Scholar 

  5. Knepper, B., Miller, A., Young, H.: Impact of an automated hand hygiene monitoring system combined with a performance improvement intervention on hospital-acquired infections. Infect. Control Hosp. Epidemiol. 05(41), 1–7 (2020). https://doi.org/10.1017/ice.2020.182

    Article  Google Scholar 

  6. Cherin, B., Jinturkar, R., Lanka, B., McKeon, D., Sivakumar N, Weiner A. A hygiene monitoring system. In: IEEE MIT Undergraduate Research Technology Conference (URTC). p. 1–4 (2018)

  7. Ameling, S., Li, J., Zhou, J., Ghosh, A., Lacey, G., Creamer, E., et al.: A vision-based system for hand washing quality assessment with real-time feedback. In: Proceedings of the 8th IASTED International Conference on Biomedical Engineering. Biomed 2011. 04 (2011) https://doi.org/10.2316/P.2011.723-103

  8. Chen, S., Mau, S., Harandi, M., Sanderson, C., Bigdeli, A., Lovell, B.: Face recognition from still images to video sequences: a local-feature-based framework. EURASIP J Image Video Process. 2011, 01 (2011). https://doi.org/10.1155/2011/790598

    Article  Google Scholar 

  9. Blokus, A., Krawczyk, H.: Systematic approach to binary classification of images in video streams using shifting time windows. Signal Image Video Process. 03, 13 (2019). https://doi.org/10.1007/s11760-018-1362-1

    Article  Google Scholar 

  10. Shahbandeh, M., Ghaffarpour, F., Vali, S., Haghpanah, M.A., Torkamani, A.M., Masouleh, M.T., et al.: A deep learning based automated hand hygiene training system. Submitted to the 10th RSI International Conference on Robotics and Mechatronics (ICRoM), IEEE (2021)

  11. Ameling, S., Li, J., Zhou, J., Ghosh, A., Lacey, G., Creamer, E., et al.: A vision-based system for hand washing quality assessment with real-time feedback. Proceedings of the 8th IASTED International Conference on Biomedical Engineering, Biomed 2011. 04 (2011) https://doi.org/10.2316/P.2011.723-103

  12. Cikel, K., Arzamendia Lopez, M., Gregor, D., Gutiérrez, D., Toral, S.: Evaluation of a CNN+LSTM system for the classification of hand-washing steps. In: XIX Conference of the Spanish Association for Artificial Intelligence (CAEPIA); (2021)

  13. He, K., Zhang, X., Ren, S., Sun, J.: Identity mappings in deep residual networks. In: Leibe, B. (ed.) European Conference on Computer Vision, pp. 630–645. Springer, Cham (2016)

    Google Scholar 

  14. realtimear.: Sample: hand wash dataset. Accessed: 2022-03-18. Kaggle. Available from: https://www.kaggle.com/realtimear/hand-wash-dataset

  15. Zach, C., Pock, T., Bischof, H.: A duality based approach for realtime TV-L1 optical flow. In: Proceedings of the 29th DAGM Conference on Pattern Recognition. 4713, 214–223 (2007)

  16. Haghpanah, M.A., Vali, S., Torkamani, A.M., Masouleh, M.T., Kalhor, A., Sarraf, E.A.: Real-time hand rubbing quality estimation using deep learning enhanced by separation index and feature-based confidence metric. J. Expert Syst. Appl

  17. Mandelbaum, A., Weinshall, D.: Distance-based confidence score for neural network classifiers. arXiv preprint arXiv:1709.09844 (2017)

  18. Lattanzi, E., Calisti, L., Freschi, V.: Automatic unstructured handwashing recognition using smartwatch to reduce contact transmission of pathogens (Preprint)

  19. Li, S., Harner, E., Adjeroh, D.: Random KNN feature selection—a fast and stable alternative to random forests. BMC Bioinform. 11(12), 450 (2011). https://doi.org/10.1186/1471-2105-12-450

    Article  Google Scholar 

  20. Fernández-Llorca, D., Vilariño, F., Zhou, Z., Lacey, G.: A multi-class SVM classifier ensemble for automatic hand washing quality assessment. In: British Machine Vision Conference BMVC. Warwick, (2007)

  21. Dong, S., Wang, P., Abbas, K.: A survey on deep learning and its applications. Comput. Sci. Rev. 05(40), 100379 (2021). https://doi.org/10.1016/j.cosrev.2021.100379

    Article  MathSciNet  MATH  Google Scholar 

  22. Wang, X., Liu, F., Ma, X.: Mixed distortion image enhancement method based on joint of deep residuals learning and reinforcement learning. Signal Image Video Process. 07(15), 1–8 (2021). https://doi.org/10.1007/s11760-020-01824-y

    Article  Google Scholar 

  23. Lee, Y.: Image classification with artificial intelligence: cats vs dogs. In: 2021 2nd International Conference on Computing and Data Science (CDS). pp. 437–441 (2021)

  24. Mahardi Wang I.H., Lee, K.C., Chang, S.L.: Images classification of dogs and cats using fine-tuned VGG models. In: 2020 IEEE Eurasia Conference on IOT, Communication and Engineering (ECICE). pp. 230–233 (2020)

  25. Manonmani, T., Pushparaj, V.: Trail optimization framework to detect nonlinear object motion in video sequences. Signal Image Video Process. 04, 14 (2020). https://doi.org/10.1007/s11760-019-01581-7

    Article  Google Scholar 

  26. Badue, C., Guidolini, R., Carneiro, R., Azevedo, P., Cardoso, V., Forechi, A., et al.: Self-driving cars: a survey. Expert Syst. Appl. 08(165), 113816 (2020). https://doi.org/10.1016/j.eswa.2020.113816

    Article  Google Scholar 

  27. Lajevardi, S., Hussain, Z.: Automatic facial expression recognition: feature extraction and selection. Signal Image Video Process. 03(6), 159–169 (2012). https://doi.org/10.1007/s11760-010-0177-5

    Article  Google Scholar 

  28. Haghpanah, M.A., Saeedizade, E., Tale Masouleh, M., Kalhor, A.: Real-time facial expression recognition using facial landmarks and neural networks, pp. 1–7. IEEE, Ahvaz (2022)

    Google Scholar 

  29. Li, Z., Liu, F., Yang, W., Peng, S., Zhou, J.: A survey of convolutional neural networks: analysis, applications, and prospects. IEEE Trans. Neural Netw. Learn. Syste. 06, 1–21 (2021). https://doi.org/10.1109/TNNLS.2021.3084827

    Article  Google Scholar 

  30. Sherstinsky, A.: Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network. Phys. D Nonlinear Phenom. 03(404), 132306 (2020). https://doi.org/10.1016/j.physd.2019.132306

    Article  MathSciNet  MATH  Google Scholar 

  31. Szegedy, C., Ioffe, S., Vanhoucke, V., Alemi, A.: Inception-v4, inception-ResNet and the impact of residual connections on learning. AAAI Conference on Artificial Intelligence. 02, (2016)

  32. Tan, C., Sun, F., Kong, T., Zhang, W., Yang, C., Liu, C.: A survey on deep transfer learning. In International Conference on Artificial Neural Networks. Springer, Cham (2018)

  33. Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Li, F.F.: ImageNet: a large-scale hierarchical image database. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 248–255 (2009)

  34. Gharbi, H., Bahroun, S., Zagrouba, E.: Key frame extraction for video summarization using local description and repeatability graph clustering. Signal Image Video Process. 04, 13 (2019). https://doi.org/10.1007/s11760-018-1376-8

    Article  Google Scholar 

  35. Ture, F., Jojic, O.: No need to pay attention: simple recurrent neural networks work! In: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. pp. 2866–2872 (2017)

  36. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural comput. 12(9), 1735–80 (1997). https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  37. Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)

  38. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A., et al.: Attention is all you need (2017)

  39. Cheng, G., Peddinti, V., Povey, D., Manohar, V., Khudanpur, S., Yan, Y.: An exploration of dropout with LSTMs. In: Interspeech 2017, 1586–1590 (2017)

Download references

Acknowledgements

The authors would sincerely like to acknowledge the financial support of Tavanresan Company.

Author information

Authors and Affiliations

Authors

Contributions

Mohammad Amin Haghpanah wrote the main manuscript text and prepared the figures. Mehdi Tale Masouleh reviewed the manuscript and modified the figures. Ahmad Kalhor reviewed the manuscript. Ehsan Akhavan Sarraf financially supported the project.

Corresponding author

Correspondence to Mehdi Tale Masouleh.

Ethics declarations

Competing interests

The authors declare that they have no competing interests as defined by Springer, or other interests that might be perceived to influence the results and/or discussion reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Haghpanah, M.A., Tale Masouleh, M., Kalhor, A. et al. A hand rubbing classification model based on image sequence enhanced by feature-based confidence metric. SIViP 17, 2499–2509 (2023). https://doi.org/10.1007/s11760-022-02467-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11760-022-02467-x

Keywords