Skip to main content

Monitoring Human Performance Through Deep Learning and Computer Vision in Industry 4.0

  • Conference paper
  • First Online:
17th International Conference on Soft Computing Models in Industrial and Environmental Applications (SOCO 2022) (SOCO 2022)

Abstract

The advent of Industry 4.0 is revolutionizing manufacturing processes through techniques that can optimize the decision-making based on manufacturing data. Monitoring the whole production process from raw material input to the final product includes the production process itself and the human resources that carry it out. One of the key aspects of this decision-making process is the monitoring of human performance. This paper presents an architecture for real-time monitoring of manufacturing activities including the operator performance. As a case study, the assembly of an electro-pneumatic circuit has been taken as an experiment and a deep learning model has been trained to take as a reference the assemblies performed by experts, in addition to the standard times of these, to identify both hand trajectory and the position of the objects. The deviations obtained with respect to these references are attributable to the operator’s experience or fatigue.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Al-Faris, M., Chiverton, J., Ndzi, D., Ahmed, A.I.: A review on computer vision-based methods for human action recognition. J. Imaging 6(6), 46 (2020). https://doi.org/10.3390/jimaging6060046

    Article  Google Scholar 

  2. Ansari, S., Naghdy, F., Du, H., Pahnwar, Y.N.: Driver mental fatigue detection based on head posture using new modified reLU-BiLSTM deep neural network. IEEE Trans. Intell. Transp. Syst. 23(8), 10957–10969 (2021)

    Article  Google Scholar 

  3. Azorin-Lopez, J., Saval-Calvo, M., Fuster-Guillo, A., Garcia-Rodriguez, J.: A novel prediction method for early recognition of global human behaviour in image sequences. Neural Process. Lett. 43(2), 363–387 (2015). https://doi.org/10.1007/s11063-015-9412-y

    Article  Google Scholar 

  4. Fernández, I.S.M., Oprea, S., Castro-Vargas, J.A., Martinez-Gonzalez, P., Garcia-Rodriguez, J.: Estimating context aware human-object interaction using deep learning-based object recognition architectures. In: Sanjurjo González, H., Pastor López, I., García Bringas, P., Quintián, H., Corchado, E. (eds.) SOCO 2021. AISC, vol. 1401, pp. 429–438. Springer, Cham (2022). https://doi.org/10.1007/978-3-030-87869-6_41

    Chapter  Google Scholar 

  5. Gellert, A., Sorostinean, R., Pirvu, B.C.: Robust assembly assistance using informed tree search with Markov chains. Sensors 22(2), 495 (2022). https://doi.org/10.3390/s22020495

    Article  Google Scholar 

  6. Gerekli, İ., Çelik, T.Z., Bozkurt, İ.: Industry 4.0 and smart production. TEM J. 10(2), 799–805 (2021). https://doi.org/10.18421/TEM102-37

  7. Ghasemi, Y., Jeong, H., Choi, S.H., Park, K.B., Lee, J.Y.: Deep learning-based object detection in augmented reality: a systematic review. Comput. Ind. 139, 103661 (2022). https://doi.org/10.1016/j.compind.2022.103661

    Article  Google Scholar 

  8. Hadfield, J., Koutras, P., Efthyrniou, N., Potamianos, G., Tzafestas, C.S., Maragos, P.: Object assembly guidance in child-robot interaction using RGB-D based 3D tracking. In: IEEE International Conference on Intelligent Robots and Systems, pp. 347–354 (2018)

    Google Scholar 

  9. Jiao, Y., Deng, Y., Luo, Y., Lu, B.L.: Driver sleepiness detection from EEG and EOG signals using GAN and LSTM networks. Neurocomputing 408, 100–111 (2020). https://doi.org/10.1016/j.neucom.2019.05.108

    Article  Google Scholar 

  10. Kazemian, A., Yuan, X., Davtalab, O., Khoshnevis, B.: Computer vision for real-time extrusion quality monitoring and control in robotic construction. Autom. Constr. 101, 92–98 (2019). https://doi.org/10.1016/j.autcon.2019.01.022

    Article  Google Scholar 

  11. Lai, Z.H., Tao, W., Leu, M.C., Yin, Z.: Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing. J. Manuf. Syst. 55, 69–81 (2020). https://doi.org/10.1016/j.jmsy.2020.02.010

    Article  Google Scholar 

  12. Borja-Borja, L.F., Azorin-Lopez, J., Saval-Calvo, M., Fuster-Guillo, A.: Deep learning architecture for group activity recognition using description of local motions. In: International Joint Conference on Neural Networks (IJCNN) 2020, pp. 1–8 (2020). https://doi.org/10.1109/IJCNN48605.2020.9207366

  13. Li, H., Wang, Y., Nan, Y.: Motion fatigue state detection based on neural networks. Comput. Intell. Neurosci. 2022, 1–10 (2022). https://doi.org/10.1155/2022/9602631

    Article  Google Scholar 

  14. Liu, M.Z., Xu, X., Hu, J., Jiang, Q.N.: Real time detection of driver fatigue based on CNN-LSTM. IET Image Proc. 16(2), 576–595 (2022). https://doi.org/10.1049/ipr2.12373

    Article  Google Scholar 

  15. Liu, Z., Peng, Y., Hu, W.: Driver fatigue detection based on deeply-learned facial expression representation. J. Vis. Commun. Image Represent. 71, 102723 (2020). https://doi.org/10.1016/j.jvcir.2019.102723

    Article  Google Scholar 

  16. Lu, Y.: The current status and developing trends of industry 4.0: a review. Inf. Syst. Front. (2021). https://doi.org/10.1007/s10796-021-10221-w

  17. Lukinac, J., Mastanjević, K., Mastanjević, K., Nakov, G., Jukić, M.: Computer vision method in beer quality evaluation-a review. Beverages 5(2), 1–21 (2019). https://doi.org/10.3390/beverages5020038

    Article  Google Scholar 

  18. Pal, A., Hsieh, S.H.: Deep-learning-based visual data analytics for smart construction management. Autom. Constr. 131(August), 103892 (2021). https://doi.org/10.1016/j.autcon.2021.103892

    Article  Google Scholar 

  19. Reich, S., Teich, F., Tamosiunaite, M., Wörgötter, F., Ivanovska, T.: A data-driven approach for general visual quality control in a robotic workcell. J. Phys. Conf. Ser. 1335(1), 012013 (2019). https://doi.org/10.1088/1742-6596/1335/1/012013

    Article  Google Scholar 

  20. Riedel, A., et al.: A deep learning-based worker assistance system for error prevention. Adv. Prod. Eng. Manage. 16(4), 393–404 (2021). https://doi.org/10.14743/apem2021.4.408

  21. Ryu, J., Patil, A.K., Chakravarthi, B., Balasubramanyam, A., Park, S., Chai, Y.: Angular features-based human action recognition system for a real application with subtle unit actions. IEEE Access 10, 9645–9657 (2022). https://doi.org/10.1109/ACCESS.2022.3144456

    Article  Google Scholar 

  22. Ullah, A.S.: What is knowledge in Industry 4.0? Eng. Rep. 2(8), 1–21 (2020). https://doi.org/10.1002/eng2.12217

  23. Varol, G., Laptev, I., Schmid, C., Zisserman, A.: Synthetic humans for action recognition from unseen viewpoints. Int. J. Comput. Vision 129(7), 2264–2287 (2021). https://doi.org/10.1007/s11263-021-01467-7

    Article  Google Scholar 

  24. Villalba-Diez, J., Schmidt, D., Gevers, R., Ordieres-Meré, J., Buchwitz, M., Wellbrock, W.: Deep learning for industrial computer vision quality control in the printing industry 4.0. Sensors (Switzerland) 19(18), 1–23 (2019). https://doi.org/10.3390/s19183987

  25. Wang, P., Liu, H., Wang, L., Gao, R.X.: Deep learning-based human motion recognition for predictive context-aware human-robot collaboration. CIRP Ann. 67(1), 17–20 (2018). https://doi.org/10.1016/j.cirp.2018.04.066

    Article  Google Scholar 

  26. Zamora-Hernández, M.-A., Castro-Vargas, J. A., Azorin-Lopez, J., Garcia-Rodriguez, J.: Deep learning-based visual control assistant for assembly in Industry 4.0. Comput. Ind. 131, 103485 (2021). https://doi.org/10.1016/j.compind.2021.103485

  27. Zamora-Hernández, M.-A., Ceciliano, J.A.C., Granados, A.V., Vargas, J.A.C., Garcia-Rodriguez, J., Azorín-López, J.: Manufacturing description language for process control in industry 4.0. In: Advances in Intelligent Systems and Computing, vol. 1268, pp. 790–799. AISC (2021). https://doi.org/10.1007/978-3-030-57802-2_76

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mauricio-Andres Zamora-Hernandez .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Alfaro-Viquez, D., Zamora-Hernandez, MA., Benavent-Lledo, M., Garcia-Rodriguez, J., Azorín-López, J. (2023). Monitoring Human Performance Through Deep Learning and Computer Vision in Industry 4.0. In: García Bringas, P., et al. 17th International Conference on Soft Computing Models in Industrial and Environmental Applications (SOCO 2022). SOCO 2022. Lecture Notes in Networks and Systems, vol 531. Springer, Cham. https://doi.org/10.1007/978-3-031-18050-7_30

Download citation

Publish with us

Policies and ethics