Abstract
The management of assembly tasks within manufacturing, which traditionally relies on using stopwatches and video review, is both labour-intensive and prone to errors. This paper explores an approach utilizing machine learning (ML) and human pose estimation technologies to automate and enhance the classification and management of time blocks for manual assembly tasks in manufacturing environments. We developed and tested ML models capable of classifying manual assembly actions by converting video clips into a time series coordinate dataset via a human pose estimation library. The research highlights the potential of these technologies to significantly reduce the reliance on manual methods by providing a more adaptable, efficient, and scalable system for time data management. Our findings demonstrate accuracy variances across different actions, underscoring the challenges and potential of integrating ML in real-world manufacturing settings. This study provides a promising direction towards revolutionizing traditional practices and enhancing operational efficiencies in manufacturing.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Almström, P.: Time Data Management - en handbok. Chalmers University of Technology (2024)
Gans, J. Digitalisation of Predetermined Motion Time Systems: An Investigation Towards Automated Time Setting Processes (2023)
Jeong, Y., Wiktorsson, M., Park, D., Gans, J., Svensson, L.: Data preparation for ai-assisted video analysis in manual assembly task: a step towards industry 5.0. In: Advances In Production Management Systems. Production Management Systems For Responsible Manufacturing, Service, And Logistics Futures, pp. 619-631 (2023)
Hedman, R., Almström, P.: A state of the art system for managing time data in manual assembly. Inter. J. Comput. Integrated Manufact. 30, 1060–1071 (2017). https://doi.org/10.1080/0951192X.2017.1305501
Kuhlang, P., Erohin, O., Krebs, M., Deuse, J., Sihn, W.: Morphology of time data management - systematic design of time data management processes as fundamental challenge in industrial engineering. Inter. J. Indus. Syst. Eng. 16, 415 (2014)
Budiman, I., Sembiring, A., Tampubolon, J., Wahyuni, D., Dharmala, A.: Improving effectiveness and efficiency of assembly line with a stopwatch time study and balancing activity elements. J. Phys: Conf. Ser. 1230, 012041 (2019). https://doi.org/10.1088/1742-6596/1230/1/012041
Walker, A., Maeda, D., Acharya, J. Lightweight video analytics for cycle time detection in manufacturing. In: 2021 IEEE International Conference On Big Data (Big Data), pp. 3615-3618 (2021)
Engström, T., Medbo, P.: Data collection and analysis of manual work using video recording and personal computer techniques. Inter. J. Indust. Ergon. 19, 291–298 (1997)
Camastra, F., Vinciarelli, A.: Machine Learning for Audio, Image and Video Analysis. AIKP, Springer, London (2015). https://doi.org/10.1007/978-1-4471-6735-8
Pavlov, V., Khryashchev, V., Pavlov, E. Shmaglit, L.: Application for video analysis based on machine learning and computer vision algorithms. In: 14th Conference Of Open Innovation Association FRUCT, pp. 90-100 (2013,1)
Dai, P., et al.: An automated ICU agitation monitoring system for video streaming using deep learning classification. BMC Med. Inform. Decision Making 24, 77 (2024). https://doi.org/10.1186/s12911-024-02479-2
Qu, F., Dang, N., Furht, B., Nojoumian, M.: Comprehensive study of driver behavior monitoring systems using computer vision and machine learning techniques. J. Big Data 11, 32 (2024). https://doi.org/10.1186/s40537-024-00890-0
Younesi Heravi, M., Jang, Y., Jeong, I., Sarkar, S.: Deep learning-based activity-aware 3D human motion trajectory prediction in construction. Expert Syst. Appli. 239, 122423 (2024)
Fathi, M., Karlsson, I., Grahn, G., Björnsson, A.: Unveiling the potential of mixed reality: enhancing time measurement and operator support in manual assembly processes. Proc. Comput. Sci. 232, 2670–2679 (2024)
Andriluka, M., Pishchulin, L., Gehler, P., Schiele, B.: 2D human pose estimation: new benchmark and state of the art analysis. In: 2014 IEEE Conference On Computer Vision and Pattern Recognition, pp. 3686-3693 (2014)
Wang, J., et al.: Deep 3D human pose estimation: a review. Comput. Vis. Image Understand. 210, 103225 (2021)
Munea, T., Jembre, Y., Weldegebriel, H., Chen, L., Huang, C., Yang, C.: The progress of human pose estimation: a survey and taxonomy of models applied in 2D human pose estimation. IEEE Access. 8, 133330–133348 (2020)
Martinez, J., Hossain, R., Romero, J., Little, J.: A Simple Yet Effective Baseline for 3d Human Pose Estimation. IEEE Computer Society (2017)
Cao, Z., Simon, T., Wei, S., Sheikh, Y. Realtime multi-person 2D pose estimation using part affinity fields (2017)
Cao, Z., Hidalgo, G., Simon, T., Wei, S., Sheikh, Y.: OpenPose: realtime multi-Person 2D pose estimation using part affinity fields. IEEE Trans. Pattern Analy. Mach. Intell. 43, 172–186 (2021)
Lugaresi, C., et al.: MediaPipe: A Framework for Building Perception Pipelines. arXiv arXiv:1906.08172 (2019)
Garg, S., Saxena, A., Gupta, R.: Yoga pose classification: a CNN and MediaPipe inspired deep learning approach for real-world application. J. Ambient Intell. Humanized Comput. 14, 16551–16562 (2023). https://doi.org/10.1007/s12652-022-03910-0
Chung, J., Ong, L., Leow, M.: Comparative analysis of skeleton-based human pose estimation. Future Internet. 14, 380 (2022). https://doi.org/10.3390/fi14120380
Shanmuga Sundari, M., Jadala, V.C.: Real-time neurological disease prediction with 3D single pose estimation using mediapipe. Inter. J. Intell. Syst. Appli. Eng. 12, 595–607 (2024)
Hii, C., Gan, K., You, H., Zainal, N., Ibrahim, N. , Azmin, S.: Frontal plane gait assessment using mediapipe pose. In: Proceedings Of The 8th International Conference On Space Science And Communication, pp. 347-356 (2024)
Böhm, J., Chen, T., Ĺ tĂcha, K., Kohout, J., Mareš, J.: Skeleton detection using mediapipe as a tool for musculoskeletal disorders analysis. Softw. Eng. Methods Syst. Netw. Syst., 35–50 (2024)
Chen, Y., Liu, X.: Design of fitness movement detection and counting system based on MediaPipe. Data Sci., 77–92 (2023)
Zheng, Q., Fu, X., Li, Y., Cai, S.: Adaptive real-time rectifying exercise posture of sport rehabilitation system based on MediaPipe. In: 2023 2nd International Conference On Health Big Data And Intelligent Healthcare (ICHIH), pp. 176-181 (2023,10)
Mundt, M., Born, Z., Goldacre, M., Alderson, J.: Estimating ground reaction forces from two-dimensional pose data: a biomechanics-based comparison of alphapose, blazepose, and openpose. Sensors 23, 78 (2023). https://doi.org/10.3390/s23010078
Urgo, M., Berardinucci, F., Zheng, P., Wang, L.: AI-based pose estimation of human operators in manufacturing environments. CIRP Novel Topics Production Eng. 1, 3–38 (2024). https://doi.org/10.1007/978-3-031-54034-9_1
Generosi, A., Agostinelli, T., Ceccacci, S., Mengoni, M.: A novel platform to enable the future human-centered factory. Inter. J. Adv. Manufact. Technol. 122, 4221–4233 (2022). https://doi.org/10.1007/s00170-022-09880-z
Ciccarelli, M., Scoccia, C., Forlini, M., Papetti, A., Palmieri, G., Germani, M.: Comparison of Wearable Inertial Sensors and RGB-D Cameras for Ergonomic Risk Assessment. Adv. Digital Human Model., 186-194 (2023)
Ciccarelli, M., et al.: SPECTRE: a deep learning network for posture recognition in manufacturing. J. Intell. Manufact. 34, 3469–3481 (2023). https://doi.org/10.1007/s10845-022-02014-y
Flowers, J., Wiens, G.: Comparison of Human Skeleton Trackers Paired With a Novel Skeleton Fusion Algorithm. American Society of Mechanical Engineers Digital Collection (2022). https://doi.org/10.1115/MSEC2022-85269
Docekal, J., Rozlivek, J., Matas, J., Hoffmann, M.: Human keypoint detection for close proximity human-robot interaction. In: 2022 IEEE-RAS 21st International Conference on Humanoid Robots (Humanoids), pp. 450-457 (2022,1)
Khalil, H., Coronado, E., Venture, G.: Human motion retargeting to pepper humanoid robot from uncalibrated videos using human pose estimation. In: 2021 30th IEEE International Conference On Robot and Human Interactive Communication (RO-MAN), pp. 1145-1152 (2021)
Jha, P., et al.: Human-machine interaction and implementation on the upper extremities of a humanoid robot. Discover Appli. Sci. 6, 152 (2024). https://doi.org/10.1007/s42452-024-05734-3
Fortini, L., Leonori, M., Gandarias, J., Momi, E., Ajoudani, A.: Markerless 3D human pose tracking through multiple cameras and AI: Enabling high accuracy, robustness, and real-time performance. arXiv, arXiv:2303.18119 (2023)
Kwolek, B.: Continuous Hand Gesture Recognition for Human-Robot Collaborative Assembly. IEEE Computer Society (2023)
Martınez, G.: Openpose: Whole-body pose estimation. Carnegie Mellon University Pittsburgh, PA, USA (2019)
Google Pose landmark detection guide. Google For Developers. Available at https://ai.google.dev/edge/mediapipe/solutions/vision/pose_landmarker. (Accessed 10 June 2024)
Altınçay, H.: Ensembling evidential <i>k</i>-nearest neighbor classifiers through multi-modal perturbation. Appli Soft Comput. 7, 1072-1083 (2007)
Zhang, Y., Cao, G., Wang, B., Li, X.: A novel ensemble method for k-nearest neighbor. Pattern Recog. 85, 13–25 (2019)
Kramer, O.: K-Nearest Neighbors. In: Dimensionality Reduction With Unsupervised Nearest Neighbors. pp. 13-23 (2013)
Acknowledgments
The authors would like to acknowledge the support of the Swedish Innovation Agency (Vinnova). This study is part of the Time Data Management Automation for Manual Assembly (TIMEBLY) project.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 IFIP International Federation for Information Processing
About this paper
Cite this paper
Jeong, Y., Park, D., Gans, J., Wiktorsson, M. (2024). Advanced Time Block Analysis for Manual Assembly Tasks in Manufacturing Through Machine Learning Approaches. In: ThĂĽrer, M., Riedel, R., von Cieminski, G., Romero, D. (eds) Advances in Production Management Systems. Production Management Systems for Volatile, Uncertain, Complex, and Ambiguous Environments. APMS 2024. IFIP Advances in Information and Communication Technology, vol 731. Springer, Cham. https://doi.org/10.1007/978-3-031-71633-1_28
Download citation
DOI: https://doi.org/10.1007/978-3-031-71633-1_28
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-71632-4
Online ISBN: 978-3-031-71633-1
eBook Packages: Computer ScienceComputer Science (R0)