Abstract
The rise of Industry 4.0 and the convergence with business process management provide new potential for the automatic gathering of process-related sensor information. In manufacturing, information about human behavior in manual assembly tasks is rare when no interaction with machines is involved. We suggest technologies to automatically detect material picking and placement in the assembly workflow to gather accurate data about human behavior and flexible support of human–process interaction. The detection of material picking is achieved by using background subtraction in combination with scales. For placement detection, two approaches are tested: image classification using convolutional neural networks and object detection using Haar wavelets. The detected fine-grained worker activities are then correlated with a hybrid model of the assembly workflow using the business process model and notation and case management model and notation, enabling the measurement of production time (time per state) and quality (frequency of error) on the shop floor as an entry point for conformance checking and process optimization. The approach has been evaluated in a quantitative case study recording the assembly process 30 times in a laboratory setup within 4 h. Under these conditions, the classification of assembly states using a neural network provides a test accuracy of 99.25% on 38 possible assembly states. Material picking based on background subtraction has been evaluated in an informal user study with six participants performing 16 picks each, providing an accuracy of 99.48%. The suggested method offers a promising approach to easily assess fine-grained timings and error rates of assembly steps which can be used to optimize the corresponding process.
Similar content being viewed by others
References
Ahonen T, Hadid A, Pietikainen M (2006) Face description with local binary patterns: application to face recognition. IEEE Trans Pattern Anal Mach Intell 28(12):2037–2041
Bader S, Aehnelt M (2014) Tracking assembly processes and providing assistance in smart factories. In: 6th International conference on agents and artificial intelligence, SCITEPRESS, science and technology publications, Lda, Portugal, ICAART, vol 1, pp 161–168
Bertram P, Motsch W, Rübel P, Ruskowski M (eds) (2019) Intelligent material supply supporting assistive systems for manual working stations. In: International conference on flexible automation and intelligent manufacturing (FAIM-2019), June 24-28, Limerick, Ireland, Elsevier B.V
Cameranesi M, Diamantini C, Potena D (2018) Discovering process models of activities of daily living from sensors. In: Teniente E, Weidlich M (eds) Business process management workshops. Springer, Cham, pp 285–297
Carolis BD, Ferilli S, Redavid D (2015) Incremental learning of daily routines as workflows in a smart home environment. ACM Trans Interact Intell Syst 4(4):20:1–20:23. https://doi.org/10.1145/2675063
Cavanillas JM, Curry E, Wahlster W (eds) (2016) New Horizons for a Data-Driven Economy : a Roadmap for Usage and Exploitation of Big Data in Europe. Springer, Cham
Fei-Fei L, Fergus R, Perona P (2007) Learning generative visual models from few training examples: an incremental bayesian approach tested on 101 object categories. Comput Vis Image Underst 106(1):59–70
Funk M, Schmidt A (2015) Cognitive assistance in the workplace. Pervasive Comput 14(3):53–55
Grzeszick R, Lenk JM, Rueda FM, Fink GA, Feldhorst S, ten Hompel M (2017) Deep neural network based human activity recognition for the order picking process. In: Proceedings of the 4th international workshop on sensor-based activity recognition and interaction, ACM, New York, NY, USA, iWOAR ’17, pp 14:1–14:6, https://doi.org/10.1145/3134230.3134231
He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778
He K, Gkioxari G, Dollár P, Girshick R (2017) Mask r-cnn. In: Proceedings of the IEEE international conference on computer vision, pp 2961–2969
Henderson SJ, Feiner SK (2011) Augmented reality in the psychomotor phase of a procedural task. In: 10th International symposium on mixed and augmented reality. IEEE, ISMAR, pp 191–200
Hull R, Motahari Nezhad HR (2016) Rethinking bpm in a cognitive world: transforming how we learn and perform business processes. In: La Rosa M, Loos P, Pastor O (eds) Business process management. Springer, Cham, pp 3–19
Janiesch C, Koschmider A, Mecella M, Weber B, Burattin A, Di Ciccio C, Gal A, Kannengiesser U, Mannhardt F, Mendling J, Oberweis A, Reichert M, Rinderle-Ma S, Song W, Su J, Torres V, Weidlich M, Weske M, Zhang L (2017) The internet-of-things meets business process management: mutual benefits and challenges. Computing research repository (709.03628):1–9, https://arxiv.org/pdf/1709.03628
Jaroucheh Z, Liu X, Smith S (2011) Recognize contextual situation in pervasive environments using process mining techniques. J Ambient Intell Humaniz Comput 2(1):53–69. https://doi.org/10.1007/s12652-010-0038-7
Kagermann H, Helbig J, Hellinger A, Wahlster W (2013) Recommendations for implementing the strategic initiative INDUSTRIE 4.0: securing the future of German manufacturing industry; final report of the Industrie 4.0 working group. Forschungsunion
Kerber F, Lessel P (2015) Adaptive und gamifizierte werkerassistenz in der (semi-)manuellen industrie 4.0-montage. In: DeLFI WOrkshops
Knoch S, Kerber F, Pavlov V, Ponpathirkoottam S (2016) Automatic capturing and analysis of manual manufacturing processes with minimal setup effort. In: International joint conference on pervasive and ubiquitous computing. ACM, UbiComp, pp 305–308
Knoch S, Ponpathirkoottam S, Fettke P, Loos P (2018) Technology-enhanced process elicitation of worker activities in manufacturing. In: Teniente E, Weidlich M (eds) Business process management workshops. Springer, Cham, pp 273–284
Knoch S, Herbig N, Ponpathirkoottam S, Kosmalla F, Staudt P, Fettke P, Loos P (2019) Enhancing process data in manual assembly workflows. In: Sheng QZ, Motahari H (eds) Business process management workshops, vol 342. Lecture notes in business information processing (LNBIP). Springer, Cham, pp 269–280
Lasi H, Fettke P, Kemper HG, Feld T, Hoffmann M (2014) Industrie 4.0. WIRTSCHAFTSINFORMATIK 56(4):261–264. https://doi.org/10.1007/s11576-014-0424-4
Lee DC, Kanade T (2007) Boosted classifier for car detection. unpublished,http://www.cs.cmu.edu/~dclee
Lenz C, Sotzek A, Roeder T, Radrich H, Knoll A, Huber M, Glasauer S (2011) Human workflow analysis using 3d occupancy grid hand tracking in a human–robot collaboration scenario. In: 2011 IEEE/ RSJ international conference on intelligent robots and systems, pp 3375–3380, https://doi.org/10.1109/IROS.2011.6094570, http://ieeexplore.ieee.org/document/6094570/
Marrella A, Mecella M (2017) Cognitive business process management for adaptive cyber-physical processes. In: Teniente E, Weidlich M (eds) Business process management workshops, Springer, lecture notes in business information processing, vol 308, pp 429–439, http://dblp.uni-trier.de/db/conf/bpm/bpmw2017.html#MarrellaM17
Monteiro G, Peixoto P, Nunes U (2006) Vision-based pedestrian detection using haar-like features. Robotica 24:46–50
Nesselrath R (2016) SiAM-dp: An open development platform for massively multimodal dialogue systems in cyber-physical environments. Ph.D. thesis, Universitaet des Saarlandes, Saarbruecken,http://scidok.sulb.uni-saarland.de/volltexte/2016/6385
Petersen N, Pagani A, Stricker D (2013) Real-time modeling and tracking manual workflows from first-person vision. In: international symposium on mixed and augmented reality. IEEE, ISMAR, pp 117–124
Poppe R (2010) A survey on vision-based human action recognition. Image Vis Comput 28(6):976–990. https://doi.org/10.1016/j.imavis.2009.11.014 http://www.sciencedirect.com/science/article/pii/S0262885609002704
Roitberg A, Somani N, Perzylo A, Rickert M, Knoll A (2015) Multimodal human activity recognition for industrial manufacturing processes in robotic workcells. In: Proceedings of the 2015 ACM on international conference on multimodal interaction, ACM, New York, NY, USA, ICMI ’15, pp 259–266, https://doi.org/10.1145/2818346.2820738, http://doi.acm.org/10.1145/2818346.2820738
Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein M, Berg AC, Fei-Fei L (2015) ImageNet large scale visual recognition challenge. Int J Comput Vis (IJCV) 115(3):211–252. https://doi.org/10.1007/s11263-015-0816-y
Sora D, Leotta F, Mecella M (2018) An habit is a process: a bpm-based approach for smart spaces. In: Teniente E, Weidlich M (eds) Business process management workshops. Springer, Cham, pp 298–309
Stiefmeier T, Roggen D, Ogris G, Lukowicz P, Troester G (2008) Wearable activity tracking in car manufacturing. IEEE Pervasive Comput 7(2):42–50. https://doi.org/10.1109/MPRV.2008.40
Thoben KD, Poeppelbuss J, Wellsandt S, Teucke M, Werthmann D (2014) Considerations on a lifecycle model for cyber-physical system platforms. Innovative and knowledge-based production management in a global-local world. In: Grabot B, Vallespir B, Gomes S, Bouras A, Kiritsis D (eds) Advances in production management systems. Springer, Berlin, pp 85–92
Ullrich C, Aust M, Dietrich M, Herbig N, Igel C, Kreggenfeld N, Prinz C, Raber F, Schwantzer S, Sulzmann F (2016) Appsist statusbericht: Realisierung einer plattform für assistenz-und wissensdienste für die industrie 4.0. In: DeLFI workshops, pp 174–180
Viola P, Jones M (2001) Rapid object detection using a boosted cascade of simple features. In: Computer vision and pattern recognition, 2001. CVPR 2001. Proceedings of the 2001 IEEE computer society conference on, IEEE, vol 1, pp I–I
Wombacher A (2011) How physical objects and business workflows can be correlated. In: 2011 IEEE International conference on services computing, pp 226–233, https://doi.org/10.1109/SCC.2011.24
Zivkovic Z, Van Der Heijden F (2006) Efficient adaptive density estimation per image pixel for the task of background subtraction. Pattern Recogn Lett 27(7):773–780
Acknowledgements
This research was funded in part by the German Federal Ministry of Education and Research under grant number 01IS16022E (project BaSys4.0). The responsibility for this publication lies with the authors. The authors thank Mettler Toledo for providing the hardware setup used for inventory control in this research.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Knoch, S., Herbig, N., Ponpathirkoottam, S. et al. Sensor-based Human–Process Interaction in Discrete Manufacturing. J Data Semant 9, 21–37 (2020). https://doi.org/10.1007/s13740-019-00109-z
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13740-019-00109-z