ABSTRACT
In the recent revival of human labour in industry, and the subsequent push to optimally combine the strengths of man and machine in industrial processes, there is an increased need for methods allowing machines to understand and interpret the actions of their users. An important aspect of this is the understanding and evaluation of the progress of the workflows that are to be executed. Methods for this require both an appropriate choice of sensors, as well as algorithms capable of quickly and efficiently evaluating activity and workflow progress.
In this paper we present such an algorithm, which provides activity and workflow recognition using both depth and RGB cameras as input. The algorithm's main purpose is to be used in an industrial training station, allowing novice workers to learn the necessary steps in assembling nordic ski products without the need for human supervision. We will describe how the algorithm recognizes predefined workflows in the sensor data, and present a comprehensive evaluation of the algorithm's performance on a real data recording of operators performing their work in an industrial setting. We will show that the algorithm fulfills the necessary requirements and is ready to be implemented in the training station application.
- Jake K Aggarwal and Lu Xia. 2014. Human activity recognition from 3d data: A review. Pattern Recognition Letters 48 (2014), 70--80.Google ScholarCross Ref
- Sabrina Amrouche, Benedikt Gollan, Alois Ferscha, and Josef Heftberger. 2018. Activity Segmentation and Identification based on Eye Gaze Features. In Proceedings of the 11th Pervasive Technologies Related to Assistive Environments Conference. ACM, 75--82. Google ScholarDigital Library
- Elisabeth Behrmann and Christoph Rauwald. 2016. Mercedes Boots Robots From the Production Line. https://www.bloomberg.com/news/articles/2016-02-25/why-mercedes-is-halting-robots-reign-on-the-production-line. (2016). Accessed: 2017-02-01.Google Scholar
- Robert Grover Brown, Patrick YC Hwang, et al. 1992. Introduction to random signals and applied Kalman filtering. Vol. 3. Wiley New York.Google Scholar
- Andreas Bulling, Ulf Blanke, and Bernt Schiele. 2014. A tutorial on human activity recognition using body-worn inertial sensors. ACM Computing Surveys (CSUR) 46, 3 (2014), 33. Google ScholarDigital Library
- Zhenghua Chen, Qingchang Zhu, Yeng Chai Soh, and Le Zhang. 2017. Robust human activity recognition using smartphone sensors via ct-pca and online svm. IEEE Transactions on Industrial Informatics 13, 6 (2017), 3070--3080.Google ScholarCross Ref
- Olga Dergachyova, David Bouget, Arnaud Huaulme, Xavier Morandi, and Pierre Jannin. 2016. Automatic data-driven real-time segmentation and recognition of surgical workflow. International journal of computer assisted radiology and surgery 11, 6 (2016), 1081--1089.Google Scholar
- Adnan Farooq, Ahmad Jalal, and Shaharyar Kamal. 2015. Dense RGB-D map-based human tracking and activity recognition using skin joints features and self-organizing map. KSII Transactions on Internet and Information Systems (TIIS) 9, 5 (2015), 1856--1869.Google Scholar
- Paul M Fitts, MS Viteles, NL Barr, DR Brimhall, Glen Finch, Eric Gardner, WF Grether, WE Kellum, and SS Stevens. 1951. Human engineering for an effective air-navigation and traffic-control system, and appendixes 1 thru 3. Technical Report. OHIO STATE UNIV RESEARCH FOUNDATION COLUMBUS.Google Scholar
- Michael Haslgrübler, Peter Fritz, Benedikt Gollan, and Alois Ferscha. 2017. Getting through: modality selection in a multi-sensor-actuator industrial IoT environment. In Proceedings of the Seventh International Conference on the Internet of Things. ACM, 21. Google ScholarDigital Library
- Mohammed Mehedi Hassan, Md Zia Uddin, Amr Mohamed, and Ahmad Almogren. 2018. A robust human activity recognition system using smartphone sensors and deep learning. Future Generation Computer Systems 81 (2018), 307--313. Google ScholarDigital Library
- Ahmad Jalal, Yeonho Kim, Shaharyar Kamal, Adnan Farooq, and Daijin Kim. 2015. Human daily activity recognition with joints plus body features representation using Kinect sensor. In Informatics, Electronics & Vision (ICIEV), 2015 International Conference on. IEEE, 1--6. Google ScholarDigital Library
- Rudolph Emil Kalman. 1960. A new approach to linear filtering and prediction problems. Journal of basic Engineering 82, 1 (1960), 35--45.Google ScholarCross Ref
- Eunju Kim, Sumi Helal, and Diane Cook. 2010. Human activity recognition and pattern discovery. IEEE Pervasive Computing/IEEE Computer Society {and} IEEE Communications Society 9, 1 (2010), 48. Google ScholarDigital Library
- Heli Koskimaki, Ville Huikari, Pekka Siirtola, Perttu Laurinen, and Juha Roning. 2009. Activity recognition using a wrist-worn inertial measurement unit: A case study for industrial assembly lines. In Control and Automation, 2009. MED'09. 17th Mediterranean Conference on. IEEE, 401--405. Google ScholarDigital Library
- Michael Kranzfelder, Armin Schneider, Adam Fiolka, Sebastian Koller, Silvano Reiser, Thomas Vogel, Dirk Wilhelm, and Hubertus Feussner. 2014. Reliability of sensor-based real-time workflow recognition in laparoscopic cholecystectomy. International journal of computer assisted radiology and surgery 9, 6 (2014), 941--948.Google Scholar
- Marc Kurz, Gerold Hölzl, Alois Ferscha, Alberto Calatroni, Daniel Roggen, and Gerhard Tröster. 2011. Real-time transfer and evaluation of activity recognition capabilities in an opportunistic system. machine learning 1, 7 (2011), 8.Google Scholar
- Young-Seol Lee and Sung-Bae Cho. 2014. Activity recognition with android phone using mixture-of-experts co-trained with labeled and unlabeled data. Neurocomputing 126 (2014), 106--115. Google ScholarDigital Library
- Yunji Liang, Xingshe Zhou, Bin Guo, and Zhiwen Yu. 2018. Activity Recognition Using Ubiquitous Sensors: An Overview. In Wearable Technologies: Concepts, Methodologies, Tools, and Applications. IGI Global, 199--230.Google Scholar
- Adrien Malaisé, Pauline Maurice, Francis Colas, François Charpillet, and Serena Ivaldi. 2018. Activity Recognition With Multiple Wearable Sensors for Industrial Applications. In Advances in Computer-Human Interactions.Google Scholar
- Yasuo Namioka, Daisuke Nakai, Kazuya Ohara, and Takuya Maekawa. 2017. Automatic Measurement of Lead Time of Repetitive Assembly Work in a Factory Using a Wearable Sensor. Journal of Information Processing 25 (2017), 901--911.Google ScholarCross Ref
- Nicolas Padoy, Tobias Blum, Seyed-Ahmad Ahmadi, Hubertus Feussner, Marie-Odile Berger, and Nassir Navab. 2012. Statistical modeling and recognition of surgical workflow. Medical image analysis 16, 3 (2012), 632--641.Google Scholar
- Igor Pernek and Alois Ferscha. 2017. A survey of context recognition in surgery. Medical & biological engineering & computing 55, 10 (2017), 1719--1734.Google Scholar
- Ronald Poppe. 2010. A survey on vision-based human action recognition. Image and vision computing 28, 6 (2010), 976--990. Google ScholarDigital Library
- E Protopapadakis, A Doulamis, Konstantinos Makantasis, and A Voulodimos. 2012. A semi-supervised approach for industrial workflow recognition. In Proceedings of the Second International Conference on Advanced Communications and Computation (INFOCOMP 2012), Venice, Italy. 21--26.Google Scholar
- Daniel Roggen, Kilian Forster, Alberto Calatroni, Thomas Holleczek, Yu Fang, Gerhard Troster, Alois Ferscha, Clemens Holzmann, Andreas Riener, Paul Lukowicz, et al. 2009. OPPORTUNITY: Towards opportunistic activity and context recognition systems. In World of Wireless, Mobile and Multimedia Networks & Workshops, 2009. WoWMoM 2009. IEEE International Symposium on a. IEEE, 1--6.Google Scholar
- Muhammad Shoaib, Ozlem Durmaz Incel, Hans Scholten, and Paul Havinga. 2018. SmokeSense: Online Activity Recognition Framework on Smartwatches. In International Conference on Mobile Computing, Applications, and Services. Springer, 106--124.Google Scholar
- Ralf Stauder, Daniel Ostler, Michael Kranzfelder, Sebastian Koller, Hubertus Feußner, and Nassir Navab. 2016. The TUM LapChole dataset for the M2CAI 2016 workflow challenge. arXiv preprint arXiv:1610.09278 (2016).Google Scholar
- Timo Sztyler and Heiner Stuckenschmidt. 2016. On-body localization of wearable devices: An investigation of position-aware activity recognition. In Pervasive Computing and Communications (PerCom), 2016 IEEE International Conference on. IEEE, 1--9.Google ScholarCross Ref
- Athanasios Voulodimos, Dimitrios Kosmopoulos, Georgios Vasileiou, Emmanuel Sardis, Vasileios Anagnostopoulos, Constantinos Lalos, Anastasios Doulamis, and Theodora Varvarigou. 2012. A threefold dataset for activity and workflow recognition in complex industrial environments. IEEE MultiMedia 19, 3 (2012), 42--52. Google ScholarDigital Library
- Min-Yu Wu, Tzu-Yang Chen, Kuan-Yu Chen, and Li-Chen Fu. 2016. Daily activity recognition using the informative features from skeletal and depth data. In Robotics and Automation (ICRA), 2016 IEEE International Conference on. IEEE, 1628--1633.Google ScholarCross Ref
- Xiaochen Zheng, Meiqing Wang, and Joaquín Ordieres-Meré. 2018. Comparison of Data Preprocessing Approaches for Applying Deep Learning to Human Activity Recognition in the Context of Industry 4.0. Sensors (Basel, Switzerland) 18, 7 (2018).Google Scholar
Index Terms
- A multi-sensor algorithm for activity and workflow recognition in an industrial setting
Recommendations
Multi-sensor fusion for human daily activity recognition in robot-assisted living
HRI '09: Proceedings of the 4th ACM/IEEE international conference on Human robot interactionIn this paper, we propose a human activity recognition method by fusing the data from two wearable inertial sensors attached to one foot and the waist of a human subject, respectively. Our multi-sensor fusion based method combines neural networks and ...
Human Activity Recognition Using Gait Pattern
Vision-based human activity recognition is the process of labelling image sequences with action labels. Accurate systems for this problem are applied in areas such as visual surveillance, human computer interaction and video retrieval. The challenges ...
Motion- and location-based online human daily activity recognition
In this paper, we proposed an approach to indoor human daily activity recognition which combines motion data and location information. One inertial sensor is worn on the right thigh of a human subject to provide motion data, while an optical motion ...
Comments