Skip to main content

Bento Packaging Activity Recognition from Motion Capture Data

  • Conference paper
  • First Online:
Sensor- and Video-Based Activity and Behavior Computing

Part of the book series: Smart Innovation, Systems and Technologies ((SIST,volume 291))

  • 215 Accesses

Abstract

Human activity recognition (HAR) has been an important research field for more than a decade due to its versatile applications in different area. It has gained significant attention in the health care domain. Although it has similarity with other form of activity recognition, it offers a unique set of challenges. Body movements in a food preparation environment are considerably less than many other activities of interest in real world. In this paper, a comprehensive solution has been demonstrated for the Bento Box Packaging Challenge activity recognition. In this case, we present a well-planned approach to recognize activities during packaging tasks from motion capture data. We use dataset obtained from motion capture system where subjects have 13 markers on their upper-body area and by special use of cameras and body suit. We obtain around 50,000 sample for each of the activities. We reduce the data dimensionality and make the data suitable for the classification purpose by extracting reliable and efficient features. After feature extraction process, three different classifiers, e.g., random forest classifier, extra trees classifier, and gradient boosting classifier are compared to check the result. We conclude that this challenging dataset has been observed to work most efficiently for random forest classifier using hyperparameter tuning.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 249.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Inoue, S., Lago, P., Hossain, T., Mairittha, T., Mairittha, N.: Integrating activity recognition and nursing care records: the system, deployment, and a verification study. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 3(3) (2019)

    Google Scholar 

  2. Saha, S.S., Rahman, S., Haque, Z.R.R., Hossain, T., Inoue, S., Ahad, M.A.R.; Position independent activity recognition using shallow neural architecture and empirical modeling. In: Adjunct Proceedings of the 2019 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2019 ACM International Symposium on Wearable Computers, UbiComp/ISWC ’19 Adjunct, pp. 808–813, New York, NY, USA. Association for Computing Machinery (2019)

    Google Scholar 

  3. Alia, S.S., Lago, P., Adachi, K., Hossain, T., Goto, H., Okita, T., Inoue., S.: Summary of the 2nd nurse care activity recognition challenge using lab and field data. In: Adjunct Proceedings of the 2020 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2020 ACM International Symposium on Wearable Computers, UbiComp-ISWC ’20, pp. 378–383, New York, NY, USA. Association for Computing Machinery (2020)

    Google Scholar 

  4. Hossain, T., Ahad, M.A.R., Tazin, T., Inoue, S.: Activity recognition by using lorawan sensor. In: Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, UbiComp ’18, pp. 58–61, New York, NY, USA. Association for Computing Machinery (2018)

    Google Scholar 

  5. Cheema, M.S., Eweiwi, A., Bauckhage, C.:. Human activity recognition by separating style and content. Pattern Recogn. Lett. 50, 130–138 (2014)

    Google Scholar 

  6. Aggarwal, J.K., Xia, L.: Human activity recognition from 3d data: a review. Pattern Recogn. Lett. 48, 70–80 (2014)

    Google Scholar 

  7. Atallah, L., Yang, G.-Z.: Review: The use of pervasive sensing for behaviour profiling—a survey. Pervasive Mob. Comput. 5(5), 447–464 (2009)

    Article  Google Scholar 

  8. Guan, Y., Ploetz, T.: Ensembles of deep lstm learners for activity recognition using wearables. Proc. ACM Interactive Mobile Wearable Ubiquitous Technol. 1, 03 (2017)

    Google Scholar 

  9. Ahmed, N., Rafiq, Islam.: Enhanced human activity recognition based on smartphone sensor data using hybrid feature selection model. Sensors 20, 317 (2020)

    Google Scholar 

  10. Chelli, A., Pätzold, M.: A machine learning approach for fall detection and daily living activity recognition. IEEE Access 7, 38670–38687 (2019)

    Google Scholar 

  11. Chelli, A., Pätzold, M.: A machine learning approach for fall detection based on the instantaneous doppler frequency. IEEE Access 7, 166173–166189 (2019)

    Google Scholar 

  12. Alia, S., Lago, P., Takeda, S., Adachi, K., Benaissa, B., Ahad, M.A.R., Inoue, S.: Summary of the Cooking Activity Recognition Challenge, pp. 1–13 (2021)

    Google Scholar 

  13. Bonanni, L., Lee, C.-H., Selker, T.: Counterintelligence: Augmented Reality Kitchen (2005)

    Google Scholar 

  14. Hossain, T., Islam, M., Ahad, M.A.R., Inoue, S.: Human Activity Recognition Using Earable Device, pp. 81–84 (2019)

    Google Scholar 

  15. Barnachon, M., Bouakaz, S., Boufama, B., Guillou, E.: Ongoing human action recognition with motion capture. Pattern Recogn. 47, 238–247 (2014)

    Google Scholar 

  16. Lin, y., le kernec, J.: Performance Analysis of Classification Algorithms for Activity Recognition Using Micro-doppler Feature, pp. 480–483 (2017)

    Google Scholar 

  17. Alia, S.S., Adachi, K., Nahid, N., Kaneko, H., Lago, P., Inoue, S.: Bento Packaging Activity Recognition Challenge (2021)

    Google Scholar 

  18. Bulling, A., Blanke, U., Schiele, B.: A tutorial on human activity recognition using body-worn inertial sensors. ACM Comput. Surv. 46, 01 (2013)

    Article  Google Scholar 

  19. Hossain, T., Ahad, M.A.R., Inoue, s.: A method for sensor-based activity recognition in missing data scenario. Sensors 20, 3811 (2020)

    Google Scholar 

  20. Nahid, N., Kaneko, H., Lago, P., Adachi, K., Alia, S.S., Inoue, S.: Summary of the Bento Packaging Activity Recognition Challenge (2021)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jahir Ibna Rafiq .

Editor information

Editors and Affiliations

6 Appendix

6 Appendix

Used sensor modalities

Motion capture (MoCap)

Features Used

As described in Sect. 3.3 and summarized in Table 1

Programming Language and Libraries

Programming language: Python

Libraries: NumPy, Pandas, Matplotlib, Scikit-learn, Sci-Py

Machine Specification

  • RAM: 8 GB

  • Processor: 2.2 GHz Dual-core Intel Core i7

  • GPU: N/A

Training and testing time

Training: 10.8 min

Testing: 3 min

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Rafiq, J.I., Nabi, S., Amin, A., Hossain, S. (2022). Bento Packaging Activity Recognition from Motion Capture Data. In: Ahad, M.A.R., Inoue, S., Roggen, D., Fujinami, K. (eds) Sensor- and Video-Based Activity and Behavior Computing. Smart Innovation, Systems and Technologies, vol 291. Springer, Singapore. https://doi.org/10.1007/978-981-19-0361-8_15

Download citation

Publish with us

Policies and ethics