skip to main content
10.1145/3197768.3201542acmotherconferencesArticle/Chapter ViewAbstractPublication PagespetraConference Proceedingsconference-collections
research-article

A Hierarchical Approach in Food and Drink Intake Recognition Using Wearable Inertial Sensors

Published: 26 June 2018 Publication History

Abstract

Despite the increasing attention given to inertial sensors for Human Activity Recognition (HAR), efforts are principally focused on fitness applications where quasi-periodic activities like walking or running are studied. In contrast, activities like eating or drinking cannot be considered periodic or quasi-periodic. Instead, they are composed of sporadic occurring gestures in continuous data streams. This paper presents an approach to gesture recognition for an Ambient Assisted Living (AAL) environment. Specifically, food and drink intake gestures are studied. To do so, firstly, waist-worn tri-axial accelerometer data is used to develop a low computational model to recognize whether a person is at moving, sitting or standing estate. With this information, data from a wrist-worn tri-axial Micro-Electro-Mechanical (MEM) system was used to recognize a set of similar eating and drinking gestures. The promising preliminary results show that states can be recognized with 100% classification accuracy with the use of a low computational model on a reduced 4-dimensional feature vector. Additionally, the recognition rate achieved for eating and drinking gestures was above 99%. Altogether suggests that it is possible to develop a continuous monitoring system based on a bi-nodal inertial unit. This work is part of a bigger project that aims at developing a self-neglect detection continuous monitoring system for older adults living independently.

References

[1]
Robert C Abrams, Mark Lachs, Gail McAvay, Denis J Keohane, and Martha L Bruce. 2002. Predictors of self-neglect in community-dwelling elders. American Journal of Psychiatry 159, 10 (2002), 1724--1730.
[2]
Oliver Amft, David Bannach, Gerald Pirkl, Matthias Kreil, and Paul Lukowicz. 2010. Towards wearable sensing-based assessment of fluid intake. In Pervasive Computing and Communications Workshops (PERCOM Workshops), 2010 8th IEEE International Conference on. IEEE, 298--303.
[3]
Oliver Amft, Martin Kusserow, and Gerhard Tröster. 2007. Probabilistic parsing of dietary activity events. In 4th International Workshop on Wearable and Implantable Body Sensor Networks (BSN 2007). Springer, 242--247.
[4]
Oliver Amft and Gerhard Tröster. 2008. Recognition of dietary activity events using on-body sensors. Artificial intelligence in medicine 42, 2 (2008), 121--136.
[5]
Dario Ortega Anderez, Kofi Appiah, Ahmad Lotfi, and Caroline Langesiepen. 2017. A hierarchical approach towards activity recognition. In Proceedings of the 10th International Conference on PErvasive Technologies Related to Assistive Environments. ACM, 269--274.
[6]
Akram Bayat, Marc Pomplun, and Duc A Tran. 2014. A study on human activity recognition using accelerometer data from smartphones. Procedia Computer Science 34 (2014), 450--457.
[7]
Ian Cleland, Basel Kikhia, Chris Nugent, Andrey Boytsov, Josef Hallberg, Kåre Synnes, Sally McClean, and Dewar Finlay. 2013. Optimal placement of accelerometers for the detection of everyday activities. Sensors 13, 7 (2013), 9183--9200.
[8]
NHS Digital. 2017. Safeguarding Adults Collection (SAC). England 2016-2017 Experimental Statistics. (2017). http://digital.nhs.uk/media/33907/Safeguarding-Adults-Collection-England-2016-17-Report/pdf/Safeguarding_Adults_Collection_2016-17_Report {Online}, Accessed: 2018-01-15.
[9]
Yujie Dong, Jenna Scisco, Mike Wilson, Eric Muth, and Adam Hoover. 2014. Detecting periods of eating during free-living by tracking wrist motion. IEEE journal of biomedical and health informatics 18, 4 (2014), 1253--1260.
[10]
Ç Berke Erdaş, Işıl Atasoy, Koray Açıcı, and Hasan Oğul. 2016. Integrating features for accelerometer-based activity recognition. Procedia Computer Science 98 (2016), 522--527.
[11]
Miikka Ermes, Juha Pärkkä, Jani Mäntyjärvi, and Ilkka Korhonen. 2008. Detection of daily activities and sports with wearable sensors in controlled and uncontrolled conditions. IEEE transactions on information technology in biomedicine 12, 1 (2008), 20--26.
[12]
Nils Y Hammerla, Shane Halloran, and Thomas Ploetz. 2016. Deep, convolutional, and recurrent models for human activity recognition using wearables. arXiv preprint arXiv:1604.08880 (2016).
[13]
Andrey Ignatov. 2018. Real-time human activity recognition from accelerometer data using Convolutional Neural Networks. Applied Soft Computing 62 (2018), 915--922.
[14]
Holger Junker, Oliver Amft, Paul Lukowicz, and Gerhard Tröster. 2008. Gesture spotting with body-worn inertial sensors to detect user activities. Pattern Recognition 41, 6 (2008), 2010--2024.
[15]
Mark S Lachs, Christianna S Williams, Shelley O'brien, Karl A Pillemer, and Mary E Charlson. 1998. The mortality of elder mistreatment. Jama 280, 5 (1998), 428--432.
[16]
Bjoern H Menze, B Michael Kelm, Ralf Masuch, Uwe Himmelreich, Peter Bachert, Wolfgang Petrich, and Fred A Hamprecht. 2009. A comparison of random forest and its Gini importance with standard chemometric methods for the feature selection and classification of spectral data. BMC bioinformatics 10, 1 (2009), 213.
[17]
Aanand D Naik, Jason Burnett, Sabrina Pickens-Pace, and Carmel B Dyer. 2008. Impairment in instrumental activities of daily living and the geriatric syndrome of self-neglect. The Gerontologist 48, 3 (2008), 388--393.
[18]
Francisco Javier Ordóñez and Daniel Roggen. 2016. Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition. Sensors 16, 1 (2016), 115.
[19]
Abhinav Parate, Meng-Chieh Chiu, Chaniel Chadowitz, Deepak Ganesan, and Evangelos Kalogerakis. 2014. Risq: Recognizing smoking gestures with inertial sensors on a wristband. In Proceedings of the 12th annual international conference on Mobile systems, applications, and services. ACM, 149--161.
[20]
Nishkam Ravi, Nikhil Dandekar, Preetham Mysore, and Michael L Littman. 2005. Activity recognition from accelerometer data. In Aaai, Vol. 5. 1541--1546.
[21]
Jessica PM Vital, Diego R Faria, Gonçalo Dias, Micael S Couceiro, Fernanda Coutinho, and Nuno MF Ferreira. 2017. Combining discriminative spatiotemporal features for daily life activity recognition using wearable motion sensing suit. Pattern Analysis and Applications 20, 4 (2017), 1179--1194.
[22]
Daniel WT Wundersitz, Casey Josman, Ritu Gupta, Kevin J Netto, Paul B Gastin, and Sam Robertson. 2015. Classification of team sport activities using a single wearable tracking device. Journal of biomechanics 48, 15 (2015), 3975--3981.
[23]
Ruize Xu, Shengli Zhou, and Wen J Li. 2012. MEMS accelerometer based nonspecific-user hand gesture recognition. IEEE sensors journal 12, 5 (2012), 1166--1173.
[24]
Chun Zhu and Weihua Sheng. 2011. Wearable sensor-based hand gesture and daily activity recognition for robot-assisted living. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans 41, 3 (2011), 569--573.

Cited By

View all
  • (2024)iFace: Hand-Over-Face Gesture Recognition Leveraging Impedance SensingProceedings of the Augmented Humans International Conference 202410.1145/3652920.3652923(131-137)Online publication date: 4-Apr-2024
  • (2024)Integrated image and sensor-based food intake detection in free-livingScientific Reports10.1038/s41598-024-51687-314:1Online publication date: 18-Jan-2024
  • (2023)A Cross-Day Analysis of EMG Features, Classifiers, and Regressors for Swallowing Events Detection and Fluid Intake Volume EstimationSensors10.3390/s2321878923:21(8789)Online publication date: 28-Oct-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
PETRA '18: Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference
June 2018
591 pages
ISBN:9781450363907
DOI:10.1145/3197768
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

In-Cooperation

  • NSF: National Science Foundation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Ambient Assisted Living
  2. Continuous Monitoring
  3. Deep Learning
  4. Gesture Recognition
  5. HAR
  6. Human Activity Recognition
  7. Wearable Inertial Sensors

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

PETRA '18

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)10
  • Downloads (Last 6 weeks)0
Reflects downloads up to 10 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)iFace: Hand-Over-Face Gesture Recognition Leveraging Impedance SensingProceedings of the Augmented Humans International Conference 202410.1145/3652920.3652923(131-137)Online publication date: 4-Apr-2024
  • (2024)Integrated image and sensor-based food intake detection in free-livingScientific Reports10.1038/s41598-024-51687-314:1Online publication date: 18-Jan-2024
  • (2023)A Cross-Day Analysis of EMG Features, Classifiers, and Regressors for Swallowing Events Detection and Fluid Intake Volume EstimationSensors10.3390/s2321878923:21(8789)Online publication date: 28-Oct-2023
  • (2023)Technology to Automatically Record Eating Behavior in Real Life: A Systematic ReviewSensors10.3390/s2318775723:18(7757)Online publication date: 8-Sep-2023
  • (2023)Passive Sensors for Detection of Food IntakeEncyclopedia of Sensors and Biosensors10.1016/B978-0-12-822548-6.00086-8(218-234)Online publication date: 2023
  • (2022)CalicoProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35503236:3(1-32)Online publication date: 7-Sep-2022
  • (2022)Let's Grab a Drink: Teacher-Student Learning for Fluid Intake Monitoring using Smart Earphones2022 IEEE/ACM Seventh International Conference on Internet-of-Things Design and Implementation (IoTDI)10.1109/IoTDI54339.2022.00014(55-66)Online publication date: May-2022
  • (2021)Fluid Intake Monitoring Systems for the Elderly: A Review of the LiteratureNutrients10.3390/nu1306209213:6(2092)Online publication date: 19-Jun-2021
  • (2020)Fluid Intake Monitoring System Using a Wearable Inertial Sensor for Fluid Intake ManagementSensors10.3390/s2022668220:22(6682)Online publication date: 22-Nov-2020
  • (2020)Deep Learning for Intake Gesture Detection From Wrist-Worn Inertial Sensors: The Effects of Data Preprocessing, Sensor Modalities, and Sensor PositionsIEEE Access10.1109/ACCESS.2020.30220428(164936-164949)Online publication date: 2020
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media