skip to main content
10.1145/3288599.3288635acmconferencesArticle/Chapter ViewAbstractPublication PagesicdcnConference Proceedingsconference-collections
research-article

HydraDoctor: real-time liquids intake monitoring by collaborative sensing

Published: 04 January 2019 Publication History

Abstract

Water has been widely acknowledged as an essential part of all living things. It is the fundamental necessity for all life's activities and most biochemical reactions in human body are executed in water. Therefore, the type and quantity of liquid intake everyday have a critical impact on individuals' health. In this paper, we demonstrate HydraDoctor, a real-time liquids intake monitoring system which is able to detect drinking activities, classify the categories of liquids and estimate the amount of intake. The system runs on multiple platforms including a smartwatch to detect the motion of hands and a smartglass to capture the images of mugs. A smartphone is also used as an edge computing platform and a remote server is designed for computationally intensive image processing. In HydraDoctor, multiple state-of-the-art machine learning techniques are applied: a Support Vector Machine (SVM)-based classifier is proposed to achieve accurate and efficient liquids intake monitoring, which is trained to detect the hand raising action. Both of them are well optimized to enable in-situ processing on smartwatch. To provide more robust and detailed monitoring, the smartglass is also incorporated and trigged to capture a short video clip in the front of the user when potential drinking activity is detected. The smartglass will send the video clip to the remote server via its companion smartphone and a Faster-RCNN is performed on the server to confirm the detected drinking activity and identify the type of intake liquid. According to our evaluation on the real-world experiments, HydraDoctor achieves very high accuracy both in drinking activity detection and types of liquids classification, whose accuracy is 85.64% and 84% respectively.

References

[1]
Jake K Aggarwal and Lu Xia. 2014. Human activity recognition from 3d data: A review. Pattern Recognition Letters 48 (2014), 70--80.
[2]
Rana Almaghrabi, Gregorio Villalobos, Parisa Pouladzadeh, and Shervin Shirmohammadi. 2012. A novel method for measuring nutrition intake based on food image. In Instrumentation and Measurement Technology Conference (I2MTC), 2012 IEEE International. IEEE, 366--370.
[3]
Oliver Amft, David Bannach, Gerald Pirkl, Matthias Kreil, and Paul Lukowicz. 2010. Towards wearable sensing-based assessment of fluid intake. In Pervasive Computing and Communications Workshops (PERCOM Workshops), 2010 8th IEEE International Conference on. IEEE, 298--303.
[4]
Yin Bi, Mingsong Lv, Chen Song, Wenyao Xu, Nan Guan, and Wang Yi. 2016. Autodietary: A wearable acoustic sensor system for food intake recognition in daily life. IEEE Sensors Journal 16, 3 (2016), 806--816.
[5]
Jingyuan Cheng, Mathias Sundholm, Bo Zhou, Marco Hirsch, and Paul Lukowicz. 2016. Smart-surface: Large scale textile pressure sensors arrays for activity recognition. Pervasive and Mobile Computing 30 (2016), 97--112.
[6]
Corinna Cortes and Vladimir Vapnik. 1995. Support-vector networks. Machine learning 20, 3 (1995), 273--297.
[7]
Jia Deng, Wei Dong, Richard Socher, Li-Jia Li, Kai Li, and Li Fei-Fei. 2009. Imagenet: A large-scale hierarchical image database. In Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on. IEEE, 248--255.
[8]
Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition. 770--778.
[9]
Asangi Jayatilaka and Damith C Ranasinghe. 2017. Real-time fluid intake gesture recognition based on batteryless UHF RFID technology. Pervasive and Mobile Computing 34 (2017), 146--156.
[10]
Haik Kalantarian, Nabil Alshurafa, and Majid Sarrafzadeh. 2014. A wearable nutrition monitoring system. In Wearable and Implantable Body Sensor Networks (BSN), 2014 11th International Conference on. IEEE, 75--80.
[11]
Matin Kheirkhahan, Hiranava Das, Manoj Battula, Anis Davoudi, Parisa Rashidi, Todd M Manini, and Sanjay Ranka. 2017. Power-efficient real-time approach to non-wear time detection for smartwatches. In Biomedical & Health Informatics (BHI), 2017 IEEE EMBS International Conference on. IEEE, 217--220.
[12]
Susan M Kleiner. 1999. Water: an essential but overlooked nutrient. Journal of the American Dietetic Association 99, 2 (1999), 200--206.
[13]
Robert LiKamWa, Zhen Wang, Aaron Carroll, Felix Xiaozhu Lin, and Lin Zhong. 2014. Draining our glass: An energy and heat characterization of google glass. In Proceedings of 5th Asia-Pacific Workshop on Systems. ACM, 10.
[14]
Chris Xiaoxuan Lu, Bowen Du, Hongkai Wen, Sen Wang, Andrew Markham, Ivan Martinovic, Shen Yiran, and Trigoni Niki. 2017. Snoopy : sniffing your smartwatch passwords via deep sequence learning. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (2017).
[15]
Yehenew Mengistu, Minh Pham, Ha Manh Do, and Weihua Sheng. 2016. AutoHydrate: A wearable hydration monitoring system. In Intelligent Robots and Systems (IROS), 2016 IEEE/RSJ International Conference on. IEEE, 1857--1862.
[16]
Sebastian Päßler, Matthias Wolff, and Wolf-Joachim Fischer. 2012. Food intake monitoring: an acoustical approach to automated food intake activity detection and classification of consumed food. Physiological measurement 33, 6 (2012), 1073.
[17]
Erica Perrier, Agnès Demazières, Nicolas Girard, Nathalie Pross, Dominique Osbild, Deborah Metzger, Isabelle Guelinckx, and Alexis Klein. 2013. Circadian variation and responsiveness of hydration biomarkers to changes in daily water intake. European journal of applied physiology 113, 8 (2013), 2143--2151.
[18]
Shaoqing Ren, Kaiming He, Ross Girshick, and Jian Sun. 2015. Faster R-CNN: Towards real-time object detection with region proposal networks. In Advances in neural information processing systems. 91--99.
[19]
Edison Thomaz, Irfan Essa, and Gregory D Abowd. 2015. A practical approach for recognizing eating moments with wrist-mounted inertial sensing. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 1029--1040.
[20]
Ju Wang, Jie Xiong, Xiaojiang Chen, Hongbo Jiang, Rajesh Krishna Balan, and Dingyi Fang. 2017. TagScan: Simultaneous Target Imaging and Material Identification with Commodity RFID Devices. In Proc. ACM MobiCom. 1--14.
[21]
Wei Wang, Alex X Liu, Muhammad Shahzad, Kang Ling, and Sanglu Lu. 2015. Understanding and modeling of wifi signal based human activity recognition. In Proceedings of the 21st annual international conference on mobile computing and networking. ACM, 65--76.

Cited By

View all
  • (2023)Technology to Automatically Record Eating Behavior in Real Life: A Systematic ReviewSensors10.3390/s2318775723:18(7757)Online publication date: 8-Sep-2023
  • (2023)Vision-Based Methods for Food and Fluid Intake Monitoring: A Literature ReviewSensors10.3390/s2313613723:13(6137)Online publication date: 4-Jul-2023
  • (2021)Fluid Intake Monitoring Systems for the Elderly: A Review of the LiteratureNutrients10.3390/nu1306209213:6(2092)Online publication date: 19-Jun-2021

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICDCN '19: Proceedings of the 20th International Conference on Distributed Computing and Networking
January 2019
535 pages
ISBN:9781450360944
DOI:10.1145/3288599
  • General Chairs:
  • R. C. Hansdah,
  • Dilip Krishnaswamy,
  • Nitin Vaidya
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 January 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. activity recognition
  2. liquid identification
  3. liquid intake monitor

Qualifiers

  • Research-article

Conference

ICDCN '19
Sponsor:
  • SIGOPS
  • Indian Institute of Science

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)20
  • Downloads (Last 6 weeks)3
Reflects downloads up to 01 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Technology to Automatically Record Eating Behavior in Real Life: A Systematic ReviewSensors10.3390/s2318775723:18(7757)Online publication date: 8-Sep-2023
  • (2023)Vision-Based Methods for Food and Fluid Intake Monitoring: A Literature ReviewSensors10.3390/s2313613723:13(6137)Online publication date: 4-Jul-2023
  • (2021)Fluid Intake Monitoring Systems for the Elderly: A Review of the LiteratureNutrients10.3390/nu1306209213:6(2092)Online publication date: 19-Jun-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media