Abstract:
Fluid intake is an important information for many health and assisted living applications. At the same time it is inherently difficult to monitor. Existing reliable solut...Show MoreMetadata
Abstract:
Fluid intake is an important information for many health and assisted living applications. At the same time it is inherently difficult to monitor. Existing reliable solutions require augmented drinking containers, which severely limits the applicability of such systems. In this paper we investigate two key components of an unobtrusive, wearable solution that is independent of a particular drinking container or environment. We first describe a system for spotting individual instances of drinking (lifting a container to the mouth and taking a single sip) in a continuous stream of data from a wrist-worn acceleration sensor. We show that drinking motion can be detected across different drinking containers (glass, cup, large beer mug, bottle) on a large dataset (560 drinking motion instances from six users, embedded in 5.84 hours of complex natural activities). An average performance of 84% recall at 94% precision was achieved for the drinking motion spotting. Based on the events derived from drinking event spotting, we show how additional information can be obtained. Specifically, we demonstrate the recognition of container types and fluid level from upper body postures during drinking events. Nine containers and three container fluid levels were evaluated to recognize container type and fluid amounts with three users. Recognition rate for container type was 75%, and for fluid level 72%.
Published in: 2010 8th IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops)
Date of Conference: 29 March 2010 - 02 April 2010
Date Added to IEEE Xplore: 24 May 2010
ISBN Information: