Situvis: A sensor data analysis and abstraction tool for pervasive computing systems
Introduction
Context-aware pervasive systems are designed to support a user’s goals by making adaptations to their behaviours in response to the user’s activities or circumstances. An example of such a system is that of Miele et al., where context can be associated with user preferences for information retrieval, meaning that those preferences are only applied in the specified contexts [1]. The accuracy and utility of these adaptations is predicated on the system’s ability to capture and recognise these circumstances as they occur. To achieve this, a system designer must characterise these adaptation opportunities by collecting context data from multiple heterogeneous sensors, which may be networked physical instruments in the environment (measuring factors like temperature, noise volume or humidity), software sensors retrieving information from the web or various data feeds, or wearable sensors measuring factors such as acceleration or object use. These context data are voluminous, highly multivariate, and constantly being updated as new readings are recorded.
To better manage this complexity, we can use data abstraction to shield developers from having to deal with raw sensor data—data that often requires a steep learning curve for interpretation, such as accelerometer data or 3D-coordinate location data. One active research area in this direction involves using machine learning techniques to perform activity recognition—activities being higher-level interpretations of raw sensor data, representing objective actions such as cooking and walking. Situations have been proposed as another high-level abstraction of context data [2]. They symbolically define commonly-experienced occurrences such as a user “taking a coffee break”, or being “in a research meeting”, without requiring the user to understand any of the dozens of distinct sensor readings which may have gone into making up these situations. Situations are thus a natural view of a context-aware system, whereas the individual pieces of context are each “a measurable component of a given situation” [3]. From a software engineering perspective, we define situations in terms of high-level context: data that has been encapsulated to a level of understanding appropriate for a developer specifying a situation (e.g., symbolic locations), as opposed to a physics expert (e.g., 3D-coordinates), for example.
Thomson et al. observe that there are two approaches to situation determination: manual specification and machine learning-based approaches [4]. The manual specification approach suffers from complexity. As the context information available to a context-aware system at any moment is so extensive, dynamic and highly dimensional, it is a significant challenge for a system observer to ascribe significance to changes in the data or identify emergent trends, much less capture the transient situations that are occurring amid the churn of data.
Machine learning-based approaches are insufficient due to the extensive training data required. Many situations are subjective and hence require a degree of personalisation. We believe that it is unrealistic to assume that every user of a context-aware system will go through the long and tedious training process required for supervised learning techniques. Here, we propose a hybrid approach that utilises minimal training data to frame a situation specification, combined with relevant visualisations that simplify the manual process of fine-tuning.
Existing work has applied the coupling of data- and user-driven processes to carry out difficult tasks. In particular, the general Interactive Machine Learning (IML) model consists of iterations of classical machine learning followed by refinement through interactive use—in the Crayons project [5], users can build classifiers for image-based perceptual user interfaces using a novel IML model that involves iterative user interaction in order to minimise the feature set and build a decision tree. Moreover, Dey’s a CAPpella is a prototyping environment, aimed at end-users, for context-aware applications [6]. It uses a programming by demonstration approach, through a combination of machine-learning and user interaction, to allow end-users to build complex context-aware applications without having to write any code.
The visualisation of large and complex multivariate data sets, such as those that context-aware system developers work with, is becoming increasingly crucial in helping those developers to organise and distill data into usable information [7]. Interactive visualisation tools help the viewer perform visual data analysis tasks: exploring patterns and highlighting and defining filters over interesting data.
Situvis is our scalable interactive visualisation tool for pervasive systems [8]. By illustrating sensor data using effective and intuitive visualisations, combined with simple, intuitive interactive functionalities, Situvis affords users the ability to quickly identify interesting features of the data. By incorporating real situation traces and annotations as ground truth, Situvis assists system developers in constructing and evaluating accurate situation specifications by essentially bootstrapping the manual process, hence affording them a better understanding of the situation space, and the reliability of modeling with situations based on real, recorded sensor data. It is a framework that allows developers to understand, at a high level, how their system will behave given certain inputs.
The following section provides some background for our work. In Section 3, we describe the details of the Situvis tool. Section 4 features an evaluation of Situvis through a user-study in which it is compared to an improvised alternative toolset. Finally, we conclude in Section 5 and present some potential future work.
Section snippets
Abstract models and activity recognition
Abstract models are often constructed in order to generalise concrete concepts by capturing their common properties and structure. Besides abstracting situations from context data, other abstract modelling research occurs in the literature. Penta et al. [9] introduce ontologies for representing and reasoning about the high-level content of multimedia data, particularly images. High-level knowledge, such as the fact that an image illustrates a jockey riding a racehorse, can be inferred from
Situvis
The first version of Situvis [8] was developed in Processing [27], a Java-based visualisation framework that supports rapid prototyping of visualisation techniques. The current version was re-developed in Java to make it more extensible. Situvis is open-source software and can be downloaded from situvis.com.4
Situvis uses two visualisation techniques, a Parallel Coordinates
Experiments & results
To the best of the authors’ knowledge, no tools exist that are purpose-built to assist users in analysing and creating abstractions of pervasive system sensor data. We are aware of tools that allow data to be played back in real-time, such as PlaceLab’s Handlense tool and Dey’s a CAPpella [6], however, these tools are purpose-built for other tasks and do not allow the user to explore the data set as a whole in an efficient manner. An improvised toolset could consist of the following: a tabular
Conclusion and future work
Situvis provides context-aware application developers with a means to understand vast datasets of sensor data in order to efficiently construct specifications of situations to be used as cues for context-aware applications. An integrated programming environment and sensor data representation means that a developer can create specifications by constraining sensors in response to real data traces produced by the system. Results show, with statistical significance, that Situvis can be used to
Acknowledgements
This work is partially funded under The Embark Initiative of the Irish Research Council for Science, Engineering and Technology, and by Science Foundation Ireland under grant number 03/CE2/I303-1, “LERO: the Irish Software Engineering Research Centre”.
References (29)
- et al.
A methodology for preference-based personalization of contextual data
- K. Henricksen, A framework for context-aware pervasive computing applications, Ph.D. thesis, The School of Information...
- S. Knox, A.K. Clear, R. Shannon, L. Coyle, S. Dobson, A. Quigley, P. Nixon, Towards Scatterbox: a context-aware message...
- et al.
A self-managing infrastructure for ad-hoc situation determination
- et al.
A design tool for camera-based interaction
- et al.
A cappella: programming by demonstration of context-aware applications
Data visualization: picture this
Nature
(2002)- A.K. Clear, R. Shannon, T. Holland, A. Quigley, S. Dobson, P. Nixon, Situvis: a visual tool for modeling a user’s...
- et al.
Multimedia knowledge management using ontologies
- C. Ghezzi, A. Mocci, M. Monga, Synthesizing intensional behavior models by graph transformation, in: ICSE’09, 2009, pp....