Keywords

1 Introduction and Related Work

Situation- and context-aware computing has an established tradition in the area of distributed mobile computing [14] and interactive systems [6]. Challenging new application domains range from sport and fitness apps, driver assistance systems, E-commerce, digital marketing and other business applications to computer-based learning environments. Such applications require a new type of software engineering support, as was postulated in [3]. We are presenting the SitAdapt 2.0 modeling and design approach for such situation-aware software systems that has been refined over years in an evolutionary process [13].

The current implementation of our approach is mainly directed towards web applications and deals with the construction of user-centered interactive software that is able to dynamically adapt to the situations of users, and to guide them to meeting their objectives and successfully finishing their tasks. The main focus of this paper lies on the adaptation process and the detailed modeling of situation-aware adaptations.

1.1 Different Types of Adaptations

The PaMGIS framework [5] that is used by the SitAdapt 2.0 system offers development and runtime support for multiple-adaptive migratory user interfaces and can react to context changes in the user’s environment as well as situational changes that include the user’s behavior and emotional state in real-time. In general, three different categories of adaptations were distinguished in the field of user interfaces [1, 15]:

  • Adaptable user interfaces. The user customizes the user interface to his or her personal preferences.

  • Semi-automated adaptive user interfaces. The user interface provides recommendations for adaptations. The user has to decide, whether he or she wants to accept the recommendation or not.

  • Automated adaptive user interfaces. The user interface automatically reacts to changes in the context-of-use.

By adding the concept of situations that include and exploit the variability of the cognitive and emotional states of the user to the adaptation modeling process, a new category of individual adaptations is made possible. With SitAdapt 2.0, such situation-aware adaptations can be modeled by the developer and generated at runtime.

The SitAdapt 2.0 runtime system allows to adapt the user interface in different ways:

  • Changes in content. The system can adapt button labels and offer help texts or a chat windows to the different types of users. In the e-commerce domain, specific advertising and offers, as well as additional elements or popups such as a voucher can be presented to the user, if the evaluation of the situation recommends such adaptations. Different users can also be offered different content. For example, more or less detailed text information or product images or more or less detailed content presentation by adding or removing content attributes, like, e.g., comments for a product.

  • Changes in design. SitAdapt 2.0 can adapt colors, images and the contrast settings of a program or website or change text layout and font size and type. The sequence or layout of the website elements can be changed for different users.

  • Changes of media and interaction object types. Such changes modify the way of information presentation. Different types of charts (mosaic or a bar charts), personalized areas or functions (areas with functions frequently used by the individual), etc., can be chosen by situation-aware adaptations. Other possibilities for adaptations can be the replacement of a form with a wizard or the change of the input media from text to language.

Depending on the adaptation rules and the available data about the user and in the recorded situation profile, the SitAdapt 2.0 system can modify the user interface at different times:

  • Before the first display of the user interface,

  • while interacting with the user interface, or,

  • when the user accesses the interface again, at a later time.

1.2 Adaptive User Interface Approaches

Some related approaches for adaptive user interfaces either focus on adaptation or migration to other devices and platforms, or also cover user-related context-adaptations. Two interesting related approaches are the SUPPLE and Adapt-UI systems that are discussed in the following paragraphs.

SUPPLE has been developed at the University of Washington in Seattle. The first publication is from 2004 [9]. This approach evaluates functional interface specifications as well as device and user models [9]. The generation and adaptation of user interfaces is treated on the basis of solutions to decision-theoretic optimization problems.

SUPPLE searches for optimal renditions considering any relevant device constraints and minimizing the user’s effort required to carry out the necessary user interface actions [9].

SUPPLE adapts the user interface to the individual work style [11] as well as to the habits of the user, for example an extra individual area for frequently used functions of a pocket calculator [7]. The necessary generation and adjustment of the surface happens at runtime [11]. SUPPLE++ is a variant of the SUPPLE system and supports automatic customization for visually impaired users with limited motor skills [8]. SUPPLE++ was developed in 2007 [10]. SUPPLE needs three input sources for the creation of adaptive user interfaces. An interface specification (I), a device specification (D), and a user model represented by user traces (T) [8].

Within SUPPLE a functional interface specification is defined as a set of interface elements and a set of interface constraints. The elements are specified in terms of their data types which can be either primitive or complex. The data types that can be either primitive or complex. The constraints are expressed as functions mapping renderings to a Boolean value and allow, for instance, to map certain elements to the same widget. The device model comprises the available widgets, device-related constraints, and two device-specific functions for evaluating the adequacy of the widgets to be used.

One function measures the appropriateness of the widgets for interacting with the variables of the given types while the other calculates the user’s effort required for navigating through the user interface. The user model is defined by means of user traces, which are a type of logs of user actions, recorded at runtime. SUPPLE is aimed at finding the most appropriate rendering for each individual abstract interface element. This is achieved by means of a branch-and bound algorithm for minimizing a cost function, which is composed of the previously mentioned functions and information from the device and the user models [9]. The cost function consists of more than 40 concerted parameters and cannot easily be determined manually. Therefore, a tool named ARNAULD has been developed in order to facilitate this process [12]. SUPPLE++ primarily utilizes even more complex cost functions in order to consider the motor and visual impairments of handicapped users. In analogy to ARNAULD, SUPPLE++ is supported by the tool named Activity Modeler [7].

Adapt-UI [16], like SitAdapt 2.0, offers runtime-adaptation of the user interface. The system focuses on context-of-use adaptations when migrating to other devices and platforms, but also manages some user-related aspects. The system includes models of the user interface, the context, and the possible adaptations. Context-changes are triggered with adaptation rules. Such rules can show or hide pre-modeled UI elements, change navigation paths in pre-modeled ways, and react to a few simple user-related aspects that can be provided by a face detection library.

2 The PaMGIS Framework

The PaMGIS (Pattern-Based Modeling and Generation of Interactive Systems) framework architecture (Fig. 1) combines a situation analytics platform with pattern- and model-based user interface construction tools in order to build runtime-adaptive interactive applications with enhanced user experience and task-accomplishment characteristics [5]. The framework uses the ontological domain- and context models as proposed by the CAMELEON reference framework (CRF). The CRF [2], is the most common standard architecture for the model-driven construction of interactive systems. The CRF framework also includes structural guidelines for adapting the target software in predefined ways, mainly for responding to the requirements of different platforms and for migrating an application from a device to another device.

Fig. 1.
figure 1

Overview of the PaMGIS models and their interrelations

2.1 PaMGIS Models

The abstract user interface model (AUI) is generated from the information contained in the domain model of the application that includes both, a task model and a concept model. The AUI mainly includes the specifications of the abstract user interface objects.

In the domain model and the originally rendered AUI the user interface is still independent of the usage context.

After the completion of AUI modeling, the AUI model can be transformed into a concrete user interface model (CUI). The information of the context model and the structure of the dialog model are exploited by this process. For defining the dynamic aspects of the user interface, PaMGIS uses a dialog model. The dialog model is based on dialog graphs that were originally introduced by the TADEUS system [4].

In the next step the final user interface model (FUI) is generated automatically from the CUI model. Depending on the target implementation language, the FUI must either be compiled, or can be executed directly by an interpreter (Execute UI). The specification of the models is done in conformity with the Extensible Markup Language (XML) [5].

2.2 SitAdapt 2.0 Components

SitAdapt 2.0 (Fig. 2) is an integrated software system for enabling situation-aware real-time adaptations for web and mobile applications. The Interpreter is included in the PaMGIS framework (Fig. 1).

Fig. 2.
figure 2

Overview of the SitAdapt 2.0 system architecture and components

SitAdapt 2.0 consists of the following parts:

  • The data interfaces use the different APIs of the devices (eye-tracker, wristband, facial expression recognition software interface, metadata from the application) to collect data about the user. SitAdapt 2.0 uses two different data types for generation and adaptation of the user interface received from the different input devices (Fig. 2). Atomic data types as constant attribute values (e.g., age = 30, or glasses = true) and temporal data types. Temporal data management makes it possible, to document and analyze changes in the recorded data by time-stamping these data. This allows to reconstruct; which value was valid at what time. For instance, blood pressure or eye positions. With the aid of the SitAdapt 2.0 rule editor (Fig. 2), these atomic and temporal data can be used to create rules, that have an effect on the adaptation of the user interface.

  • The recording component synchronizes the different input records with a timestamp. In Table 1, for instance, the attribute value ranges are listed that can be received from the eye tracking system API.

    Table 1. Data input from the eye tracking system
  • The database writer stores the data from the recording component and from the browser in the database, where the raw situations and situation profiles are managed. It also controls the communication with the rule editor.

  • The rule editor allows the definition and modification of situation rules, e.g., for specifying the different user states and the resulting actions. The rule editor is very flexible and can use all input data types and attribute values as well as their temporal changes for formulating rule conditions. At runtime rules are triggered by the situation analytics component for adapting the user interface, if the conditions of one or more rules apply. However, situation rules can also activate HCI-patterns in the pattern repository. These patterns come with different levels of abstraction. They may contain concrete templates for generating interaction objects or information that describes, how low-level user interface attributes can be modified.

  • The situation analytics component analyzes and assesses situations by exploiting the observed data. Situation rules are triggered by the situation analytics component when the rule conditions are satisfied. Situation rules interact with the situation profiles stored in the SitAdapt 2.0 database. The rule actions either directly trigger simple adaptations or interact with the PaMGIS resources as described above.

  • The evaluation and decision component uses the data that are provided by the situation analytics component to decide whether an adaptation of the user interface is currently meaningful and necessary. For this purpose, the component evaluates one or more applicable situation rules and has to solve possible conflicts between the rules. Whether an adaptation is meaningful depends on the predefined purpose of the situation-aware target application. Such goals can be detected, if one or more situations in the situation profile trigger an application dependent or domain independent situation rule. Situation rules are related to patterns. They define behavioral and context-related situational patterns. If the decision component decides that a complex adaptation is necessary, it has to provide the artifacts from the PaMGIS pattern and model repositories to allow for the modification of the target application by the adaptation component.

  • The adaptation component generates the necessary modifications of the interactive target user application.

2.3 SitAdapt 2.0 at Work

For the SitAdapt 2.0 system implementation, we developed a prototypical travel-booking application website to highlight and evaluate some of the system’s capabilities. The application features elements typical for e-commerce applications, including the ability to enter query parameters, viewing and selecting query results, viewing product details, registering, logging in, modifying a selected product, and a payment process. Having the ability to collect information about a user’s physical, physiological and emotional properties allows application designers, e.g., to help the user to fill out forms (Fig. 3).

Fig. 3.
figure 3

Regular user interface without adaptation

In our example (Fig. 3), the system, by exploiting the collected eyetracking data, can recognize, whether the user has a problem with a form field. With the help of the API of the eye tracking system SitAdapt 2.0 receives the data about the X and Y coordinates (Table 1) of the user’s eyes. The recording component synchronizes these data with other data, for example facial expression recognition data and adds a timestamp to these data. In our example, the eye tracking data are temporal data. The data base writer stores this information from the recording component in the Rethink database. In the rule editor, the application developer can create a rule (Fig. 4) for displaying a help text for a form field (e.g., card security code). A new rule is created in the rule editor by specifying a range for the X and Y coordinates. If the coordinates move within this area within 5 s (around a certain form field), a certain action should be triggered. In this example, the display of a help text (Fig. 5).

Fig. 4.
figure 4

SitAdapt 2.0 Rule editor

Fig. 5.
figure 5

Adapted user interface

The situation analytics component analyzes and assesses the situation by exploiting the observed data. The rule (helptext CVC) is triggered by the situation analytics component. The evaluation and decision component is exploiting the collected data and checking possible conflicts, if more than one rule is triggered at the same time. The adaptation component finally generates the necessary modifications of the displayed user interface (Fig. 5).

3 Conclusions and Future Work

This paper has discussed the adaptation modeling process and briefly introduced the available components and resources of the combined SitAdapt 2.0/PaMGIS environment. The system is currently being evaluated for several business applications in a usability and situation-analytics-lab environment. In order to also record and assess the cognitive load, the attention level, and stress-related parameters we have recently added the g.Tec EEG measurement and brain-computer interface (BCI) hard- and software to our lab equipment and integrated it into the SitAdapt 2.0 architecture.

We are currently carrying out a large study within the customer experience domain, where we are interested in comparing and aggregating the results obtained with visual emotion recognition software with the data gathered with the brain-computer interface equipment. If the recorded data can be securely kept privately on the devices, the benefits generated from individualized adaptations will be experienced by the users without compromising their privacy. For this purpose, we are also looking into mobile hard- and software for situation analytics that will allow us to build a light version of the SitAdapt 2.0 system. Such a system can be integrated into mobile platforms and bring situation-aware applications to the end user.