Keywords

1 Introduction

Disability and demographic change are challenging societies around the world and EU member states in particular [1, 2]. ICT-based Assistive Technologies (AT), eHealth services and modern smart home infrastructures can significantly increase independent living of people with disabilities, allowing them to stay longer in their own living environment. Non-standard Human-Computer Interfaces (HCIs) are required to support people with severely reduced motor capabilities like those with late-stage Multiple Sclerosis (MS), Amyotrophic Lateral Sclerosis (ALS) and Hemi- or Tetraplegia. Multiple disabilities – for example a combination of motor restriction and low vision – lead to specific challenges for the selection of adequate assistive tools. Specific needs of users in the range of movement, auditory or visual capability but also preferences and know-how in using a computing platform have to be respected. Use cases may include environmental control, Augmentative and Alternative Communication (AAC), control of specific devices, non-standard user interfaces for computer- or smartphone control and special methods for using the world-wide web or social media platforms. The current market for smart home appliances and AT offers many different products which are not standardized and lack interoperability. This significantly increases the workload for the implementation of tailored assistive solutions, which is one of the reasons why people with highly specific needs still lack in getting access to tailored AT products.

In this work, we present system components and design strategies facilitating user-driven, tailored assistive technology solutions. First we outline the research methodology and the frameworks/tools used - in particular the AsTeRICS construction set and the FlipMouse special input device. Second we introduce the Two-Level Personalization method for implementing personalized solutions, where an iterative user-centered design process supports reusability of implemented solutions for persons with comparable capabilities. Third, we apply and evaluate the method in a single-subject study where we develop a set of tailored assistive solutions for computer access together with a client suffering from multiple disabilities. In the final section we discuss our results and compare them with similar research. The involved software modules and hardware designs are available under open source license, inviting for reuse or modification at low cost.

2 Methods and Tools for Personalization

2.1 User Centered Design: Participatory Action- and Single-Subject Research

For the creation and evaluation of the assistive tools developed in this study, we applied user-centered design (UCD) methodologies, in particular the Singe-Subject Research and Participatory Action Research (PAR) paradigms. UCD is often used when ICT solutions are modeled according to individual needs, see for example [19, 20]. The principles of PAR imply that all participants of a study or project are considered as stakeholders or experts and are directly involved in the formative evaluation and indirectly involved in the system/software development process. Researchers and clients are engaged in an equitable relationship where the aim is to solve a problem and generate new knowledge [17, 18]. Single-Subject Research is a method for the development of evidence-based practice in fields with highly individual challenges and research questions, where quantitative measures involving a high number of participants are not possible or feasible [16].

2.2 Component-Based Development for Assistive Technology: AsTeRICS

The Assistive Technology Rapid Integration and Construction Set (AsTeRICS, [3]) was co-developed by the authors together with 9 partnering institutions in course of a collaborative research project funded by the EU-commission under the 7th Framework Program (FP-7). The AsTeRICS framework consists of a Java/OSGI middleware (the AsTeRICS Runtime Environment, ARE), a graphical editor (the AsTeRICS Configuration Suite, ACS) and a set of more than 160 bundles (“plugin”-components) for a flexible creation of assistive solutions [13]. These components are classified into 3 categories: sensors, processors and actuators. Sensors monitor the environment and transmit input information to other components of a model. Processors are responsible for receiving, processing and forwarding this information. Finally, actuators receive data and carry out desired actions [5]. The components expose input- and output ports for data and events and can be connected via the ACS. In the AsTeRICS framework, a model is considered as the container that holds connected components and produces a specific functionality. The runtime environment features dedicated services and communication interfaces (the so-called ASAPI protocol and a RESTful API) which expose functionalities for model management, for example transferring models from ACS to ARE and vice versa, starting and stopping models, modification of parameters of running models, storage and many more. The main components of the AsTeRICS architecture are shown in Fig. 1. For a full description of the capabilities of the API and the xml schema for the model files please refer to [3].

Fig. 1.
figure 1

System architecture, key components and interfaces of the AsTeRICS framework

Examples for sensor components include computer vision (head- and eye-tracking), interfaces to bioelectric amplifiers or simple momentary switches. Actuators include mouse/keyboard/joystick emulation and home automation equipment – for example KNX, enOcean or FS20. An example for the graphical creation of models using the AsTeRICS Configuration Suite (ACS) is depicted in Fig. 2.

Fig. 2.
figure 2

Construction of an AsTeRICS model using the ACS editor

In course of the ongoing Prosperity4All (P4All) project [4], the AsTeRICS framework was extended with the RESTful interface [6] for controlling model lifecycle, adjusting plugin properties and receiving events or live data. Furthermore, the AsTeRICS Packaging Environment (APE) has been developed, which provides infrastructure to extract a particular set of AsTeRICS-based AT solutions as a dedicated source code repository. Additionally, down-stripped deployment packages can be generated for a given a set of solutions, including native installers. This allows a much better management of different customized versions of the framework which have been tailored to individual users as described in Sect. 3.

2.3 The FlipMouse – A Universal Input Device

For people who cannot move their upper limbs and are thus restricted in using standard input devices for computers, tablets or smartphones, the FLipMouse special input device has been developed by the authors of this paper [7, 8]. The FLipMouse can be actuated via low-force and low-range finger- or lip-movements (using a dedicated mouthpiece) and creates standard USB Human Interface Device (HID) reports - acting as mouse, keyboard and joystick (composite device). As shown in Fig. 3, additional momentary switches can be attached to the FlipMouse and a standard mounting option allows combination with off-the-shelf mounting solutions.

Fig. 3.
figure 3

The FlipMouse alternative input device with attached external switches and mounting option

The FlipMouse offers configurable sensitivity for cursor-movement and sip- and puff activities. Up to seven configurations can be stored in the internal EEPROM of the device, which allows the user to switch e.g. from mouse to keyboard mode via a desired action. This offers increased flexibility compared to similar devices available on the market (for example the LifeTool IntegraMouse [14] or the QuadJoy mouth control device [15]). The configuration editor for the FlipMouse settings is depicted in Fig. 4. Furthermore, the device can learn and replay infrared remote commands. Via a bluetooth low energy (BLE) add-on-module, phones or tablets (including the iPad) can be controlled wirelessly. The FlipMouse can be combined with low-cost eye tracking systems (Tobii EyeX [9, 10], EyeTribe [11]) allowing significant reduction of jitter and inaccuracies of the eye tracking [5]. For a detailed description of the FlipMouse system and the configurations options please refer to [7].

Fig. 4.
figure 4

The FlipMouse configuration editor (detail)

2.4 The Two-Level Personalization Method for SW/HW Engineering

Using the AsTeRICS framework, the FlipMouse module and - where reasonable - additional AT tools or software, tailored assistive solutions can be created rapidly. For efficient re-use and sharing of these solutions, we suggest a Two-Level Personalization engineering method, which fosters a stepwise refinement and customization of an assistive setup in course of participative single-subject studies, and a final generalization step where successful setups are adapted for generic parameterization. These generic versions of tailored solutions can then be re-applied for persons with similar capabilities and needs.

Level 1: Development of Configurable and Shareable AT Solutions

The development of an AT-solution starts with an initial assessment with the end user (client), querying the individual capabilities and needs of the person and deducing the goals for the intervention. In an iterative process of refinement and evaluation which is performed in subsequent meetings together with the client, a tailored assistive solution is developed using the AsTeRICS framework and other rapid prototyping tools (see Fig. 5). This process is similar to agile software development and follows the user-centered design and PAR paradigms.

Fig. 5.
figure 5

Level 1: Iterative development of personalized AT solutions, deduction of generic solutions

The Level-1 process yields a set of personalized AT solutions, which are evaluated against the defined goals. When a satisfying result is achieved, this solution is equipped with additional features which make it more flexible and potentially useful for other clients. For example, a small GUI or web-based user interface for changing particular settings of an existing model via the RESTful interface could be added, which allows changing essential parameters also for caretakers and non-experts. This yields the second outcome of the Level-1 process: a generic version of the tailored solution, which can be stored in a database for reference solutions and can be reused by the community.

Level 2: On-Demand Personalization of Generic AT Solutions

Existing AsTeRICS-based solutions which resulted from the Level-1 development process can be picked up on-demand when a tailored solution is required for a new client with similar needs and capabilities, making personalization steps easier and more efficient. Figure 6 illustrates an ideal case where a desired personalization can be achieved solely by changing the provided system properties of a generic solution (via GUI or web interface). If this is not sufficient, model- and system modifications will be applied to the generic solution, re-entering the Level-1 process.

Fig. 6.
figure 6

Level 2: Personalization of existing generic solutions

2.5 The AsTeRICS Packaging Environment (APE)

The AsTeRICS Packaging Environment (APE) enables the deployment of stand-alone AsTeRICS-based AT solutions by extracting necessary resources for particular use cases from the complete framework. The resulting “stripped-down”-version of the framework contains only necessary system components (e.g. plugin jar -files, Java Runtime Environment, model- and configuration files) and specific other resources (e.g. images, icons, documentation and license files). This collection can then be hosted as a separate source code repository. Finally, deployment packages for the solution can be created. Several deployment types are supported including a compressed archive (.zip) or native installers for specific target platforms (Windows, Linux, Mac-OS). The installer also contains a native launcher which is integrated into the menu and desktop of the target platform. The installer properties (e.g. application name, version) can be defined and further customization can be done by adding a post-install script. (Please note that some plugins/components depend on platform-specific code and thus not every AsTeRICS-based solution can be deployed to any operating system or target platform.)

Several APE-based example projects for different use-cases including bioelectric signal processing, camera-based input modalities, Smart Home integration or speech recognition have been provided in the P4AllBuildingBlocks github-repository Footnote 1. They can be forked and further customized by other developers or directly be used by caretakers or end-users with IT-background. Additionally, the examples are listed on the developer space platformFootnote 2, which was created in course of the Prosperity4All project [4], allowing developers to find AT-related technologies easily.

The AsTeRICS Ergo repository Footnote 3 is an APE-based solution targeted for occupational therapists with low technical background. The provided simplified web-based user interfaces enable occupational therapists the creation of environmental control setups (e.g. for controlling TV, lights, etc.) for people with disabilities. Using a step-by-step wizard, the user can add remote control commands for infrared- or radio-controlled devices.

2.6 Auto-Personalization Using the GPII

The Global Public Inclusive Infrastructure (GPII) is a cloud-based infrastructure for enabling accessibility of devices and ICT by auto-personalization depending on user preferences and device characteristics [12]. Figure 7 describes the auto-personalization flow in GPII: First, a user approaches a device. The user listener notifies the flow manager, which retrieves the user preferences stored in the preferences server and asks the device reporter for device characteristics and installed applications. The matchmaker matches preferences, device characteristics and available solutions (installed or available in the solutions registry). Finally, the life cycle manager applies the solutions and settings using an appropriate settings handler (e.g. for the Windows registry or via a JSON configuration file).

Fig. 7.
figure 7

The GPII for Auto-personalization flow (https://wiki.gpii.net/w/GPII_Architecture_Overview#The_Auto-Personalization_Flow)

Fig. 8.
figure 8

P.K. using the FlipMouse input device for computer control

AT solutions created with the Two-Level Personalization method and deployed via APE can be registered in the solutions registry so that they can be installed and configured by the lifecycle manager, thereby setting model parameters according to the user’s preferences. For more information please refer to the GPII Wiki pages [12].

3 Single-Subject Research - Evaluation and Results

In this section, the application and evaluation of the Two-Level Personalization concept of tailored AT-solutions involving AsTeRICS and the FlipMouse will be described. The applied methodology follows Single-Subject Research and Participatory Action Research criteria, with involvement of one client with multiple disabilities and two Assistive Technology system designers (the authors) who were responsible for the implementation of the AT solutions.

3.1 Initial Assessment

In a first meeting with the client, a holistic understanding of the person and a plan of the interventions were established. For this purpose, an interview was performed where biographic information and details about the kind of disability, personal background, motivation and goals for the ICT-based intervention and other data have been collected. Furthermore, the client read and signed the “informed consent” document which contains information about the nature of the study, his rights - including the right to withdraw from the study at any time - and a data protection agreement. (The photos used in this publication were explicitly authorized by the client.) Based on this information, a plan for the implementation of a first set of assistive solutions and a time schedule for next meetings have been created, initiating the Level-1 Personalization phase. The key information from this first meeting will be summarized in the following section.

3.2 Introduction of Client, Setting and Use Cases

P.K. was born 1964 and was diagnosed with multiple sclerosis in 1985. He has lived in a care center in Vienna since 2008. He can move his head and actuate his facial muscles, but no other conscious motor activity is possible. The motor control for his head is limited but precise (lateral movement about 10 cm, dorsal/caudal movement about 5 cm). There are no tremors or involuntary movements involved. P.K. speaks very silently but understandably. He has a very low vision (left eye: 7%, right eye: 12% - diagnosed in 2014 – and his vision has probably gotten worse since then).

P.K. uses a new version of the Sicare Pilot Footnote 4 environmental control system. The voice recognition works quite well despite P.K.’s whispering voice (commands for changing the TV channel must be repeated several times but are eventually recognized). The primary device controlled by the Sicare Pilot is a wide screen LCD TV. P.K. also uses an Android tablet for music playback. The tablet is paired with a stereo music player via Bluetooth. P.K. has an extensive music collection on Google Music and occasionally buys additional albums. The music playback via tablet and Sicare Pilot can only be used to start/stop playlists (no selection of individual albums or tracks).

Goals of the Intervention

P.K. wants to use a computer for newspaper reading and for listening to music/browsing his music collection (these are the primary use cases). This would considerably increase his autonomy. If these goals can be accomplished, P.K. would also like to perform internet searches via google and browse arbitrary web pages.

3.3 Iterative Development of Level-1 Solution

Several interaction strategies were prototyped and tested together with P.K. in course of the first three personal meetings at the care center. Each meeting took about 2–3 h and consisted of an initial interview, followed by an explanation of the planned intervention, the application of the solution and a discussion/evaluation of the advantages and disadvantages. To explore reasonable methods for interaction, the focus was first put on the less complex music player use case.

To facilitate the desired selection of music from P.K.’s music archive, a new AsTeRICS model with a basic selection interface for albums (subfolders in a music directory) was created. In the music player model, a simple graphical user interface with big buttons provides the primary functions including next/previous folder, enter/exit subfolder, play/stop album. The folder names and album titles were spoken by a synthetic voice. Furthermore, the 3rd-party screen reader ZoomText Footnote 5 (commercial software, trial version) was installed on the client’s computer. Figure 9 shows the resulting GUI:

Fig. 9.
figure 9

First version of a GUI-based music player

The initial strategies for enabling P.K an interaction with the user interface were all based on cursor control and deduced from existing generic AsTeRICS models. These strategies were evaluated in the first 3 meetings and involved the following sensor modules:

  • Face Tracking for cursor control, using the generic camera mouse model

  • Blob Tracking using infrared reflection and head marker (IR-sticker)

  • The FlipMouse universal input device, using standard settings for mouth control of the mouse cursor. For this purpose, the FlipMouse was mounted on a Manfrotto Gelenkarm Footnote 6 with Superclamp Footnote 7 so that it could be precisely positioned. The client could reach the mouthpiece with his lips, the distance of the lips to the mouthpiece was about 5 mm when the client remained in his resting position. The client’s bed was adjusted to allow a convenient sitting position (see Fig. 8).

The ZoomText screen magnification and screen reader software was configured to apply a very high screen magnification factor (5) and high contrast settings. The “mouse hover” function was activated so that the ZoomText screen reader speaks content/captions of UI elements which are located under the mouse cursor. Additionally a big crosshair tool enhanced the visibility of the actual cursor position and a 22” computer monitor was used as primary display. The results were evaluated using qualitative and quantitative methods (observation, recording of results and informal interview) – see Sect. 3.7. The results of the first meetings can be summarized as follows:

  • The Face Tracking and Blob Tracking based input methods for cursor control could not be used efficiently: Although a conscious control of the cursor could be obtained, the range of movement was too low and the occurring problems (drifting cursor, lost positioning, etc.) were significant. The Blob Tracking method was also experienced as inconvenient because of the marker which needs to be placed on the forehead.

  • Controlling the cursor with the FlipMouse worked well, by touching the mouthpiece with the lips to obtain cursor movement and using sip/puff for clicking. The cursor moved too fast in the first trials, but after reducing sensitivity/acceleration, directional movement was possible without limitation.

  • The greatest problem was the low vision of P.K.: despite the high magnification factor, contrast settings and crosshair, P.K.’s visual limitations are too severe for using the GUI as a primary navigation method. The cursor-based navigation was not feasible at all.

These results of the first design iterations led to the conclusion to omit the mouse cursor control completely and create another input interface for P.K. which relies solely on keyboard input. For this purpose, the FlipMouse was reconfigured for cursor key control, so that lip- and sip/puff interactions with the mouthpiece create desired keyboard activity via the FlipMouse device.

3.4 Refinement and Finalization of Level-1 Solution

In the subsequent meeting and customization steps, a suitable Level-1 Personalization was created: The AsTeRICS model for the music player functionality was modified so that it uses the KeyCapture plugin to detect key presses. Six numeric keys (1/2/3/4/5/6) were mapped to the music player functions and a dedicated FlipMouse configuration slot was created to send these keys directly from the FLipMouse when an interaction with the mouthpiece takes place (see Table 1).

Table 1. FlipMouse configuration slot for music player operation via ARE/keyboard emulation

Integrating the Newsreader Use Case

To address the second use case (browser control for reading newspapers on the internet), a second FlipMouse configuration slot was created. This configuration maps the mouthpiece interaction of the user to cursor keys and other special keys which are useful for navigating web pages via a standard web browser with screen reader support (for example skipping to next link/sentence or displaying the bookmark menu – which also places the keyboard focus into the bookmark menu) – see Table 2. The FlipMouse configurations can be switched (from music player to news reader function and vice versa) via a strong puff activity into the mouthpiece. The music player function remains active in the background during newspaper reading. Furthermore, the AsTeRICS model was modified to automatically start the Firefox web browser after system boot. As screen reader solution, the free NVDA Footnote 8 application replaced the commercial ZoomText software which was not necessary anymore because the visual feedback and magnification settings did not yield useful results in the prior trials.

Table 2. FlipMouse configuration slot for newspaper reader/browser control

Summary of the Finalized Tailored Setup:

  • The FlipMouse is utilized as primary interaction aid (two configuration slots for keyboard key creation which can be changed via a strong puff action).

  • NVDA screen reader used for auditory feedback.

  • The ARE and NVDA are started automatically after login. There is no login password so that the menu appears immediately after system startup.

  • The primary task of the AsTeRICS model is to capture the keys which are necessary for the music player

  • Only one AsTeRICS model is used. The application launcher plugin automatically starts the Firefox browser when AsTeRICS is loaded at system startup.

Utilized Hardware for this Personalized Solution:

  • Windows10 laptop (Medion low-cost laptop with Intel Celeron CPU)

  • FlipMouse v2.4 with acrylic caseFootnote 9

  • Mounting solution: Manfrotto Gelenkarm+Superclamp

  • Adjustable table with wheels for laptop and FlipMouse

  • Total hardware cost for this solution: about € 650

3.5 Deduction of Generalization Parameters for Level-2 Solution

To make the music player applicable in other contexts and for other users, specific parameters were exposed so that they can be adjustable via a dedicated web-based GUI. This GUI can be accessed by any ICT-device with an internet browser. The GUI allows changing the desired keys for using the media player and the root folder for the music archive (see Fig. 10). Thus, different alternative input devices (e.g. momentary switch interface boxes or commercial mouth-control devices) can be supported.

Fig. 10.
figure 10

Settings dialog for generic parameters of the media player use case (left), FlipMouse editor: sip/puff settings for key emulation (right)

For the newsreader use case, a flexible mapping of user activity to system action is already possible via the FlipMouse GUI configuration editor. Several different mappings can be stored in form of FlipMouse configuration files and these files can be provided via the public source code repository together with the Level-2 AsTeRICS models and other resources. Thus, the utilized keyboard actions for the newsreader use case (e.g. ctrl+b to display or hide the bookmark menu of the Firefox browser) can be provided also for other web browsers or operating system technologies.

3.6 Utilized Quality Indicators for Single-Subject Research

Horner, Carr and colleagues defined a number of quality indicators for Single-Subject Research in special education (see [16]). These quality indicators consider the description of participants and settings, characterization of variables, baseline measures, validity of experimental control and other factors. Although not all quality criteria are applicable for the given study, we present a number of criteria which have been addressed in the evaluation of the tailored assistive solution in Table 3.

Table 3. Quality indicators for the applied Single-Subject Research/evaluation of solutions

3.7 Task Evaluation and Usability Rating

In course of 7 meetings with P.K. at the care center, the assistive solutions have been evaluated along different criteria. The Task-attainment was derived as successful to unsuccessful trials. (The trial was successful if a desired task could be performed without help at the first time). The tasks were randomly selected from a list of interactions necessary for the music player- and newspaper reader AT solutions (in course of one session, every task was performed):

  • select artist folder, change into artist folder

  • select album folder, change into album folder

  • play/stop music playback

  • exit folder

  • switch to newspaper reader functions/switch back to music player

  • set input focus to bookmark menu (web browser)

  • select desired link in bookmark menu

  • skip to next heading/link in news article

  • follow link (open new page)/go to previous page

After a trial session, the user was interviewed and he rated usability criteria of the assistive solution. The rated criteria included “ease of use”, “efficiency” and “would like to keep solution”. In the first 3 sessions, different input variants were evaluated; as expected, the usability was rated low in the first trials where the client could not utilize the provided interaction solutions in a reasonable way. During the 3rd session, the FlipMouse-based key input variant in combination with NVDA screen reader (as described in Sect. 3.3) was introduced and this method proved to be useable. In sessions 4 to 7, significant improvements in the client’s efficiency using this solution were gained, and the client reported improved usability ratings. In the last evaluation session, about 75% of the tasks could be performed correctly at the first try, and the task completion time decreased to several seconds, indicating that the client learned to use the interaction method more and more efficiently (see Table 4).

Table 4. Evaluation of different solutions in course of 7 user meetings, usability rating

4 Discussion

In this article, we demonstrate the Two-Level Personalization concept for participatory design of assistive solutions in cooperation with end users, which utilizes open source tools and rapid prototyping strategies. In course of a single-subject study, the AsTeRICS graphical construction set was combined with the FlipMouse special input device and 3rd party software to support a client with severely reduced visual and motoric capabilities. Several alternative HCI variants for controlling music player- and newspaper reader applications were prototyped and evaluated, including computer-vision-based cursor control and cursor control via lip movements. An efficient solution which allows the client controlling music player and news reader by key inputs created via the FlipMouse was found after 3 meetings and was further evaluated.

In related literature, several systems for computer vision-based mouse cursor control involving head- or eye-tracking are described [5, 10, 21, 22]. Such approaches are hardly applicable for people with low vision. For the client who participated in this study, even high screen magnification and contrast settings did not enable reasonable cursor control via visual feedback, which also prohibited the use of commercial mouth- or lip-controlled input devices [14, 15]. On the other hand, screen readers and supportive tools for people with low vision or blind people often rely on complex interaction via the keyboard and different keyboard shortcuts for navigation. This makes it difficult for people with limited motor control to use such interfaces.

We demonstrated the iterative customization of a suitable unique interaction strategy for our client, which could be achieved with minimal effort and without the need for writing additional programming code, by combining AsTeRICS and the FlipMouse with the NVDA screen reader and a standard internet browser. In a final generalization step, useful parameters of the music player solution were exposed to a web-based configuration GUI which allows the application of this use case also for other users and in combination with other special input devices.

In the future, further improvements of the tailored solution are planned, for example adding the possibility to perform a restart or shutdown of the computer via dedicated interaction with the mouthpiece. Quantitative measures of the task completion time will be recorded and evaluated. Furthermore, we would like to test the solutions with other clients and evaluate the parametrization and usability in more detail together with persons who have a non-technical background (e.g. caretakers). Finally, we plan to share the generic solutions in a public repository.

In general this research demonstrates a path how personalization of AT could be implemented at a larger scale by (a) taking care for user involvement and user centered design (Participatory Action Research), (b) reducing technical complexity and efforts through employing Component Based Development, (c) fostering reusability through a Two-Level Personalization approach allowing generalization of components and AT solutions through parametrization and (d) sharing components, know-how and solutions. The use cases we worked on proved the viability of the approach but also made clear that more research is needed to improve usability for practitioners without having a strong IT background. More training for therapists and care personnel is needed so that personalization can become an integral part of everyday service provision. This demands for an organizational shift of focus towards using the potential of personalized AT at a larger scale. AT must not be seen as an external resource to be ordered but more as a solution to be built with the users in the service process, based on shared components and know-how. This shift in return would also lead to more and better components featuring increased adaptability through parametrization.