Skip to content
Publicly Available Published by Oldenbourg Wissenschaftsverlag April 16, 2019

Requirements and Interaction Patterns for a Novel Interaction Device for Patients in Intensive Care

  • Börge Kordts

    Börge Kordts (PhD student, M.Sc. in Computer Science) is a researcher and member of the Ambient Computing Group at the Institute of Telematics (ITM) of the University of Lübeck. His research interests include human computer interaction in smart spaces, accessibility, and digital healthcare.

    EMAIL logo
    , Jan Patrick Kopetz

    Jan Patrick Kopetz (PhD student, M.Sc. in Media Informatics) is a researcher at the Institute for Multimedia and Interactive Systems (IMIS) of the University of Lubeck. His research interests include human computer interaction, new interaction devices, mobile media, augmented reality, and digital healthcare.

    , Adrienne Henkel

    Adrienne Henkel (M.Sc. Health and Nursing Science) is a research assistant at the Institute for Social Medicine and Epidemiology (ISE), University of Lubeck. Her research interests include clinical nursing research and digital technologies in healthcare, especially in nursing.

    , Andreas Schrader

    Andreas Schrader (PhD in Computer Science) is a Professor for Ambient Computing and head of the Ambient Computing Working Group at the Institute of Telematics (ITM) at the University of Lübeck. His research interests include human interaction in smart spaces with a special focus on frameworks for the development of ambient health systems.

    and Nicole Jochems

    Nicole Jochems (PhD in Engineering) is a Professor for Media Informatics and Head of the Media Informatics Programme at the Institute for Multimedia and Interactive Systems (IMIS) of the University of Lubeck. Her research interests include methods in the area of human-centerd design, age-specific design, virtual- and augmented reality, and digital healthcare.

From the journal i-com

Abstract

Intensive care patients that are weaned from mechanical ventilation are facing substantial physical and psychical stress. Due to the breathing tube, they often cannot voice their basic needs adequately. Possible consequences, amongst other complications, are a prolonged healing process and a delirium. To address this issue and support patient communication in intensive care, we provide a solution tailored to patients that are dealing with limited cognitive and physiological abilities, hindering them to use traditional devices efficiently. For this purpose, we develop a novel interaction device tailored to the special situation of in-bed interaction.

In this paper, we present key requirements for the device, which are relevant to the interaction itself as well as possible interaction gestures that may be performed with the device. The basis for this is a human-centered design process consisting of a comprehensive user and context analysis, as well as a requirements analysis. As a result, we identified three categories relevant for the interaction, namely look and feel, sensors, and actuators. The results of the requirement analysis were precise enough to start the actual development process of the device.

1 Introduction

Intensive care is a highly specialized setting for the treatment of the most seriously and critically ill patients in a hospital. It is characterized by a high degree of complexity due to the severity of diseases and the degree of technologization. This means a high burden on everyone involved, particularly patients themselves, their relatives and health professionals. The intensive care unit (ICU) setting has a higher nurse-patient ratio than general wards, justified in the high degree of responsibility and activities. But staffing is precarious: A recommended 1 to 2 cast is not steadily feasible in Germany, especially in night shifts serious deviations are reported [23].

Mechanically ventilated patients represent a considerable and mounting patient population in intensive care units [23]. In 2017, 2.1 million cases of intensive care treatment were reported in Germany. About 430,000 of them were mechanically ventilated [10, p. 68]. A special and vulnerable phase is the weaning from the ventilator (‘weaning’). During this phase, the body has to take over physiological functions that previously were executed by a respirator. The first step in this process is to reduce the sedative medication making ventilation tolerable to the patient. The process of weaning is completed when the patient is able to breathe sufficiently and autonomously and the artificial airway has been safely removed. The duration of this procedure is difficult to predict and can last from minutes to days or even weeks. By definition, it is divided into simple weaning, difficult weaning, and prolonged weaning [4]. In summary, it can be stated that the weaning process causes high physical and psychological stress for patients and means great responsibility for health professionals. Based on their serious disease, the influences of sedating medication and the artificial airway (e. g., tracheostoma or endotracheal tube), patients are severely restricted to communicate or interact with their social environment. Frequently reported strains are insufficiently treated pain [6], increased risk of suffering a delirium [4], fear, as well as feelings of unfamiliarity and loss of identity [1], [28]. These influencing factors can cause complications or can delay the healing process.

In this context an unambiguous communication with patients is inevitable, but there is a lack of effective supporting methods [16]. Existing aids and strategies to support communication are perceived as inappropriate [1]. Communication between patients and health professionals is inevitable to support recovering physical and psychological capabilities and must be established as early as possible. Currently, decisions made by caregivers do not exactly match with patients’ actual needs [30]. Literature shows coherences between positive communicative activities with ventilated patients and improved patient-related outcomes [25]. Lacking communication ability leads to an increased risk of poorer treatment, which reinforces the need for augmentative and alternative communication (AAC) [15]. However, existing devices have shown the necessity to distinguish between high- and low technology AAC in practice. Low-tech AAC devices do not consider the holistic impact of impairments of patients in intensive care units which makes their usage inappropriate [8].

High-tech AAC devices, on the other hand, provide a very complex and potentially expensive solution tailored to very specific needs. This could introduce a steep learning curve for the operation, both for the healthcare provider’s staff as well as the patients. In addition, neither low-tech nor high-tech devices usually support a comprehensive integration into the already existing technical infrastructure at the premise without dedicated software development or sometimes even hardware adjustments.

In the last decades, information and communication technology (ICT) underwent a rapid evolution. Ongoing miniaturization of microelectronics allows for an integration of ICT components into almost any daily object. Whereas the integration into (smart) phones and (smart) watches have already produced wide-spread portable computer systems to interact with, a plethora of further smart mobile and wearable devices is in development and soon will be used in hospital settings as well (e. g., smart glasses or smart textiles). Equipped with sensors to detect the current state of the environment and the (inter-)action of users as well as wireless networks to connect with each other in the Internet of Things (IoT), these device ensembles can soon assist caregivers‘ tasks and patients‘ needs in a holistic ambient interaction space.

In contrast to using only specialized interaction devices (micro adaptation / one-size-fits-all), our approach of dynamically integrating interaction devices into adaptive interaction ensembles (see [3]) will allow future smart hospital environments for supporting individualized services (macro adaptation / come-as-you-are) to their stakeholders depending on user abilities and task context.

In the BMBF project ACTIVATE, we develop an ambient interaction space for intensive care, consisting of adaptive interaction ensembles and ambient displays to support communication, (re-)orientation and control in intensive care. In principle, our system architecture supports the integration of any (wireless) interaction device (e. g., eye-scanner or camera-based gesture detection). For example, healthcare staff could use touch-enabled mobile devices or speech recognition for interacting with the system. The ACTIVATE platform supports automatic ensemble creation on the fly triggered by the user or based on context decisions in-situ.

Since patients bound to the bed are not able to use traditional interaction devices like keyboard/mouse and common alternatives like gesture and speech control efficiently, dedicated support for in-bed interaction becomes necessary. Unfortunately, we were not able to identify any appropriate device in our state of the art analysis (see Section 2) and therefore further aim at developing a novel device for this special usage context.

BIRDY (Ball-shaped Interactive Rehabilitation Device) is intended to be used with the hand while lying in bed and consequently not posing strong or special restrictions on the context of use. Since balls have the affordance to play or interact with them, we consequently chose a spherical shape to stimulate users exploring possible interactions with the device and the connected system. In principle, BIRDY is intended to act as a gesture detection device for the graphical system offering pre-defined and typical services of intensive care units (e. g., service requests, information needs, room controls).

A possible setup for the system can be seen in Figure 1. In this example, a patient is using BIRDY to communicate his pain to the nurse. She receives the message on her smartphone supporting her in preparation of medication. This may reduce walking distances and save time.

Figure 1 
          A possible scenario of our system where a patient interacts with it using BIRDY. This image was drawn by Kai Simons.
Figure 1

A possible scenario of our system where a patient interacts with it using BIRDY. This image was drawn by Kai Simons.

In this article, we present key requirements for the novel interaction device BIRDY, which are relevant to the interaction itself. Based on the taxonomy of Altakrouri [2] of basic input tasks in ambient spaces (i. e. selection, position, orientation, path, modification, semiotic), we restrict the scope of BIRDY gestures to selection commands of predefined menu elements. This avoids overwhelming and unnecessary complex operation for the weaning patient and still offers complete control and personalisation, e. g., by selecting letters on a virtual keyboard to phrase personal requests not included in the offered services. The requirements for ball-shaped selection gestures were gathered in a human-centered design (HCD) approach including two studies, several workshops and a comprehensive user and context analysis. In the following, our applied methods and resulting insights are described. Especially, consequences for the hardware aspects relevant for the interaction were identified and addressed, namely look and feel, sensors, and actuators.

2 Related Work

Various work on novel interaction devices has been done in the past. Yet, there is only limited work targeting the support of communication in intensive care with a specialized device, particularly during the weaning phase. Recently, several requirements for such systems were presented [13]. Based on the results, a controller similar to a joystick was developed [14]. Its design is focused on domain-specific communication needs such as suitable content, infection control, simple design and capitalization of motor movements easily performable by ICU patients. Furthermore, the device is adaptable to the patients’ physical deficits and impairments and provides feedback based on a vibration motor. The work is early-stage research and only few information has been published up to now. Furthermore, the interaction with it is only roughly comparable to those with our planned ball-shaped interaction device. Nevertheless, the work provides valuable insights relevant to the ICU setting.

First workshops with domain experts showed that an interaction device used by ICU patients to communicate their basic needs should be controlled haptically with the hand since tactile feedback is the first given by weaning patients awakening from the sedation. Therefore and due to cost, complexity, the time required for calibration, and more complex interaction patterns, gaze control systems as well as tablet-based solutions and brain-computer interfaces, although successfully tested on the ICU [9], [27], [21], [12], are not further taken into account.

The ball-shape was a consensus in our first workshops where we discussed the need for a novel interaction device and how it should be designed. Thus, we further reviewed the state of the art of ball-shaped interaction devices to take insights from these into account. It should be noted that all these devices are not suitable for the ICU context mainly due to the sanitary regulations or the fact that they do not address the limited cognitive and physiological abilities of ICU patients.

Several ball-shaped interaction devices were already described. They are typically designed to control a traditional desktop computer. For instance, the Roly-Poly Mouse (RPM) was inspired by a roly-poly toy, which is keeping its position despite having a round bottom [26]. RPM combines the advantages of a standard mouse and 3D devices by allowing translation, roll, and rotation. However, RPM does not provide any actuators making the RPM lacking feedback for the users.

Another comparable device is PALLA, a spherical input device that also provides simple feedback [34]. It is designed for games and leisure activities and is equipped with a set of sensors and actuators allowing a user and environmental input (e. g., device motion, mechanical stimuli or changes in external conditions) as well as feedback based on its LED and vibration motor. Due to its robustness, users may interact with it carefreely. Neither RPM nor PALLA is squeezable.

Another interesting approach is the mid-air pointing device SOAP [5] shaped similar to a piece of soap. It consists of a wireless optical sensor core of a computer mouse moving freely in a hull of elastic fabric. Applying pressure to the hull makes the core evade the pressure and start rotating. The motion between the hull and the core is captured by the optical sensor. Beneath joystick interaction SOAP also supports both clutching and infinite motion (e. g., for scrolling through large lists) called belt and soap interaction. It has been successfully tested in different scenarios like desktop interactions, gaming and as a presenter. The device is squeezable and mouse clicks can be performed, nevertheless it does not provide any integrated actuators.

Figure 2 
          Human-Centered Design process following DIN EN ISO 9241-210 [18].
Figure 2

Human-Centered Design process following DIN EN ISO 9241-210 [18].

Qoom is an interactive omnidirectional ball display using a foam rubber ball consisting of a sensor module, electrodes for touch sensing and a display [24]. By having roughly the size of a handball, it is too large for one-handed interaction in bed. Furthermore, it does not provide further actuators.

Some other similar, partially spherical, interaction devices are TDome [29], combining a semi-spherical shape with a touch display, Mouse 2.0 [35], a mouse enriched with multi-touch sensing on a hemispherical surface, and Globefish [11], an embedded 3-DOF trackball in an elastic frame.

Recently, the development of the ichó ball has started [31]. The motivation behind ichó is to support dementia patients. The ichó ball detects input gestures and provides feedback using light, sound, and vibration. In contrast to our work, the focus of ichó is rather on mental disabilities with normal hand operation in daily life settings.

Since none of the above approaches fulfills all the special requirements of the given context of use, we aimed to develop a device tailored to it.

3 Method – Human-Centered Design

One major aspect of the interaction device will be the acceptance of potential users, namely patients and the nursing staff. Towards this end, considering their needs already in the development process and focusing on the interaction device’s usability are crucial factors. Consequently, the development is based on the HCD process for interactive systems as specified in ISO 9241-210 (see Figure 2).

An important part of the HCD process is understanding and specifying the context of use and the users’ needs and requirements. Based on these requirements, design solutions are developed and formatively evaluated. Once they meet the requirements in a summative evaluation, the process cycle ends.

In the case of BIRDY, the requirement specification was realized using five partially parallel sub-processes. Besides requirements relevant to human-computer interaction, additional technical requirements (among others regarding battery, energy consumption, wireless communication and charging) were collected. However, these additional requirements are not discussed in this article.

Performed sub-processes are a user and context analysis (including literature research, qualitative interview studies as well as the construction of personas and problem scenarios) and a requirements analysis including user preference studies, and workshops with stakeholders to discuss insights, identify potential interactions and derive concrete requirements. These sub-processes, their methods, and results are described in the following sections.

3.1 User and Context Analysis

For a better understanding of the user groups, their needs, and the context of use, we conducted a user and context analysis.

3.1.1 Literature Research

As a first step towards an analysis, a comprehensive literature search was done to identify similar work in the field and key findings from these. The search was focused on socio-technical systems to support communication in intensive care, especially the need for AAC in this context. Results show a demand for AAC and only limited work on technical solutions in intensive care (see Section 2). The patients’ first reaction typically is attempting to communicate and receive information. Hence, one suggestion is to help patients „develop usable communication methods in connecting with their surroundings and enabling patients to express their feelings“ [32, p. 532]. Furthermore, information on the context was gathered from the literature. Comprehensive information on context, persons involved and surroundings of ICU is for example given by Marx et al. [22].

3.1.2 Individual and Focus Group Interviews

Next, qualitative individual interviews were conducted. Among 16 patients and 16 relatives, we interviewed 6 members of the medical ICU staff of the nearby hospital and clinical ACTIVATE project partner Universitätsklinikum Schleswig-Holstein. Additionally, three focus group interviews with 18 members of therapeutic ICU staff were carried out. Besides patients’ needs from their own and from nursing staff’s perspective, staff’s and relatives’ own needs were documented. On top, conductive and obstructive factors for the use of a system to support information, communication, and control in intensive care were identified.

3.1.3 Personas and Scenarios

Based on these two steps, our interdisciplinary team iteratively developed personas to represent the target user groups. They were derived from the collected data and carefully modeled according to key characteristics, namely (un-)planned hospitalization, fitness, medical discipline, (non-)native speaker and (non-)delirious condition. Next, we modeled a typical weaning process to later construct persona-based problem scenarios. In further discussions, the scenarios were refined multiple times.

Personas and scenarios were designed (see Section 4.1) and used successfully for further discussion in our workshops which are described in the following.

3.2 Requirements Analysis

Following the user and context analysis, we conducted the requirements analysis to identify requirements not only based on users and context but also by taking user preferences, experts, and possible ways to interact with BIRDY into account.

3.2.1 User Preference Study

An important factor of an interaction device is its look and feel as well as high usability. Considering our vulnerable target user group, defining appropriate characteristic attributes is very important and choices must be well-founded. Thus, we conducted a user study to collect participants’ preferences regarding device properties, especially referring to the optimal size, weight, shape, surface properties and deformability.

A two-step process with a preliminary and the main study was conducted. We acquired 30 commercially available objects (balls of different size, color, material, weight, etc.) with various characteristic attributes potentially suitable for BIRDY that were tested in the preliminary study with 12 participants on an exhibition at the health fair Gesundheit Morgen in Kiel, Germany 2017. Participants randomly chose objects, tested and finally ranked them, resulting in a list of preferred objects. Based on these results and the criteria of the highest variety in the characteristic attributes, a subset of eight objects was determined for our main study.

For this study, we recruited 40 participants, divided into two age groups: adults and senior adults (each 20). The adults were aged from 18 to 40 years (M = 23.45, SD = 3.03) with a gender distribution of 11 females and the senior adults were aged over 58 years (M = 67.25, SD = 6.6) with a distribution of 12 females. Participants were primarily asked about their object preferences. First, they conducted a pair-wise comparison and then ranked their favorites regarding predefined characteristics, namely size, weight, shape, surface properties, and deformability. Finally, they chose their overall favorite object regardless of a given characteristic. We documented choice and the underlying reason for all participants. Also, we recorded how they interacted with their favorite object.

During the study, we created a setting that resembles realistic conditions in a clinic. Participants wore special gloves simulating swollen hands (and thereby reduced hand mobility) and they lied in a hospital bed in a 30° upper-body position [36].

3.2.2 Interdisciplinary Workshops

The members of our research project work in several different disciplines, namely nursing research, hospital IT, hospital nursing, hardware engineering, software engineering, usability, psychology as well as AAC. All previous results were discussed and refined in joint workshops to determine concrete requirements for BIRDY. In particular, personas and scenarios were enhanced and then used for further considerations finally leading to requirements.

The next step in our process was conducting several workshops with the stakeholders in our project to define actual requirements addressing the look and feel, sensors, and actuators of BIRDY. During these workshops, we shared the results of previous work, clarified unresolved aspects of the preference studies, and discussed design options. The scenarios and personas, as well as the results of the user preference studies, were used to show potential use cases and for the discussion of various concepts.

In total, 20 experts were involved in the workshops. We conducted one workshop focusing on the application of the HCD process, four workshops to discuss technical details, scenarios, and personas, three workshops to consider preliminary requirements, two workshops to realize safety and security by design, and one workshop to analyze state of the art devices for AAC. Finally, we had six telephone conferences to finalize technical details and requirements with our hardware engineers Cognimed GmbH, who will develop and build the final product.

3.2.3 Interaction Workshops

We used the results and insights of our user preference study, more precisely those with regard to the participants’ interaction patterns as a starting point for further research on possible gestures. In particular, we were looking for gestures that can be performed with BIRDY and that are suitable for ICU patients. Hence, following the technical details, we identified potential gestures and later chose a subset of the identified gestures to evaluate their suitability with five non-ventilated ICU patients with respect to medical accesses, the patients’ mobility and the position of the device. Besides, we investigated which gestures could be performed ambidextrously. Next, we discussed these gestures with our experts in hardware engineering, software engineering and HCI. For this purpose, we further conducted workshops focusing on this matter and on the software specifications of BIRDY, as well as several telephone conferences to discuss the feasibility of identified gestures based on possible sensors and their respective data.

4 Results

In this section, we describe results of our subprocesses and how we used them to specify our requirements.

4.1 User and Context Analysis

The procedure resulted in an elaborated user analysis including descriptions of user groups along with their characteristics and personas of different types (primary, negative, served and customer). Furthermore, a detailed context analysis was done [17]. An organization and a task analysis gave additional information on the context.

Several barriers and enablers could be identified. Here, we focus on those relevant to the interaction with BIRDY. Besides the required time and expertise for installation and use, slow system performance was named obstructive for a possible application. On the other hand, intuitive or natural operation, stability, and simple usage were found constructive.

One key aspect is the intuitiveness since weaning patients cannot be expected to learn complex interactions. In fact, we plan to apply the system at an early stage in the weaning phase. Interaction and its effect must be immediately clear because of the patient’s short awareness phases. Most of the time, they must be instructed by the system itself. Furthermore, typical first interactions with our device must be taken into account within the context of intensive care. A second key aspect is the usability despite various impairments caused by being bound to the bed, the medical condition, swollen hands, reduced manual force (due to age and condition), and reduced arm mobility due to invasive accesses. The interaction must not cause any injuries, requiring the device to be robust, having no rough edges and a good weight balance.

This data has been used to design twelve personas representing the main ACTIVATE target user groups, namely patients, therapeutic staff and relatives. Given that, 16 scenarios based on those personas were created describing typical problems of the target groups (deficit in information, insufficient communication, missing feedback, delegation, etc.) that should be addressed by the ACTIVATE system.

4.2 Requirements Analysis

The results of our requirements analysis are described in the following. Apart from the results of our user and context analysis also the results of the user preference study and our workshops were used to specify requirements for the BIRDY device.

4.2.1 User Preference Study

After analyzing the gathered data, we got a set of potential interaction gestures as well as several rankings based on the pair-wise comparison, preferences regarding fixed characteristics, and the overall favorite object. In combination with results from other analyses, a comprehensive overview of user preferences on ball-shaped objects is given. We already published the results of the study [19], an analysis of the choice of an overall favorite ball-shaped object and the underlying reason [7], as well as an analysis of the first impulse in spontaneous interaction with an overall favorite spherical object [33].

These obtained rankings and known characteristic attributes of the objects were discussed in workshops with the project’s stakeholders to finally determine the look and feel requirements for BIRDY. The results of these workshops are described in the following.

4.2.2 Interdisciplinary Workshops

Our detailed discussions focused on results of the previous studies, on different designs and concepts, their feasibility and technical details regarding for example connectivity and wireless charging. After all, we found a consensus among all experts and were able to define technical details, possible solutions, and designs. Furthermore, the stakeholders confirmed our design choices. Since the therapeutical staff wishes to detect movement of the sleeping patients as early as possible, BIRDY should be able to detect motions and pressure changes. Additionally, movements might be the most obvious interaction patterns, they also have to be measurable and distinguishable. Key insights are the confirmation of the spherical shape, sanitariness, a translucent and white design, a softness range and sensors to detect motion and pressure.

We discussed and agreed to provide feedback in different modalities. Vibration might be the most reasonable one for direct feedback on recognized interaction gestures. Visual feedback – especially colorful light – might be helpful to support calibrating or pairing. Hence, to recognize colorful lights the outer shell must be translucent. Acoustic feedback could be used for warnings, potentially about a low battery level or critical issues.

The next step was identifying and discussing appropriate BIRDY interaction gestures to specify requirements how they could be detected.

4.2.3 Interaction Workshops

The interaction workshops led to an identification of possible gestures that may be used to perform a Selection input task (as defined in [2]). Next, we grouped the gestures into two subcategories, namely navigation and selection gestures (performing a discrete action, e. g. confirming a choice). As a result of our interaction workshops, we decided to rely only on basic interactions to avoid complex input tasks that would not reflect the cognitive and physical skills of ICU patients. Additional factors that led to our decision to rely only on this category (and these subcategories) are hardware limitations (device’s size, weight) and the cost per device.

The identified navigation gestures are turning (counter-)clockwise, tilting (in the four directions forward, backward, left and right), touching or pressing and holding (until the desired element is focused), rolling on a surface (e. g. clockwise on a circle), pushing or moving, and mid-air movements (in the four directions up, down, left and right). For the selection category, pressing, (double) tapping or clicking, knocking (on the surface), lifting, as well as shaking were identified. They are displayed in Figure 3. It should be noted that mid-air gestures may cause arm fatigue and should generally only be used sparsely. They are most likely not suitable for patients in intensive care. Regarding all other identified gestures, there is no restriction that would hinder the use.

Figure 3 
              The identified gesture set consisting of navigation (a–f) and selection (g–l) gestures. They include turning (counter-)clockwise (a), tilting (b), touching or pressing and holding (c), rolling on a surface (d), pushing or moving (e), mid-air movements (f), pressing (g), (double) tapping or clicking (h, i), knocking (j), lifting (k) and shaking (l).
Figure 3

The identified gesture set consisting of navigation (a–f) and selection (g–l) gestures. They include turning (counter-)clockwise (a), tilting (b), touching or pressing and holding (c), rolling on a surface (d), pushing or moving (e), mid-air movements (f), pressing (g), (double) tapping or clicking (h, i), knocking (j), lifting (k) and shaking (l).

Patients potentially lacking fine-motor skills should still be able to use the proposed gestures to interact with the UI of the ACTIVATE system. Therefore, BIRDY does not need to allow fine-grained, mouse-like control. In other contexts, where the users can be expected to have the skills, a precise control may be required.

The final requirements are described in Section 4.3.

4.3 Requirements

The results of our analysis and specification process are several requirements concerning the interaction. We identified three relevant categories, namely look and feel, sensors, and actuators.

4.3.1 Look and Feel

With BIRDY, we envision an encouraging and carefree nature of the interaction. In particular, we chose to realize a playful design to stimulate users to explore not only the device but also the interactive ACTIVATE system. As previously described, the shape of BIRDY was specified to be spherical. During the whole analysis phase, it has been one of the most controversially discussed properties. Also eligible was an egg-shaped variation, which might simplify identifying the orientation and therefore also simplify wireless charging. Besides, it might decrease the risk for the device to fall out of the bed. Finally, a decisive factor towards a spherical shape was its affordance to encourage interacting with the device while our experts assumed that an egg-shaped device would be perceived fragile. Additionally, the participants of our study preferred spherical objects.

Relevant look and feel related characteristics to be specified for our device were size, weight, surface properties and deformability. BIRDY was defined to have a diameter of 64 to 90 mm since the favored study objects regarding the size had these physical dimensions. Based on the study results, we specified a weight ranging from 40 to 150 g.

The weight plays a minor role in the planned interactions on a bed surface. However, for potential mid-air gestures, the weight becomes relevant. In this case, lower weight means less physical effort while interacting. Though, a low weight provides little tactile feedback. In summary, for mid-air gestures, the weight should be close to the lower bound but not below.

The surface was fixed to have a rough texture in order to be palpable by touch. Deep, detailed textures would make the device hard to disinfect and thus, would contradict hygiene guidelines. A small number of tiny nubs could be a pragmatic design on this matter. The majority of participants preferred objects with rough surfaces. Besides, ICU patients’ medication might cause swollen hands and decreased tactile perception. Therefore, palpable tactile feedback is chosen, also to encourage interaction. Furthermore, the surface was set to have a high elasticity due to the decreased gripping force of weakened patients and the usage on a mattress. The device was determined to have a translucent and white design to keep the internal light undistorted but make the internal components invisible.

4.3.2 Sensors

In our study and our interaction workshops, we observed turning, tilting and rolling the objects were most often used for a navigation task. As a consequence, we propose turning, tilting and rolling interactions with BIRDY on the mostly planar bed surface or, as observed in study participants, on the user’s belly. Additionally, we observed squeezing, tapping, shaking and lifting the object was predominantly used to perform a discrete action, e. g. making a selection or confirming a choice. To enable all these interactions, pressure, orientation, and acceleration sensors are required and therefore, specified within the requirements.

4.3.3 Actuators

Feedback has a major impact on the user experience, especially when interacting with a physical device. Feedback for interactions between humans and machines can be of passive nature or controlled by actuators. Besides passive feedback based on elasticity and the texture, BIRDY should provide tactile and visual feedback to encourage further interactions. Additionally, acoustic feedback should be implemented for possible error warnings. Furthermore, to realize the discussed calibration required for some of the gestures, LEDs need to have a defined position and should be placed so they can display the initial orientation. In summary, BIRDY needs a vibration motor, several LEDs, and a buzzer.

5 Discussion

In this paper, we presented design choices and concrete requirements for a novel interaction device supporting ICU patients during the weaning phase. For an actual device and to evaluate its impact on the target user group, additional work has to be done.

Results of the qualitative individual interviews and focus group interviews in terms of possible solutions for communication barriers show limited imagination of the participants. Hence, we organized the described workshops to propose and discuss solutions by a team of domain and engineering experts.

Our user preferences study was limited to acquirable objects and thus, in particular, the favorite objects were based on real objects and not based on a combination of favored attributes. Yet, the results of our study with concerning those characteristics also investigated by Perelman et al. [26] are comparable to the results of their studies and informal tests, justifying our approach.

Next, we assessed the gestures’ feasibility with different sensors. Turning may be realized using an accelerometer to determine the direction of gravity and a gyroscope to estimate the angle. For tilting, an orientation of the device with respect to the user’s hand is required making this kind of gesture particularly challenging. One possible approach is to use an inertial measurement unit (IMU) containing a compass sensor to determine the earth frame (north and down direction and their cross product) and calibrating the device using the fixed position of the bed (assuming it is rarely moved which would necessitate a re-calibration). The calibration can be realized using LEDs on the device to display its initial orientation and thereby record the position of the bed. Touching or pressing and then holding may be implemented based on a pressure sensor or an accelerometer detecting the touching. Rolling on a surface is more complex than turning, but may be implemented referring to the data of an accelerometer and a gyroscope. The last two navigation gestures are the most challenging to detect using only sensors in the device and also require knowledge about the position of the device relative to the user. They may roughly be detected using the same approach as described for the tilting and using the accelerometer to determine movements. For higher precision, these two gestures necessitate determining the position relative to the user. Finally, the selection gestures are easier to detect and may be realized using a pressure sensor for pressing and an accelerometer for all other selection gestures (using orientation invariant features). In summary, all proposed gestures may be detected more or less precisely using an IMU that also contains a compass. These sensors are sufficient to determine the earth frame and hence, to provide an orientation of the device relative to the earth frame.

The key challenge of the gesture recognition using BIRDY is the identification of the orientation relative to the user. This is particularly challenging when relying only on hardware inside the device. One possible approach based on calibration was already described above. This approach depends on the magnetometer which may, especially in a clinic, be distorted resulting in a decreased accuracy. An evaluation of the actual distortion and the precision must be conducted to compare the different gestures and to finally assess their suitability for the ICU situation. Other approaches to detect the device’s orientation relative to the user may rely on various sensors. Conceivable are touch sensors that may be used to identify the orientation of the user’s hand. This approach was originally planned within the project ACTIVATE but was discarded due to technical challenges and the high cost of such a device. Furthermore, approaches relying on the triangulation of wifi signals, on radio frequency or on generated magnetic fields using wireless trackers, and on camera tracking are also imaginable but have their own downsides.

In the future, we plan to evaluate design choices and technical details of BIRDY in practice in a field study. However, the next step is the realization of the device in terms of hard- and software engineering based on present requirements. Moreover, we will implement the overall system based on defined requirements [20] within the scope of our research project to finally be able to evaluate the effects of supported communication on weaning patients.

Besides, a validation of our choice regarding the gestural input by evaluating the first impulse in spontaneous interaction with the actual device is planned. Thereby, we intend to fix suitable gestures and interaction options for weaning patients. Furthermore, we plan to evaluate different options for social interaction with therapeutic staff as well as relatives and for the expression of patients’ needs.

Award Identifier / Grant number: 16SV7689

Funding statement: This publication is part of the research project “ACTIVATE” financed by the Federal Ministry of Education and Research (BMBF, under Grant No. 16SV7689) and promoted by VDI/VDE Innovation + Technik GmbH.

About the authors

Börge Kordts

Börge Kordts (PhD student, M.Sc. in Computer Science) is a researcher and member of the Ambient Computing Group at the Institute of Telematics (ITM) of the University of Lübeck. His research interests include human computer interaction in smart spaces, accessibility, and digital healthcare.

Jan Patrick Kopetz

Jan Patrick Kopetz (PhD student, M.Sc. in Media Informatics) is a researcher at the Institute for Multimedia and Interactive Systems (IMIS) of the University of Lubeck. His research interests include human computer interaction, new interaction devices, mobile media, augmented reality, and digital healthcare.

Adrienne Henkel

Adrienne Henkel (M.Sc. Health and Nursing Science) is a research assistant at the Institute for Social Medicine and Epidemiology (ISE), University of Lubeck. Her research interests include clinical nursing research and digital technologies in healthcare, especially in nursing.

Andreas Schrader

Andreas Schrader (PhD in Computer Science) is a Professor for Ambient Computing and head of the Ambient Computing Working Group at the Institute of Telematics (ITM) at the University of Lübeck. His research interests include human interaction in smart spaces with a special focus on frameworks for the development of ambient health systems.

Nicole Jochems

Nicole Jochems (PhD in Engineering) is a Professor for Media Informatics and Head of the Media Informatics Programme at the Institute for Multimedia and Interactive Systems (IMIS) of the University of Lubeck. Her research interests include methods in the area of human-centerd design, age-specific design, virtual- and augmented reality, and digital healthcare.

Acknowledgment

Furthermore, we thank all participants who took part in our studies.

References

[1] Alham Abuatiq. Patients’ and health care providers’ perception of stressors in the intensive care units. Dimensions of Critical Care Nursing, 34(4):205–214, 2015.10.1097/DCC.0000000000000121Search in Google Scholar PubMed

[2] Bashar Altakrouri. (Ambient assisted living with dynamic interaction ensembles. Doctoral thesis, The Department of Computer Sciences/Engineering), University of Luebeck, Luebeck, Germany, August 31 2014.Search in Google Scholar

[3] Bashar Altakrouri and Andreas Schrader. Towards dynamic natural interaction ensembles. In Fourth International Workshop on Physicality, page 1, 2012.10.14236/ewic/HCI2012.85Search in Google Scholar

[4] Arbeitsgemeinschaft der Wissenschaftlichen Medizinischen Fachgesellschaften. S3-Leitlinie: Analgesie, Sedierung und Delirmanagement in der Intensivmedizin 08/2015 – DAS – Leitlinie, August 2015. URL https://www.awmf.org/uploads/tx_szleitlinien/001-012l_S3_Analgesie_Sedierung_Delirmanagement_Intensivmedizin_2015-08_01.pdf.Search in Google Scholar

[5] Patrick Baudisch, Mike Sinclair, and Andrew Wilson. Soap: a pointing device that works in mid-air. In Proceedings of the 19th annual ACM symposium on User interface software and technology, pages 43–46. ACM, 2006.10.1145/1166253.1166261Search in Google Scholar

[6] T Bohrer, M Koller, T Neubert, A Moldzio, O Beaujean, A Hellinger, W Lorenz, and M Rothmund. Wie erleben allgemeinchirurgische Patienten die Intensivstation? Ergebnisse einer prospektiven Beobachtungsstudie. Der Chirurg, 73(5):443–450, 2002.10.1007/s00104-002-0471-zSearch in Google Scholar PubMed

[7] Svenja Burgsmüller, Ann-Katrin Vandereike, Jan Patrick Kopetz, Michael Sengpiel, and Nicole Jochems. Study of Desirable Characteristics of a Communication Device for Intensive Care Patients. In Student Conference Proceedings 2018, Infinite Science Publishing, 2018. ISBN 978-3-945954-47-8.Search in Google Scholar

[8] Helen Carruthers, Felicity Astin, and Wendy Munro. Which alternative communication methods are effective for voiceless patients in Intensive Care Units? A systematic review. Intensive & Critical Care Nursing, 42:88–96, October 2017. ISSN 1532-4036. 10.1016/j.iccn.2017.03.003.Search in Google Scholar PubMed

[9] Omid Dehzangi and Muhamed Farooq. Portable Brain-Computer Interface for the Intensive Care Unit Patient Communication Using Subject-Dependent SSVEP Identification, 2018. URL https://www.hindawi.com/journals/bmri/2018/9796238/abs/.10.1155/2018/9796238Search in Google Scholar PubMed PubMed Central

[10] Destatis. Gesundheit: Grunddaten Der Krankenhäuser 2017. Number 2120611177004 in Fachserie 12 Reihe 6.1.1. Statistisches Bundesamt (Destatis), 2018.Search in Google Scholar

[11] Bernd Froehlich, Jan Hochstrate, Verena Skuk, and Anke Huckauf. The GlobeFish and the GlobeMouse: Two New Six Degree of Freedom Input Devices for Graphics Applications. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI ’06, pages 191–199. ACM, 2006. ISBN 978-1-59593-372-0.Search in Google Scholar

[12] Jonah Garry, Kelly Casey, Therese Kling Cole, Angela Regensburg, Colleen McElroy, Eric Schneider, David Efron, and Albert Chi. A pilot study of eye-tracking devices in intensive care. Surgery, 159(3):938–944, March 2016. ISSN 1532-7361. 10.1016/j.surg.2015.08.012.Search in Google Scholar PubMed

[13] Miriam Goldberg, Leigh Hochberg, Dawn Carpenter, Johnny Isenberger, Stephen Heard, and J. Walz. Principles of augmentative and alternative communication system design in the icu setting. UMass Center for Clinical and Translational Science Research Retreat, May 2017. URL https://escholarship.umassmed.edu/cts_retreat/2017/posters/27.Search in Google Scholar

[14] Miriam Goldberg, Leigh Hochberg, Dawn Carpenter, Johnny Isenberger, Stephen Heard, and J. Walz. Testing a Novel Manual Communication System for Mechanically Ventilated ICU Patients. UMass Center for Clinical and Translational Science Research Retreat, May 2017. URL https://escholarship.umassmed.edu/cts_retreat/2017/posters/28.Search in Google Scholar

[15] Charlotte Handberg and Anna Katarina Voss. Implementing augmentative and alternative communication in critical care settings: Perspectives of healthcare professionals. Journal of Clinical Nursing, 27(1–2):102–114, January 2018. ISSN 1365-2702. 10.1111/jocn.13851.Search in Google Scholar PubMed

[16] Mary Beth Happ, Kathryn Garrett, Dana DiVirgilio Thomas, Judith Tate, Elisabeth George, Martin Houze, Jill Radtke, and Susan Sereika. Nurse-patient communication interactions in the intensive care unit. American Journal of Critical Care: An Official Publication, American Association of Critical-Care Nurses, 20(2):e28–40, March 2011. ISSN 1937-710X. 10.4037/ajcc2011433.Search in Google Scholar PubMed PubMed Central

[17] Adrienne Henkel, Björn Hussels, Jan Patrick Kopetz, Susanne Krotsetis, Nicole Jochems, and Katrin Balzer. Nutzer- und Aufgabenanalyse für ein sozio-technisches System zur Unterstützung der Kommunikation und Reorientierung beatmeter Patientinnen und Patienten in Intensivstationen: Ergebnisse und methodische Herausforderungen. In Zukunft der Pflege: Tagungsband der 1. Clusterkonferenz 2018 – Innovative Technologien für die Pflege, 2018.Search in Google Scholar

[18] ISO. 9241-210: 2010. ergonomics of human system interaction-part 210: Human-centred design for interactive systems. International Standardization Organization (ISO). Switzerland, 2009.Search in Google Scholar

[19] Jan Patrick Kopetz, Ann-Katrin Vandereike, Svenja Burgsmüller, Michael Sengpiel, Daniel Wessel, and Nicole Jochems. How should “BIRDY” look and feel? Designing a new interaction device for intensive care patients based on user preferences. In 20th Congress of International Ergonomics Association – IEA 2018, Florence, 2018.Search in Google Scholar

[20] Börge Kordts, Jan Patrick Kopetz, Katrin Balzer, and Nicole Jochems. Requirements for a System Supporting Patient Communication in Intensive Care in Germany. In Zukunft der Pflege: Tagungsband der 1. Clusterkonferenz 2018 – Innovative Technologien für die Pflege, 2018.Search in Google Scholar

[21] Rebecca S. Koszalinski, Ruth M. Tappen, and David Viggiano. Evaluation of Speak for Myself with Patients Who Are Voiceless. Rehabilitation Nursing: The Official Journal of the Association of Rehabilitation Nurses, 40(4):235–242, August 2015. ISSN 0278-4807. 10.1002/rnj.186.Search in Google Scholar PubMed

[22] Gernot Marx, Elke Muhl, Kai Zacharowski, and Stefan Zeuzem. Die Intensivmedizin. Springer-Verlag, 2014.10.1007/978-3-642-54953-3Search in Google Scholar

[23] Michael Isfort. Personalsituation in der Intensivpflege – Permanent “am Limit”. BibliomedPflege, (04-2017):22–27, November 2017. URL https://www.bibliomed-pflege.de/zeitschriften/pflegenintensiv/heftarchiv/ausgabe/artikel/pi-4-2017-das-intensivtagebuch-bruecke-zur-erinnerung/33646-permanent-am-limit/.Search in Google Scholar

[24] Shio Miyafuji, Toshiki Sato, Zhengqing Li, and Hideki Koike. Qoom: An interactive omnidirectional ball display. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, UIST ’17, pages 599–609. ACM, 2017. ISBN 978-1-4503-4981-9.10.1145/3126594.3126607Search in Google Scholar

[25] Marci Nilsen, Susan M. Sereika, Leslie A. Hoffman, Amber Barnato, Heidi Donovan, and Mary Beth Happ. Nurse and Patient Interaction Behaviors Effects on Nursing Care Quality for Mechanically Ventilated, Older Adults in the ICU. Research in gerontological nursing, 7(3):113–125, May 2014. ISSN 1940-4921. 10.3928/19404921-20140127-02. URL https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4040337/.Search in Google Scholar PubMed PubMed Central

[26] Gary Perelman, Marcos Serrano, Mathieu Raynal, Celia Picard, Mustapha Derras, and Emmanuel Dubois. The Roly-Poly Mouse: Designing a Rolling Input Device Unifying 2D and 3D Interaction. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI ’15, pages 327–336. ACM, New York, NY, USA, 2015. ISBN 978-1-4503-3145-6.Search in Google Scholar

[27] Carmen S. Rodriguez, Meredeth Rowe, Brent Koeppel, Loris Thomas, Michelle S. Troche, and Glenna Paguio. Development of a communication intervention to assist hospitalized suddenly speechless patients. Technology and Health Care: Official Journal of the European Society for Engineering and Medicine, 20(6):489–500, 2012. ISSN 1878-7401. 10.3233/THC-2012-0695.Search in Google Scholar PubMed PubMed Central

[28] Louise Rose, Katie N Dainty, Joanne Jordan, and Bronagh Blackwood. Weaning from mechanical ventilation: a scoping review of qualitative studies. American Journal of Critical Care, 23(5):e54–e70, 2014.10.4037/ajcc2014539Search in Google Scholar PubMed

[29] Houssem Saidi, Marcos Serrano, Pourang Irani, and Emmanuel Dubois. TDome: A Touch-Enabled 6DOF Interactive Device for Multi-Display Environments. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI ’17, pages 5892–5904. ACM, 2017. ISBN 978-1-4503-4655-9. 10.1145/3025453.3025661. URL http://doi.acm.org/10.1145/3025453.3025661.Search in Google Scholar

[30] A. W. Schindler, N. Schindler, F. Enz, A. Lueck, T. Olderog, and D. A. Vagts. ICU personnel have inaccurate perceptions of their patients’ experiences. Acta Anaesthesiologica Scandinavica, 57(8):1032–1040, September 2013. ISSN 1399-6576. 10.1111/aas.12148.Search in Google Scholar PubMed

[31] Steffen Preuss. ichó · interaktive Förderung für Menschen mit Demenz, 2019. URL https://icho-systems.de/.Search in Google Scholar

[32] Shwu-Feng Tsay, Pei-Fan Mu, Shirling Lin, Kai-Wei Katherine Wang, and Yu-Chih Chen. The experiences of adult ventilator-dependent patients: A meta-synthesis review. Nursing & health sciences, 15(4):525–533, 2013.10.1111/nhs.12049Search in Google Scholar PubMed

[33] Ann-Katrin Vandereike, Svenja Burgsmüller, Jan Patrick Kopetz, Michael Sengpiel, and Nicole Jochems. Interaction Paradigms of a Ball-Shaped Input Device for Intensive Care Patients. In Student Conference Proceedings 2018, Infinite Science Publishing, 2018. ISBN 978-3-945954-47-8.Search in Google Scholar

[34] Fabio Varesano and Fabiana Vernero. Introducing PALLA, a Novel Input Device for Leisure Activities: A Case Study on a Tangible Video Game for Seniors. In Proceedings of the 4th International Conference on Fun and Games, FnG ’12, pages 35–44. ACM, 2012. ISBN 978-1-4503-1570-8. 10.1145/2367616.2367621. URL http://doi.acm.org/10.1145/2367616.2367621.Search in Google Scholar

[35] Nicolas Villar, Shahram Izadi, Dan Rosenfeld, Hrvoje Benko, John Helmes, Jonathan Westhues, Steve Hodges, Eyal Ofek, Alex Butler, Xiang Cao, and Billy Chen. Mouse 2.0: Multi-touch Meets the Mouse. In Proceedings of the 22Nd Annual ACM Symposium on User Interface Software and Technology, UIST ’09, pages 33–42. ACM, 2009. ISBN 978-1-60558-745-5. 10.1145/1622176.1622184. URL http://doi.acm.org/10.1145/1622176.1622184.Search in Google Scholar

[36] Li Wang, Xiao Li, Zongxia Yang, Xueli Tang, Qiang Yuan, Lijing Deng, and Xin Sun. Semi-recumbent position versus supine position for the prevention of ventilator-associated pneumonia in adults requiring mechanical ventilation. The Cochrane Database of Systematic Reviews, (1):CD009946, January 2016. ISSN 1469-493X.10.1002/14651858.CD009946.pub2Search in Google Scholar PubMed PubMed Central

Published Online: 2019-04-16
Published in Print: 2019-04-26

© 2019 Walter de Gruyter GmbH, Berlin/Boston

Downloaded on 16.4.2024 from https://www.degruyter.com/document/doi/10.1515/icom-2019-0004/html
Scroll to top button