Keywords

1 Introduction

Most application programs and operating systems simply lack considering users’ emotions, for example, frustration, anger, confusion, helplessness, and so on, while using the computer systems. Most casual users of application programs or OS, who are using such programs as word processors, spreadsheets, presentation programs, almost all sorts of application programs in every field, and Windows or Linux systems may experience frustration and anger until they are able to use computer systems/programs conveniently, or when they encounter some functionality very undesirable or unfriendly to users. Some users with many complaints often express their raged emotions to the computers by punching or kicking computers.

There is a need to improve man-machine interface to make a UI more user-friendly and convenient especially for users who are not skillful or hot-tempered with using some OS or application programs. With the recent trend of affective computing, an emotional communication between computers and users has been suggested, but there has been a very few practically useful UI for this goal. Picard [1] suggested four goals with improved UI for this purpose:

  • Reducing user frustration;

  • Enabling comfortable communication of user emotion;

  • Developing infrastructure and applications to handle affective information; and

  • Building tools that help develop social-emotional skill.

Fulfilling the four goals mentioned above, we can increase the level of emotional communication between computing machines and human beings.

When we try to communicate with machine, there are two channels – explicit and implicit channels [2]. The explicit channel is related to characters and languages for communication, and the implicit channel is accomplished by emotions. For complete understanding between a sender and a receiver, there should be both channels, explicit and implicit. The UI technology has been developed centered on a machine using explicit communication channel, e.g., message communication, without considering human emotion.

There are a variety of implicit emotion data as depicted in Fig. 1 [2]. Typical examples include facial expression, body gesture, a movement of head, a direction of eye focus, and voice [3,4,5], which may express various emotional status, e.g., anger, joy, sorrow, negative or positive emotions. Especially voice delivers delicate change of inner emotions. Body signal represented by electrocardiogram (ECG), change of skin temperature (SKT), Galvanic skin resistance (GSR), photo-plethysmography (PPG) have been used to measure implicit emotions [6,7,8].

Fig. 1.
figure 1

Means of expressing emotion and body signals being sent to computer

Expression of affection by a computer has many meanings depending on the roles and functions of affective consideration on HCI [9]:

  • Recognitions of user’s affective expression;

  • Adaptation to the user’s affective state to generate ‘affective’ behavior by the machine;

  • Modelling of user’s affective states; or

  • Generation of affective states by the computer’s cognitive architecture.

The interpretation of an affective computing may be depicted by Fig. 2, in which an affective computing example provides users with a UI, help functions, guidance functions, and tutoring in perceiving body signals from users.

Fig. 2.
figure 2

Affective communication with a machine

2 Emotional Response from a Computer

The UI has been evolved to provide users with more user-friendly or kind interface since a personal computer was first introduced. However, most communication between a user and a computing machine has been mechanical and uni-directional UI without considering user’s emotions. Emotional interface does not necessarily mean that the computer should fix or solve user’s problem instantly. As Piccard [1] indicated, calming down user’s upset mind is quite effective improving the usability of computers.

2.1 Constant Measurement of User’s Emotion

For effective emotional communication between a computer user and a computer, the constant measurement of user’s emotion by using a mouse and/or keyboard attached with sensor monitoring body signals is suggested in this study. Figure 3 depicts the configuration of the data communication of affective devices. Temperature and skin resistance are measured with sensors attached, and measured signal values are sent to a computer. Body signals are used to check a user’s emotion, and a computer may display some friendly messages on the monitor depending on user’s changed emotion. All the sensor data may be collected and accumulated to the server for a recording of the change of computer user’s emotion change.

Fig. 3.
figure 3

Monitoring of user’s emotional change with a mouse or a keyboard

When specific application program used by a user senses some change of body signals from a user, the application program may display some proper messages that are used to calm down the user’s emotion [1]. The program may also play some favorite music of the computer user along with a message. The example messages may be as follows:

  • Calm down, a little bit more patience makes you an expert;

  • Why don’t you use help service in the menu?;

  • Could you speak your problem to the microphone?; and

  • Could you mail your problem to this address, helpdesk@…?

To fully help users for making a continuous use of an application program without frustration, the program should be made with a really useful help or guidance facility. If the help facility can be used with a microphone and a speaker, that would be better. The affective component that deals with users’ emotions may include the functionalities of playing users’ favorite music or checking users’ health status based on some physiological signals.

2.2 Use of a GSR Sensor to Check User’s Emotional Change

GSR (Galvanic Skin Response) is used to measure the conductivity of skin. When we feel some strong emotion, the sympathetic nerve get stimulated and sweat glands emit lots of sweat and the resistance of skin is changed.

Using a GSR sensor, Figs. 4 and 5 show a user’s change of emotion or mood.

Fig. 4.
figure 4

GSR measurement showing emotional state A

Fig. 5.
figure 5

GSR measurement showing changed emotional state B

Figures 4 and 5 show some change of GSR sensor values depending on the emotional change of users under test. The measurement of emotional signals needs to be standardized for getting generally acceptable emotion state. The GSR sensor is supposed to be attached to the mouse or a keyboard as depicted in Fig. 3. One serious problem is to overcome the effect of noise signals when collecting sensor values from input devices because input devices generates many noise sources such as vibration, impact by hand or fingers, etc.

2.3 Use of Arduino and Smartphone to Simulate Monitoring Body Signal Change

Figure 6 shows the Arduino connection with a temperature sensor and a GSR sensor, which is a prototype before attaching sensors to a mouse or a keyboard. The sensor values are transmitted to a smartphone via a Bluetooth device, HC-06. In this experiment, the Arduino is to be replaced by a mouse or a keyboard, and the smartphone display action is to be implemented by a specific application program or an OS. The dotted line is used to indicate imaginary attachment of sensors to the mouse and the keyboard.

Fig. 6.
figure 6

Arduino connection with temperature and GSR sensors

Figures 7 and 8 show the display of temperature and GSR values transmitted from the Arduino. The set simply shows temperature and GSR values measured by sensors.

Fig. 7.
figure 7

Smartphone showing temperature change transmitted from arduino

Fig. 8.
figure 8

Smartphone showing GSR value change transmitted from arduino

There are many rooms to be improved with the prototype with an Arduino attached with low quality sensors. There is a need to use a temperature sensor that measures the specific narrow range of 35~38 degrees very sensitively. Preparing proper messages depending on sensor values and steep change of sensor values are also in need.

Figure 9 shows the data transmission between a mouse/keyboard and a smartphone, and data storing into a cloud computer. A cloud computer may be replaced by a local server or a notebook PC. Or a smartphone with SQLite may replace the role of a cloud computer in case there is no need to store a large volume of data.

Fig. 9.
figure 9

Communication among mouse, smartphone, and cloud server

3 Conclusions

In this study, a mouse or a keyboard attached with sensors and a prototype with Arduino set have been suggested to make an emotionally user-friendly UI for computer-users who may have some frustration, anger, or other negative emotions while using a computer. Whenever users have any type of complaints, the body data collected with a mouse and/or a keyboard signal the emotional change of users to the OS or a specific application program. The OS or an application program that caused the feeling of frustration or anger may handle properly the emotion of users so that the usability of specific programs may be improved, and also users keep using the computer with emotional composure.

There are many technical difficulties for the realization of suggested ideas as follows:

  • Embedding of temperature and skin resistance sensors into the mouse and keyboard;

  • The preparation of really helpful and soothing messages from the computer;

  • Analysis of body signals for determining a user’s emotions; and

  • Major modification of popular OS or application programs to communicate with users based on users’ emotion.

Currently, the suggested ideas may be difficult to implement considering above difficulties. It is not easy to install some sensitive temperature sensor on a particular location of a mouse and a keyboard. Preparing proper messages for users with complaints needs the combined analysis of temperature and GSR values to accurately know the emotional status of computer users. The support from the makers of OS and application programs to use the API for generating appropriate emotional message based on sensor values is also required. Even with the above mentioned difficulties, the importance of emotional UI is re-emphasized in this study. This study also suggests some idea that may be implemented with widely used input devices, a mouse and a keyboard, to improve emotional UI for comfortable and continuous use of computer systems.