Abstract
Even with a big progress with computing facilities including notebook PC, tap computers, and smartphones, there is still no consideration at all to deal with users’ emotions while they use computing devices. In this study, a primitive idea with some technique is suggested to monitor users’ emotional changes while using computing facility. With the help of some physiological signals collected from sensors attached to a mouse or keyboard, the application programs or OS can be made to behave or respond some appropriate actions depending on users’ emotional status. Two typical body signals used in this study are finger temperature and skin resistance, which may be measured with a keyboard and a mouse attached with sensors. To respond to user’s emotions, an application program should be equipped with some components that are able to analyze user’s abrupt emotion changes through input sensors. To fully use affective techniques developed with application software and also operating system, the functionalities dealing with body signals should be included into the programs, and also various sensing techniques for measuring user’s temperature and skin resistance should be attached to input devices. The difficulties related to affective UI is to make a general indication of many users’ emotions, and to extract particular emotions based on body temperature and skin resistance that are measured with input devices, which is related to the individualization of body signals.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Most application programs and operating systems simply lack considering users’ emotions, for example, frustration, anger, confusion, helplessness, and so on, while using the computer systems. Most casual users of application programs or OS, who are using such programs as word processors, spreadsheets, presentation programs, almost all sorts of application programs in every field, and Windows or Linux systems may experience frustration and anger until they are able to use computer systems/programs conveniently, or when they encounter some functionality very undesirable or unfriendly to users. Some users with many complaints often express their raged emotions to the computers by punching or kicking computers.
There is a need to improve man-machine interface to make a UI more user-friendly and convenient especially for users who are not skillful or hot-tempered with using some OS or application programs. With the recent trend of affective computing, an emotional communication between computers and users has been suggested, but there has been a very few practically useful UI for this goal. Picard [1] suggested four goals with improved UI for this purpose:
-
Reducing user frustration;
-
Enabling comfortable communication of user emotion;
-
Developing infrastructure and applications to handle affective information; and
-
Building tools that help develop social-emotional skill.
Fulfilling the four goals mentioned above, we can increase the level of emotional communication between computing machines and human beings.
When we try to communicate with machine, there are two channels – explicit and implicit channels [2]. The explicit channel is related to characters and languages for communication, and the implicit channel is accomplished by emotions. For complete understanding between a sender and a receiver, there should be both channels, explicit and implicit. The UI technology has been developed centered on a machine using explicit communication channel, e.g., message communication, without considering human emotion.
There are a variety of implicit emotion data as depicted in Fig. 1 [2]. Typical examples include facial expression, body gesture, a movement of head, a direction of eye focus, and voice [3,4,5], which may express various emotional status, e.g., anger, joy, sorrow, negative or positive emotions. Especially voice delivers delicate change of inner emotions. Body signal represented by electrocardiogram (ECG), change of skin temperature (SKT), Galvanic skin resistance (GSR), photo-plethysmography (PPG) have been used to measure implicit emotions [6,7,8].
Expression of affection by a computer has many meanings depending on the roles and functions of affective consideration on HCI [9]:
-
Recognitions of user’s affective expression;
-
Adaptation to the user’s affective state to generate ‘affective’ behavior by the machine;
-
Modelling of user’s affective states; or
-
Generation of affective states by the computer’s cognitive architecture.
The interpretation of an affective computing may be depicted by Fig. 2, in which an affective computing example provides users with a UI, help functions, guidance functions, and tutoring in perceiving body signals from users.
2 Emotional Response from a Computer
The UI has been evolved to provide users with more user-friendly or kind interface since a personal computer was first introduced. However, most communication between a user and a computing machine has been mechanical and uni-directional UI without considering user’s emotions. Emotional interface does not necessarily mean that the computer should fix or solve user’s problem instantly. As Piccard [1] indicated, calming down user’s upset mind is quite effective improving the usability of computers.
2.1 Constant Measurement of User’s Emotion
For effective emotional communication between a computer user and a computer, the constant measurement of user’s emotion by using a mouse and/or keyboard attached with sensor monitoring body signals is suggested in this study. Figure 3 depicts the configuration of the data communication of affective devices. Temperature and skin resistance are measured with sensors attached, and measured signal values are sent to a computer. Body signals are used to check a user’s emotion, and a computer may display some friendly messages on the monitor depending on user’s changed emotion. All the sensor data may be collected and accumulated to the server for a recording of the change of computer user’s emotion change.
When specific application program used by a user senses some change of body signals from a user, the application program may display some proper messages that are used to calm down the user’s emotion [1]. The program may also play some favorite music of the computer user along with a message. The example messages may be as follows:
-
Calm down, a little bit more patience makes you an expert;
-
Why don’t you use help service in the menu?;
-
Could you speak your problem to the microphone?; and
-
Could you mail your problem to this address, helpdesk@…?
To fully help users for making a continuous use of an application program without frustration, the program should be made with a really useful help or guidance facility. If the help facility can be used with a microphone and a speaker, that would be better. The affective component that deals with users’ emotions may include the functionalities of playing users’ favorite music or checking users’ health status based on some physiological signals.
2.2 Use of a GSR Sensor to Check User’s Emotional Change
GSR (Galvanic Skin Response) is used to measure the conductivity of skin. When we feel some strong emotion, the sympathetic nerve get stimulated and sweat glands emit lots of sweat and the resistance of skin is changed.
Using a GSR sensor, Figs. 4 and 5 show a user’s change of emotion or mood.
Figures 4 and 5 show some change of GSR sensor values depending on the emotional change of users under test. The measurement of emotional signals needs to be standardized for getting generally acceptable emotion state. The GSR sensor is supposed to be attached to the mouse or a keyboard as depicted in Fig. 3. One serious problem is to overcome the effect of noise signals when collecting sensor values from input devices because input devices generates many noise sources such as vibration, impact by hand or fingers, etc.
2.3 Use of Arduino and Smartphone to Simulate Monitoring Body Signal Change
Figure 6 shows the Arduino connection with a temperature sensor and a GSR sensor, which is a prototype before attaching sensors to a mouse or a keyboard. The sensor values are transmitted to a smartphone via a Bluetooth device, HC-06. In this experiment, the Arduino is to be replaced by a mouse or a keyboard, and the smartphone display action is to be implemented by a specific application program or an OS. The dotted line is used to indicate imaginary attachment of sensors to the mouse and the keyboard.
Figures 7 and 8 show the display of temperature and GSR values transmitted from the Arduino. The set simply shows temperature and GSR values measured by sensors.
There are many rooms to be improved with the prototype with an Arduino attached with low quality sensors. There is a need to use a temperature sensor that measures the specific narrow range of 35~38 degrees very sensitively. Preparing proper messages depending on sensor values and steep change of sensor values are also in need.
Figure 9 shows the data transmission between a mouse/keyboard and a smartphone, and data storing into a cloud computer. A cloud computer may be replaced by a local server or a notebook PC. Or a smartphone with SQLite may replace the role of a cloud computer in case there is no need to store a large volume of data.
3 Conclusions
In this study, a mouse or a keyboard attached with sensors and a prototype with Arduino set have been suggested to make an emotionally user-friendly UI for computer-users who may have some frustration, anger, or other negative emotions while using a computer. Whenever users have any type of complaints, the body data collected with a mouse and/or a keyboard signal the emotional change of users to the OS or a specific application program. The OS or an application program that caused the feeling of frustration or anger may handle properly the emotion of users so that the usability of specific programs may be improved, and also users keep using the computer with emotional composure.
There are many technical difficulties for the realization of suggested ideas as follows:
-
Embedding of temperature and skin resistance sensors into the mouse and keyboard;
-
The preparation of really helpful and soothing messages from the computer;
-
Analysis of body signals for determining a user’s emotions; and
-
Major modification of popular OS or application programs to communicate with users based on users’ emotion.
Currently, the suggested ideas may be difficult to implement considering above difficulties. It is not easy to install some sensitive temperature sensor on a particular location of a mouse and a keyboard. Preparing proper messages for users with complaints needs the combined analysis of temperature and GSR values to accurately know the emotional status of computer users. The support from the makers of OS and application programs to use the API for generating appropriate emotional message based on sensor values is also required. Even with the above mentioned difficulties, the importance of emotional UI is re-emphasized in this study. This study also suggests some idea that may be implemented with widely used input devices, a mouse and a keyboard, to improve emotional UI for comfortable and continuous use of computer systems.
References
Picard, R.W.: Affective Computing For HCI
Kim, N.S.: And perspectives on emotion recognition technologies. Telecommun. Rev. 19(5) (2009)
Cowie, R., et al.: Emotion recognition in human computer interaction. IEEE Signal Process. Mag. 18(1), 32–80 (2001)
Zeng, Z., Pantic, M., Roisman, G.I., Huang, S.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)
Lee, C.M., Narayanan, S.S.: Toward detecting emotions in spoken dialogs. IEEE Trans. Speech Audio Process. 13(2), 293–303 (2005)
Katsis, C.D., Katertsidis, N., Ganiatsas, G., Fotiadis, D.I.: Toward emotion recognition in car-racing drivers: a biosignal processing approach. IEEE Trans. Syst. Man Cybern. Part A 38(3), 502–512 (2008)
DeSilva, L.C., Miyasato, T., Nakatsu, R.: Facial emotion recognition using multi-modal information. In: Proceedings IEEE International Conference on Communication, Signal Process, pp. 397–401 (1997)
Cacioppo, J.T., Tassinary, L.G.: Inferring psychological significance from physiological signals. Amer. Psychol. 45(1), 16–28 (1990)
Hudlicka, E.: To feel of not to feel: the role of affect in human-computer interaction. Int. J. Hum.-Comput. Stud. 59, 1–32 (2003)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Sung, K.Y. (2017). A Suggestion to Improve User-Friendliness Based on Monitoring Computer User’s Emotions. In: Marcus, A., Wang, W. (eds) Design, User Experience, and Usability: Designing Pleasurable Experiences. DUXU 2017. Lecture Notes in Computer Science(), vol 10289. Springer, Cham. https://doi.org/10.1007/978-3-319-58637-3_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-58637-3_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-58636-6
Online ISBN: 978-3-319-58637-3
eBook Packages: Computer ScienceComputer Science (R0)