Abstract
This paper presents an online chat interface design which can display users’ mixed emotions during online chatting in the form of gradient color. This design is based on affective recognition and tries to provide an effective way of emotional communication. The system of this design consists of two parts: emotional recognition and emotional display. The affective recognition function relies on the Microsoft cognitive service, which provides APIs (Application Programming Interface) to make the function quick and accurate. Mixed emotions are shown by gradient color and an emotion indicator has been added to create an emotional chatting environment. A preliminary test has been done among 8 couples or lovers to evaluate the design and the results show that this online chat interface design, which aims to help mixed emotional communication, can be useable for lovers chatting. The study provides opportunities for further improvement and exploration of this approach and applications to more fields.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Nowadays an increasing number of people chat online in various situation in their daily life. With this increasing use, there is a trend that, besides content itself, people are eager to exchange their emotions during online chat. Currently there are some existing methods to exchange emotions, for example: emoji, kaomoji (a popular Japanese emoticon style made up of Japanese characters and grammar punctuations) and personalized stickers. However, these methods have some shortcomings. When people chat in these ways, the emotional communication cannot be as natural as face-to-face conversation and cannot be in real time since it needs users to trigger them actively after the emotions happen. Additionally, these methods of expression will interrupt the flow of the chat because they occupy the same channel. More importantly these ways cannot express mixed emotion. Therefore, it is necessary to find a new method to help mixed emotional communication be conducted in a more natural way.
Color emotion means the semantic words describing the characteristics of colors and human’s emotional responses to colors [1]. Colors have a strong impact on people’s emotions [2]. Researches show that many kinds of emotions can be sensed from some specific colors [1,2,3]. Some related psychological studies [4, 5] show that there are marked associations between colors and feelings, and people can be emotionally sensitive to many specific colors. For example: people can feel intense emotion like anger when they see the color red.
Based on former studies, it is reasonable to utilize colors to demonstrate people’s emotions. The main aim of this online chat interface is to add a mixed emotional indicator using gradient color to create a more natural emotional expression during online text chat in people’s daily life. However, according to researches [4, 5], color preference varies among people and is culturally-based. Gender, age and many other factors would affect people’s feeling to colors [6]. Therefore, considering this factor, the color of the indicator was designed to be adaptable.
2 Related Work
2.1 Affective Recognition
Affective recognition is a branch of affective computing. Affective Computing is an interdisciplinary field which deals with the problem of how computers or devices interact with human’s affect. There are many studies working on emotional recognition. Different methods [7,8,9] have been used to detect affect like detecting human’s physiological response to emotions such as facial expression, tone changes and other kinds of physiological variations. Additionally, analyzing semantic content [10] is another method used by researchers to determine emotions. What’s more, many new algorithms have been proposed to promote the accuracy of affective recognition [11].
2.2 Application of Affective Recognition
In addition to curing mental illnesses [12], there have been many attempts to apply emotional recognition, especially in the fields of gaming [13] and education [14, 15] where users’ emotions matter significantly. When it comes to online chat, many studies utilize method of semantic recognition [10, 16], and some identify emotions using other features [17], but the device used is complicated and cannot be applied to daily chatting. Limited studies [18] have applied facial expression recognition to the online chat. Scarcely any studies investigated a way to display mixed emotions.
We propose an online chat interface which displays mixed emotions based on affective recognition. Uniquely, it can exchange mixed emotions without the use of complicated devices in daily life and can work by adding a new channel to communicate without interrupting chat flow.
3 Design Description
3.1 System Overview
Figure 1 demonstrates an overview workflow of this online chat interface design. Under the permission of both users, the cameras of computers or mobile phones actively detect both their facial expressions and transmit the facial images to the Microsoft Emotion Service [20] server to define the expression. After analyzing the image, the server returns results to the receivers’ computers/mobile phones and the interface will display their chatting partners’ mixed emotions by gradient color.
3.2 Emotional Recognition
When it comes to online chat, in some cases, researchers [10] use semantic recognition. They identify the users’ emotion by analyzing the chatting content. But we think users can tell the emotions better than machines from content by themselves. The facial expression, however, is the part that users cannot see during normal online chat so its recognition can be added reference for users to understand their partners’ emotions. There are many studies [18, 19] on facial emotional recognition utilizing machine learning. Also, many companies provide their APIs (Application Programming Interface) or SDK (Software Development Kit) for easy and quick use. We chose the Microsoft cognitive service [20] for its powerful function and relatively convenient accessibility.
While using the Microsoft service, we set camera to capture users’ facial images 20 times per minute. Therefore, every three seconds an image will be transmitted to the sever for analyzing. The emotion API takes a facial expression image as an input, and returns value of confidence of 8 emotions. The types of emotions in this service are anger, contempt, disgust, fear, happiness, neutral, sadness and surprise. These emotions are understood cross-culturally and universally communicated with particular facial expressions. Additionally, they can cover most of the emotions expressed in daily life. Figure 2 shows an example of the results returned from Microsoft emotion API. It draws a rectangle of the face it detected and returns the location of the face by coordinates. On the left part of the picture, the confidence of each emotion of this face is shown. The bigger the number of a particular emotion is, the more that emotion has been detected in the image. From the result of Fig. 2, we can see that neutral and happiness are the two most detected emotions showed by the face.
3.3 Emotional Display
Based on the result returned from the server, a bar with gradient color will be added on the chatting interface to indicate the partners’ real-time emotion. Meanwhile, the gradient color will change in real time according to the alteration of partners’ emotions. The gradient color displayed on the interface contains different proportions of different basic colors which represent different types of emotions correspondingly. The eight basic types of emotion that Microsoft service define are hard for users to get used to so we divide these eight colors into four groups based on J. A. Russell’s theory [21]. Russell measures emotion with two dimensions: arousal and pleasant. Figure 3 illustrates how we group the eight basic emotions based on Russell’s emotional coordinate system.
We matched the four emotional groups to four basic colors—red, orange, blue and green. Also, the selection of the basic colors was based on related studies [2] so that the original set of color indications can be recognized more widely. Considering that each person may have different color-emotion match choice, we made this corresponding relationship between emotion and color changeable in test. What’s more, we don’t recommend people with defective color vision to use this design, if they want to, the word of the emotion type can be shown on the gradient bar. Figure 4 shows a sample of how we “translate” the returned results into the gradient color. As we can see from the figure, the different confidences of each group of emotion is represented by correspondingly different proportions of colors. Figure 5 shows how the gradient bar placed on the interface. In this way, the interface shows the complexity of users’ emotions. We did our design on the basis of the popular chat tool in China-WeChat so that later comparative tests can be done without the interference of the unfamiliarity of user interface.
4 Preliminary Test and Results
4.1 Test Description
A preliminary test was carried out to evaluate the usability of this design interface. People don’t want to recognize the emotions of everyone that they chat with. The design won’t be necessary for to chatting with less intimate people. But when people chat with someone who is emotionally important to them, and it is not convenient to use video chat, this design can be quite meaningful. For example: long-distance lovers’ chatting online respectively in some public places.
Eight participants participated in the test - four pairs of lovers or couples who have been together as least for 6 months. This length of time can guarantee there is a need for emotional communication of them. All the participants are range from 20–29 years old and are familiar with online chatting. None of the participants had defective color vision.
Each pair of couples or lovers were asked to conduct three rounds of 10 min online chatting. Before each round, each participant was asked to solely watch a prepared emotional video. The videos can evoke their emotions and help provide chatting content. During each round, participants were requested to stay in different rooms (so their emotions would not be visible to each other) and asked to chat online with each other. Figure 6 shows part of the test process.
In each round, participants used three different methods to chat. The three methods were: using daily emoji, stickers or similar, only in text and chatting with the new interface. The sequence of the three methods used were different among the 4 pairs to minimize the influence of methods used and time of chat. Additionally, before the participants used the new interface to conduct chat, we asked them to match the four colors with the four groups of emotion by using their instinct. We adjusted our system matching relationship based on their choice. Finally, a short training was done before the test to enable participants to understand the function of the system.
After each round of chat, one participant was selected to choose ten most emotional sentences from their chatting content and both of the participants had to choose one or two types of emotion expressed in those sentences. If they choose two emotions, it meant that the emotion was mixed and so they have to select which was the primary emotion and which was the secondary. After all the rounds of chatting, every participant was asked to complete a questionnaire and conduct a short interview to give their feedback on the design.
4.2 Evaluation
We can evaluate the usability of new design by their effectiveness, efficiency and users’ satisfaction. To decide the effectiveness of our design, we compared the scores of the three methods of chatting. The score represents how well mixed emotion were successfully communicated by comparing how well the lovers’ answers match. In this test, partners’ first emotion successfully matching was worth 2 points and if the secondary emotion matched or only the sequence was wrong, it worth 1 points, otherwise no points were assigned. This provided four possibilities: 3 points, 2 points, 1 points or 0. For each method, we added up all the points awarded and for each pair, we compared the effectiveness of the different methods for complicated emotion communication. The left part of Fig. 7 illustrates the total score of each round and the proportion each score represented. The right part shows the average score of the four pairs, which can indicate the effectiveness of the design among these participants.
The questionnaires that the participants completed investigated the ease of use of the design, how much the participants were involved in the chat and whether they thought the interface made communication more interesting, and whether they were satisfied with the design. The participants were asked to score the statements below, where 0 represents totally disagree and 5 means totally agree:
Compared to the existing online chat methods, using the design interface with emotion bar made
-
It is easy to recognize your partner’s emotion;
-
me more involved in the online chat;
-
our conversation more interesting;
-
me satisfied with the design of the emotion bar.
In the interview, users could give their personal comments on the design so that we can identify more problems. This research collected quantitative and qualitative data to judge the efficiency and users’ satisfaction of this design interface. All three parts of evaluation analyzed the usability of the design.
4.3 Results
In the Fig. 7, Method 1 stands for normal chatting in which emoji, kaomoji and stickers are allowed. Method 2 represents chatting only in text and Method 3 uses the design interface to chat. By comparing the test data, we can find that in case of lovers’ chatting, the new interface design shows a slight superiority over current daily chatting method with the respect to in effectiveness of conveying mixed emotion. Additionally, chatting using the new interface was more effective than chatting only in plain text for emotional communication. Therefore, when it comes to conveying mixed emotions, among lovers, this design has been demonstrated to be effective.
Figure 8 shows data regarding the efficiency and satisfaction. In the aspect of easiness, this design scored 3.25 out of five. It seems that this design is relatively easy to use but still has some weaknesses. One participant complained: “The placement of the bar makes me have to raise my head to see my darling’s feelings.” Another pointed out that it is a little bit confusing when they are not familiar with the corresponding relationship between the color and different types of emotion. These reasons may cause the users to find the design not so easy to use. The involvement score was 3.5 out of five, indicating that design made the users more involved in their conversation. The interest score was 4.13 out of five, which indicates that almost all the participants thought the emotion bar made chatting more interesting. One said: “It is really fun to chat emotionally in that way!” Finally, the score of satisfaction is 3.5 of five, which shows participants were generally satisfied with the design.
From the test, we found that users were keen to comprehend their partners’ emotions and most of them reported that they felt better understood. They thought the ease of the design was acceptable and the emotion bar was very interesting. They were more involved in their conversations using new design interface since complex emotions could be expressed, and most of them were satisfied with this design. This preliminary test shows that this online chat interface design, which aims to help mixed emotion communication can be useful for lovers chatting.
4.4 Limitations
Although the initial testing indicated that the concept is useful, there are some limitations of the current design, and the test could also be improved.
For the design, there are three problems. Firstly, the current design only allows users to match four kinds of color to their emotions which limits the users’ choice. One user said that she thought all the four colors we provided looked soft and she could hardly feel strong emotions like anger. Secondly, the frequency at which we detected the users’ emotions was not enough because the send-and-return process required a minimum of 2 s, thus delating real-time emotion display. Thirdly, some participants argued that the types of emotions we provided were not suitable. For example, curiosity is expressed a lot in online chat but cannot be recognized by our system, whereas fear is rare in daily conversation.
For the test part, there are also some problems. The types of the participants are limited to couples or lovers and the number of the participants are not big. This number will be increased after the issues aforementioned are solved. The most important problem was that emotion, as a very complex object, is hard to judge accurately so it was challenging to identify how well a kind of emotion or a mixed emotion had been expressed. Therefore, the test of usability still has opportunities to improve.
5 Future Vision
Although the test only shows the usability among lovers and couples, we believe that this new interactive online chat interface can be applied in many scenarios among different kind of people, especially when emotion plays a vital role in chat such as long-distance lovers’ chatting, online education, online healthcare, and online trading.
In future studies, more work and research will be carried out to fix the problems mentioned above and more methods will be found to express people’s mixed emotions so that we can make online chat more effective and interesting.
References
Nijdam, N.A.: Mapping emotion to color. Book Mapping emotion to color, pp. 2–9 (2009)
Naz, K., Epps, H.: Relationship between color and emotion: a study of college students. Coll. Student J. 38, 396 (2004)
Gao, X.P., Xin, J.H.: Investigation of human’s emotional responses on colors. Color Res. Appl. 31, 411–417 (2006)
Buechner, V.L., Maier, M.A., Lichtenfeld, S., Elliot, A.J.: Emotion expression and color: their joint influence on perceived attractiveness and social position. Curr. Psychol. 34, 422–433 (2015)
Valdez, P., Mehrabian, A.: Effects of color on emotions. J. Exp. Psychol. Gen. 123, 394 (1994)
Gao, X.P., Xin, J.H., Sato, T., Hansuebsai, A., Scalzo, M., Kajiwara, K., Guan, S.S., Valldeperas, J., Lis, M.J., Billger, M.: Analysis of cross-cultural color emotion. Color Res. Appl. 32, 223–229 (2007)
Kim, J., André, E.: Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 30, 2067–2083 (2008)
Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31, 39–58 (2009)
Edwards, J., Pattison, P.E., Jackson, H.J., Wales, R.J.: Facial affect and affective prosody recognition in first-episode schizophrenia. Schizophr. Res. 48, 235–253 (2001)
Khan, F.M., Fisher, T.A., Shuler, L., Wu, T., Pottenger, W.M.: Mining chat-room conversations for social and semantic interactions. Computer Science and Engineering, Lehigh University (2002)
Lee, C.-C., Katsamanis, A., Black, M.P., Baucom, B.R., Georgiou, P.G., Narayanan, S.S.: Affective state recognition in married couples’ interactions using PCA-based vocal entrainment measures with multiple instance learning. In: D’Mello, S., Graesser, A., Schuller, B., Martin, J.-C. (eds.) ACII 2011. LNCS, vol. 6975, pp. 31–41. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-24571-8_4
Dalili, M., Penton-Voak, I., Harmer, C., Munafò, M.: Meta-analysis of emotion recognition deficits in major depressive disorder. Psychol. Med. 45, 1135–1144 (2015)
Finkelstein, S.L., Nickel, A., Harrison, L., Suma, E.A., Barnes, T.: cMotion: A new game design to teach emotion recognition and programming logic to children using virtual humans. In: 2009 Virtual Reality Conference, VR 2009, pp. 249–250. IEEE (2009)
Ashwin, T., Jose, J., Raghu, G., Reddy, G.R.M.: An e-learning system with multifacial emotion recognition using supervised machine learning. In: 2015 IEEE Seventh International Conference on Technology for Education (T4E), pp. 23–26. IEEE (2015)
Yoon, H., Park, S.-W., Lee, Y.-K., Jang, J.-H.: Emotion recognition of serious game players using a simple brain computer interface. In: 2013 International Conference on ICT Convergence (ICTC), pp. 783–786. IEEE (2013)
Resch, B., Summa, A., Sagl, G., Zeile, P., Exner, J.-P.: Urban emotions—geo-semantic emotion extraction from technical sensors, human sensors and crowdsourced data. In: Gartner, G., Huang, H. (eds.) Progress in Location-Based Services 2014. LNGC, pp. 199–212. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-11879-6_14
Wang, H., Prendinger, H., Igarashi, T.: Communicating emotions in online chat using physiological sensors and animated text. In: CHI 2004 Extended Abstracts on Human Factors in Computing Systems, pp. 1171–1174. ACM (2004)
Bouguerra, B.: Real-time animations of emoticons using facial recognition during a video chat. Google Patents (2012)
Chen, S., Pande, A., Mohapatra, P.: Sensor-assisted facial recognition: an enhanced biometric authentication system for smartphones. In: Proceedings of the 12th Annual International Conference on Mobile Systems, Applications, and Services, pp. 109–122. ACM (2014)
Microsoft cognitive service. https://azure.microsoft.com/en-us/services/cognitive-services/
Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161 (1980)
Acknowledgement
This paper was supported by Zhejiang Provincial Key Laboratory of Integration of Healthy Smart Kitchen System (Grant No: 2017F02) and the Fundamental Research Funds for the Central Universities of Shanghai Jiao Tong University (Grant No: 17JCYB07).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Tang, N., Dong, Z., Liu, L. (2018). Expressing Mixed Emotions via Gradient Color: An Interactive Online Chat Interface Design Based on Affective Recognition. In: Kurosu, M. (eds) Human-Computer Interaction. Interaction Technologies. HCI 2018. Lecture Notes in Computer Science(), vol 10903. Springer, Cham. https://doi.org/10.1007/978-3-319-91250-9_15
Download citation
DOI: https://doi.org/10.1007/978-3-319-91250-9_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-91249-3
Online ISBN: 978-3-319-91250-9
eBook Packages: Computer ScienceComputer Science (R0)