Abstract
In this paper, we investigate an object oriented (OO) architecture for multimodal emotion recognition in interactive applications through mobile phones or handheld devices. Mobile phones are different from desktop computers since mobile phones are not performing any processing involving emotion recognition whereas desktop computers can perform such processing. In fact, in our approach, mobile phones have to pass all data collected to a server and then perform emotion recognition. The object oriented architecture that we have created, combines evidence from multiple modalities of interaction, namely the mobile device’s keyboard and the mobile device’s microphone, as well as data from emotion stereotypes. Moreover, the OO method classifies them into well structured objects with their own properties and methods. The resulting emotion detection server is capable of using and handling transmitted information from different mobile sources of multimodal data during human-computer interaction. As a test bed for the affective mobile interaction we have used an educational application that is incorporated into the mobile system.
Similar content being viewed by others
References
Alepis Ε, Virvou Μ (2006) Emotional intelligence: constructing user stereotypes for affective bi-modal interaction. In: Lecture notes in computer science: knowledge-based intelligent information and engineering systems. Springer-Verlag Berlin Heidelberg 2006, volume 4251 LNAI-I, 2006, pp 435–442
Alepis E, Virvou M (2009) Emotional intelligence in multimodal object oriented user interfaces. Stud Comput Intell 226:349–359
Alepis E, Virvou M (2010) Object oriented architecture for affective multimodal e-learning interfaces. Intell Decis Technol 4(3):171–180
Alepis E, Virvou M, Kabassi K (2008) Requirements analysis and design of an affective bi-modal intelligent tutoring system: the case of keyboard and microphone. Studies in computational intelligence, intelligent interactive systems in knowledge-based environments. 104/2008:9–24
Alepis E, Virvou M, Kabassi K (2008) Knowledge Engineering for affective bi-modal interaction in mobile devices. knowledge-based software engineering. Frontiers in artificial intelligence and applications, volume 180, IOS Press, ISBN 978-1-58603-900-4, pp 305–314
Armstrong DJ (2006) The quarks of object-oriented development. Commun ACM 49(2):123–128
Benz UC, Hofmann P, Willhauck G, Lingenfelder I, Heynen M (2004) Multi-resolution, object-oriented fuzzy analysis of remote sensing data for GIS-ready information. ISPRS J Photogramm Remote Sens 58(3–4):239–258
Bernhardt D, Robinson P (2008) Interactive control of music using emotional body expressions, Conference on human factors in computing systems—Proceedings, pp 3117–3122
Booch G (1996) The unified modeling language. Perform Comp/Unix Rev 14(13):41–48
Busso C, Deng Z, Yildirim S, Bulut M, Lee C, Kazemzadeh A, Lee S, Neumann U, Narayanan S (2004) Analysis of emotion recognition using facial expressions, speech and multimodal infor-mation. In: Proceedings of the 6th international conference on Multimodal interfaces State College, PA, USA: ACM
Cacioppo JT, Berntson GG, Larsen JT, Poehlmann KM, Ito TA (2000) In: Lewis M, Haviland-Jones JM (eds) The psychophysiology of emotion, handbook of emotions, pp 173–191
Caridakis G, Karpouzis K, Wallace M, Kessous L, Amir N (2010) Multimodal user’s affective state analysis in naturalistic interaction. J Multimodal User Interfaces 3(1):49–66
Chiu P-H, Lo C-C, Chao K-M (2009) Integrating semantic web and object-oriented programming for cooperative design. J Univ Comput Sci 15(9):1970–1990
Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor J (2001) Emotion recognition in human-computer interaction. IEEE Signal Process Mag 18(1):32–80
De Silva L, Miyasato T, Nakatsu R (1997) Facial emotion recognition using multimodal informa-tion. In: IEEE Int. Conf. on Information, Communications and Signal Processing (ICICS’97), pp 397–401
Esposito A (2009) The perceptual and cognitive role of visual and auditory channels in conveying emotional information. Cogn Comput J 1(n.2):268–278
Fishburn PC (1967) Additive utilities with incomplete product set: applications to priorities and assignments, operations research
Goleman D (1995) Emotional intelligence. Bantam Books, New York
Huang TS, Chen LS, Tao H (1998) Bimodal emotion recognition by man and machine. In: ATR Workshop on Virtual Communication Environments Kyoto, Japan
Hwang CL, Yoon K (1981) Multiple attribute decision making: methods and applications, vol. 186. Springer, Berlin
Hwang CL, Yoon K (1981) Multiple attribute decision making: methods and applications. Lecture notes in economics and mathematical systems 186. Springer, Berlin
Jascanu N, Jascanu V, Bumbaru S (2008) Toward emotional e-commerce: the customer agent. Lect Notes Comput Sci (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 5177 LNAI(PART 1):202–209
Kay J (2000) Stereotypes, student models and scrutability. In: Gauthier G, Frasson C, VanLehn K (eds) Proceedings of the fifth international conference on intelligent tutoring systems, lecture notes in computer science, vol. 1839. Springer-Verlag, Berlin, pp 19–30
Liao W, Zhang W, Zhu Z, Ji Q, Gray WD (2006) Toward a decision-theoretic framework for affect recognition and user assistance. Int J Hum Comput Stud 64:847–873
Lim MY, Aylett R (2007) Feel the difference: a guide with attitude! Lect Notes Comput Sci 4722 LNCS:317–330
Moriyama T, Ozawa S (2001) Measurement of human vocal emotion using fuzzy control. Syst Comput Jpn 32(4)
Nasoz F, Lisetti CL (2006) MAUI avatars: mirroring the user’s sensed emotions via expressive multi-ethnic facial avatars. J Vis Lang Comput 17:430–444
Oviatt S (2003) User-modeling and evaluation of multimodal interfaces. Proc IEEE 1457–1468
Pantic M, Rothkrantz LJM (2003) Toward an affect-sensitive multimodal human-computer interaction. Proc IEEE 91:1370–1390
Park Y-W, Lim C-Y, Nam T-J (2010) CheekTouch: an affective interaction technique while speaking on the mobile phone. Conference on Human Factors in Computing Systems—Proceedings, pp 3241–3246
Pastor O, Gómez J, Insfrán E, Pelechano V (2001) The OO-Method approach for information systems modeling: from object-oriented conceptual modeling to automated programming. Inf Syst 26(7):507–534
Picard RW (2003) Affective computing: challenges. Int J Hum Comput Stud 59(1–2):55–64
Platinum Technology, Inc (1997) Paradigm plus: Round-trip engineering for JAVA, white paper
Rational Software Corporation, Rational Rose User’s Manual (1997)
Rich E (1983) Users are individuals: individualizing user models. Int J Man Mach Stud 18:199–214
Shieh C-K, Mac S-C, Chang T-C, Lai C-M (1996) An object-oriented approach to develop software fault-tolerant mechanisms for parallel programming systems. J Syst Softw 32(3):215–225
Stathopoulou I-O, Alepis E, Tsihrintzis GA, Virvou M (2010) On assisting a visual-facial affect recognition system with keyboard-stroke pattern information. Knowl-Based Syst 23(4):350–356
Tsihrintzis G, Virvou M, Stathopoulou IO, Alepis E (2008) On improving visual-facial emotion recognition with audio-lingual and keyboard stroke pattern information. Web Intelligence and Intelligent Agent Technology, WI-IAT ‘08, volume 1, pp 810–816
Vincent J (2009) Affiliations, emotion and the mobile phone. Lect Notes Comput Sci 5641 LNAI:28–41
Virvou M, Alepis E (2004) Mobile versus desktop facilities for an e-learning system: users’ perspective, Conference Proceedings—IEEE International Conference on Systems, Man and Cybernetics 1:48–52
Virvou M, Alepis E (2005) Mobile educational features in authoring tools for personalised tutoring. Comput Educ 44(1):53–68
Virvou M, Tsihrintzis G, Alepis E, Stathopoulou IO, Kabassi K (2007) Combining empirical studies of audio-lingual and visual-facial modalities for emotion recognition. In Knowledge-based intelligent information and engineering systems—KES 2007, lecture notes in artificial intelligence, subseries of lecture notes in computer science, volume 4693/2007. Springer, Berlin, pp 1130–1137
Wikipedia. The free encyclopedia. http://en.wikipedia.org/wiki/Object_oriented
Yoon WJ, Cho YH, Park KS (2007) A study of speech emotion recognition and its application to mobile services. Lect Notes Comput Sci 4611 LNCS:758–766
Zeng Z, Tu J, Liu M, Huang T, Pianfetti B, Roth D, Levinson S (2007) Audio-visual affect recognition. IEEE Trans Multimedia 9:424–428
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Alepis, E., Virvou, M. Multimodal object oriented user interfaces in mobile affective interaction. Multimed Tools Appl 59, 41–63 (2012). https://doi.org/10.1007/s11042-011-0744-y
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-011-0744-y