Skip to main content
Log in

Multimodal object oriented user interfaces in mobile affective interaction

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

In this paper, we investigate an object oriented (OO) architecture for multimodal emotion recognition in interactive applications through mobile phones or handheld devices. Mobile phones are different from desktop computers since mobile phones are not performing any processing involving emotion recognition whereas desktop computers can perform such processing. In fact, in our approach, mobile phones have to pass all data collected to a server and then perform emotion recognition. The object oriented architecture that we have created, combines evidence from multiple modalities of interaction, namely the mobile device’s keyboard and the mobile device’s microphone, as well as data from emotion stereotypes. Moreover, the OO method classifies them into well structured objects with their own properties and methods. The resulting emotion detection server is capable of using and handling transmitted information from different mobile sources of multimodal data during human-computer interaction. As a test bed for the affective mobile interaction we have used an educational application that is incorporated into the mobile system.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Alepis Ε, Virvou Μ (2006) Emotional intelligence: constructing user stereotypes for affective bi-modal interaction. In: Lecture notes in computer science: knowledge-based intelligent information and engineering systems. Springer-Verlag Berlin Heidelberg 2006, volume 4251 LNAI-I, 2006, pp 435–442

  2. Alepis E, Virvou M (2009) Emotional intelligence in multimodal object oriented user interfaces. Stud Comput Intell 226:349–359

    Article  Google Scholar 

  3. Alepis E, Virvou M (2010) Object oriented architecture for affective multimodal e-learning interfaces. Intell Decis Technol 4(3):171–180

    Google Scholar 

  4. Alepis E, Virvou M, Kabassi K (2008) Requirements analysis and design of an affective bi-modal intelligent tutoring system: the case of keyboard and microphone. Studies in computational intelligence, intelligent interactive systems in knowledge-based environments. 104/2008:9–24

  5. Alepis E, Virvou M, Kabassi K (2008) Knowledge Engineering for affective bi-modal interaction in mobile devices. knowledge-based software engineering. Frontiers in artificial intelligence and applications, volume 180, IOS Press, ISBN 978-1-58603-900-4, pp 305–314

  6. Armstrong DJ (2006) The quarks of object-oriented development. Commun ACM 49(2):123–128

    Article  Google Scholar 

  7. Benz UC, Hofmann P, Willhauck G, Lingenfelder I, Heynen M (2004) Multi-resolution, object-oriented fuzzy analysis of remote sensing data for GIS-ready information. ISPRS J Photogramm Remote Sens 58(3–4):239–258

    Article  Google Scholar 

  8. Bernhardt D, Robinson P (2008) Interactive control of music using emotional body expressions, Conference on human factors in computing systems—Proceedings, pp 3117–3122

  9. Booch G (1996) The unified modeling language. Perform Comp/Unix Rev 14(13):41–48

    Google Scholar 

  10. Busso C, Deng Z, Yildirim S, Bulut M, Lee C, Kazemzadeh A, Lee S, Neumann U, Narayanan S (2004) Analysis of emotion recognition using facial expressions, speech and multimodal infor-mation. In: Proceedings of the 6th international conference on Multimodal interfaces State College, PA, USA: ACM

  11. Cacioppo JT, Berntson GG, Larsen JT, Poehlmann KM, Ito TA (2000) In: Lewis M, Haviland-Jones JM (eds) The psychophysiology of emotion, handbook of emotions, pp 173–191

  12. Caridakis G, Karpouzis K, Wallace M, Kessous L, Amir N (2010) Multimodal user’s affective state analysis in naturalistic interaction. J Multimodal User Interfaces 3(1):49–66

    Article  Google Scholar 

  13. Chiu P-H, Lo C-C, Chao K-M (2009) Integrating semantic web and object-oriented programming for cooperative design. J Univ Comput Sci 15(9):1970–1990

    Google Scholar 

  14. Cowie R, Douglas-Cowie E, Tsapatsoulis N, Votsis G, Kollias S, Fellenz W, Taylor J (2001) Emotion recognition in human-computer interaction. IEEE Signal Process Mag 18(1):32–80

    Article  Google Scholar 

  15. De Silva L, Miyasato T, Nakatsu R (1997) Facial emotion recognition using multimodal informa-tion. In: IEEE Int. Conf. on Information, Communications and Signal Processing (ICICS’97), pp 397–401

  16. Esposito A (2009) The perceptual and cognitive role of visual and auditory channels in conveying emotional information. Cogn Comput J 1(n.2):268–278

    Article  Google Scholar 

  17. Fishburn PC (1967) Additive utilities with incomplete product set: applications to priorities and assignments, operations research

  18. Goleman D (1995) Emotional intelligence. Bantam Books, New York

    Google Scholar 

  19. Huang TS, Chen LS, Tao H (1998) Bimodal emotion recognition by man and machine. In: ATR Workshop on Virtual Communication Environments Kyoto, Japan

  20. Hwang CL, Yoon K (1981) Multiple attribute decision making: methods and applications, vol. 186. Springer, Berlin

    Book  Google Scholar 

  21. Hwang CL, Yoon K (1981) Multiple attribute decision making: methods and applications. Lecture notes in economics and mathematical systems 186. Springer, Berlin

    Book  Google Scholar 

  22. Jascanu N, Jascanu V, Bumbaru S (2008) Toward emotional e-commerce: the customer agent. Lect Notes Comput Sci (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) 5177 LNAI(PART 1):202–209

  23. Kay J (2000) Stereotypes, student models and scrutability. In: Gauthier G, Frasson C, VanLehn K (eds) Proceedings of the fifth international conference on intelligent tutoring systems, lecture notes in computer science, vol. 1839. Springer-Verlag, Berlin, pp 19–30

    Google Scholar 

  24. Liao W, Zhang W, Zhu Z, Ji Q, Gray WD (2006) Toward a decision-theoretic framework for affect recognition and user assistance. Int J Hum Comput Stud 64:847–873

    Article  Google Scholar 

  25. Lim MY, Aylett R (2007) Feel the difference: a guide with attitude! Lect Notes Comput Sci 4722 LNCS:317–330

    Google Scholar 

  26. Moriyama T, Ozawa S (2001) Measurement of human vocal emotion using fuzzy control. Syst Comput Jpn 32(4)

  27. Nasoz F, Lisetti CL (2006) MAUI avatars: mirroring the user’s sensed emotions via expressive multi-ethnic facial avatars. J Vis Lang Comput 17:430–444

    Article  Google Scholar 

  28. Oviatt S (2003) User-modeling and evaluation of multimodal interfaces. Proc IEEE 1457–1468

  29. Pantic M, Rothkrantz LJM (2003) Toward an affect-sensitive multimodal human-computer interaction. Proc IEEE 91:1370–1390

    Article  Google Scholar 

  30. Park Y-W, Lim C-Y, Nam T-J (2010) CheekTouch: an affective interaction technique while speaking on the mobile phone. Conference on Human Factors in Computing Systems—Proceedings, pp 3241–3246

  31. Pastor O, Gómez J, Insfrán E, Pelechano V (2001) The OO-Method approach for information systems modeling: from object-oriented conceptual modeling to automated programming. Inf Syst 26(7):507–534

    Article  MATH  Google Scholar 

  32. Picard RW (2003) Affective computing: challenges. Int J Hum Comput Stud 59(1–2):55–64

    Article  Google Scholar 

  33. Platinum Technology, Inc (1997) Paradigm plus: Round-trip engineering for JAVA, white paper

  34. Rational Software Corporation, Rational Rose User’s Manual (1997)

  35. Rich E (1983) Users are individuals: individualizing user models. Int J Man Mach Stud 18:199–214

    Article  Google Scholar 

  36. Shieh C-K, Mac S-C, Chang T-C, Lai C-M (1996) An object-oriented approach to develop software fault-tolerant mechanisms for parallel programming systems. J Syst Softw 32(3):215–225

    Article  Google Scholar 

  37. Stathopoulou I-O, Alepis E, Tsihrintzis GA, Virvou M (2010) On assisting a visual-facial affect recognition system with keyboard-stroke pattern information. Knowl-Based Syst 23(4):350–356

    Article  Google Scholar 

  38. Tsihrintzis G, Virvou M, Stathopoulou IO, Alepis E (2008) On improving visual-facial emotion recognition with audio-lingual and keyboard stroke pattern information. Web Intelligence and Intelligent Agent Technology, WI-IAT ‘08, volume 1, pp 810–816

  39. Vincent J (2009) Affiliations, emotion and the mobile phone. Lect Notes Comput Sci 5641 LNAI:28–41

    Google Scholar 

  40. Virvou M, Alepis E (2004) Mobile versus desktop facilities for an e-learning system: users’ perspective, Conference Proceedings—IEEE International Conference on Systems, Man and Cybernetics 1:48–52

  41. Virvou M, Alepis E (2005) Mobile educational features in authoring tools for personalised tutoring. Comput Educ 44(1):53–68

    Article  Google Scholar 

  42. Virvou M, Tsihrintzis G, Alepis E, Stathopoulou IO, Kabassi K (2007) Combining empirical studies of audio-lingual and visual-facial modalities for emotion recognition. In Knowledge-based intelligent information and engineering systems—KES 2007, lecture notes in artificial intelligence, subseries of lecture notes in computer science, volume 4693/2007. Springer, Berlin, pp 1130–1137

  43. Wikipedia. The free encyclopedia. http://en.wikipedia.org/wiki/Object_oriented

  44. Yoon WJ, Cho YH, Park KS (2007) A study of speech emotion recognition and its application to mobile services. Lect Notes Comput Sci 4611 LNCS:758–766

    Google Scholar 

  45. Zeng Z, Tu J, Liu M, Huang T, Pianfetti B, Roth D, Levinson S (2007) Audio-visual affect recognition. IEEE Trans Multimedia 9:424–428

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Maria Virvou.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Alepis, E., Virvou, M. Multimodal object oriented user interfaces in mobile affective interaction. Multimed Tools Appl 59, 41–63 (2012). https://doi.org/10.1007/s11042-011-0744-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-011-0744-y

Keywords

Navigation