skip to main content
10.1145/2809643.2809647acmotherconferencesArticle/Chapter ViewAbstractPublication PagesempireConference Proceedingsconference-collections
research-article

A Multimodal Framework for Recognizing Emotional Feedback in Conversational Recommender Systems

Authors Info & Claims
Published:16 September 2015Publication History

ABSTRACT

A conversational recommender system should interactively assist users in order to understand their needs and preferences and produce personalized recommendations accordingly. While traditional recommender systems use a single-shot approach, the conversational ones refine their suggestions during the conversation since they gain more knowledge about the user. This approach can be useful in case the recommender is embodied in a conversational agent acting as a shopping assistant in a smart retail context. In this case, knowledge about the user preferences may be acquired during the conversation and by observing the user behavior. In such a setting, besides "rational" information, the agent may grasp information also about extra-rational factors such as attitudes, emotions, likes and dislikes. This paper describes the study performed in order to develop a multimodal framework for recognizing the shopping attitude of the user during the interaction with DIVA, a Dress-shopping InteractiVe Assistant. In particular, speech prosody, gestures and facial expressions have been taken into account for providing feedback to the system and refining the recommendation accordingly.

References

  1. http://perso.telecom-paristech.fr/~pelachau/Greta/Google ScholarGoogle Scholar
  2. I. Poggi. Mind markers. In N. T. M. Rector, I. Poggi, editor, Gestures. Meaning and use. University Fernando Pessoa Press, Oporto, Portugal, 2003.Google ScholarGoogle Scholar
  3. James R. Bettman, Mary F. Luce, and John W. Payne, "Constructive consumer choice processes," Journal of consumer research, vol. 25, no. 3, pp. 187--217, 1998.Google ScholarGoogle Scholar
  4. Robert J. Donovan, John R. Rossiter, Gilian Marcoolyn, and Nesdale Andrew, "Store atmosphere and purchasing behavior," Journal of Retailing, vol. 70, no. 3, pp. 283--294, 1994.Google ScholarGoogle Scholar
  5. M. Tkalčič, U. Burnik, A. Odic, A. Koßsir, and J. Tasic, "Emotion-aware recommender systems-a framework and a case study," ICT Innovations 2012: Secure and Intelligent Systems, vol. 207, pp. 141--150, 2013.Google ScholarGoogle Scholar
  6. Karen A. Machleit, Susan Powell Mantel. (2001). Emotional response and shopping satisfaction. Moderating effects of shopper attributions. Journal of Business Research 54. Elsevier Science Inc. pp: 97--106Google ScholarGoogle Scholar
  7. Jensen, F.V. Bayesian Networks and Decision Graphs. Springer. (2001). Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Reeves, B. and Nass, C. The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places. Cambridge: Cambridge University Press. (1996). Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Reilly WSN (1996) Believable Social and Emotional Agents. PhD thesis.Google ScholarGoogle Scholar
  10. Bickmore T (2003) Relational Agents: Effecting Change through Human-Computer Relationships. PhD Thesis, Media Arts & Sciences, Massachusetts Institute of Technology.Google ScholarGoogle Scholar
  11. Qiu, L., & Benbasat, I. (2009). Evaluating Anthropomorphic Product Recommendation Agents: A Social Relationship Perspective to Designing Information Systems. Journal of Management Information Systems. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Mumme, C., Pinkwart, N., Loll, F.: Design and Implementation of a Virtual Salesclerk. In Proceedings of IVA 2009, 379--385. Springer, Heidelberg (2009) Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. B. De Carolis, I. Mazzotta, N. Novielli, S. Pizzutilo (2013). User Modeling in Social Interaction with a Caring Agent. User Modeling and Adaptation for Daily Routines Human--Computer Interaction Series 2013, pp 89--116Google ScholarGoogle ScholarCross RefCross Ref
  14. R. Cowie, E. Douglas-Cowie, N. Tsapatsoulis, G. Votsis, S. Kollias, W. Fellenz, J. Taylor (2001). Emotion Recognition in Human-Computer Interaction. IEEE Signal Processing Magazine January 2001. 32--8Google ScholarGoogle Scholar
  15. Vinciarelli, A., Pantic, M., Bourlard, H., and Pentland, A. (2008) Social signals, their function, and automatic analysis: a survey. In: ICMI '08 Proceedings of the 10th International Conference on Multimodal Interfaces, October 20--22, 2008, Chania, Greece. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. De Carolis, B., Novielli N., Recognizing signals of social attitude in interacting with Ambient Conversational Systems, Journal on Multimodal User Interfaces, March 2014, Volume 8, Issue 1, pp 43--60.Google ScholarGoogle ScholarCross RefCross Ref
  17. Zhihong Zeng, Maja Pantic, Glenn I. Roisman, Thomas S. Huang: A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1): 39--58 (2009) Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Pantic, M., Rothkrantz, L.J.M. Toward an affect-sensitive multimodal human-computer interaction. Proceedings of the IEEE, Volume: 91 Issue: 9, Sept. 2003. Page(s): 1370--1390.Google ScholarGoogle ScholarCross RefCross Ref
  19. G. Caridakis, A. Raouzaiou, K. Karpouzis, S. Kollias, Synthesizing Gesture Expressivity Based on Real Sequences, Workshop on multimodal corpora: from multimodal behaviour theories to usable models, LREC Conference, Genoa, Italy, 24--26 May, 2006.Google ScholarGoogle Scholar
  20. Wagner J, Lingenfelser F, André E (2011) The social signal interpretation framework (SSI) for real time signal processing and recognition. In: Proceedings of Interspeech 2011Google ScholarGoogle Scholar
  21. Bruno Lepri, Nadia Mana, Alessandro Cappelletti, Fabio Pianesi, Massimo Zancanaro: Modeling the Personality of Participants During Group Interactions. UMAP 2009: 114--125. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Candace L. Sidner, Cory D. Kidd, Christopher Lee, and Neal Lesh. 2004. Where to look: a study of human-robot engagement. In Proceedings of the 9th international conference on Intelligent user interfaces (IUI '04). ACM, New York, NY, USA, 78--84. DOI=10.1145/964442.964458 http://doi.acm.org/10.1145/964442.964458 Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Nakano YI, Ishii R (2010) Estimating user's engagement from eye-gaze behaviors in human-agent conversations. In: 2010 international conference on intelligent user interfaces (IUI2010) Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. L. C. De Silva and P. C. Ng, "Bimodal emotion recognition," in IEEE International Conf. on Automatic Face and Gesture Recognition, March 2000, pp. 332--335. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. L. S. Chen and T. S. Huang, "Emotional expressions in audiovisual human computer interaction," in International Conf. on Multimedia and Expo (ICME), 2000, pp. 423--426.Google ScholarGoogle Scholar
  26. N. Sebe, I. Cohen, T. Gevers, and T.S. Huang, "Emotion Recognition Based on Joint Visual and Audio Cues," Proc. 18th Int'l Conf. Pattern Recognition (ICPR '06), pp. 1136--1139, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. T. Balomenos, A. Raouzaiou, S. Ioannou, A. Drosopoulos, K. Karpouzis, and S. Kollias. Emotion Analysis in Man-Machine Interaction Systems, pages 175{200. 3D Modeling and Animation: Synthesis and Analysis Techniques, Idea Group Publ., 2005.Google ScholarGoogle Scholar
  28. Hatice Gunes and Massimo Piccardi. 2007. Bi-modal emotion recognition from expressive face and body gestures. J. Netw. Comput. Appl. 30, 4 (November 2007), 1334--1345. DOI=10.1016/j.jnca.2006.09.007 http://dx.doi.org/10.1016/j.jnca.2006.09.007 Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. G. Castellano, L. Kessous, G. Caridakis, "Multimodal emotion recognition from expressive faces, body gestures and speech", C. Peter, R. Beale (eds), Affect and Emotion in Human-Computer Interaction, Lecture Notes in Computer Science, Springer.Google ScholarGoogle Scholar
  30. Ekman, P. & Friesen, W. V. 1969 The repertoire of nonverbal behavior. Semiotica 1, 49--98.Google ScholarGoogle ScholarCross RefCross Ref
  31. Ambady, N. & Rosenthal, R. 1992 Thin slices of expressive behavior as predictors of interpersonal consequences: a meta-analysis. Psychol. Bull. 111, 256--274.Google ScholarGoogle ScholarCross RefCross Ref
  32. Whittaker S., Walker M., and Moore J., Fish or Fowl: A Wizard of Oz evaluation of dialogue strategies in the restaurant domain. Language Resources and Evaluation Conference. (2002).Google ScholarGoogle Scholar
  33. Russell J.A. A circumplex model of affect. Journal of Personality and Social Psychology.1980;39:1161--1178.Google ScholarGoogle Scholar
  34. Sundberg, J., Patel, S., Björkner, E., Scherer, K.R. (2011). Interdependencies among voice source parameters in emotional speech. IEEE Transactions on Affective Computing, 2(3). Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Feldman, R.S. (1992). Applications of Nonverbal Behavioral Theories and Research. New Jersey: Lawrence Erlbaum Associates, Inc.Google ScholarGoogle Scholar
  36. Knapp, M. and Hall, J. (1992). Nonverbal communication in human interaction. Orlando, FL: Holt, Rinehart&Winsten, Inc.Google ScholarGoogle Scholar
  37. Prendinger H., Mori J. and Ishizuka, M. (2005). Recognizing, modeling, and responding to users affective states. In Proc. of User Modeling 2005, Lecture Notes in Computer Science, Volume 3538/2005, 149. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Conati C. (2002) Probabilistic assessment of user's emotions in educational games. Applied Artificial Intelligence, 16:555--575.Google ScholarGoogle ScholarCross RefCross Ref
  39. Vogt, T., Adre' E., and Bee N. 2008. EmoVoice - A Framework for Online Recognition of Emotions from Voice. In Proceedings of the 4th IEEE workshop on Perception and Interactive Technologies for Speech-Based Systems: Perception in Multimodal Dialogue Systems (PIT '08), Springer-Verlag, Berlin, Heidelberg, 188--199. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Boersma, P. and Weenink, D. (2007). Praat: doing phonetics by computer (version 4.5.15) {computer program}.Google ScholarGoogle Scholar
  41. KinectDTW: http://kinectdtw.codeplex.com/Google ScholarGoogle Scholar
  42. Y. Zheng, B. Mobasher, and R. D. Burke. The role of emotions in context-aware recommendation. Proceedings of the 3rd Workshop on Human Decision Making in Recommender Systems, in conjunction with the 7th ACM Conference on Recommender Systems (RecSys 2013), vol. 1050 of CEUR Workshop Proceedings, pp.21--28. CEURWS.org, 2013.Google ScholarGoogle Scholar

Index Terms

  1. A Multimodal Framework for Recognizing Emotional Feedback in Conversational Recommender Systems

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      EMPIRE '15: Proceedings of the 3rd Workshop on Emotions and Personality in Personalized Systems 2015
      September 2015
      45 pages
      ISBN:9781450336154
      DOI:10.1145/2809643

      Copyright © 2015 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 16 September 2015

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

      Acceptance Rates

      EMPIRE '15 Paper Acceptance Rate6of9submissions,67%Overall Acceptance Rate6of9submissions,67%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader