Abstract
Emotions play a central role in our daily lives, influencing the way we think and act, our health and sense of well-being, and films are by excellence the form of art that exploits our affective, perceptual and intellectual activity, holding the potential for a significant impact. Video is becoming a dominant and pervasive medium, and online video a growing entertainment activity on the web and iTV, mainly due to technological developments and the trends for media convergence. In addition, the improvement of new techniques for gathering emotional information about videos, both through content analysis or user implicit feedback through user physiological signals complemented in manual labeling from users, is revealing new ways for exploring emotional information in videos, films or TV series, and brings out new perspectives to enrich and personalize video access. In this work, we reflect on the power that emotions have in our lives, on the emotional impact of movies, and on how to address this emotional dimension in the way we classify and access movies, by exploring and evaluating the design of iFelt in its different ways to classify, access, browse and visualize movies based on their emotional impact.











Similar content being viewed by others
References
Ahlberg, C., Truvé, S.: Tight coupling: guiding user actions in a direct manipulation retrieval system. In: People and computers X: proceedings of HCI’95, Huddersfield, pp. 305–321, Aug 1995
Ahlberg, C., Shneiderman, B.: Film Finder website. www.infovis-wiki.net/index.php?title=Film_Finder
Arijon, D.: Grammar of the film language. Focal Press, Waltham (1976)
Ashby, G.F., Valentin, V.V., Turken, U.: The effects of positive affect and arousal on working memory and executive attention. In: Moore, S.C., Oaksford, M. (eds.) Emotional Cognition: from Brain to Behaviour. Amsterdam [u.a.]: Benjamins, pp. 245–287 (2002)
Banerjee, S., Greene, K., Krcmar, M., Bagdasarov, Z., Ruginyte, D.: The role of gender and sensation seeking in film choice: exploring mood and arousal. J. Media Psychol. Theories Methods Appl. 20(3), 97–105 (2008)
Bestiario, Videosphere, May 2008. http://www.bestiario.org/research/videosphere/
Card, S.K., Mackinlay, J.D., Shneiderman, B.: Readings in Information Visualization: Using Vision to Think. Morgan-Kaufmann, San Francisco (1999)
Chambel, T., Guimarães, N.: Context perception in video-based hypermedia spaces. In: Proceedings of ACM Hypertext’02, College Park, Maryland, USA (2002)
Chambel, T., Oliveira, E., Martins, P.: Being happy, healthy and whole watching movies that affect our emotions. In: Proceedings of ACII 2011, 4th International Conference on Affective Computing and Intelligent Interaction. Springer, Berlin, Heidelberg, pp. 35–45, Memphis, TN, USA, Oct 9–12 (2011)
Cunningham, S., Nichols, D.M.: How people find videos. In: Proceedings of the 8th ACM/IEEE-CS Joint Conference on Digital Libraries (JCDL ‘08). ACM, NY, USA, pp. 201–210 (2008)
Doherty, J., Girgensohn, A., Helfman, J., Shipman, F., Wilcox, L.: Detail-on demand hypervideo. In: Multimedia’03: Proceedings of the Eleventh ACM International Conference on Multimedia (2003)
Ekman, P.: Are there basic emotions? Psychol. Rev. 99(3), 550–553 (1992)
Emotionally}Vague. http://www.emotionallyvague.com/
Few, S.: Data Visualization: Past, Present, and Future. IBM Cognos Innovation Center, Ottawa (2007)
Hanjalic, A., Xu, L.Q.: Affective video content representation and modeling. Multimedia, IEEE Trans. Multimed., 7(1), 143–154 (2005)
Harris, J., Kamvar, S.: We Feel Fine: An Almanac of Human Emotion. Scribner, New York (2009)
Hauptmann, A.G.: Lessons for the future from a decade of informedia video analysis research. In: International Conference on Image and Video Retrieval, National University of Singapore, Singapore, LNCS, vol 3568, pp. 1–10. Springer, Berlin, Heidelberg, July 20–22 (2005). http://dl.acm.org/citation.cfm?id=2106102
IMDb—Internet movie database. www.imdb.com
Kang, H.B.: Affective content detection using HMMs. In: Proceedings of the Eleventh ACM International Conference on Multimedia, pp. 259–262 (2003)
Kim, J., André, E.: Emotion recognition based on physiological changes in listening music. IEEE Trans. Pattern Anal. Mach. Intell. 30(12), 2067–2083 (2008)
Kreibig, S.D., Wilhelm, F.H., Roth, W.T., Gross, J.J.: Cardiovascular, electrodermal, and respiratory response patterns to fear- and sadness-inducing films. Psychophysiology 44(5), 787–806 (2007)
Langlois, T., Chambel, T., Oliveira, E., Carvalho, P., Marques, G., Falcão, A.: VIRUS: video information retrieval using subtitles. In: Proceedings of Academic MindTrek’2010, Tampere, Finland, 6–8 Oct 2010
Lund, A.M.: Measuring usability with the USE questionnaire. Usability User Exp. 8(2), 8 (2001)
Maaoui, C., Pruski, A., Abdat, F.: Emotion recognition for human-machine communication. In: 2008 IEEERSJ International Conference on Intelligent Robots and Systems, pp. 1210–1215. IEEE. Retrieved from http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=4650870 (2008)
Martinho, J., Chambel, T.: ColorsInMotion: interactive visualization and exploration of video spaces. In Proceedings of Academic MindTrek’2009, Tampere, Finland, Sep–Oct 2009
Mauss, I.B., Robinson, M.D.: Measures of emotion: a review. Cogn. Emot. 23(2), 209–237 (2009)
Metz, C., Taylor, M.: Film language: a semiotics of the cinema. University of Chicago Press, Chicago (1991)
Money, A.G., Agius, H.: Analysing user physiological responses for affective video summarization. Displays 30(2), 3059–3070 (2009)
Neisser, U.: Cognition and Reality: Principles and Implications of Cognitive Psychology. W. H. Freeman and Company (1976). http://www.amazon.com/Cognition-Reality-Principles-Implications-Psychology/dp/0716704773
NetFlix. http://www.netflix.com/
Oliveira, E., Benovoy, M., Ribeiro, N., Chambel, T.: Towards emotional interaction: using movies to automatically learn users’ emotional states. In: Proceedings of Interact’ 2011: 13th IFIP TC13 International Conference on Human–Computer Interaction, Lisbon, Portugal, pp. 152–161, 5–9 Sep (2011)
Ordelman, R., de Jong, F., Larson, M.: Enhanced multimedia content access and exploitation using semantic speech retrieval. In: Third IEEE International Conference on Semantic Computing, ICSC, Berkeley, CA, USA, 14–16 Sep (2009). http://doc.utwente.nl/68255/
Perez, S.: The best tools for visualization. http://www.readwriteweb.com/archives/the_best_tools_for_visualization.php (2008)
Picard, R.W., Vyzas, E., Healey, J.: Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 1175–1191 (2001)
Plutchik, R.: Emotion: A Psychoevolutionary Synthesis. Harper & Row, New York (1980)
Rainville, P., Bechara, A., Naqvi, N., Damasio, A.R.: Basic emotions are associated with distinct patterns of cardiorespiratory activity. Int. J. Psychophysiol. 61(1), 5–18 (2006). http://www.ncbi.nlm.nih.gov/pubmed/16439033
Rocha, T., Chambel,T.: VideoSpace: a 3D video experience. In: Proceedings of Artech’2008, 4th International Conference on Digital Arts, Porto, Portugal, Nov (2008)
Rottenberg, J., Ray, R., Gross, J.: Emotion Elicitation Using Films. The Handbook of Emotion Elicitation and Assessment, pp. 9–28. Oxford University Press, USA (2005). http://www.amazon.com/Handbook-Emotion-Elicitation-Assessment-Affective/dp/0195169158
Russell, J.: A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161–1178 (1980)
Scherer, K.R.: What are emotions? and how can they be measured? Soc. Sci. Inf. 44(4), 695 (2005)
Shneiderman, B.: Tree visualization with tree-maps: 2-D space-filling approach. ACM Trans. Graph. (TOG) 11(1), 92–99 (1992)
Smeaton, A.F., Rothwell, S.: Biometric responses to music-rich segments in films: the cdvplex. Biometric Responses to Music-Rich Segments in Films: The Cdvplex, pp. 162–168 (2009). doi:10.1109/CBMI.2009.21
Snoek, C.G.M., Worring, M., Smeulders, A.W.M., Freiburg, B.: The role of visual content and style for concert video indexing. In: Proceedings of IEEE International Conference on Multimedia and Expo (2007)
Soleymani, M. S., Chanel, C.G., Kierkels, J.K., Pun, T.P.: Affective characterization of movie scenes based on content analysis and physiological changes. In: International Symposium on Multimedia, pp. 228–235 (2008)
Synesketch. www.synesketch.krcadinac.com
Vimeo. http://vimeo.com/
Ware, C.: Information Visualization: Perception for Design. Morgan Kaufmann Publisher, San Francisco (2004)
We Feel Fine. www.wefeelfine.org
Xu, M., Jin, J.S., Luo, S.: Personalized video adaptation based on video content analysis. In: Proceedings of the 9th International Workshop on Multimedia Data Mining: held in conjunction with the ACM SIGKDD (MDM ‘08). ACM, New York, NY, USA, pp. 26–35 (2008)
Yoo, H., Cho, S.: Video scene retrieval with interactive genetic algorithm. Multimed. Tools Appl. 34(September), 317–336 (2007)
YouTube. www.youtube.com
Zwaag, M., Broek, E.: Guidelines for biosignal-driven HCI. In: CHI Workshop on Brain, Body and Bytes: Psychophysiological User Interaction (2010)
Acknowledgments
This work is partially supported by FCT through LASIGE Multiannual Funding and VIRUS research project (PTDC/EIA–EIA/101012/2008).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Oliveira, E., Martins, P. & Chambel, T. Accessing movies based on emotional impact. Multimedia Systems 19, 559–576 (2013). https://doi.org/10.1007/s00530-013-0303-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00530-013-0303-7