ABSTRACT
Historically, in the field of Affective Computing the research focus was on recognizing emotions expressed by humans. In our work, we show that it is possible to predict emotional reaction as central tendency directly to a stimulus, prior to its actual exposure to any human. This is achieved by training new Affect-Predictive machine learning models, which leverage the large amount of weak emotional signals of aggregated and fully anonymized reactions of online users to a vast variety of textual and visual stimuli. Based on our Affect-Predictive computer vision model we (a) set a new benchmark to evaluate its predictive power on an open-access affective image set, (b) generate affective saliency maps and (c) discuss a few instances of peculiar visual patterns learned by the model. We theorize that Affect-Predictive models can be used to learn implicit patterns allowing AI agents to see the world and react in a more human-like way: imagine an autonomous vehicle that slows down automatically when detecting something highly surprising or negative. Using our Affect-Predictive natural language model we demonstrate that it is possible to predict general emotional response to a piece of text from the reader perspective and how it can be used on practice to improve social listening. We conclude with a discussion on the broader implication of the Affect-Predictive models: human emotional reactions can be treated as natural encoders of multimodal stimuli capturing just enough semantics to allow instantaneous decision-making; the ability to automatically predict such reactions directly to stimuli opens up a lot of new opportunities in the field of Affective Computing.
- Pooyan Balouchian, Marjaneh Safaei, and Hassan Foroosh. 2019. LUCFER: A large-scale context-sensitive image dataset for deep learning of visual emotions. In 2019 IEEE Winter Conference on Applications of Computer Vision (WACV). IEEE, 1645–1654.Google ScholarCross Ref
- Lisa Feldman Barrett, Ralph Adolphs, Stacy Marsella, Aleix M Martinez, and Seth D Pollak. 2019. Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychological science in the public interest 20, 1 (2019), 1–68.Google Scholar
- Lisa Feldman Barrett and Eliza Bliss-Moreau. 2009. Affect as a psychological primitive. Advances in experimental social psychology 41 (2009), 167–218.Google Scholar
- L Bing. 2012. Sentiment Analysis and Opinion Mining (Synthesis Lectures on Human Language Technologies). University of Illinois: Chicago, IL, USA(2012).Google Scholar
- Rafael A Calvo and Sidney D’Mello. 2010. Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Transactions on affective computing 1, 1 (2010), 18–37.Google ScholarDigital Library
- Elise S Dan-Glauser and Klaus R Scherer. 2011. The Geneva affective picture database (GAPED): a new 730-picture database focusing on valence and normative significance. Behavior research methods 43, 2 (2011), 468–477.Google Scholar
- Seth Duncan and Lisa Feldman Barrett. 2007. Affect is a form of cognition: A neurobiological analysis. Cognition and emotion 21, 6 (2007), 1184–1211.Google Scholar
- Shaojing Fan, Zhiqi Shen, Ming Jiang, Bryan L Koenig, Juan Xu, Mohan S Kankanhalli, and Qi Zhao. 2018. Emotional attention: A study of image sentiment and visual attention. In Proceedings of the IEEE Conference on computer vision and pattern recognition. 7521–7531.Google ScholarCross Ref
- James J Gibson. 1977. The theory of affordances. Hilldale, USA 1, 2 (1977), 67–82.Google Scholar
- Edouard Grave, Piotr Bojanowski, Prakhar Gupta, Armand Joulin, and Tomas Mikolov. 2018. Learning word vectors for 157 languages. arXiv preprint arXiv:1802.06893(2018).Google Scholar
- Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition. 770–778.Google ScholarDigital Library
- Geoff Hollis and Chris Westbury. 2016. The principals of meaning: Extracting semantic dimensions from co-occurrence models of semantics. Psychonomic bulletin & review 23, 6 (2016), 1744–1756.Google Scholar
- Ming Jiang, Shengsheng Huang, Juanyong Duan, and Qi Zhao. 2015. Salicon: Saliency in context. In Proceedings of the IEEE conference on computer vision and pattern recognition. 1072–1080.Google ScholarCross Ref
- Tilke Judd, Krista Ehinger, Frédo Durand, and Antonio Torralba. 2009. Learning to predict where humans look. In 2009 IEEE 12th international conference on computer vision. IEEE, 2106–2113.Google ScholarCross Ref
- Ronak Kosti, Jose M Alvarez, Adria Recasens, and Agata Lapedriza. 2017. Emotion recognition in context. In Proceedings of the IEEE conference on computer vision and pattern recognition. 1667–1675.Google ScholarCross Ref
- Philip A Kragel, Marianne C Reddan, Kevin S LaBar, and Tor D Wager. 2019. Emotion schemas are embedded in the human visual system. Science advances 5, 7 (2019), eaaw4358.Google Scholar
- Matthias Kummerer, Thomas SA Wallis, Leon A Gatys, and Matthias Bethge. 2017. Understanding low-and high-level contributions to fixation prediction. In Proceedings of the IEEE International Conference on Computer Vision. 4789–4798.Google ScholarCross Ref
- Benedek Kurdi, Shayn Lozano, and Mahzarin R Banaji. 2017. Introducing the open affective standardized image set (OASIS). Behavior research methods 49, 2 (2017), 457–470.Google Scholar
- Peter J Lang, Margaret M Bradley, and Bruce N Cuthbert. 2008. International affective picture system (IAPS): affective ratings of pictures and instruction manual. University of Florida, Gainesville. Technical Report. Tech Rep A-8.Google Scholar
- Fei-Fei Li. [n.d.]. Stanford HAI 2019 - Introduction to Stanford HAI: Fei-Fei Li. YouTube. https://youtu.be/XnhfeNDc0eI?t=600Google Scholar
- Jana Machajdik and Allan Hanbury. 2010. Affective image classification using features inspired by psychology and art theory. In Proceedings of the 18th ACM international conference on Multimedia. 83–92.Google ScholarDigital Library
- Joseph A Mikels, Barbara L Fredrickson, Gregory R Larkin, Casey M Lindberg, Sam J Maglio, and Patricia A Reuter-Lorenz. 2005. Emotional category data on images from the International Affective Picture System. Behavior research methods 37, 4 (2005), 626–630.Google Scholar
- Ali Mollahosseini, Behzad Hasani, and Mohammad H Mahoor. 2017. Affectnet: A database for facial expression, valence, and arousal computing in the wild. IEEE Transactions on Affective Computing 10, 1 (2017), 18–31.Google ScholarDigital Library
- Charles Egerton Osgood, George J Suci, and Percy H Tannenbaum. 1957. The measurement of meaning. Number 47. University of Illinois press.Google Scholar
- Michael J Owren and Drew Rendall. 1997. An affect-conditioning model of nonhuman primate vocal signaling. In Communication. Springer, 299–346.Google Scholar
- Rameswar Panda, Jianming Zhang, Haoxiang Li, Joon-Young Lee, Xin Lu, and Amit K Roy-Chowdhury. 2018. Contemplating visual emotions: Understanding and overcoming dataset bias. In Proceedings of the European Conference on Computer Vision (ECCV). 579–595.Google ScholarDigital Library
- Kuan-Chuan Peng, Tsuhan Chen, Amir Sadovnik, and Andrew C Gallagher. 2015. A mixed bag of emotions: Model, predict, and transfer emotion distributions. In Proceedings of the IEEE conference on computer vision and pattern recognition. 860–868.Google ScholarCross Ref
- Rosling Picard. 1997. Affective computing. cambridge, massachustes institure of technology.Google Scholar
- Ramprasaath R Selvaraju, Michael Cogswell, Abhishek Das, Ramakrishna Vedantam, Devi Parikh, and Dhruv Batra. 2017. Grad-cam: Visual explanations from deep networks via gradient-based localization. In Proceedings of the IEEE international conference on computer vision. 618–626.Google ScholarCross Ref
- Monique AM Smeets, Egge AE Rosing, Doris M Jacobs, Ewoud van Velzen, Jean H Koek, Cor Blonk, Ilse Gortemaker, Marloes B Eidhof, Benyamin Markovitch, Jasper de Groot, 2020. Chemical fingerprints of emotional body odor. Metabolites 10, 3 (2020), 84.Google ScholarCross Ref
- Carlo Strapparava and Rada Mihalcea. 2007. Semeval-2007 task 14: Affective text. In Proceedings of the Fourth International Workshop on Semantic Evaluations (SemEval-2007). 70–74.Google ScholarCross Ref
- Yichen Wang and Aditya Pal. 2015. Detecting emotions in social media: A constrained optimization approach. In Twenty-fourth international joint conference on artificial intelligence.Google Scholar
- Zijun Wei, Jianming Zhang, Zhe Lin, Joon-Young Lee, Niranjan Balasubramanian, Minh Hoai, and Dimitris Samaras. 2020. Learning visual emotion representations from web data. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 13106–13115.Google ScholarCross Ref
- Victoria Yanulevskaya, Jan C van Gemert, Katharina Roth, Ann-Katrin Herbold, Nicu Sebe, and Jan-Mark Geusebroek. 2008. Emotional valence categorization using holistic image features. In 2008 15th IEEE international conference on Image Processing. IEEE, 101–104.Google ScholarCross Ref
- Quanzeng You, Jiebo Luo, Hailin Jin, and Jianchao Yang. 2016. Building a large scale dataset for image emotion recognition: The fine print and the benchmark. In Proceedings of the AAAI conference on artificial intelligence, Vol. 30.Google ScholarCross Ref
- Lei Zhang, Shuai Wang, and Bing Liu. 2018. Deep learning for sentiment analysis: A survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery 8, 4(2018), e1253.Google ScholarCross Ref
- Affect-Predictive Models: Predicting Emotional Responses Directly to Stimuli
Recommendations
Pupillary responses to emotionally provocative stimuli
ETRA '00: Proceedings of the 2000 symposium on Eye tracking research & applicationsThis paper investigated in two experiments pupillary responses to emotionally provocative sound stimuli. In experiment one, 30 subjects' pupillary responses were measured while listening to 10 negatively and 10 positively arousing sounds, and 10 ...
Psychological responses to simulated displays of mismatched emotional expressions
Embodied agents are often designed with the ability to simulate human emotion. This paper investigates the psychological impact of simulated emotional expressions on computer users with a particular emphasis on how mismatched facial and audio ...
Emotional responses to thermal stimuli
ICMI '11: Proceedings of the 13th international conference on multimodal interfacesThe present aim was to study if thermal stimuli presented to the palm can affect emotional responses when measured with emotion related subjective rating scales and changes in skin conductance response (SCR). Two target temperatures, cold and warm, were ...
Comments