Abstract
This paper proposes a method to quantitatively extract human emotions by analyzing images of the surrounding environment captured by a smartphone camera in real time. In the field of psychology, it is known that visual elements such as colors and complexity affect human emotions. Based on this knowledge, we developed an application for the extraction of emotions in real time using the colors and spatial complexity of images obtained with Android smartphones. Among the color components of images, the hue component, which indicates the color, is extracted as the color element and the spatial complexity is extracted by quantitatively determining the high- and low-frequency components that are visible in the images. The corresponding two-dimensional emotion is identified by using two support vector regression modules. The results show that the root-mean-square error between the estimated emotion and the subjectively evaluated emotion is approximately 0.36 on the −1 to +1 axes plane.
Similar content being viewed by others
References
Bradley MM, Hamby S, Löw A, Lang PJ (2007) Brain potentials in perception: picture complexity and emotional arousal. Psychophysiology 44(3):364–373
Caprani N, Piasek P, Gurrin C, O'Connor NE, Irving K, Smeaton AF (2014) Life-long collections: motivations and the implications for lifelogging with mobile devices. Int J Mob Hum Comput Interact 6(1):15–36
Corchs SE, Ciocca G, Bricolo E, Gasparini F (2016) Predicting complexity perception of real world images. PLoS One 11(6):e0157986
Elliot AJ, Maier MA (2014) Color psychology: effects of perceiving color on psychological functioning in humans. Annu Rev Psychol 65:95–120
Freedman D, Thornton A, Camburn D, Alwin D, Young-DeMarco L (1988) The life history calendar: a technique for collecting retrospective data. Sociol Methodol 18:37–68
Gupta R, Laghari K, Banville H, Falk TH (2016) Using affective brain-computer interfaces to characterize human influential factors for speech quality-of-experience perception modelling. Human-centric Comput Inf Sci 6(1):5
Hair JF, Black WC, Babin BJ, Anderson RE, Tatham RL (1998) Multivariate data analysis. Pearson Lond 5(3):207–219
http://docs.opencv.org/2.4.11/modules/core/doc/operations_on_arrays.html?highlight=cvsplit#split. Accessed 26 Sept 2017
http://docs.opencv.org/2.4.11/modules/core/doc/operations_on_arrays.html?highlight=dft#dft. Accessed 26 Sept 2017
http://docs.opencv.org/2.4.11/modules/imgproc/doc/miscellaneous_transformations.html#cvtcolor. Accessed 26 Sept 2017
Hwang H, Ko D, Whang M, Lee EC (2017) Mobile app for analyzing environmental visual parameters with life logging camera. In: Park JJ, Jin H, Jeong Y-S, Khan MK (eds) Advanced multimedia and ubiquitous engineering. Springer, Singapore, pp 37–42
Im J, Park MW, Whang M, Lee EC (2015) Environmental feature extraction method in terms of emotion. Asia–Pac J Multimed Serv Converg Art Hum Soc 5:107–115
Mafrur R, Nugraha IGD, Choi D (2015) Modeling and discovering human behavior from smartphone sensing life-log data for identification purpose. Human-centric Comput Inf Sci 5(1):31
Mann S, Nolan J, Wellman B (2002) Sousveillance: inventing and using wearable computing devices for data collection in surveillance environments. Surveill Soc 1(3):331–355
Mitchell AA (1986) The effect of verbal and visual components of advertisements on brand attitudes and attitude toward the advertisement. J Consum Res 13(1):12–24
NAz KAYA, Epps H (2004) Relationship between color and emotion: a study of college students. Coll Stud J 38(3):396
Park MW, Im J, Kwon J, Whang M, Lee EC (2015) Correlation between heart rate and image components. In: Park JJ, Jin H, Jeong Y-S, Khan MK (eds) Advances in computer science and ubiquitous computing. Springer, Singapore, pp 201–207
Park MW, Ko D, Hwang H, Moon J, Lee EC (2017) Image classification using color and spatial frequency in terms of human emotion. In: Park JJ, Jin H, Jeong Y-S, Khan MK (eds) Advanced multimedia and ubiquitous engineering. Springer, Singapore, pp 91–96
Paro JA, Nazareli R, Gurjala A, Berger A, Lee GK (2015) Video-based self-review: comparing Google glass and GoPro technologies. Ann Plas Surg 74:71–74
Pieters R, Wedel M, Batra R (2010) The stopping power of advertising: measures and effects of visual complexity. J Mark Theory 74(5):48–60
Rao LK, Rao DV (2015) Local quantized extrema patterns for content-based natural and texture image retrieval. Human-centric Comput Inf Sci 5(1):26
Scherer KR (2005) What are emotions? And how can they be measured? Soc Sci Inf 44(4):695–729
Sellen AJ, Fogg A, Aitken M, Hodges S, Rother C, Wood K (2007) Do life-logging technologies support memory for the past?: An experimental study using sensecam. In: Proc. SIGCHI Conf. Human factors in computing systems. ACM: 81–90
Wolf K, Schmidt A, Bexheti A, Langheinrich M (2014) Lifelogging: You're wearing a camera? Ann Plast Surg 13(3):8–12
Acknowledgements
This work was supported by Institute for Information & communications Technology Promotion(IITP) grant funded by the Korea government(MSIT) (No. 2015-0-00312, The development of technology for social life logging based on analyzing social emotion and intelligence of convergence contents).
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Hwang, H., Lee, E.C. Determining the parameters of emotion by analyzing environmental images captured by a mobile device. Multimed Tools Appl 78, 28375–28389 (2019). https://doi.org/10.1007/s11042-017-5342-1
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-017-5342-1