Skip to main content
Log in

Determining the parameters of emotion by analyzing environmental images captured by a mobile device

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

This paper proposes a method to quantitatively extract human emotions by analyzing images of the surrounding environment captured by a smartphone camera in real time. In the field of psychology, it is known that visual elements such as colors and complexity affect human emotions. Based on this knowledge, we developed an application for the extraction of emotions in real time using the colors and spatial complexity of images obtained with Android smartphones. Among the color components of images, the hue component, which indicates the color, is extracted as the color element and the spatial complexity is extracted by quantitatively determining the high- and low-frequency components that are visible in the images. The corresponding two-dimensional emotion is identified by using two support vector regression modules. The results show that the root-mean-square error between the estimated emotion and the subjectively evaluated emotion is approximately 0.36 on the −1 to +1 axes plane.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Bradley MM, Hamby S, Löw A, Lang PJ (2007) Brain potentials in perception: picture complexity and emotional arousal. Psychophysiology 44(3):364–373

    Article  Google Scholar 

  2. Caprani N, Piasek P, Gurrin C, O'Connor NE, Irving K, Smeaton AF (2014) Life-long collections: motivations and the implications for lifelogging with mobile devices. Int J Mob Hum Comput Interact 6(1):15–36

    Article  Google Scholar 

  3. Corchs SE, Ciocca G, Bricolo E, Gasparini F (2016) Predicting complexity perception of real world images. PLoS One 11(6):e0157986

    Article  Google Scholar 

  4. Elliot AJ, Maier MA (2014) Color psychology: effects of perceiving color on psychological functioning in humans. Annu Rev Psychol 65:95–120

    Article  Google Scholar 

  5. Freedman D, Thornton A, Camburn D, Alwin D, Young-DeMarco L (1988) The life history calendar: a technique for collecting retrospective data. Sociol Methodol 18:37–68

    Article  Google Scholar 

  6. Gupta R, Laghari K, Banville H, Falk TH (2016) Using affective brain-computer interfaces to characterize human influential factors for speech quality-of-experience perception modelling. Human-centric Comput Inf Sci 6(1):5

    Article  Google Scholar 

  7. Hair JF, Black WC, Babin BJ, Anderson RE, Tatham RL (1998) Multivariate data analysis. Pearson Lond 5(3):207–219

    Google Scholar 

  8. http://docs.opencv.org/2.4.11/modules/core/doc/operations_on_arrays.html?highlight=cvsplit#split. Accessed 26 Sept 2017

  9. http://docs.opencv.org/2.4.11/modules/core/doc/operations_on_arrays.html?highlight=dft#dft. Accessed 26 Sept 2017

  10. http://docs.opencv.org/2.4.11/modules/imgproc/doc/miscellaneous_transformations.html#cvtcolor. Accessed 26 Sept 2017

  11. Hwang H, Ko D, Whang M, Lee EC (2017) Mobile app for analyzing environmental visual parameters with life logging camera. In: Park JJ, Jin H, Jeong Y-S, Khan MK (eds) Advanced multimedia and ubiquitous engineering. Springer, Singapore, pp 37–42

    Chapter  Google Scholar 

  12. Im J, Park MW, Whang M, Lee EC (2015) Environmental feature extraction method in terms of emotion. Asia–Pac J Multimed Serv Converg Art Hum Soc 5:107–115

    Google Scholar 

  13. Mafrur R, Nugraha IGD, Choi D (2015) Modeling and discovering human behavior from smartphone sensing life-log data for identification purpose. Human-centric Comput Inf Sci 5(1):31

    Article  Google Scholar 

  14. Mann S, Nolan J, Wellman B (2002) Sousveillance: inventing and using wearable computing devices for data collection in surveillance environments. Surveill Soc 1(3):331–355

    Article  Google Scholar 

  15. Mitchell AA (1986) The effect of verbal and visual components of advertisements on brand attitudes and attitude toward the advertisement. J Consum Res 13(1):12–24

    Article  MathSciNet  Google Scholar 

  16. NAz KAYA, Epps H (2004) Relationship between color and emotion: a study of college students. Coll Stud J 38(3):396

    Google Scholar 

  17. Park MW, Im J, Kwon J, Whang M, Lee EC (2015) Correlation between heart rate and image components. In: Park JJ, Jin H, Jeong Y-S, Khan MK (eds) Advances in computer science and ubiquitous computing. Springer, Singapore, pp 201–207

    Chapter  Google Scholar 

  18. Park MW, Ko D, Hwang H, Moon J, Lee EC (2017) Image classification using color and spatial frequency in terms of human emotion. In: Park JJ, Jin H, Jeong Y-S, Khan MK (eds) Advanced multimedia and ubiquitous engineering. Springer, Singapore, pp 91–96

    Chapter  Google Scholar 

  19. Paro JA, Nazareli R, Gurjala A, Berger A, Lee GK (2015) Video-based self-review: comparing Google glass and GoPro technologies. Ann Plas Surg 74:71–74

    Article  Google Scholar 

  20. Pieters R, Wedel M, Batra R (2010) The stopping power of advertising: measures and effects of visual complexity. J Mark Theory 74(5):48–60

    Article  Google Scholar 

  21. Rao LK, Rao DV (2015) Local quantized extrema patterns for content-based natural and texture image retrieval. Human-centric Comput Inf Sci 5(1):26

    Article  Google Scholar 

  22. Scherer KR (2005) What are emotions? And how can they be measured? Soc Sci Inf 44(4):695–729

    Article  Google Scholar 

  23. Sellen AJ, Fogg A, Aitken M, Hodges S, Rother C, Wood K (2007) Do life-logging technologies support memory for the past?: An experimental study using sensecam. In: Proc. SIGCHI Conf. Human factors in computing systems. ACM: 81–90

  24. Wolf K, Schmidt A, Bexheti A, Langheinrich M (2014) Lifelogging: You're wearing a camera? Ann Plast Surg 13(3):8–12

    Google Scholar 

Download references

Acknowledgements

This work was supported by Institute for Information & communications Technology Promotion(IITP) grant funded by the Korea government(MSIT) (No. 2015-0-00312, The development of technology for social life logging based on analyzing social emotion and intelligence of convergence contents).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eui Chul Lee.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hwang, H., Lee, E.C. Determining the parameters of emotion by analyzing environmental images captured by a mobile device. Multimed Tools Appl 78, 28375–28389 (2019). https://doi.org/10.1007/s11042-017-5342-1

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-017-5342-1

Keywords

Navigation