Abstract
In daily life, as people are most exposed to the surrounding environment, accordingly humans are greatly affected as their emotion depending on the surrounding visual and spatial information. In this paper, we propose an analysis method of how the sound information such as amplitude and frequency in the surrounding environment can affect to human emotions by adopting sound features among the visual and spatial information. For the experiments, a total of 1,500 video clips of surrounding environment are acquired by 15 subjects using the camera built in smartphone. Also, the subjective evaluation were performed after taking the video. Two features such as amplitude and frequency of sound data were extracted. Then, we designed a fully connected SVR inference networks in which the data were divided into two sets such as training and test. The extracted two dimensional features were SVR trained by corresponding with the subjective evaluation scores for pleasant and arousal levels. As a result, we confirmed that the estimated two-dimensional emotions were similar with the subjective evaluated ones in which the errors were about the pleasant level (−1 ~ +1) of 0.27 and the arousal level (−1 ~ +1) of 0.32, respectively.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Oh, S.J.: Psychoacoustics. Sigma Press, New York (2014)
Schuller, B., Hantke, S., Weninger, F., Han, W., Zhang, Z., Narayanan, S.: Automatic recognition of emotion evoked by general sound events. In: IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), pp. 341–344 (2012)
Yang, Y.H., Chen, H.H.: Machine recognition of music emotion: a review. ACM Trans. Intell. Syst. Technol. (TIST), vol. 3, Article No. 40 (2012)
Zentner, M., Grandjean, D., Scherer, K.R.: Emotions evoked by the sound of music: characterization, classification, and measurement. Emotion 8, 494–521 (2008)
Ren, F.: Affective information processing and recognizing human emotion. Electron. Notes Theor. Comput. Sci. 225, 39–50 (2009)
Park, M.W., Ko, D., Hwang, H., Moon, J., Lee, E.C.: Image classification using color and spatial frequency in terms of human emotion. In: Advanced Multimedia and Ubiquitous Engineering, pp. 91–96 (2017)
Hwang, H., Ko, D., Whang, M., Lee, E.C.: Mobile app for analyzing environmental visual parameters with life logging camera. In: Advanced Multimedia and Ubiquitous Engineering, pp. 37–42 (2017)
Acknowledgements
This work was supported by Institute for Information & communications Technology Promotion (IITP) grant funded by the Korea government (MSIT) (No. 2015-0-00312, The development of technology for social life logging based on analyzing social emotion and intelligence of convergence contents).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Park, M.W., Hwang, H., Lee, E.C. (2018). Correlation Analysis Between Environmental Sound and Human Emotion. In: Park, J., Loia, V., Yi, G., Sung, Y. (eds) Advances in Computer Science and Ubiquitous Computing. CUTE CSA 2017 2017. Lecture Notes in Electrical Engineering, vol 474. Springer, Singapore. https://doi.org/10.1007/978-981-10-7605-3_214
Download citation
DOI: https://doi.org/10.1007/978-981-10-7605-3_214
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-7604-6
Online ISBN: 978-981-10-7605-3
eBook Packages: EngineeringEngineering (R0)