Skip to main content

Advertisement

Log in

Learning user-emotion and user-feature couplings for image emotion classification

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Over the past few years, image emotion classification (IEC) has received increasing research interest. Existing works usually define IEC as a multi-class classification problem from features to emotions, while the subjectivity of user perception is often ignored. However, our experimental study shows that there are coupling relationships between users and emotions, as well as users and features. To address such issues, in this paper, we propose a new IEC model, called CoupledIEC. In CoupledIEC, to capture the user-emotion coupling, a clustering-based embedding model is proposed to encode users of similar emotion preferences with close representations. To model the user-feature coupling, a convolutional neural network-based coupling learning model is developed, where the Hadamard product and the matrix product are employed respectively to capture the explicit and the implicit user-feature coupling information. The two models are then integrated in a unified neural network. The experimental results on real-world image collection demonstrate that the IEC performance can be improved significantly by taking into account user-emotion and user-feature couplings.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. Synonyms were obtained from site: http://www.thesaurus.com/browse/synonym

References

  1. Bhattacharya S, Maddikunta PKR, Pham QV, Gadekallu TR, Chowdhary CL, Alazab M, Piran MJ, et al. (2020) Deep learning and medical image processing for coronavirus (COVID-19) pandemic: a survey. Sustainable cities and society 65:102589

    Article  Google Scholar 

  2. Borth D, Ji R, Chen T, Breuel T, Chang SF (2013) Large-scale visual sentiment ontology and detectors using adjective noun pairs. In: Proceedings of the 21st ACM International conference on Multimedia, pp 223–232

  3. Cao L (2015) Coupling learning of complex interactions. Information Processing & Management 51(2):167–186

    Article  Google Scholar 

  4. Chen C, Wu Z, Jiang YG (2016) Emotion in context: Deep semantic feature fusion for video emotion recognition. In: Proceedings of the 24th ACM international conference on Multimedia, pp 127–131

  5. Deng J, Dong W, Socher R, Li LJ, Li K, Fei-Fei L (2009) Imagenet: A large-scale hierarchical image database. In: IEEE Conference on Computer Vision and Pattern Recognition, pp 248–255

  6. Gadekallu TR, Rajput DS, Reddy MPK, Lakshmanna K, Bhattacharya S, Singh S, Jolfaei A, Alazab M (2020) A novel PCA–whale optimization-based deep neural network model for classification of tomato plant diseases using GPU. J Real-Time Image Proc, 1–14

  7. Jian S, Cao L, Pang G, Lu K, Gao H (2017) Embedding-based representation of categorical data by hierarchical value coupling learning. In: Proceedings of the 26th International Joint Conference on Artificial Intelligence, pp 1937–1943

  8. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv:160902907

  9. Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: Advances in neural information processing systems, pp 1097–1105

  10. Lee J, Park E (2011) Fuzzy similarity-based emotional classification of color images. IEEE Transactions on Multimedia 13(5):1031–1039

    Article  Google Scholar 

  11. Machajdik J, Hanbury A (2010) Affective image classification using features inspired by psychology and art theory. In: Proceedings of the 18th ACM International conference on multimedia, pp 83–92

  12. Mikels JA, Fredrickson BL, Larkin GR, Lindberg CM, Maglio SJ, Reuter-Lorenz PA (2005) Emotional category data on images from the international affective picture system. Behav Res Methods 37(4):626–630

    Article  Google Scholar 

  13. Rao T, Xu M, Liu H, Wang J, Burnett I (2016) Multi-scale blocks based image emotion classification using multiple instance learning. In: IEEE International conference on image processing, pp 634–638

  14. Rao T, Xu M, Xu D (2016) Learning multi-level deep representations for image emotion classification. arXiv:161107145

  15. Tonge A, Caragea C (2016) Image privacy prediction using deep features. In: Proceedings of the AAAI conference on artificial intelligence

  16. Wang C, Cao L, Chi CH (2015) Formalization and verification of group behavior interactions. IEEE Transactions on Systems, Man, and Cybernetics: Systems 45(8):1109–1124

    Article  Google Scholar 

  17. Wang J, Fu J, Xu Y, Mei T (2016) Beyond object recognition: visual sentiment analysis with deep coupled adjective and noun neural networks. In: Proceedings of the twenty-fifth International joint conference on artificial intelligence, pp 3484–3490

  18. Yang P, Liu Q, Metaxas DN (2010) Exploring facial expressions with compositional features. In: IEEE conference on computer vision and pattern recognition, pp 2638–2644

  19. Yao L, Mao C, Luo Y (2019) Graph convolutional networks for text classification. In: Proceedings of the AAAI conference on artificial intelligence, pp 7370–7377

  20. You Q, Luo J, Jin H, Yang J (2015) Robust image sentiment analysis using progressively trained and domain transferred deep networks. In: Proceedings of the twenty-ninth AAAI conference on artificial intelligence, pp 381–388

  21. You Q, Luo J, Jin H, Yang J (2016) Building a large scale dataset for image emotion recognition: the fine print and the benchmark. In: Proceedings of the thirtieth AAAI conference on artificial intelligence, pp 308–314

  22. Yuan J, Mcdonough S, You Q, Luo J (2013) Sentribute: image sentiment analysis from a mid-level perspective. In: Proceedings of the second International workshop on issues of sentiment discovery and opinion mining, p 10

  23. Zhang Q, Cao L, Zhu C, Li Z, Sun J (2018) Coupledcf: Learning explicit and implicit user-item couplings in recommendation for deep collaborative filtering. In: Proceedings of the 27th International joint conference on artificial intelligence, pp 3662–3668

  24. Zhao S, Ding G, Huang Q, Chua TS, Schuller BW, Keutzer K (2018) Affective image content analysis: a comprehensive survey. In: Proceedings of the twenty-seventh International joint conference on artificial intelligence, pp 5534–5541

  25. Zhao S, Yao H, Gao Y, Ding G, Chua TS (2018) Predicting personalized image emotion perceptions in social networks. IEEE Transactions on Affective Computing

  26. Zhao S, Yao H, Gao Y, Ji R, Xie W, Jiang X, Chua TS (2016) Predicting personalized emotion perceptions of social images. In: Proceedings of the 2016 ACM on multimedia conference, pp 1385–1394

  27. Zhu X, Li L, Zhang W, Rao T, Xu M, Huang Q, Xu D (2017) Dependency exploitation: a unified cnn-rnn approach for visual emotion recognition. In: Proceedings of the 26th International Joint Conference on Artificial Intelligence, pp 3595–3601

Download references

Acknowledgements

This work is supported by the National Natural Science Foundation of China (No. 61972035 and No. U19B2020).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yonggang Huang.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Huang, Y., Zheng, Y. & Wu, H. Learning user-emotion and user-feature couplings for image emotion classification. Multimed Tools Appl 81, 32739–32754 (2022). https://doi.org/10.1007/s11042-022-12867-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-12867-3

Keywords