Skip to main content

Employing Contrastive Strategies for Multi-label Textual Emotion Recognition

  • Conference paper
Intelligent Information Processing XI (IIP 2022)

Abstract

Textual emotion recognition is an important part of the human-computer interaction field. Current methods of textual emotion recognition mainly use large-scale pre-trained models fine-tuning. However, these methods are not accurate enough in the semantic representation of sentences. Contrastive learning has been shown to optimize the representation of vectors in the feature space. Therefore, we introduce the contrastive strategies to the textual emotion recognition task. We propose two approaches: using self-supervised contrastive learning before fine-tuning the pre-trained model, and using contrastive training on the same inputs during fine-tuning. We experiment on two multi-label emotion classification datasets: Ren-CECps and NLPCC2018. The experimental results demonstrate that the latter contrastive approach effectively improves the accuracy of emotion recognition.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alswaidan, N., Menai, M.E.B.: A survey of state-of-the-art approaches for emotion recognition in text. Knowl. Inf. Syst. 62(8), 2937–2987 (2020). https://doi.org/10.1007/s10115-020-01449-0

    Article  Google Scholar 

  2. Anusha, V., Sandhya, B.: A learning based emotion classifier with semantic text processing. In: El-Alfy, E.-S.M., Thampi, S.M., Takagi, H., Piramuthu, S., Hanne, T. (eds.) Advances in Intelligent Informatics. AISC, vol. 320, pp. 371–382. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-11218-3_34

    Chapter  Google Scholar 

  3. Bandhakavi, A., Wiratunga, N., Padmanabhan, D., Massie, S.: Lexicon based feature extraction for emotion text classification. Pattern Recogn. Lett. 93, 133–142 (2017)

    Article  Google Scholar 

  4. Chandrasekaran, D., Mago, V.: Evolution of semantic similarity-a survey. ACM Comput. Surv. (CSUR) 54(2), 1–37 (2021)

    Article  Google Scholar 

  5. Deng, J., Ren, F.: Multi-label emotion detection via emotion-specified feature extraction and emotion correlation learning. IEEE Trans. Affect. Comput. (2020)

    Google Scholar 

  6. Deng, J., Ren, F.: A survey of textual emotion recognition and its challenges. IEEE Trans. Affect. Comput. (2021)

    Google Scholar 

  7. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

  8. Gao, T., Yao, X., Chen, D.: SimCSE: simple contrastive learning of sentence embeddings. arXiv preprint arXiv:2104.08821 (2021)

  9. Han, X., et al.: Pre-trained models: past, present and future. AI Open 2, 225–250 (2021)

    Article  Google Scholar 

  10. Jaiswal, A., Babu, A.R., Zadeh, M.Z., Banerjee, D., Makedon, F.: A survey on contrastive self-supervised learning. Technologies 9(1), 2 (2021)

    Article  Google Scholar 

  11. Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22(1), 79–86 (1951)

    Article  MathSciNet  Google Scholar 

  12. Li, B., Zhou, H., He, J., Wang, M., Yang, Y., Li, L.: On the sentence embeddings from pre-trained language models. arXiv preprint arXiv:2011.05864 (2020)

  13. Li, J., Ren, F.: Creating a Chinese emotion lexicon based on corpus Ren-CECps. In: 2011 IEEE International Conference on Cloud Computing and Intelligence Systems, pp. 80–84. IEEE (2011)

    Google Scholar 

  14. Li, J., Xu, Y., Xiong, H., Wang, Y.: Chinese text emotion classification based on emotion dictionary. In: 2010 IEEE 2nd Symposium on Web Society, pp. 170–174. IEEE (2010)

    Google Scholar 

  15. Liang, X., et al.: R-drop: Regularized dropout for neural networks. arXiv preprint arXiv:2106.14448 (2021)

  16. Liu, B.: Sentiment analysis and opinion mining. Synth. Lect. Hum. Lang. Technol. 5(1), 1–167 (2012)

    Article  Google Scholar 

  17. Liu, S.M., Chen, J.H.: A multi-label classification based approach for sentiment classification. Expert Syst. Appl. 42(3), 1083–1093 (2015)

    Article  Google Scholar 

  18. Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)

  19. Mai, L., Le, B.: Joint sentence and aspect-level sentiment analysis of product comments. Ann. Oper. Res. 300(2), 493–513 (2020). https://doi.org/10.1007/s10479-020-03534-7

    Article  MathSciNet  MATH  Google Scholar 

  20. Majumder, G., Pakray, P., Gelbukh, A., Pinto, D.: Semantic textual similarity methods, tools, and applications: a survey. Computación y Sistemas 20(4), 647–665 (2016)

    Article  Google Scholar 

  21. Majumder, N., Poria, S., Gelbukh, A., Cambria, E.: Deep learning-based document modeling for personality detection from text. IEEE Intell. Syst. 32(2), 74–79 (2017)

    Article  Google Scholar 

  22. Quan, C., Ren, F.: An exploration of features for recognizing word emotion. In: Proceedings of the 23rd International Conference on Computational Linguistics (Coling 2010), pp. 922–930 (2010)

    Google Scholar 

  23. Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I., et al.: Language models are unsupervised multitask learners. OpenAI Blog 1(8), 9 (2019)

    Google Scholar 

  24. Ren, F., Deng, J.: Background knowledge based multi-stream neural network for text classification. Appl. Sci. 8(12), 2472 (2018)

    Article  Google Scholar 

  25. Satrio, D., Priyanto, S.H., Nugraha, A.K.: Viral marketing for cultural product: the role of emotion and cultural awareness to influence purchasing intention. Montenegrin J. Econ. 16(2), 77–91 (2020)

    Article  Google Scholar 

  26. Shorten, C., Khoshgoftaar, T.M., Furht, B.: Text data augmentation for deep learning. J. Big Data 8(1), 1–34 (2021)

    Article  Google Scholar 

  27. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

  28. Udochukwu, O., He, Y.: A rule-based approach to implicit emotion detection in text. In: Biemann, C., Handschuh, S., Freitas, A., Meziane, F., Métais, E. (eds.) NLDB 2015. LNCS, vol. 9103, pp. 197–203. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-19581-0_17

    Chapter  Google Scholar 

  29. Wang, T., Isola, P.: Understanding contrastive representation learning through alignment and uniformity on the hypersphere. In: International Conference on Machine Learning, pp. 9929–9939. PMLR (2020)

    Google Scholar 

  30. Wang, Z., Li, S., Wu, F., Sun, Q., Zhou, G.: Overview of NLPCC 2018 shared task 1: emotion detection in code-switching text. In: Zhang, M., Ng, V., Zhao, D., Li, S., Zan, H. (eds.) NLPCC 2018. LNCS (LNAI), vol. 11109, pp. 429–433. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-99501-4_39

    Chapter  Google Scholar 

  31. Wolf, T., et al.: Transformers: state-of-the-art natural language processing. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pp. 38–45 (2020)

    Google Scholar 

  32. Zhang, L., Wang, S., Liu, B.: Deep learning for sentiment analysis: a survey. Wiley Interdisc. Rev. Data Mining Knowl. Discov. 8(4), e1253 (2018)

    Article  Google Scholar 

Download references

Acknowledgements

This research has been supported by JSPS KAKENHI Grant Number 19K20345 and Grant Number 19H04215.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fuji Ren .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 IFIP International Federation for Information Processing

About this paper

Cite this paper

Zhou, Y., Kang, X., Ren, F. (2022). Employing Contrastive Strategies for Multi-label Textual Emotion Recognition. In: Shi, Z., Zucker, JD., An, B. (eds) Intelligent Information Processing XI. IIP 2022. IFIP Advances in Information and Communication Technology, vol 643. Springer, Cham. https://doi.org/10.1007/978-3-031-03948-5_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-03948-5_24

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-03947-8

  • Online ISBN: 978-3-031-03948-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics