skip to main content
10.1145/3555776.3577689acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
research-article

REFORMIST: Hierarchical Attention Networks for Multi-Domain Sentiment Classification with Active Learning

Published: 07 June 2023 Publication History

Abstract

In multi-domain sentiment classification, the classifier is trained on the source domain that includes multiple domains and is tested on the target domain. It is essential to highlight that the domain in the target domain is one of the domains in the source domain. The primary assumption is that none of the domains has sufficient labeled data, which is a real-life scenario, and there is transferred knowledge among the domains. In real applications, domains are unbalanced. Some domains have much less labeled data than others, and manually labeling them would require domain experts and much time, which can induce tremendous costs. This work proposes the REFORMIST approach that uses transfer learning and is based on Hierarchical Attention with BiLSTMs while incorporating Fast-Text word embedding. The Transfer Learning approach in this work assumes that a lot of the available data is unlabeled by only selecting a portion of the domain-specific training set. Two approaches were followed for the data sampling. In the first one, the data is randomly sampled from the data pool, while the second method applies Active Learning to query the most informative observations. First, the general classifier is trained on all domains. Second, the general model transfers knowledge to the domain-specific classifiers, using the general model's trained weights as a starting point. Three different approaches were used, and the experiments showed that the sentence-level transfer learning approach yields the best results. In this approach, the transferred weights of word-level layers are not updated throughout the training, as opposed to the weights of sentence-level layers.

References

[1]
Charu C. Aggarwal. 2013. Outlier Analysis. Springer.
[2]
Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2014. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014).
[3]
Piotr Bojanowski, Edouard Grave, Armand Joulin, and Tomas Mikolov. 2016. Enriching Word Vectors with Subword Information. arXiv preprint arXiv:1607.04606 (2016).
[4]
Piotr Bojanowski, Edouard Grave, Armand Joulin, and Tomás Mikolov. 2016. Enriching Word Vectors with Subword Information. CoRR abs/1607.04606 (2016). arXiv:1607.04606 http://arxiv.org/abs/1607.04606
[5]
Yitao Cai and Xiaojun Wan. 2019. Multi-Domain Sentiment Classification Based on Domain-Aware Embedding and Attention. In IJCAI. 4904--4910.
[6]
Rich Caruana. 2004. Multitask Learning. Machine Learning 28 (2004), 41--75.
[7]
Varun Chandola, Arindam Banerjee, and Vipin Kumar. 2009. Anomaly detection: A survey. ACM computing surveys (CSUR) 41, 3 (2009), 1--58.
[8]
Gui Citovsky, Giulia DeSalvo, Claudio Gentile, Lazaros Karydas, Anand Rajagopalan, Afshin Rostamizadeh, and Sanjiv Kumar. 2021. Batch Active Learning at Scale.
[9]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018).
[10]
Melanie Ducoffe and Frederic Precioso. 2018. Adversarial Active Learning for Deep Networks: a Margin Based Approach.
[11]
Daniel Gissin and Shai Shalev-Shwartz. 2019. Discriminative Active Learning.
[12]
Rui He, Shengcai Liu, Shan He, and Ke Tang. 2021. Multi-Domain Active Learning: Literature Review and Comparative Study.
[13]
Katerina Katsarou, Roxana Maria Jeney, and Kostas Stefanidis. 2023. MUTUAL: Multi-Domain Sentiment Classification via Uncertainty Sampling (SAC '23). Association for Computing Machinery, New York, NY, USA.
[14]
Diederik P. Kingma and Jimmy Ba. 2014. Adam: A Method for Stochastic Optimization.
[15]
Joris Knoester, Flavius Frasincar, and Maria Mihaela Truşcundefined. 2022. Domain Adversarial Training for Aspect-Based Sentiment Analysis. In Web Information Systems Engineering - WISE 2022: 23rd International Conference, Biarritz, France, November 1--3, 2022, Proceedings. Springer-Verlag, Berlin, Heidelberg, 21--37.
[16]
David D. Lewis and Jason Catlett. 1994. Heterogenous Uncertainty Sampling for Supervised Learning. In Proceedings of the Eleventh International Conference on International Conference on Machine Learning (ICML'94). Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 148--156.
[17]
Lianghao Li, Xiaoming Jin, Sinno Jialin Pan, and Jian-Tao Sun. 2012. Multi-Domain Active Learning for Text Classification. In Proceedings of the 18th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD '12). Association for Computing Machinery, New York, NY, USA, 1086--1094.
[18]
Shoushan Li and Chengqing Zong. 2008. Multi-domain Sentiment Classification. In Proceedings of ACL-08: HLT, Short Papers. Association for Computational Linguistics, Columbus, Ohio, 257--260. https://aclanthology.org/P08-2065
[19]
Xiaozhou Li, Boyang Zhang, Zheying Zhang, and Kostas Stefanidis. 2020. A Sentiment-Statistical Approach for Identifying Problematic Mobile App Updates Based on User Reviews. Inf. 11, 3 (2020), 152.
[20]
Fei Tony Liu, Kai Ming Ting, and Zhi-Hua Zhou. 2008. Isolation forest. In 2008 eighth ieee international conference on data mining. IEEE, 413--422.
[21]
Pengfei Liu, Xipeng Qiu, and Xuanjing Huang. 2016. Deep Multi-Task Learning with Shared Memory for Text Classification. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, Austin, Texas, 118--127.
[22]
Pengfei Liu, Xipeng Qiu, and Xuanjing Huang. 2016. Recurrent Neural Network for Text Classification with Multi-Task Learning. In IJCAI. 2873--2879.
[23]
Pengfei Liu, Xipeng Qiu, and Xuanjing Huang. 2017. Adversarial Multi-task Learning for Text Classification. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Association for Computational Linguistics, Vancouver, Canada, 1--10.
[24]
Qi Liu, Yue Zhang, and Jiangming Liu. 2018. Learning domain representation for multi-domain sentiment classification. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 541--550.
[25]
Shayne Longpre, Julia Reisler, Edward Greg Huang, Yi Lu, Andrew Frank, Nikhil Ramesh, and Chris DuBois. 2022. Active Learning Over Multiple Domains in Natural Language Tasks.
[26]
Nicholas Roy and Andrew McCallum. 2001. Toward Optimal Active Learning through Sampling Estimation of Error Reduction. In ICML. Morgan Kaufmann Publishers Inc., 441--448.
[27]
Burr Settles. 2012. Active learning. Synthesis lectures on artificial intelligence and machine learning 6, 1 (2012), 1--114.
[28]
Burr Settles and Mark Craven. 2008. An Analysis of Active Learning Strategies for Sequence Labeling Tasks. In Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, Honolulu, Hawaii, 1070--1079. https://aclanthology.org/D08-1112
[29]
Burr Settles and Mark Craven. 2008. An Analysis of Active Learning Strategies for Sequence Labeling Tasks. In EMNLP. 1070--1079.
[30]
Xuefeng Su, Ru Li, and Xiaoli Li. 2020. Multi-domain transfer learning for text classification. In CCF International Conference on Natural Language Processing and Chinese Computing. Springer, 457--469.
[31]
Stefan van Berkum, Sophia van Megen, Max Savelkoul, Pim Weterman, and Flavius Frasincar. 2022. Fine-Tuning for Cross-Domain Aspect-Based Sentiment Classification. In IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT '21). Association for Computing Machinery, New York, NY, USA, 524--531.
[32]
Zhao Xu, Kai Yu, Volker Tresp, Xiaowei Xu, and Jizhi Wang. 2003. Representative sampling for text classification using support vector machines. In European conference on information retrieval. Springer, 393--407.
[33]
Zichao Yang, Diyi Yang, Chris Dyer, Xiaodong He, Alex Smola, and Eduard Hovy. 2016. Hierarchical attention networks for document classification. In Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies. 1480--1489.
[34]
Fedor Zhdanov. 2019. Diverse mini-batch Active Learning.
[35]
Renjie Zheng, Junkun Chen, and Xipeng Qiu. 2018. Same Representation, Different Attentions: Shareable Sentence Representation Learning from Multiple Tasks. In IJCAI. 4616--4622.
[36]
Xiaojin Zhu, John Lafferty, and Zoubin Ghahramani. 2003. Combining active learning and semi-supervised learning using gaussian fields and harmonic functions. In ICML 2003 workshop on the continuum from labeled to unlabeled data in machine learning and data mining, Vol. 3.

Cited By

View all
  • (2024)Data-Driven Analysis for Monitoring Software EvolutionNew Trends in Database and Information Systems10.1007/978-3-031-70421-5_36(383-391)Online publication date: 14-Nov-2024

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SAC '23: Proceedings of the 38th ACM/SIGAPP Symposium on Applied Computing
March 2023
1932 pages
ISBN:9781450395175
DOI:10.1145/3555776
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 June 2023

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. multi-domain sentiment classification
  2. transfer learning
  3. active learning
  4. entropy sampling
  5. hierarchical attention
  6. BiLSTM

Qualifiers

  • Research-article

Conference

SAC '23
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,650 of 6,669 submissions, 25%

Upcoming Conference

SAC '25
The 40th ACM/SIGAPP Symposium on Applied Computing
March 31 - April 4, 2025
Catania , Italy

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)13
  • Downloads (Last 6 weeks)2
Reflects downloads up to 06 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Data-Driven Analysis for Monitoring Software EvolutionNew Trends in Database and Information Systems10.1007/978-3-031-70421-5_36(383-391)Online publication date: 14-Nov-2024

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media