ABSTRACT
In multi-domain sentiment classification, the classifier is trained on the source domain that includes multiple domains and is tested on the target domain. It is essential to highlight that the domain in the target domain is one of the domains in the source domain. The primary assumption is that none of the domains has sufficient labeled data, which is a real-life scenario, and there is transferred knowledge among the domains. In real applications, domains are unbalanced. Some domains have much less labeled data than others, and manually labeling them would require domain experts and much time, which can induce tremendous costs. This work proposes the REFORMIST approach that uses transfer learning and is based on Hierarchical Attention with BiLSTMs while incorporating Fast-Text word embedding. The Transfer Learning approach in this work assumes that a lot of the available data is unlabeled by only selecting a portion of the domain-specific training set. Two approaches were followed for the data sampling. In the first one, the data is randomly sampled from the data pool, while the second method applies Active Learning to query the most informative observations. First, the general classifier is trained on all domains. Second, the general model transfers knowledge to the domain-specific classifiers, using the general model's trained weights as a starting point. Three different approaches were used, and the experiments showed that the sentence-level transfer learning approach yields the best results. In this approach, the transferred weights of word-level layers are not updated throughout the training, as opposed to the weights of sentence-level layers.
- Charu C. Aggarwal. 2013. Outlier Analysis. Springer. Google ScholarCross Ref
- Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. 2014. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014).Google Scholar
- Piotr Bojanowski, Edouard Grave, Armand Joulin, and Tomas Mikolov. 2016. Enriching Word Vectors with Subword Information. arXiv preprint arXiv:1607.04606 (2016).Google Scholar
- Piotr Bojanowski, Edouard Grave, Armand Joulin, and Tomás Mikolov. 2016. Enriching Word Vectors with Subword Information. CoRR abs/1607.04606 (2016). arXiv:1607.04606 http://arxiv.org/abs/1607.04606Google Scholar
- Yitao Cai and Xiaojun Wan. 2019. Multi-Domain Sentiment Classification Based on Domain-Aware Embedding and Attention.. In IJCAI. 4904--4910.Google Scholar
- Rich Caruana. 2004. Multitask Learning. Machine Learning 28 (2004), 41--75.Google ScholarDigital Library
- Varun Chandola, Arindam Banerjee, and Vipin Kumar. 2009. Anomaly detection: A survey. ACM computing surveys (CSUR) 41, 3 (2009), 1--58.Google ScholarDigital Library
- Gui Citovsky, Giulia DeSalvo, Claudio Gentile, Lazaros Karydas, Anand Rajagopalan, Afshin Rostamizadeh, and Sanjiv Kumar. 2021. Batch Active Learning at Scale. Google ScholarCross Ref
- Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018).Google Scholar
- Melanie Ducoffe and Frederic Precioso. 2018. Adversarial Active Learning for Deep Networks: a Margin Based Approach.Google Scholar
- Daniel Gissin and Shai Shalev-Shwartz. 2019. Discriminative Active Learning. Google ScholarCross Ref
- Rui He, Shengcai Liu, Shan He, and Ke Tang. 2021. Multi-Domain Active Learning: Literature Review and Comparative Study. Google ScholarCross Ref
- Katerina Katsarou, Roxana Maria Jeney, and Kostas Stefanidis. 2023. MUTUAL: Multi-Domain Sentiment Classification via Uncertainty Sampling (SAC '23). Association for Computing Machinery, New York, NY, USA.Google Scholar
- Diederik P. Kingma and Jimmy Ba. 2014. Adam: A Method for Stochastic Optimization. Google ScholarCross Ref
- Joris Knoester, Flavius Frasincar, and Maria Mihaela Truşcundefined. 2022. Domain Adversarial Training for Aspect-Based Sentiment Analysis. In Web Information Systems Engineering - WISE 2022: 23rd International Conference, Biarritz, France, November 1--3, 2022, Proceedings. Springer-Verlag, Berlin, Heidelberg, 21--37. Google ScholarDigital Library
- David D. Lewis and Jason Catlett. 1994. Heterogenous Uncertainty Sampling for Supervised Learning. In Proceedings of the Eleventh International Conference on International Conference on Machine Learning (ICML'94). Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 148--156.Google Scholar
- Lianghao Li, Xiaoming Jin, Sinno Jialin Pan, and Jian-Tao Sun. 2012. Multi-Domain Active Learning for Text Classification. In Proceedings of the 18th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD '12). Association for Computing Machinery, New York, NY, USA, 1086--1094. Google ScholarDigital Library
- Shoushan Li and Chengqing Zong. 2008. Multi-domain Sentiment Classification. In Proceedings of ACL-08: HLT, Short Papers. Association for Computational Linguistics, Columbus, Ohio, 257--260. https://aclanthology.org/P08-2065Google ScholarCross Ref
- Xiaozhou Li, Boyang Zhang, Zheying Zhang, and Kostas Stefanidis. 2020. A Sentiment-Statistical Approach for Identifying Problematic Mobile App Updates Based on User Reviews. Inf. 11, 3 (2020), 152.Google Scholar
- Fei Tony Liu, Kai Ming Ting, and Zhi-Hua Zhou. 2008. Isolation forest. In 2008 eighth ieee international conference on data mining. IEEE, 413--422.Google ScholarDigital Library
- Pengfei Liu, Xipeng Qiu, and Xuanjing Huang. 2016. Deep Multi-Task Learning with Shared Memory for Text Classification. In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, Austin, Texas, 118--127.Google ScholarCross Ref
- Pengfei Liu, Xipeng Qiu, and Xuanjing Huang. 2016. Recurrent Neural Network for Text Classification with Multi-Task Learning. In IJCAI. 2873--2879.Google Scholar
- Pengfei Liu, Xipeng Qiu, and Xuanjing Huang. 2017. Adversarial Multi-task Learning for Text Classification. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). Association for Computational Linguistics, Vancouver, Canada, 1--10.Google ScholarCross Ref
- Qi Liu, Yue Zhang, and Jiangming Liu. 2018. Learning domain representation for multi-domain sentiment classification. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 541--550.Google ScholarCross Ref
- Shayne Longpre, Julia Reisler, Edward Greg Huang, Yi Lu, Andrew Frank, Nikhil Ramesh, and Chris DuBois. 2022. Active Learning Over Multiple Domains in Natural Language Tasks. Google ScholarCross Ref
- Nicholas Roy and Andrew McCallum. 2001. Toward Optimal Active Learning through Sampling Estimation of Error Reduction. In ICML. Morgan Kaufmann Publishers Inc., 441--448.Google Scholar
- Burr Settles. 2012. Active learning. Synthesis lectures on artificial intelligence and machine learning 6, 1 (2012), 1--114.Google Scholar
- Burr Settles and Mark Craven. 2008. An Analysis of Active Learning Strategies for Sequence Labeling Tasks. In Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing. Association for Computational Linguistics, Honolulu, Hawaii, 1070--1079. https://aclanthology.org/D08-1112Google ScholarCross Ref
- Burr Settles and Mark Craven. 2008. An Analysis of Active Learning Strategies for Sequence Labeling Tasks. In EMNLP. 1070--1079.Google Scholar
- Xuefeng Su, Ru Li, and Xiaoli Li. 2020. Multi-domain transfer learning for text classification. In CCF International Conference on Natural Language Processing and Chinese Computing. Springer, 457--469.Google ScholarDigital Library
- Stefan van Berkum, Sophia van Megen, Max Savelkoul, Pim Weterman, and Flavius Frasincar. 2022. Fine-Tuning for Cross-Domain Aspect-Based Sentiment Classification. In IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT '21). Association for Computing Machinery, New York, NY, USA, 524--531. Google ScholarDigital Library
- Zhao Xu, Kai Yu, Volker Tresp, Xiaowei Xu, and Jizhi Wang. 2003. Representative sampling for text classification using support vector machines. In European conference on information retrieval. Springer, 393--407.Google ScholarCross Ref
- Zichao Yang, Diyi Yang, Chris Dyer, Xiaodong He, Alex Smola, and Eduard Hovy. 2016. Hierarchical attention networks for document classification. In Proceedings of the 2016 conference of the North American chapter of the association for computational linguistics: human language technologies. 1480--1489.Google ScholarCross Ref
- Fedor Zhdanov. 2019. Diverse mini-batch Active Learning. Google ScholarCross Ref
- Renjie Zheng, Junkun Chen, and Xipeng Qiu. 2018. Same Representation, Different Attentions: Shareable Sentence Representation Learning from Multiple Tasks. In IJCAI. 4616--4622.Google Scholar
- Xiaojin Zhu, John Lafferty, and Zoubin Ghahramani. 2003. Combining active learning and semi-supervised learning using gaussian fields and harmonic functions. In ICML 2003 workshop on the continuum from labeled to unlabeled data in machine learning and data mining, Vol. 3.Google Scholar
Index Terms
- REFORMIST: Hierarchical Attention Networks for Multi-Domain Sentiment Classification with Active Learning
Recommendations
MUTUAL: Multi-Domain Sentiment Classification via Uncertainty Sampling
SAC '23: Proceedings of the 38th ACM/SIGAPP Symposium on Applied ComputingMulti-domain sentiment classification trains a classifier using multiple domains and then tests the classifier on one of the domains. Importantly, no domain is assumed to have sufficient labeled data; instead, the goal is leveraging information between ...
Multi-domain active learning for text classification
KDD '12: Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data miningActive learning has been proven to be effective in reducing labeling efforts for supervised learning. However, existing active learning work has mainly focused on training models for a single domain. In practical applications, it is common to ...
Multi-domain Sentiment Classification on Self-constructed Indonesian Dataset
Natural Language Processing and Chinese ComputingAbstractDomain-dependence limits the application of a well-trained sentiment classifier based on one domain data in other different domains. To solve this problem, multi-domain sentiment classification has received great attention recently. It aims to ...
Comments