Abstract
Text classification is one of the fundamental tasks in the field of natural language processing. The CNN-based approaches and RNN-based approaches have shown different capabilities in representing a piece of text. In this paper, we propose a hybrid CNN-RNN attention-based neural network, named CRAN, which combines the convolutional neural network and recurrent neural network effectively with the help of the attention mechanism. We validate the proposed model on several large-scale datasets (i.e., eight multi-class text classification and five multi-label text classification tasks), and compare with the state-of-the-art models. Experimental results show that CRAN can achieve the state-of-the-art performance on most of the datasets. In particular, CRAN yields better performance with much fewer parameters compared with a very deep convolutional networks with 29 layers, which proves its effectiveness and efficiency.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
We ignore some relevant models because of the unavailability of their codes.
References
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. CoRR abs/1409.0473 (2014)
Blanzieri, E., Bryl, A.: A survey of learning-based techniques of email spam filtering. Artif. Intell. Rev. 29(1), 63–92 (2008)
Cai, R., Zhang, X., Wang, H.: Bidirectional recurrent convolutional neural network for relation classification. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016, August 7–12, 2016, Berlin, Germany, Volume 1: Long Papers (2016), http://aclweb.org/anthology/P/P16/P16-1072.pdf
Carrier, P.L., Cho, K.: Lstm networks for sentiment analysis. Deep Learning Tutorials (2014)
Cho, K., van Merrienboer, B., Gülçehre, Ç., Bougares, F., Schwenk, H., Bengio, Y.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. CoRR abs/1406.1078 (2014), http://arxiv.org/abs/1406.1078
Chorowski, J., Bahdanau, D., Serdyuk, D., Cho, K., Bengio, Y.: Attention-based models for speech recognition. In: Advances in Neural Information Processing Systems. pp. 577–585 (2015)
Conneau, A., Schwenk, H., Barrault, L., LeCun, Y.: Very deep convolutional networks for natural language processing. CoRR abs/1606.01781 (2016)
Hermann, K.M., Kociský, T., Grefenstette, E., Espeholt, L., Kay, W., Suleyman, M., Blunsom, P.: Teaching machines to read and comprehend. CoRR abs/1506.03340 (2015)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Computation 9(8), 1735–80 (1997)
Joachims, Thorsten: Text categorization with Support Vector Machines: Learning with many relevant features. In: Nédellec, Claire, Rouveirol, Céline (eds.) ECML 1998. LNCS, vol. 1398, pp. 137–142. Springer, Heidelberg (1998). https://doi.org/10.1007/BFb0026683
Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences. In: ACL (1). pp. 655–665. The Association for Computer Linguistics (2014)
Kim, Y.: Convolutional neural networks for sentence classification. In: EMNLP. pp. 1746–1751 (2014)
Kingma, D.P., Ba, J.: Adam: A method for stochastic optimization. CoRR abs/1412.6980 (2014), http://dblp.uni-trier.de/db/journals/corr/corr1412.html/KingmaB14
Lai, S., Xu, L., Liu, K., Zhao, J.: Recurrent convolutional neural networks for text classification. In: AAAI. pp. 2267–2273. AAAI Press (2015)
Lai, S., Xu, L., Liu, K., Zhao, J.: Recurrent convolutional neural networks for text classification. In: Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence. pp. 2267–2273. AAAI’15, AAAI Press (2015), http://dl.acm.org/citation.cfm?id=2886521.2886636
Maas, A.L., Daly, R.E., Pham, P.T., Huang, D., Ng, A.Y., Potts, C.: Learning word vectors for sentiment analysis. In: HLT-NAACL. pp. 142–150 (2011)
Mikolov, T., Sutskever, I., Chen, K., Corrado, G., Dean, J.: Distributed representations of words and phrases and their compositionality. In: NIPS. pp. 3111–3119 (2013)
Pang, B., Lee, L., Vaithyanathan, S.: Thumbs up? sentiment classification using machine learning techniques. In: EMNLP. pp. 79–86 (July 2002)
Song, J., Guo, Z., Gao, L., Liu, W., Zhang, D., Shen, H.T.: Hierarchical LSTM with adjusted temporal attention for video captioning. CoRR abs/1706.01231 (2017), http://arxiv.org/abs/1706.01231
Tang, D., Qin, B., Liu, T.: Document modeling with gated recurrent neural network for sentiment classification. In: EMNLP. pp. 1422–1432 (2015)
Wang, S.I., Manning, C.D.: Baselines and bigrams: Simple, good sentiment and topic classification. In: ACL (2). pp. 90–94. The Association for Computer Linguistics (2012)
Xiao, Y., Cho, K.: Efficient character-level document classification by combining convolution and recurrent layers. CoRR abs/1602.00367 (2016)
Xu, K., Ba, J., Kiros, R., Cho, K., Courville, A., Salakhutdinov, R., Zemel, R., Bengio, Y.: Show, attend and tell: Neural image caption generation with visual attention. Computer Science pp. 2048–2057 (2015)
Yang, Z., Yang, D., Dyer, C., He, X., Smola, A.J., Hovy, E.H.: Hierarchical attention networks for document classification. In: HLT-NAACL. pp. 1480–1489 (2016)
Zhang, D., Wang, D.: Relation classification via recurrent neural network. CoRR abs/1508.01006 (2015)
Zhang, X., Zhao, J.J., LeCun, Y.: Character-level convolutional networks for text classification. In: NIPS. pp. 649–657 (2015)
Zhou, P., Qi, Z., Zheng, S., Xu, J., Bao, H., Xu, B.: Text classification improved by integrating bidirectional lstm with two-dimensional max pooling. In: Calzolari, N., Matsumoto, Y., Prasad, R. (eds.) COLING. pp. 3485–3495. ACL (2016), http://dblp.uni-trier.de/db/conf/coling/coling2016.html//ZhouQZXBX16
Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H., Xu, B.: Attention-based bidirectional long short-term memory networks for relation classification. In: ACL (2016)
Acknowledgments
This work is supported in part by the National Natural Science Foundation of China under Grant No. 61702016, 61602087, 61632007, 61572039, the China Postdoctoral Science Foundation under Grant No. 2017M610019, 973 program under No. 2014CB340405, Shenzhen Gov Research Project JCYJ20151014093505032, the Fundamental Research Funds for the Central Universities under grants No. ZYGX2016J080, ZYGX2014Z007.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Guo, L., Zhang, D., Wang, L., Wang, H., Cui, B. (2018). CRAN: A Hybrid CNN-RNN Attention-Based Model for Text Classification. In: Trujillo, J., et al. Conceptual Modeling. ER 2018. Lecture Notes in Computer Science(), vol 11157. Springer, Cham. https://doi.org/10.1007/978-3-030-00847-5_42
Download citation
DOI: https://doi.org/10.1007/978-3-030-00847-5_42
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-00846-8
Online ISBN: 978-3-030-00847-5
eBook Packages: Computer ScienceComputer Science (R0)