Abstract
In recent years, machine reading comprehension is becoming a more and more popular research topic. Promising results were obtained when the machine reading comprehension task had only two inputs, context and query. In this paper, we propose a capsule networks based model for Chinese opinion machine reading comprehension task which has three inputs: context, query and alternatives. First, we use a bi-directional LSTM to encode the three inputs. Second, model the complex interactions between context and query with a multiway attention layer. In addition to the attention mechanism used in BiDAF, the other two attention functions are designed to match the relationship between inputs. Finally, we present a capsule networks layer to route the right alternative. Specifically, we use two strategies to improve the dynamic routing process to filter noisy capsules, which may contain useless information such as stop words. Our single model achieves competitive results compared to the baseline methods on a Chinese dataset and obtains a significant improvement of 2.45% accuracy.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
This dataset was published by AI-Challenger2018 and available at: challenger.ai/competition/oqmrc2018.
- 2.
This github url is: https://github.com/NLPLearn/QANet.
- 3.
This release can be found at: https://github.com/baidu/DuReader/tree/master/tensorflow.
- 4.
This github url is: https://github.com/NLPLearn/R-net.
References
Devlin, J., et al.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)
Seo, M., et al.: Bidirectional attention flow for machine comprehension. arXiv preprint arXiv:1611.01603 (2016)
Wang, W., et al.: R-NET: machine reading comprehension with self-matching networks. Natural Language Computer Group, Microsoft Reserach. Asia, Beijing, China, Technical Report 5 (2017)
Yu, A.W., et al.: Qanet: combining local convolution with global self-attention for reading comprehension. arXiv preprint arXiv:1804.09541 (2018)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Chen, D., Jason, B., Manning, C.D.: A thorough examination of the cnn/daily mail reading comprehension task. arXiv preprint arXiv:1606.02858 (2016)
Wang, S., Jiang, J.: A compare-aggregate model for matching text sequences. arXiv preprint arXiv:1611.01747 (2016)
Sabour, S., Frosst, N., Hinton, G.E.: Dynamic routing between capsules. In: Advances in Neural Information Processing Systems (2017)
Zhao, W., et al.: Investigating capsule networks with dynamic routing for text classification. arXiv preprint arXiv:1804.00538 (2018)
Rajpurkar, P., et al.: Squad: 100,000+ questions for machine comprehension of text. arXiv preprint arXiv:1606.05250 (2016)
Hermann, K.M., et al.: Teaching machines to read and comprehend. In: Advances in Neural Information Processing Systems (2015)
Hewlett, D., et al.: Wikireading: a novel large-scale language understanding task over wikipedia. arXiv preprint arXiv:1608.03542 (2016)
He, W., et al.: Dureader: a chinese machine reading comprehension dataset from real-world applications. arXiv preprint arXiv:1711.05073 (2017)
Xiong, C., Zhong, V., Socher, R.: Dynamic coattention networks for question answering. arXiv preprint arXiv:1611.01604 (2016)
AI-Challenger2018 Homepage. https://challenger.ai/competition/oqmrc2018. Accessed 17 May 2019
Chen, D., et al.: Reading wikipedia to answer open-domain questions. arXiv preprint arXiv:1704.00051 (2017)
Cui, Y., et al.: Attention-over-attention neural networks for reading comprehension. arXiv preprint arXiv:1607.04423 (2016)
Bowman, S.R., et al.: A large annotated corpus for learning natural language inference. arXiv preprint arXiv:1508.05326 (2015)
Yang, Y., Yih, W.T., Meek, C.: Wikiqa: a challenge dataset for open-domain question answering. In: Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (2015)
Tan, M., et al.: Improved representation learning for question answer matching. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, (Long Papers), vol. 1 (2016)
Rocktäschel, T., et al.: Reasoning about entailment with neural attention. arXiv preprint arXiv:1509.06664 (2015)
Bahdanau, D., Cho, K., Bengio, Y.: Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473 (2014)
Kadlec, R., et al.: Text understanding with the attention sum reader network. arXiv preprint arXiv:1603.01547 (2016)
Joachims, T.: Text categorization with support vector machines: learning with many relevant features. In: Nédellec, C., Rouveirol, C. (eds.) ECML 1998. LNCS, vol. 1398, pp. 137–142. Springer, Heidelberg (1998). https://doi.org/10.1007/BFb0026683
Liaw, A., Wiener, M.: Classification and regression by randomForest. R News 2(3), 18–22 (2002)
Chen, T., Carlos G.: Xgboost: a scalable tree boosting system. In: Proceedings of the 22nd Acm Sigkdd International Conference on Knowledge Discovery and Data Mining. ACM (2016)
Kim, Y.: Convolutional neural networks for sentence classification. arXiv preprint arXiv:1408.5882 (2014)
Joulin, A., et al.: Bag of tricks for efficient text classification. arXiv preprint arXiv:1607.01759 (2016)
Hinton, Geoffrey E., Krizhevsky, A., Wang, Sida D.: Transforming auto-encoders. In: Honkela, T., Duch, W., Girolami, M., Kaski, S. (eds.) ICANN 2011. LNCS, vol. 6791, pp. 44–51. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-21735-7_6
Mikolov, T., et al.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)
Srivastava, R.K., Greff, K., Schmidhuber, J.: Highway networks. arXiv preprint arXiv:1505.00387 (2015)
Sara Github. https://github.com/Sarasra/models/tree/master/research/capsules. Accessed 20 May 2019
Loshchilov, I., Hutter, F.: Sgdr: Stochastic gradient descent with warm restarts. arXiv preprint arXiv:1608.03983 (2016)
Peters, M.E., et al.: Deep contextualized word representations. arXiv preprint arXiv:1802.05365 (2018)
Acknowledgments
This work is supported in part by the National Natural Science Foundation of China (Grand Nos. U1636211, 61672081, 61370126), and the National Key R&D Program of China (No. 2016QY04W0802).
We would like to thank lixinsu, sarasra, freefuiiismyname and andyweizhao. Their open source projects on github reduce our work on coding, thus we can take more time to focus on studying.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Ding, L., Li, Z., Wang, B., He, Y. (2019). Capsule Networks for Chinese Opinion Questions Machine Reading Comprehension. In: Sun, M., Huang, X., Ji, H., Liu, Z., Liu, Y. (eds) Chinese Computational Linguistics. CCL 2019. Lecture Notes in Computer Science(), vol 11856. Springer, Cham. https://doi.org/10.1007/978-3-030-32381-3_42
Download citation
DOI: https://doi.org/10.1007/978-3-030-32381-3_42
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-32380-6
Online ISBN: 978-3-030-32381-3
eBook Packages: Computer ScienceComputer Science (R0)