Abstract
This paper describes our solution to the NLPCC 2017 shared task on Chinese word semantic relation classification. Our proposed method won second place for this task. The evaluation result of our method on the test set is 76.8% macro F1 on the four types of semantic relation classification, i.e., synonym, antonym, hyponym, and meronym. In our experiments, we try basic word embedding, linear regression and convolutional neural networks (CNNs) with the pre-trained word embedding. The experimental results show that CNNs have better performance than other methods. Also, we find that the proposed method can achieve competitive results with small training corpus.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Mohammad, S.M., Holyoak, K.J.: SemEval-2012 task 2: measuring degrees of relational similarity. In: Joint Conference on Lexical and Computational Semantics, pp. 356–364 (2012)
Jurgens, D., Pilehvar, M.T., Navigli, R.: SemEval-2014 task 3: cross-level semantic similarity. In: International Workshop on Semantic Evaluation (2014)
Bordea, G., Buitelaar, P., Faralli, S., Navigli, R.: SemEval-2015 task 17: taxonomy extraction evaluation (texeval). SemEval-2015, vol. 452, no. 465, p. 902 (2015)
Bordea, G., Lefever, E., Buitelaar, P.: SemEval-2016 task 13: taxonomy extraction evaluation (texeval-2). In: SemEval-2016, pp. 1081–1091. Association for Computational Linguistics (2016)
Rapp, R., Zock, M.: The cogalex-iv shared task on the lexical access problem. In: The Workshop on Cognitive Aspects of the Lexicon, pp. 1–14 (2014)
Bengio, Y., Ducharme, R., Vincent, P., Jauvin, C.: A neural probabilistic language model. J. Mach. Learn. Res. 3(Feb), 1137–1155 (2003)
Collobert, R., Weston, J.: A unified architecture for natural language processing: deep neural networks with multitask learning. In: Proceedings of the 25th International Conference on Machine Learning, pp. 160–167 (2008)
Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)
Pennington, J., Socher, R., Manning, C.D.: Glove: global vectors for word representation. In: EMNLP, vol. 14, pp. 1532–1543 (2014)
Levy, O., Goldberg, Y., Dagan, I.: Improving distributional similarity with lessons learned from word embeddings. Trans. Assoc. Comput. Linguist. 3, 211–225 (2015)
Gladkova, A., Drozd, A., Matsuoka, S.: Analogy-based detection of morphological and semantic relations with word embeddings: what works and what doesn’t. In: SRW@HLT-NAACL, pp. 8–15 (2016)
Vylomova, E., Rimell, L., Cohn, T., Baldwin, T.: Take and took, gaggle and goose, book and read: evaluating the utility of vector differences for lexical relation learning. arXiv preprint arXiv:1509.01692 (2015)
Snow, R., Jurafsky, D., Ng, A.Y.: Learning syntactic patterns for automatic hypernym discovery. In: Advances in Neural Information Processing Systems, pp. 1297–1304 (2005)
Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J., et al.: Relation classification via convolutional deep neural network. In: COLING, pp. 2335–2344 (2014)
dos Santos, C.N., Xiang, B., Zhou, B.: Classifying relations by ranking with convolutional neural networks. arXiv preprint arXiv:1504.06580 (2015)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)
Hu, B., Lu, Z., Li, H., Chen, Q.: Convolutional neural network architectures for matching natural language sentences. In: Advances in Neural Information Processing Systems, pp. 2042–2050 (2014)
Collobert, R., Weston, J., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12(1), 2493–2537 (2011)
Kim, Y.: Convolutional neural networks for sentence classification. Eprint arXiv arXiv:1408.5882 (2014)
Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences. Eprint arXiv arXiv:1404.2188 (2014)
Kingma, D., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
Yunfang, W., Minghua, Z.: Overview of the NLPCC 2017 shared task: Chinese word semantic relation classification. In: 6th Conference on Natural Language Processing and Chinese Computing (2017)
Acknowledgments
We would like to thank members in our lab and the anonymous reviewers for their helpful feedback. This work was supported by the National Basic Research Program of China (2014CB340404), the National Natural Science Foundation of China (71571136), and the Project of Science and Technology Commission of Shanghai Municipality (16JC1403000, 14511108002).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Shijia, E., Jia, S., Xiang, Y. (2018). Study on the Chinese Word Semantic Relation Classification with Word Embedding. In: Huang, X., Jiang, J., Zhao, D., Feng, Y., Hong, Y. (eds) Natural Language Processing and Chinese Computing. NLPCC 2017. Lecture Notes in Computer Science(), vol 10619. Springer, Cham. https://doi.org/10.1007/978-3-319-73618-1_74
Download citation
DOI: https://doi.org/10.1007/978-3-319-73618-1_74
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-73617-4
Online ISBN: 978-3-319-73618-1
eBook Packages: Computer ScienceComputer Science (R0)