Skip to main content

Study on the Chinese Word Semantic Relation Classification with Word Embedding

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10619))

Abstract

This paper describes our solution to the NLPCC 2017 shared task on Chinese word semantic relation classification. Our proposed method won second place for this task. The evaluation result of our method on the test set is 76.8% macro F1 on the four types of semantic relation classification, i.e., synonym, antonym, hyponym, and meronym. In our experiments, we try basic word embedding, linear regression and convolutional neural networks (CNNs) with the pre-trained word embedding. The experimental results show that CNNs have better performance than other methods. Also, we find that the proposed method can achieve competitive results with small training corpus.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    https://code.google.com/archive/p/word2vec.

  2. 2.

    https://keras.io.

References

  1. Mohammad, S.M., Holyoak, K.J.: SemEval-2012 task 2: measuring degrees of relational similarity. In: Joint Conference on Lexical and Computational Semantics, pp. 356–364 (2012)

    Google Scholar 

  2. Jurgens, D., Pilehvar, M.T., Navigli, R.: SemEval-2014 task 3: cross-level semantic similarity. In: International Workshop on Semantic Evaluation (2014)

    Google Scholar 

  3. Bordea, G., Buitelaar, P., Faralli, S., Navigli, R.: SemEval-2015 task 17: taxonomy extraction evaluation (texeval). SemEval-2015, vol. 452, no. 465, p. 902 (2015)

    Google Scholar 

  4. Bordea, G., Lefever, E., Buitelaar, P.: SemEval-2016 task 13: taxonomy extraction evaluation (texeval-2). In: SemEval-2016, pp. 1081–1091. Association for Computational Linguistics (2016)

    Google Scholar 

  5. Rapp, R., Zock, M.: The cogalex-iv shared task on the lexical access problem. In: The Workshop on Cognitive Aspects of the Lexicon, pp. 1–14 (2014)

    Google Scholar 

  6. Bengio, Y., Ducharme, R., Vincent, P., Jauvin, C.: A neural probabilistic language model. J. Mach. Learn. Res. 3(Feb), 1137–1155 (2003)

    MATH  Google Scholar 

  7. Collobert, R., Weston, J.: A unified architecture for natural language processing: deep neural networks with multitask learning. In: Proceedings of the 25th International Conference on Machine Learning, pp. 160–167 (2008)

    Google Scholar 

  8. Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. arXiv preprint arXiv:1301.3781 (2013)

  9. Pennington, J., Socher, R., Manning, C.D.: Glove: global vectors for word representation. In: EMNLP, vol. 14, pp. 1532–1543 (2014)

    Google Scholar 

  10. Levy, O., Goldberg, Y., Dagan, I.: Improving distributional similarity with lessons learned from word embeddings. Trans. Assoc. Comput. Linguist. 3, 211–225 (2015)

    Google Scholar 

  11. Gladkova, A., Drozd, A., Matsuoka, S.: Analogy-based detection of morphological and semantic relations with word embeddings: what works and what doesn’t. In: SRW@HLT-NAACL, pp. 8–15 (2016)

    Google Scholar 

  12. Vylomova, E., Rimell, L., Cohn, T., Baldwin, T.: Take and took, gaggle and goose, book and read: evaluating the utility of vector differences for lexical relation learning. arXiv preprint arXiv:1509.01692 (2015)

  13. Snow, R., Jurafsky, D., Ng, A.Y.: Learning syntactic patterns for automatic hypernym discovery. In: Advances in Neural Information Processing Systems, pp. 1297–1304 (2005)

    Google Scholar 

  14. Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J., et al.: Relation classification via convolutional deep neural network. In: COLING, pp. 2335–2344 (2014)

    Google Scholar 

  15. dos Santos, C.N., Xiang, B., Zhou, B.: Classifying relations by ranking with convolutional neural networks. arXiv preprint arXiv:1504.06580 (2015)

  16. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)

    Google Scholar 

  17. Hu, B., Lu, Z., Li, H., Chen, Q.: Convolutional neural network architectures for matching natural language sentences. In: Advances in Neural Information Processing Systems, pp. 2042–2050 (2014)

    Google Scholar 

  18. Collobert, R., Weston, J., Karlen, M., Kavukcuoglu, K., Kuksa, P.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12(1), 2493–2537 (2011)

    MATH  Google Scholar 

  19. Kim, Y.: Convolutional neural networks for sentence classification. Eprint arXiv arXiv:1408.5882 (2014)

  20. Kalchbrenner, N., Grefenstette, E., Blunsom, P.: A convolutional neural network for modelling sentences. Eprint arXiv arXiv:1404.2188 (2014)

  21. Kingma, D., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  22. Yunfang, W., Minghua, Z.: Overview of the NLPCC 2017 shared task: Chinese word semantic relation classification. In: 6th Conference on Natural Language Processing and Chinese Computing (2017)

    Google Scholar 

Download references

Acknowledgments

We would like to thank members in our lab and the anonymous reviewers for their helpful feedback. This work was supported by the National Basic Research Program of China (2014CB340404), the National Natural Science Foundation of China (71571136), and the Project of Science and Technology Commission of Shanghai Municipality (16JC1403000, 14511108002).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to E. Shijia .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Shijia, E., Jia, S., Xiang, Y. (2018). Study on the Chinese Word Semantic Relation Classification with Word Embedding. In: Huang, X., Jiang, J., Zhao, D., Feng, Y., Hong, Y. (eds) Natural Language Processing and Chinese Computing. NLPCC 2017. Lecture Notes in Computer Science(), vol 10619. Springer, Cham. https://doi.org/10.1007/978-3-319-73618-1_74

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-73618-1_74

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-73617-4

  • Online ISBN: 978-3-319-73618-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics