Skip to main content

Relation Classification: How Well Do Neural Network Approaches Work?

  • Conference paper
  • First Online:
  • 689 Accesses

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1232))

Abstract

Relation classification is a well known task in NLP. It classifies relations that occur between two entities in sentences by assigning a label from a pre-defined set of abstract relation labels. A benchmark data set for this task is the SemEval-2010 Task 8 data set. Neural network approaches are currently the methods that give state-of-art results on a wide range of NLP problems. There is also the claim that the models trained on one task carry over to other tasks with only a small amount of fine tuning. Our experience suggests that for the relation classification problem while a wide variety of neural network methods work reasonably well it is very hard to improve performance significantly by including different kinds of syntactic and semantic information that intuitively should be important in signalling the relation label. We think that improved performance will be hard to achieve without injecting controlled class specific semantic information into the classification process.

In our experimentation we have given many different kinds of syntactic and semantic information by tagging suitable words with relevant semantic/syntactic tags. We have also tried various embedding methods like Google embeddings, FastText, Word-to-vec and BERT. None of these make a substantial difference in the performance which hovers between 82% to 85%.

Surprisingly, when we looked at the top three classification performance it was above 96% that is 11 to 14% above the top one performance. This implies that it should be possible to boost the correct label from the second or third position to the first position by suitable semantic inputs and architectural innovations. We have experimented with an architecture that gives supplementary information about words in the sentence as well as the sentence itself in parallel with the main stream of information, namely the sentence itself. In one such case we are able to boost performance to state-of-art levels. A systematic investigation is ongoing.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Patel, S.: Multi-Way Classification of Relations Between Pairs of Entities, IIT Kanpur M.Tech. thesis (2018)

    Google Scholar 

  2. Xu, K., Feng, Y., Huang, S., Zhao, D: Semantic relation classification via convolutional neural networks with simple negative sampling. arXiv preprint(2015). arXiv:1506.07650

  3. Zhang, D., Wang, D.: Relation classification via recurrent neural network arXiv preprint (2015). arXiv:1508.01006

  4. Yan, X., Mou, L., Li, G., Chen, Y., Peng, H., Jin, Z.: Classifying relations via long short term memory networks along shortest dependency path. arXiv preprint (2015). arXiv:1508.03720

  5. Zhang, X., Chen, F., Huang, R.: A combination of RNN and CNN for attention-based relation classification. J. Procedia Comput. Sci. 131, 911–917 (2018). Elsevier

    Article  Google Scholar 

  6. Vashishth, S., Joshi, R., Prayaga, S.S., Bhattacharyya, C., Talukdar, P.: Reside: improving distantly-supervised neural relation extraction using side information. journal: arXiv preprint arXiv:1812.04361 (2018)

  7. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. journal: arXiv preprint arXiv:1406.1078 (2014)

  8. Kim, J., Lee, J.-H.: Multiple range-restricted bidirectional gated recurrent units with attention for relation classification. journal: arXiv preprint arXiv:1707.01265 (2017)

  9. Liu, Y., Wei, F., Li, S., Ji, H., Zhou, M., Wang, H.g: A dependency-based neural network for relation classification. journal: arXiv preprint arXiv:1507.04646 (2015)

  10. Miwa, M., Bansal, M.: End-to-end relation extraction using lstms on sequences and tree structures. journal: arXiv preprint arXiv:1601.00770 (2016)

  11. Nguyen, T.H., Grishman, R.: Combining neural networks and log-linear models to improve relation extraction. journal: arXiv preprint arXiv:1511.05926 (2015)

  12. Qin, P., Xu, W., Guo, J.: An empirical convolutional neural network approach for semantic relation classification. Neurocomputing 190, 1–9 (2016). Elsevier

    Article  Google Scholar 

  13. Vu, N.T., Adel, H., Gupta, P., Schütze, H.: Combining recurrent and convolutional neural networks for relation classification. journal: arXiv preprint arXiv:1605.07333 (2016)

  14. Wang, L., Cao, Z., De Melo, G., Liu, Z.: Relation Classification via Multi-level Attention CNNs. Tsinghua University, Beijing (2016)

    Book  Google Scholar 

  15. Xu, K., Feng, Y., Huang, S., Zhao, D.: Semantic relation classification via convolutional neural networks with simple negative sampling. journal: arXiv preprint arXiv:1506.07650 (2015)

  16. Xu, Y., et al.: Improved relation classification by deep recurrent neural networks with data augmentation. journal: arXiv preprint arXiv:1601.03651 (2016)

  17. Yan, X., Mou, L., Li, G., Chen, Y., Peng, H., Jin, Z.: Classifying relations via long short term memory networks along shortest dependency path. journal: arXiv preprint arXiv:1508.03720 (2015)

  18. Blunsom, P., Hermann, K.M.: The Role of Syntax in Vector Space Models of Compositional Semantics (2013)

    Google Scholar 

  19. Soares, L.B., FitzGerald, N., Ling, J., Kwiatkowski, T.: Matching the Blanks: Distributional Similarity for Relation Learning. journal: arXiv preprint arXiv:1906.03158 (2019)

  20. Wu, S., He, Y.: Enriching Pre-trained Language Model with Entity Information for Relation Classification. journal: arXiv preprint arXiv:1905.08284 (2019)

  21. Peters, M.E., et al.: Knowledge Enhanced Contextual Word Representations. journal: arXiv preprint arXiv:1909.04164 (2019)

  22. Wang, H., et al.: Extracting Multiple-Relations in One-Pass with Pre-Trained Transformers. journal: arXiv preprint arXiv:1902.01030 (2019)

  23. Alt, C., Hübner, M., Hennig, L.: Improving relation extraction by pre-trained language representations. journal: arXiv preprint arXiv:1906.03088 (2019)

  24. Lee, J., Seo, S., Choi, Y.S.: Semantic Semantic relation classification via bidirectional LSTM networks with entity-aware attention using latent entity typing. Symmetry 11(6), 785 (2019). Multidisciplinary Digital Publishing Institute

    Article  Google Scholar 

  25. Dos Santos, C.N., Xiang, B., Zhou, B.: Classifying relations by ranking with convolutional neural networks. arXiv preprint (2015). arXiv:1504.06580

  26. Zhou, P., et al.: Attention-based bidirectional long short-term memory networks for relation classification. In: 54th Annual Meeting. on Proceedings, (Volume 2: Short Papers), pp. 207–212. Association for Computational Linguistics (2016)

    Google Scholar 

  27. Socher, R., Huval, B., Manning, C.D., Ng, A.Y: Semantic compositionality through recursive matrix-vector spaces. In: Proceedings of the 2012 Joint Conference on Empirical Methods in Language Processing Natural and Computational Natural Language Learning on Proceedings, Organization. Association for Computational Linguistics (2012)

    Google Scholar 

  28. Socher, R., Bauer, J., Manning, C.D., Ng, A.Y.: Parsing with compositional vector grammars. In: 51st Annual Meeting on Proceedings, Association for Computational Linguistics (Volume 1: Long Papers), pp. 455–465 (2013)

    Google Scholar 

  29. Zhao, Y., Wan, H., Gao, J., Lin, Y.: Improving relation classification by entity pair graph. In: Asian Conference on Machine Learning, pp. 1156–1171 (2019)

    Google Scholar 

  30. Wang, L., Cao, Z., De Melo, G., Liu, Z.: Relation classification via multi-level attention CNNs. In: 54th Annual Meeting on Proceedings, Association for Computational Linguistics (Volume 1: Long Papers), pp. 1298–1307 (2016)

    Google Scholar 

  31. Hashimoto, K., Miwa, M., Tsuruoka, Y., Chikayama, T.: Simple customization of recursive neural networks for semantic relation classification. In: Conference on Proceedings, Empirical Methods in Natural Language Processing, pp. 1372-1376 (2013)

    Google Scholar 

  32. Cai, R., Zhang, X., Wang, H.: Bidirectional recurrent convolutional neural network for relation classification. In: the 54th Annual Meeting on Proceedings, the Association for Computational Linguistics (Volume 1: Long Papers) pp. 756–765 (2016)

    Google Scholar 

  33. Qin, P., Xu, W., Guo, J.: Designing an adaptive attention mechanism for relation classification. In: 2017 International Joint Conference on Neural Networks (IJCNN), pp. 4356–4362. IEEE (2017)

    Google Scholar 

  34. Manning, C., Surdeanu, M., Bauer, J., Finkel, J., Bethard, S., McClosky, D.: The stanford CoreNLP natural language processing toolkit. In: 52nd Annual Meeting on Proceedings, the Association for Computational Linguistics: System Demonstrations, pp. 55–60 (2014)

    Google Scholar 

  35. Yu, M., Gormley, M., Dredze, M.: Factor-based compositional embedding models. In: NIPS Workshop on Learning Semantics, on Proceedings, pp. 95–101 (2014)

    Google Scholar 

  36. Patel, S.: http://172.28.64.70:8080/jspui/handle/123456789/17624

Download references

Acknowledgement

I would like to express my gratitude to Sahitya Patel, M.Tech, IIT Kanpur and Pawan Kumar, Ph.D student of IIT Kanpur. They were very helpful and provided me technical support required for the experiments.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Sri Nath Dwivedi , Harish Karnick or Renu Jain .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Dwivedi, S.N., Karnick, H., Jain, R. (2020). Relation Classification: How Well Do Neural Network Approaches Work?. In: Villazón-Terrazas, B., Ortiz-Rodríguez, F., Tiwari, S.M., Shandilya, S.K. (eds) Knowledge Graphs and Semantic Web. KGSWC 2020. Communications in Computer and Information Science, vol 1232. Springer, Cham. https://doi.org/10.1007/978-3-030-65384-2_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-65384-2_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-65383-5

  • Online ISBN: 978-3-030-65384-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics