Abstract
Previous work on neural relation extraction mainly focus on point-based methods and ignores uncertainty within the bag, thus making poor predictions when there are insufficient instances of the bag. To solve this problem, in this paper, we propose two density-based methods. Specifically, we assume each bag is a Gaussian distribution and sentences in the bag are drawn from it. We use predicted variance, capturing bag’s uncertainty, as well as predicted mean to draw more samples to enrich one-instance bags. We also use predicted variance to vote for good representation and temper the loss. To the best of our knowledge, this is the first paper to model uncertainty in neural relation extraction. Experiment results on NYT-10 show significant improvements over baselines.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Mintz, M., Bills, S., Snow, R., Jurafsky, D.: Distant supervision for relation extraction without labeled data. In: ACL (2009)
Riedel, S., Yao, L., McCallum, A.: Modeling relations and their mentions without labeled text. In: ECML-KDD (2010)
Surdeanu, M., Tibshirani, J., Nallapati, R., Manning, C.: Multi-instance multi-label learning for relation extraction. In: EMNLP (2012)
Zeng, D., Liu, K., Lai, S., Zhou, G., Zhao, J.: Relation classification via convolutional deep neural network. In: COLING (2014)
Zeng, D., Liu, K., Chen, Y., Zhao, J.: Distant supervision for relation extraction via piecewise convolutional neural networks. In: EMNLP (2015)
Soares, L., FitzGerald, N., Ling, J., Kwiatkowski, T.: Matching the blanks: distributional similarity for relation learning. In: ACL (2019)
Glorot, X., Bordes, A., Bengio, Y.: Deep sparse rectifier neural networks. In: AISTATS (2011)
Lin, Y., Shen, S., Liu, Z., Luan, H., Sun, M.: Neural relation extraction with selective attention over instances. In: ACL (2016)
Han, X., Yu, P., Liu, Z., Sun, M., Li, P.: Hierarchical relation extraction with coarse-to-fine grained attention. In: EMNLP (2018)
Ye, Z., Ling, Z.: Distant supervision relation extraction with intra-bag and inter-bag attentions. In: ACL (2019)
Li, Y., et al.: Self-attention enhanced selective gate with entity-aware embedding for distantly supervised relation extraction. In: AAAI (2020)
Hinton, G.: Visualizing high-dimensional data using t-SNE. J. Mach. Learn. Res. 9, 2579–2605 (2008)
Acknowledgements
This work is supported by National Key Research and Development Project (No.2020AAA0109302), Shanghai Science and Technology Innovation Action Plan (No.19511120400) and Shanghai Municipal Science and Technology Major Project (No.2021SHZDZX0103).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Hong, Y., Xiao, Y., Wang, W., Chen, Y. (2022). Modeling Uncertainty in Neural Relation Extraction. In: Bhattacharya, A., et al. Database Systems for Advanced Applications. DASFAA 2022. Lecture Notes in Computer Science, vol 13247. Springer, Cham. https://doi.org/10.1007/978-3-031-00129-1_29
Download citation
DOI: https://doi.org/10.1007/978-3-031-00129-1_29
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-00128-4
Online ISBN: 978-3-031-00129-1
eBook Packages: Computer ScienceComputer Science (R0)