Skip to main content

The Research on Chinese Coreference Resolution Based on Maximum Entropy Model and Rules

  • Conference paper
Web Information Systems and Mining (WISM 2009)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 5854))

Included in the following conference series:

Abstract

Coreference resolution is an important research topic in natural language processing, including the coreference resolution of proper nouns, common nouns and pronouns. In this paper, a coreference resolution algorithm of the Chinese noun phrase and the pronoun is proposed that based on maximum entropy model and rules. The use of maximum entropy model can integrate effectively a variety of separate features, on this basis to use rules method to improve the recall rate of digestion, and then use filtering rules to remove ”noise” to further improve the accuracy rate of digestion. Experiments show that the F value of the algorithm in a closed test and an open test can reach 85.2% and 76.2% respectively, which improve about 12.9 percentage points and 7.8 percentage points compare with the method of rules respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. feng, W.H.: Computational Models and Technologies in Anaphora Resolution. Journal of Chinese Information Processing 16(6), 9–17 (2002)

    Google Scholar 

  2. feng, W.H.: On Anaphora Resolution within Chinese Text. Applied Linguistics 11(4), 113–119 (2004)

    Google Scholar 

  3. Wei, Q., Kun, G.Y., qian, Z.Y.: English Noun Phrase Coreference Resolution Based on Maximum Entropy Model. Journal of Computer Research and Development 40(9), 1337–1343 (2003)

    Google Scholar 

  4. juan, L.Z., ning, H.C.: An Improved Maximum Entropy Language Model and Its Application. Journal of Software 10(3), 257–263 (1999)

    Google Scholar 

  5. Ning, P., Erhong, Y.: Study on Pronoun Clearance based on Statistical Mode and Rule. Journal of Chinese Information Processing 22(2), 24–27 (2008)

    Google Scholar 

  6. Jun, L., Ting, L., Bing, Q.: Decision Trees-based Chinese Noun Phrase Coreference Resolution. Harbin Institute of Technology, Harbin (2004)

    Google Scholar 

  7. Qiang, W.Z., Xin, Z.Y.: Research on Chinese Coreference Resolution and Its Related Technologies. Beijing University of Posts and Telecommunications, Beijing (2006)

    Google Scholar 

  8. Sha, L.S., Jun, L.Z., Wang, C.H.: Research on Resolution with in Text. Computer Science 34(7), 138–141 (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhang, Y., Guo, J., Yu, Z., Zhang, Z., Yao, X. (2009). The Research on Chinese Coreference Resolution Based on Maximum Entropy Model and Rules. In: Liu, W., Luo, X., Wang, F.L., Lei, J. (eds) Web Information Systems and Mining. WISM 2009. Lecture Notes in Computer Science, vol 5854. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-05250-7_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-05250-7_1

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-05249-1

  • Online ISBN: 978-3-642-05250-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics