Skip to main content

A Study on Novelty Evaluation in Biomedical Information Retrieval

  • Conference paper
String Processing and Information Retrieval (SPIRE 2012)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7608))

Included in the following conference series:

  • 1222 Accesses

Abstract

In novelty information retrieval, we expect that novel passages are ranked higher than redundant ones and relevant ones higher than irrelevant ones. Accordingly, we desire an evaluation algorithm that would respect such expectations. In TREC 2006 & 2007, a novelty performance measure, called the aspect-based mean average precision (MAP), was introduced to the Genomics Track to rank the novelty of the medical passages. In this paper, we demonstrate that this measure may not necessarily yeild a higher score for the rankings that honor above expectations better. We propose an improved measure to reflect such expectations more precisely, and present some supporting evidences.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ali, M., et al.: Structural relevance: A common basis for the evaluation of structured document retrieval. In: Proceedings of CIKM 2008, pp. 1153–1162 (2008)

    Google Scholar 

  2. Carbonell, J., Goldstein, J.: The use of MMR, diversity-based reranking for reordering documents and producing summaries. In: SIGIR 1998, pp. 335–336 (1998)

    Google Scholar 

  3. Clarke, C., et al.: Novelty and diversity in information retrieval evaluation. In: SIGIR 2008, pp. 659–666 (2008)

    Google Scholar 

  4. Cormack, G.V., Lynam, T.R.: Statistical precision of information retrieval evalution. In: SIGIR 2006, pp. 533–540 (2006)

    Google Scholar 

  5. Hersh, W., et al.: TREC 2007 genomics track overview. In: TREC 2007, pp. 98–115 (2007)

    Google Scholar 

  6. Järvelin, K., Kekäläinen, J.: Cumulated gain-based evaluation of IR techniques. ACM Transactions on Information Systems 20(4), 422–446 (2002)

    Article  Google Scholar 

  7. Zhai, C., Cohen, W.W., Lafferty, J.: Beyond independent relevance: methods and evaluation metrics for subtopic retrieval. In: SIGIR 2003, pp. 10–17 (2003)

    Google Scholar 

  8. Zhang, B., Li, H., Liu, Y., Ji, L., Xi, W., Fan, W., Chen, Z., Ma, W.-Y.: Improving web search results using affinity graph. In: SIGIR 2005, pp. 504–511 (2005)

    Google Scholar 

  9. Zhang, Y., Callan, J., Minka, T.: Novelty and redundancy detection in adaptive filtering. In: SIGIR 2002, pp. 81–88 (2002)

    Google Scholar 

  10. Zhu, X., et al.: Improving diversity in ranking using absorbing random walks. In: NAACL-HLT 2007, pp. 97–104 (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

An, X., Cercone, N., Wang, H., Ye, Z. (2012). A Study on Novelty Evaluation in Biomedical Information Retrieval. In: Calderón-Benavides, L., González-Caro, C., Chávez, E., Ziviani, N. (eds) String Processing and Information Retrieval. SPIRE 2012. Lecture Notes in Computer Science, vol 7608. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-34109-0_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-34109-0_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-34108-3

  • Online ISBN: 978-3-642-34109-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics