Skip to main content

Scientific Data of an Evaluation Campaign: Do We Properly Deal with Them?

  • Conference paper
Evaluation of Multilingual and Multi-modal Information Retrieval (CLEF 2006)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 4730))

Included in the following conference series:

Abstract

This paper examines the current way of keeping the data produced during the evaluation campaigns and highlights some shortenings of it. As a consequence, we propose a new approach for improving the management evaluation campaigns’ data. In this approach, the data are considered as scientific data to be cured and enriched in order to give full support to longitudinal statistical studies and long-term preservation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Agosti, M., Di Nunzio, G.M., Ferro, N.: The Importance of Scientific Data Curation for Evaluation Campaigns. In: DELOS Conference 2007 Working Notes, ISTI-CNR, Gruppo ALI, Pisa, Italy, pp. 185–193 (2007)

    Google Scholar 

  2. Agosti, M., Di Nunzio, G.M., Ferro, N.: A Proposal to Extend and Enrich the Scientific Data Curation of Evaluation Campaigns. In: Proc. 1st International Workshop on Evaluating Information Access (EVIA 2007), National Institute of Informatics, Tokyo, Japan, pp. 62–73 (2007)

    Google Scholar 

  3. Abiteboul, S., Agrawal, R., Bernstein, P., Carey, M., Ceri, S., Croft, B., DeWitt, D., Franklin, M., Garcia-Molina, H., Gawlick, D., Gray, J., Haas, L., Halevy, A., Hellerstein, J., Ioannidis, Y., Kersten, M., Pazzani, M., Lesk, M., Maier, D., Naughton, J., Schek, H.J., Sellis, T., Silberschatz, A., Stonebraker, M., Snodgrass, R., Ullman, J.D., Weikum, G., Widom, J., Zdonik, S.: The Lowell Database Research Self-Assessment. Communications of the ACM (CACM) 48, 111–118 (2005)

    Article  Google Scholar 

  4. Ioannidis, Y., Maier, D., Abiteboul, S., Buneman, P., Davidson, S., Fox, E.A., Halevy, A., Knoblock, C., Rabitti, F., Schek, H.J., Weikum, G.: Digital library information-technology infrastructures. International Journal on Digital Libraries 5, 266–274 (2005)

    Article  Google Scholar 

  5. Agosti, M., Di Nunzio, G.M., Ferro, N.: A Data Curation Approach to Support In-depth Evaluation Studies. In: Proc. International Workshop on New Directions in Multilingual Information Access (MLIA 2006), pp. 65–68 (2006) (last visited, March 23, 2007), http://ucdata.berkeley.edu/sigir2006-mlia.htm

  6. Agosti, M., Di Nunzio, G.M., Ferro, N., Peters, C.: CLEF: Ongoing Activities and Plans for the Future. In: Proc. 6th NTCIR Workshop Meeting on Evaluation of Information Access Technologies: Information Retrieval, Question Answering and Cross-Lingual Information Access, National Institute of Informatics, Tokyo, Japan, pp. 493–504 (2007)

    Google Scholar 

  7. Cleverdon, C.W.: The Cranfield Tests on Index Languages Devices. In: Readings in Information Retrieval, pp. 47–60. Morgan Kaufmann Publisher, Inc., San Francisco, California, USA (1997)

    Google Scholar 

  8. Agosti, M., Di Nunzio, G.M., Ferro, N.: Evaluation of a Digital Library System. In: Notes of the DELOS WP7 Workshop on the Evaluation of Digital Libraries, pp. 73–78 (2004) (last visited, March 23, 2007), http://dlib.ionio.gr/wp7/workshop2004_program.html

  9. W3C: XML Schema Part 1: Structures - W3C Recommendation 28 October 2004. (2004) (last visited, March 23, 2007), http://www.w3.org/TR/xmlschema-1/

  10. W3C: XML Schema Part 2: Datatypes - W3C Recommendation 28 October 2004. (2004) (last visited, March 23, 2007), http://www.w3.org/TR/xmlschema-2/

  11. Hull, D.: Using Statistical Testing in the Evaluation of Retrieval Experiments. In: Proc. 16th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 1993), pp. 329–338. ACM Press, New York, USA (1993)

    Chapter  Google Scholar 

  12. Di Nunzio, G.M., Ferro, N., Jones, G.J.F., Peters, C.: CLEF 2005: Ad Hoc Track Overview. In: Peters, C., Gey, F.C., Gonzalo, J., Müller, H., Jones, G.J.F., Kluck, M., Magnini, B., de Rijke, M., Giampiccolo, D. (eds.) CLEF 2005. LNCS, vol. 4022, pp. 11–36. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  13. Voorhees, E.M.: Overview of the TREC 2005 Robust Retrieval Track. In: The Fourteenth Text REtrieval Conference Proceedings (TREC 2005), National Institute of Standards and Technology (NIST), Special Pubblication 500-266, Washington, USA (2005) (last visited, March 23, 2007), http://trec.nist.gov/pubs/trec14/t14_proceedings.html

  14. Lord, P., Macdonald, A.: e-Science Curation Report. Data curation for e-Science in the UK: an audit to establish requirements for future curation and provision. The JISC Committee for the Support of Research (JCSR) (2003) (last visited, March 23, 2007), http://www.jisc.ac.uk/uploaded_documents/e-ScienceReportFinal.pdf

  15. Croft, W.B.: Combining Approaches to Information Retrieval. In: Advances in Information Retrieval: Recent Research from the Center for Intelligent Information Retrieval, pp. 1–36. Kluwer Academic Publishers, Norwell (MA), USA (2000)

    Google Scholar 

  16. Harman, D.K.: Overview of the First Text REtrieval Conference (TREC-1). In The First Text REtrieval Conference (TREC-1), National Institute of Standards and Technology (NIST), Special Pubblication 500-207, Washington, USA (1992) (last visited, March 23, 2007), http://trec.nist.gov/pubs/trec1/papers/01.txt

  17. Di Nunzio, G.M., Ferro, N.: DIRECT: a Distributed Tool for Information Retrieval Evaluation Campaigns. In: Proc. 8th DELOS Thematic Workshop on Future Digital Library Management Systems: System Architecture and Information Access, pp. 58–63 (2005) (last visited, March 23, 2007), http://dbis.cs.unibas.ch/delos_website/delos-dagstuhl-handout-all.pdf

  18. Di Nunzio, G.M., Ferro, N.: DIRECT: a System for Evaluating Information Access Components of Digital Libraries. In: Rauber, A., Christodoulakis, S., Tjoa, A.M. (eds.) ECDL 2005. LNCS, vol. 3652, pp. 483–484. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  19. Di Nunzio, G.M., Ferro, N.: Scientific Evaluation of a DLMS: a service for evaluating information access components. In: Gonzalo, J., Thanos, C., Verdejo, M.F., Carrasco, R.C. (eds.) ECDL 2006. LNCS, vol. 4172, pp. 536–539. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  20. Di Nunzio, G.M., Ferro, N.: Appendix A. Results of the Core Tracks and Domain-Specific Tracks. In: Peters, C., Quochi, V. (eds.) Working Notes for the CLEF 2005 Workshop (2005) (last visited, March 23, 2007), http://www.clef-campaign.org/2005/working_notes/workingnotes2005/appendix_a.pdf

  21. Di Nunzio, G.M., Ferro, N.: Appendix A: Results of the Ad-hoc Bilingual and Monolingual Tasks. In: Nardi, A., Peters, C., Vicedo, J.L. (eds.) Working Notes for the CLEF 2006 Workshop (2006) (last visited, March 23, 2007), http://www.clefcampaign.org/2006/working_notes/workingnotes2006/Appendix_Ad-Hoc.pdf

Download references

Author information

Authors and Affiliations

Authors

Editor information

Carol Peters Paul Clough Fredric C. Gey Jussi Karlgren Bernardo Magnini Douglas W. Oard Maarten de Rijke Maximilian Stempfhuber

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Agosti, M., Di Nunzio, G.M., Ferro, N. (2007). Scientific Data of an Evaluation Campaign: Do We Properly Deal with Them?. In: Peters, C., et al. Evaluation of Multilingual and Multi-modal Information Retrieval. CLEF 2006. Lecture Notes in Computer Science, vol 4730. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74999-8_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74999-8_2

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74998-1

  • Online ISBN: 978-3-540-74999-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics