skip to main content
10.1145/2484028.2484117acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
short-paper

Report from the NTCIR-10 1CLICK-2 Japanese subtask: baselines, upperbounds and evaluation robustness

Published: 28 July 2013 Publication History

Abstract

The One Click Access Task (1CLICK) of NTCIR requires systems to return a concise multi-document summary of web pages in response to a query which is assumed to have been submitted in a mobile context. Systems are evaluated based on information units (or iUnits), and are required to present important pieces of information first and to minimise the amount of text the user has to read. Using the official Japanese results of the second round of the 1CLICK task from NTCIR-10, we discuss our task setting and evaluation framework. Our analyses show that: (1) Simple baseline methods that leverage search engine snippets or Wikipedia are effective for 'lookup' type queries but not necessarily for other query types; (2) There is still a substantial gap between manual and automatic runs; and (3) Our evaluation metrics are relatively robust to the incompleteness of iUnits.

References

[1]
B. Carterette. Multiple testing in statistical analysis of systems-based information retrieval experiments. ACM TOIS, 30(1):4:1--4:34, 2012.
[2]
M. Ekstrand-Abueg, V. Pavlu, M. P. Kato, T. Sakai, T. Yamamoto, and M. Iwata. Exploring semi-automatic nugget extraction for japanese one click access evaluation. In Proc. of ACM SIGIR 2013, to appear.
[3]
J. Li, S. Huffman, and A. Tokuda. Good abandonment in mobile and PC internet search. In Proc. of ACM SIGIR 2009, pages 43--50, 2009.
[4]
T. Sakai and N. Kando. On information retrieval metrics designed for evaluation with incomplete relevance assessments. Information Retrieval, 11(5):447--470, 2008.
[5]
T. Sakai and M. P. Kato. One click one revisited: Enhancing evaluation based on information units. In Proc. of AIRS 2012 (LNCS 7675), pages 39--51, 2012.
[6]
T. Sakai, M. P. Kato, and Y.-I. Song. Click the search button and be happy: Evaluating direct and immediate information access. In Proc. of ACM CIKM 2011, pages 621--630, 2011.
[7]
T. Sakai, M. P. Kato, and Y.-I. Song. Overview of NTCIR-9 1CLICK. In Proc. of NTCIR-9, pages 180--201, 2011.

Cited By

View all
  • (2020)Evaluation of Information Access with SmartphonesEvaluating Information Retrieval and Access Tasks10.1007/978-981-15-5554-1_11(151-167)Online publication date: 2-Sep-2020
  • (2014)Evaluating answer passages using summarization measuresProceedings of the 37th international ACM SIGIR conference on Research & development in information retrieval10.1145/2600428.2609485(963-966)Online publication date: 3-Jul-2014
  • (2014)Metrics, Statistics, TestsBridging Between Information Retrieval and Databases10.1007/978-3-642-54798-0_6(116-163)Online publication date: 2014
  • Show More Cited By

Index Terms

  1. Report from the NTCIR-10 1CLICK-2 Japanese subtask: baselines, upperbounds and evaluation robustness

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SIGIR '13: Proceedings of the 36th international ACM SIGIR conference on Research and development in information retrieval
    July 2013
    1188 pages
    ISBN:9781450320344
    DOI:10.1145/2484028
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 July 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. evaluation
    2. information units
    3. mobile environment
    4. ntcir
    5. nuggets
    6. summaries
    7. test collections

    Qualifiers

    • Short-paper

    Conference

    SIGIR '13
    Sponsor:

    Acceptance Rates

    SIGIR '13 Paper Acceptance Rate 73 of 366 submissions, 20%;
    Overall Acceptance Rate 792 of 3,983 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)1
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 07 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2020)Evaluation of Information Access with SmartphonesEvaluating Information Retrieval and Access Tasks10.1007/978-981-15-5554-1_11(151-167)Online publication date: 2-Sep-2020
    • (2014)Evaluating answer passages using summarization measuresProceedings of the 37th international ACM SIGIR conference on Research & development in information retrieval10.1145/2600428.2609485(963-966)Online publication date: 3-Jul-2014
    • (2014)Metrics, Statistics, TestsBridging Between Information Retrieval and Databases10.1007/978-3-642-54798-0_6(116-163)Online publication date: 2014
    • (2013)How Intuitive Are Diversified Search Metrics? Concordance Test Results for the Diversity U-MeasuresInformation Retrieval Technology10.1007/978-3-642-45068-6_2(13-24)Online publication date: 2013

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media