ABSTRACT
The purpose of this half-day workshop is to bring together academic and industry interactive information retrieval (IIR) researchers with an interest in evaluation methodologies. The workshop articulates contemporary challenges in the investigation of IIR and invites user- and system-oriented researchers to work collaboratively to address these challenges by combining user- and system-centered methodologies in meaningful ways. We anticipate that this workshop will initiate productive knowledge exchange and partnerships that can respond to the increasing user, task, system, and contextual complexity of the IIR field.
- Agosti, M., Fuhr, N., Toms, E.G. and Vakkari, P. 2013. Evaluation Methodologies in Information Retrieval (Dagstuhl Seminar 13441). Dagstuhl Report. 3, 10 (2013), 92--126. Available, http://drops.dagstuhl.de/opus/volltexte/2014/4433/.Google Scholar
- Arapakis, I. Lalmas, M., Cambazoglu, B. B., Marcos, M.-C. and Jose, J.M. 2014. User engagement in online news: Under the scope of sentiment, interest, affect, and gaze. J. Assoc. Inform. Sci. Tech. 65, 10 (March. 2014), 1988--2005. DOI: 10.1002/asi.23096.Google ScholarDigital Library
- Chuklin, A., Markov, I. and de Rijke, M. 2015. Click Models for Web Search. Morgan & Claypool Publishers, USA.Google Scholar
- Dean-Hall, A., Clarke, C. L. A., Kamps, J., Thomas, P. and Voorhees, E. 2014. Overview of the TREC 2014 Contextual Suggestion Track. In Proceedings of the Text Retrieval Conference. National Institute of Standards and Technology. Available: http://trec.nist.gov/pubs/trec23/papers/overview-context.pdf.Google Scholar
- Freund, L., Gwizdka, J., Hansen, P., He, J., Kando, N. and Rieh, S Y. (2014). Searching as Learning Workshop, Information Interaction in Context (IIiX) Conference (Regensburg, Germany, August 30, 2014). Available, http://www.diigubc.ca/IIIXSAL/index.html. Google ScholarDigital Library
- Gurrin, C., Albatal, R., Joho, H, and Hopfgartner, F. 2015. Lifelog: Pilot Task of NTCIR-12. Available: http://ntcir-lifelog.computing.dcu.ie/.Google Scholar
- Kelly, D. 2009. Methods for evaluating interactive information retrieval systems with users. Foundations and Trends in Information Retrieval. 3(1--2), 1--224. DOI= 10.1561/1500000012. Google ScholarDigital Library
- Kelly, D., Harper, D.J. and Landau, B. 2008. Questionnaire mode effects in interactive information retrieval experiments. Inform. Process. Manage. 44, 1 (January. 2008), 122--141. DOI=10.1016/j.ipm.2007.02.007 Google ScholarDigital Library
- Kohavi, R., Longbotham, R., Sommerfield, D. and Henne, R. M. (2009). Controlled experiments on the web: Survey and practical guide. Data Mining and Knowledge Discovery, 18(1): 140--181. Google ScholarDigital Library
- Lalmas, M., O'Brien, H. and Yom-Tov, E. 2014. Measuring User Engagement. Morgan & Claypool. DOI= 10.2200/S00605ED1V01Y201410ICR038 Google ScholarDigital Library
- McCay-Peet, L. and Toms, E. G. 2011. Measuring the dimensions of serendipity in digital environments. Information Research: An International Electronic Journal 16, 3 (September 2011): n3. http://www.informationr.net/ir/16--3/paper483.htmlGoogle Scholar
- Moffat, A., Thomas, P. and Scholer, F. (2013). Users versus models: What observation tells us about effectiveness metrics. In Iyengar, A., et al.,, eds, Proceeding of the 22th International Conference on Information and Knowledge Management, 659--668. ACM Press, New York, USA. Google ScholarDigital Library
- Sakai, T. 2014. Metrics, Statistics, Tests. In Ferro, N., ed, Bridging Between Information Retrieval and Databases PROMISE Winter School 2013, Revised Tutorial Lectures, 116--163. Lecture Notes in Computer Science (LNCS) 8173, Springer, Heidelberg, Germany.Google Scholar
- Sanderson, M. 2010. Test Collection Based Evaluation of Information Retrieval Systems. Foundations and Trends in Information Retrieval, 4(4), 247--375.Google ScholarCross Ref
- Scholer, F., Moffat, A. and Thomas, P. 2013. Choices in batch information retrieval evaluation. In Proceedings of the Australasian Document Computing Symposium, 74--81. Brisbane, Australia. Google ScholarDigital Library
- Su, L.T. 1992. Evaluation measures for interactive information retrieval. Inform. Process. Manage. 28, 4 (July-August. 1992), 503--516. DOI= doi: 10.1016/0306-4573(92)90007-M. Google ScholarDigital Library
- Tague-Sutcliffe, J. 1992. Measuring the informativeness of a retrieval process. In Proceedings of the 15th annual International ACM SIGIR Conference on Research and Development in Information Retrieval (Copenhagen, Denmark, June 21-24, 1992). ACM, New York, NY, 23--26. DOI = 10.1145/133160.133171. Google ScholarDigital Library
Index Terms
System And User Centered Evaluation Approaches in Interactive Information Retrieval (SAUCE 2016)
Recommendations
Automatic Simulation of Users for Interactive Information Retrieval
CHIIR '17: Proceedings of the 2017 Conference on Conference Human Information Interaction and RetrievalHiring and engaging real users for conducting user studies is costly and time consuming. Therefore, simulating users has been proposed as a resource saving solution. Simulation helps to predict users' performance with retrieval systems and could provide ...
Report on the SIGIR 2016 Workshop on Neural Information Retrieval (Neu-IR)
The SIGIR 2016 workshop on Neural Information Retrieval (Neu-IR) took place on 21 July, 2016 in Pisa. The goal of the Neu-IR (pronounced "New IR") workshop was to serve as a forum for academic and industrial researchers, working at the intersection of ...
Comments