Skip to main content

Human-Computer Interaction View on Information Retrieval Evaluation

  • Chapter

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 7757))

Abstract

The field of information retrieval (IR) has experienced tremendous growth over the years. Researchers have however identified Human-Computer Interaction (HCI) aspects as important concerns in IR research. Incorporation of HCI techniques in IR can ensure that IR systems intended for human users are developed and evaluated in a way that is consistent with and reflects the needs of those users. The traditional methods of evaluating IR systems have for a long period been largely concerned with system-oriented measurements such as precision and recall, but not on the usability aspects of the IR system. There also are no well-established evaluation approaches for studying users and their interactions with IR systems. This chapter describes the role and place of HCI toward supporting and appropriating the evaluation of IR systems.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   49.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ahmed, S.M.Z., McKnight, C., Oppenheim, C.: A user-centred design and evaluation of IR interfaces. Journal of Librarianship and Information Science 38(2), 157–172 (2006)

    Article  Google Scholar 

  2. Allen, B.: From research to design: A user-centered approach. In: Ingwersen, P., Pors, N.O. (eds.) CoLIS 2. Second International Conference on Conceptions of Library and Information Science: Integration in Perspective, pp. 45–59. The Royal School of Librarianship, Copehagen (1996)

    Google Scholar 

  3. Allen, B.: Information tasks. Towards a user-centered approach to information systems. Academic Press, San Diego (1996)

    Google Scholar 

  4. Alonso, O., Rose, D.E., Stewart, B.: Crowdsourcing for relevance evaluation. SIGIR Forum 42, 10–16 (2008)

    Google Scholar 

  5. Baeza-Yates, R., Ribeiro-Neto, B.: Modern information retrieval. Addison-Wesley, Reading (1999)

    Google Scholar 

  6. Bates, M.J.: The design of browsing and berrypicking techniques for the online search interface. Online Review 13(5), 407–424 (1989)

    Article  Google Scholar 

  7. Belkin, N.J., Cool, C., Stein, A., Thiel, U.: Cases, scripts and information seeking strategies: on the design of interactive information retrieval systems. Expert Systems with Applications 9, 379–395 (1995)

    Article  Google Scholar 

  8. Belkin, N., J., Cole, M., Liu, J.: A Model for Evaluation of Interactive Information Retrieval. In: SIGIR Workshop on the Future of IR Evaluation (2009)

    Google Scholar 

  9. Berger, L.: Look ahead caching process for improved information retrieval response time by caching bodies of information before they are requested by the user. United States Patent (1999)

    Google Scholar 

  10. Bernal, J.D.: Preliminary Analysis of Pilot Questionnaires on the Use of Scientific literature. In: The Royal Society Scientific Information Conference, pp. 589–637 (1948)

    Google Scholar 

  11. Borgman, C.L., Hirsh, S.G., Walter, V.A., Gallagher, A.L.: Children’s searching behaviour on browsing and keyword online catalog: the Science Library Catalog Project. Journal of the American Society for Information Science 46(9), 663–684 (1995)

    Article  Google Scholar 

  12. Borlund, P.: The concept of relevance in IR. Journal of the American Society for Information Science 54, 913–925 (2003)

    Article  Google Scholar 

  13. Borlund, P.: The IIR evaluation model: a framework for evaluation of interactive information retrieval systems. Information Research 8(3) (2003)

    Google Scholar 

  14. Borlund, P., Ingwersen, P.: Measure of relative relevance and ranked halflife: Performance indicators for interactive information retrieval. In: Proceedings of the 21st ACM SIGIR Conference on Research and Development of Information Retrieval (SIGIR 1998), Melbourne, Australia, pp. 324–331 (1998)

    Google Scholar 

  15. Boyce, B.R., Meadow, C.T., Kraft, D.H.: Measurement in Information Science. Academic Press, Inc., London (1994)

    Google Scholar 

  16. Callan, J., Allan, J., Clarke, J.L.A., Dumais, S., Evans, D.A., Sanderson, M., Zhai, C.: Meeting of the MINDS: An information retrieval research agenda. SIGIR Forum 41, 25–34 (2007)

    Article  Google Scholar 

  17. Cleverdon, C.W.: The Cranfield tests on index language devices. In: Spark-Jones, K., Willett, P. (eds.) Readings in Information Retrieval. Reprinted from Aslib Proceedings, pp. 173–192. Morgan Kaufman Publishers, San Francisco (1967)

    Google Scholar 

  18. Cleverdon, C. W., Mills, L., Keen, M.: Factors Determining the Performance of Indexing Systems. Design, vol. 1. Aslib Cranfield Research Project, Cranfield (1996)

    Google Scholar 

  19. Cogdill, K.: MEDLINEplus Interface Evaluation: Final Report, University of Maryland, Human-Computer Interaction Lab (HCIL), College Park, MD (1999)

    Google Scholar 

  20. Cooper, W.S.: Expected search length: A single measure of retrieval effectiveness based on the weak ordering action of retrieval systems. American Documentation 19, 30–41 (1968)

    Article  Google Scholar 

  21. Cooper, W.S.: On selecting a measure of retrieval effectiveness, part 1: The “subjective” philosophy of evaluation. Journal of the American Society for Information Science 24, 87–100 (1973)

    Article  Google Scholar 

  22. Csikszentmihalyi, M.: Finding Flow: The Psychology of Engagement with Everyday Life. Basic Books, New York (1997)

    Google Scholar 

  23. Dalrymple, P.W.: Retrieval by reformulation in two university library catalogs: Toward a cognitive model of searching behavior. Journal of the American Society (1990)

    Google Scholar 

  24. Dalrymple, P.W.: User-Centered Evaluation of Information Retrieval. In: Allen, B. (ed.) Evaluation of Public Services and Public Services Personnel, pp. 85–102. University of Illinois, Urbana (1991)

    Google Scholar 

  25. Dix, A., Finlay, J., Abowd, G., Beale, R.: Human-Computer Interaction. Prentice Hall (2003)

    Google Scholar 

  26. Doubleday, A.R., Ryan, M., Springett, M., Sutcliffe, A.: A comparison of usability techniques for evaluating design. In: Proceedings of the Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, August 18-20, pp. 101–110. ACM, Amsterdam (1997)

    Google Scholar 

  27. Dunlop, M.: Time, relevance and interaction modeling for information retrieval. In: Proceedings of the 20th ACM Conference on Research and Development in Information Retrieval (SIGIR 1997), Philadelphia, PA, pp. 206–213 (1997)

    Google Scholar 

  28. Ellis, D.: The derivation of a behavioral model for information system design. Unpublished doctoral dissertation. University of Sheffield, England (1987)

    Google Scholar 

  29. Ellis, D.: A behavioural approach to information retrieval system design. Journal of Documentation 45(3), 171–212 (1989)

    Article  Google Scholar 

  30. Ellis, D., Haugan, M.: Modeling the information seeking patterns of engineers and research scientists in an industrial environment. Journal of Documentation 53(4), 384–403 (1997)

    Article  Google Scholar 

  31. Fenichel, C.H.: Online searching: Measures that discriminate among users with different types of experience. Journal of the American Society for Information Science 32, 23–32 (1981)

    Article  Google Scholar 

  32. Ford, N., Miller, D., Moss, N.: The role of individual differences in Internet searching: An empirical study. Journal of the American Society for Information Science and Technology 52, 1049–1066 (2001)

    Article  Google Scholar 

  33. Hansen, P.: Evaluation of IR User Interface. Implications for user interface design. Human IT (2), 28–41 (1998)

    Google Scholar 

  34. Harter, S.: Online information retrieval. Concepts, principles, and techniques. Academic Press, Orlando (1986)

    Google Scholar 

  35. Hearst, M.: User Interfaces and Visualization. In: Baeza-Yates, R., Ribeiro-Neto, B. (eds.) Modern Information Retrieval, ch.10 (1999)

    Google Scholar 

  36. Hearst, M.: Query Reformulation. In: Search User Interfaces, ch. 6. Cambridge University Press (2009)

    Google Scholar 

  37. Hewett, T., Baecker, R., Card, S., Carey, T., Gasen, J., Mantei, M., Perlman, G., Strong, G., Verplank, W.: ACM SIGCHI Curricula for Human-Computer Interaction (1992)

    Google Scholar 

  38. Hix, D., Hartson, H.R.: Developing user interfaces. Ensuring usability through product and process. Wiley, New York (1993)

    MATH  Google Scholar 

  39. Hornbaek, K.: Current practice in measuring usability: Challenges to usability studies and research. International Journal of Human–Computer Studies 64, 79–102 (2005)

    Article  Google Scholar 

  40. Hutchinson, H.B., Drunin, A., Bederson, B.B.: Supporting elementary-age children’s searching and browsing: design and evaluation using the International Children’s Digital Library. Journal of the American Society for Information Science 58(11), 1618–1630 (2007)

    Article  Google Scholar 

  41. Ingwersen, P.: Cognitive perspectives of information retrieval interaction: Elements of a cognitive IR theory. Journal of Documentation 52, 3–50 (1996)

    Article  Google Scholar 

  42. Ingwersen, P., Järvelin, K.: The Turn: Integration of Information Seeking and Retrieval in Context. Springer, Dordrecht (2005)

    MATH  Google Scholar 

  43. ISO: Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs): Part II, Guidance on Usability, ISO 9241- 11:1998 (1998)

    Google Scholar 

  44. Järvelin, K., Kekäläinen, J.: IR evaluation methods for retrieving highly relevant documents. In: Proceedings of the 23rd ACM SIGIR Conference on Research and Development of Information Retrieval (SIGIR 2000), Athens, Greece, pp. 41–48 (2000)

    Google Scholar 

  45. Järvelin, K., Kekäläinen, J.: Cumulated gain-based evaluation of IR techniques. ACM Transactions on Information Systems (TOIS) 20, 422–446 (2002)

    Article  Google Scholar 

  46. Johnson, F.C., Griffiths, J.R., Hartley, R.J.: Task dimensions of user evaluations of information retrieval systems. Information Research 8(4) (2003)

    Google Scholar 

  47. Käki, M., Aula, A.: Controlling the complexity in comparing search user interfaces via user studies. Information Processing and Management 44, 82–91 (2008)

    Article  Google Scholar 

  48. Kelly, D.: Methods for evaluating interactive information retrieval systems with users. Foundations and Trends in Information Retrieval 3(1-2), 1–224 (2009)

    Google Scholar 

  49. Kelly, D., Harper, J., Landau, B.: Questionnaire mode effects in interactive information retrieval experiments. Information Processing and Management (2008)

    Google Scholar 

  50. Kuhlthau, C.C.: Inside the search process: information seeking from the user’s perspective. Journal of the American Society for Information Science 42, 361–371 (1991)

    Article  Google Scholar 

  51. Kuhlthau, C.C.: Seeking meaning: a process approach to library and information services. Ablex Publishing, Norwood (1994)

    Google Scholar 

  52. Lievesley, M.A., Yee, J.S.R.: Surrogate Users – A Pragmatic Approach to Defining User Needs. In: Conference Proceedings and Extended Abstracts, CHI 2007, San Jose. ACM Press (2007)

    Google Scholar 

  53. Lin, X.: Map displays for information retrieval. Journal of the American Society for Information Science 48(1), 40–54 (1997)

    Article  Google Scholar 

  54. Losee, R.M.: Evaluating retrieval performance given database and query characteristics: Analytical determination of performance surfaces. Journal of the American Society for Information Science 47, 95–105 (1996)

    Article  Google Scholar 

  55. Marchionini, G.: An invitation to browse: designing full text systems for novice users. Canadian Journal of Information Science 12(3), 69–79 (1987)

    Google Scholar 

  56. Marchionini, G.: Toward Human-Computer Information Retrieval Bulletin. Bulletin of the American Society for Information Science (June/July 2006), http://www.asis.org/Bulletin/Jun-06/marchionini.html

  57. Mira working group: Evaluation Frameworks for Interactive Multimedia Information Retrieval Applications (1996), http://www.dcs.gla.ac.uk/mira

  58. O’Brien, H., Toms, E.: What is user engagement? A conceptual framework for defining user engagement with technology. Journal of the American Society for Information Science and Technology 59, 938–955 (2008)

    Article  Google Scholar 

  59. Pirolli, P., Card, S.K.: Information foraging. Psychological Review 106(4), 643–675 (1999)

    Article  Google Scholar 

  60. Rocchio, J.: Relevance feedback in information retrieval. In: Salton, G. (ed.) The SMART Retrieval System (1971)

    Google Scholar 

  61. Salton, G.: Evaluation problems in interactive information retrieval. Information Storage and Retrieval 6, 29–44 (1970)

    Article  Google Scholar 

  62. Salton, G.: Automatic text processing: the transformation, analysis and retrieval of information by computer. Addison-Wesley, Reading (1989)

    Google Scholar 

  63. Salton, G.: The state of retrieval system evaluation. Information Processing and Management 28, 441–449 (1992)

    Article  Google Scholar 

  64. Salton, G., Buckley, C.: Improving retrieval performance by relevance feedback. Journal of the American Society for Information Science 41(4), 288–297 (1990)

    Article  Google Scholar 

  65. Saracevic, T.: Modeling interaction in information retrieval (IR): A review and proposal. In: Proceedings of the 59th ASIS Annual Meeting 1996, vol. 33, pp. 3–9 (1996)

    Google Scholar 

  66. Saracevic, T.: The stratified model of information retrieval interaction: Extension and applications. In: Proceedings of the 60th ASIS Annual Meeting 1997, vol. 34, pp. 313–327 (1997)

    Google Scholar 

  67. Saracevic, T., Kantor, P.: A study of information seeking and retrieving. Journal of the American Society for Information Science 39(3), 177–216 (1988)

    Article  Google Scholar 

  68. Siatri, R.: The Evolution of User Studies. Libri 49, 132–141 (1999)

    Article  Google Scholar 

  69. Su, L.T.: Evaluation measures for interactive information retrieval. Information Processing and Management 28, 503–516 (1992)

    Article  Google Scholar 

  70. Tague, J.: Informativeness as an ordinal utility function for information retrieval. SIGIR Forum 21, 10–17 (1987)

    Article  Google Scholar 

  71. Tague-Sutcliffe, J.M.: The pragmatics of information retrieval experimentation, revisted. Information Processing and Management 28, 467–490 (1992)

    Article  Google Scholar 

  72. Tague-Sutcliffe, J.M.: Measuring Information: An Information Services Perspective. Academic Press, San Diego (1995)

    Google Scholar 

  73. Urquhart, D.J.: The Distribution and Use of Scientific and Technical Information. In: The Royal Society Scientific Information Conference, pp. 408–419 (1948)

    Google Scholar 

  74. Veerasamy, A., Belkin, N.J.: Evaluation of a tool for visualization of information retrieval results. In: Proceedings of the 19th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, Zurich, Switzerland, pp. 85–92 (1996)

    Google Scholar 

  75. Veerasamy, A., Heikes, R.: Effectiveness of a graphical display of retrieval results. SIGIR Forum 31, 236–245 (1997)

    Article  Google Scholar 

  76. Voorhees, E.M., Harman, D.K.: TREC: Experiment and Evaluation in Information Retrieval. MIT Press, Cambridge (2005)

    Google Scholar 

  77. Yuan, W., Meadow, C.T.: A study of the use of variables in information retrieval user studies. Journal of the American Society for Information Science 50, 140–150 (1999)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Catarci, T., Kimani, S. (2013). Human-Computer Interaction View on Information Retrieval Evaluation. In: Agosti, M., Ferro, N., Forner, P., Müller, H., Santucci, G. (eds) Information Retrieval Meets Information Visualization. PROMISE 2012. Lecture Notes in Computer Science, vol 7757. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-36415-0_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-36415-0_3

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-36414-3

  • Online ISBN: 978-3-642-36415-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics