skip to main content
10.1145/2854946.2855000acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
short-paper

How does Interest in a Work Task Impact Search Behavior and Engagement?

Authors Info & Claims
Published:13 March 2016Publication History

ABSTRACT

One goal of using simulated work tasks in interactive information retrieval (IIR) experiments is to create a more relevant and interesting search experience for study participants. However, there is not much guidance about how to identify interesting tasks or how interest in a task impacts search behaviors and experiences, which is the purpose of this study. In this study, we created eight work tasks and asked forty participants to rank these tasks from most interesting to least interesting before they came into the lab for an IIR experiment. During the experiment, we asked participants to conduct searches for the two tasks they ranked as most interesting and the two they ranked as least interesting. Participants completed pre- and post-search questionnaires to characterize their interests in the tasks and their search experiences, including engagement. Participants rated their interest, prior knowledge and search experience, and the relevancy of interesting tasks significantly higher than uninteresting tasks. They also predicted these tasks would be significantly less difficult to complete. Participants reported significantly greater engagement with interesting tasks and they spent longer completing these tasks. However, there were no significant differences in participants' search behaviors including number of queries issued, number of SERPs, or number of documents bookmarked. These results provide evidence that our method of assigning tasks to participants that would interest and engage them, at least cognitively, if not behaviorally, was somewhat successful. This method can be used by others conducting laboratory IIR studies.

References

  1. Borlund, P. (2003). The IIR evaluation model: a framework for evaluation of interactive information retrieval systems. Information Research, 8.Google ScholarGoogle Scholar
  2. Borlund, P., Dreier, S., & Byström, K. (2012). What does time spent on searching indicate? Proc. of IIiX, 184--193. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Borlund, P., & Schneider, J. W. (2010). Reconsideration of the simulated work task situation: A context instrument for evaluation of information retrieval interaction. In Proceedings of the Third Symposium on Information Interaction in Context, 155--164. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Byström, K. & Hansen, P. (2005). Conceptual framework for tasks in information studies. JASIST, 56, 1050--1061. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Kelly, D. (2009). Methods for evaluating interactive information retrieval systems with users. Foundations and Trends in Information Retrieval, 3(1--2). Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Kelly, D., Arguello, J., Edwards, A., & Wu, W. C. (2015). Development and evaluation of search tasks for IIR experiments using a cognitive complexity framework. Proc. of ICTIR, 101--110. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. O'Brien, H.L., & Toms, E.G. (2008). What is user engagement? A conceptual framework for defining user engagement with technology. JASIST, 59, 938--955. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. O'Brien, H.L., & Toms, E.G. (2013). Examining the generalizability of the User Engagement Scale (UES) in exploratory search. IP&M, 49, 1092--1107. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Peters, L. H., O'Connor, E. J., & Rudolf, C. J. (1980). The behavioral and affective consequences of performance- relevant situational variables. Organizational Behavior and Human Performance, 25, 79--96.Google ScholarGoogle ScholarCross RefCross Ref
  10. Poddar, A., and Ruthven, I. (2010). The emotional impact of search tasks. Proc. of IIiX, 35--44. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Vakkari, P. (2003). Task-based information searching. ARIST, 37, 413--464.Google ScholarGoogle Scholar
  12. White, R. W., Ruthven, I., & Jose, J. M. (2005). A study of factors affecting the utility of implicit relevance feedback. In Proceedings of the 28th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, 35--42. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Wildemuth, B. W., Freund, L. & Toms, E. G. (2014). Untangling search task complexity and difficulty in the context of interactive information retrieval studies. Journal of Documentation, 70, 1118--1140.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. How does Interest in a Work Task Impact Search Behavior and Engagement?

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHIIR '16: Proceedings of the 2016 ACM on Conference on Human Information Interaction and Retrieval
      March 2016
      400 pages
      ISBN:9781450337519
      DOI:10.1145/2854946

      Copyright © 2016 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 13 March 2016

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • short-paper

      Acceptance Rates

      CHIIR '16 Paper Acceptance Rate23of58submissions,40%Overall Acceptance Rate55of163submissions,34%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader