skip to main content
10.1145/2207676.2207710acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Direct answers for search queries in the long tail

Authors Info & Claims
Published:05 May 2012Publication History

ABSTRACT

Web search engines now offer more than ranked results. Queries on topics like weather, definitions, and movies may return inline results called answers that can resolve a searcher's information need without any additional interaction. Despite the usefulness of answers, they are limited to popular needs because each answer type is manually authored. To extend the reach of answers to thousands of new information needs, we introduce Tail Answers: a large collection of direct answers that are unpopular individually, but together address a large proportion of search traffic. These answers cover long-tail needs such as the average body temperature for a dog, substitutes for molasses, and the keyboard shortcut for a right-click. We introduce a combination of search log mining and paid crowdsourcing techniques to create Tail Answers. A user study with 361 participants suggests that Tail Answers significantly improved users' subjective ratings of search quality and their ability to solve needs without clicking through to a result. Our findings suggest that search engines can be extended to directly respond to a large new class of queries.

References

  1. Adar, E., Teevan, J., Dumais, S.T., and Elsas, J.L. The web changes everything. Proc. WSDM '09, (2009).Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Agichtein, E., Lawrence, S., and Gravano, L. Learning to find answers to questions on the Web. ACM TOIS 4, 2 (2004), 129--162. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Banko, M., Cafarella, M.J., Soderland, S., Broadhead, M., and Etzioni, O. Open information extraction for the web. IJCAI '072, University of Washington (2007). Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bernstein, M.S., Little, G., Miller, R.C., et al. Soylent: A Word Processor with a Crowd Inside. Proc. UIST '10, (2010). Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Brill, E., Dumais, S., and Banko, M. An analysis of the AskMSR question-answering system. Proc. EMNLP '02, (2002). Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Broder, A. A taxonomy of web search. ACM SIGIR Forum 36, 2 (2002), 3. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Chilton, L.B. and Teevan, J. Addressing people's information needs directly in a web search result page. Proc. WWW '11, (2011). Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Cutrell, E. and Guan, Z. What are you looking for?: an eye-tracking study of information usage in web search. Proc. CHI '07, (2007). Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Horowitz, D. and Kamvar, S.D. The anatomy of a largescale social search engine. Proc. WWW '10, (2010). Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Kellar, M., Watters, C., and Shepherd, M. A field study characterizing Web-based information-seeking tasks. JASIST 58, 7 (2007), 999--1018. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Law, E. and Zhang, H. Towards Large-Scale Collaborative Planning: Answering High-Level Search Queries Using Human Computation. Proc. AAAI '11, (2011).Google ScholarGoogle Scholar
  12. Le, J., Edmonds, A., Hester, V., and Biewald, L. Ensuring quality in crowdsourced search relevance evaluation. Proc. SIGIR '10 Workshop on Crowdsourcing for Search Evaluation, (2010).Google ScholarGoogle Scholar
  13. Li, J., Huffman, S., and Tokuda, A. Good abandonment in mobile and PC internet search. Proc. SIGIR '09, (2009). Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Lin, J. An exploration of the principles underlying redundancy-based factoid question answering. ACM TOIS 25, 2 (2007). Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Little, G., Chilton, L., Goldman, M., and Miller, R.C. Exploring iterative and parallel human computation processes. Proc. HCOMP '10, (2010). Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Pirolli, P. Information foraging theory: adaptive interaction with information. Oxford Press, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Stamou, S. and Efthimiadis, E.N. Queries without clicks: Successful or failed searches. Proc. SIGIR '09 Wkshp on the Future of IR Evaluation, (2009).Google ScholarGoogle Scholar
  18. Suchanek, F.M., Kasneci, G., and Weikum, G. YAGO : A Core of Semantic Knowledge Unifying Wikipedia and WordNet. WWW '07, (2007). Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Teevan, J., Liebling, D.J., and Ravichandran Geetha, G. Understanding and predicting personal navigation. Proc. WSDM '11, (2011). Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. White, R.W., Bilenko, M., and Cucerzan, S. Studying the use of popular destinations to enhance web search interaction. Proc. SIGIR '07, (2007). Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Direct answers for search queries in the long tail

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '12: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      May 2012
      3276 pages
      ISBN:9781450310154
      DOI:10.1145/2207676

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 May 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate6,199of26,314submissions,24%

      Upcoming Conference

      CHI '24
      CHI Conference on Human Factors in Computing Systems
      May 11 - 16, 2024
      Honolulu , HI , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader