Abstract
With the proliferation of the social web questions about information quality and optimization attract the attention of IS scholars. Question-answering (QA) sites, such as Yahoo!Answers, have the potential to produce good answers, but at the same time not all answers are good and not all QA sites are alike. When organizations design and plan for the integration of question answering services on their sites, identification of good answers and process optimization become critical. Arguing that ‘given enough answers all questions are answered successfully,’ this paper identifies the optimal number of posts that generate high quality answers. Based on content analysis of Yahoo! Answers’ informational questions (n=174) and their answers (n=1,023), the study found that seven answers per question are ‘enough’ to provide a good answer.
Keywords
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Briggs, R.O., Nunamaker, J., Sprague, R.: Introduction to the special section: social aspects of sociotechnical systems. Journal of Management Information Systems 27(1), 13–16 (2010)
Ballou, D., Madnick, S., Wang, R.: Special section: assuring information quality. Journal of Management Information & Systems 20(3), 9–11 (2003)
Nelson, R.R., Todd, P.A., Wixom, B.: Antecedents of information and system quality: an empirical examination within the context of data warehousing. Journal of Management Information Systems 21(4), 199–235 (2005)
Schweik, C.M., English, R.C., Kisting, M., Haire, S.: Brooks’ versus Linus’ law: an empirical test of open source projects. In: Proceedings of the 2008 International Conference on Digital Government Research, pp. 423–424. ACM, Montreal (2008)
Howe, J.: The rise of crowdsourcing. Wired 14(6) (2006), http://www.wired.com/wired/archive/14.06/crowds.html
Howe, J.: Crowdsourcing. Crown Publishing Group, New York (2008)
Leimeister, J.M., Huber, M., Bretschneider, U., Krcmar, H.: Leveraging crowdsourcing: activation-supporting components for IT-based ideas competition. Journal of Management Information Systems 26(1), 197–224 (2009)
Giles, J.: Internet encyclopedias go head to head. Nature 438, 900–901 (2005), http://www.nature.com/news/2005/051212/full/438900a.html
Fichman, P.: A comparative assessment of answer quality on four question answering sites. Journal of Information Science 37(5), 476–486 (2011)
Keen, E.: The Cult of the Amateur: How Today’s Internet is Killing Our Culture. Doubleday/Currency, New York (2008)
Weinberger, D.: Everything is Miscellaneous: The Power of the New Digital Disorder. Henry Holt & Co., New York (2007)
Surowiecki, J.: The Wisdom of Crowds. Anchor Books, New York (2004)
Raymond, E.: The cathedral and the bazaar. Knowledge, Technology & Policy 12(3), 23–49 (1999)
Brooks Jr., F.P.: The Mythical Man-Month: Essays on Software Engineering. Addison-Wesley Publishing Company, Reading (1975)
Noguchi, Y.: Web searches go low-tech: you ask, a person answers. Washington Post, p. A01 (2006), http://www.washingtonpost.com/wp-dyn/content/article/2006/08/15/AR2006081501142.htm
Yahoo Answers hits 200 million visitors worldwide! Yahoo Answers Blog. Yahoo (2009), http://yanswersblog.com/index.php/archives/2009/12/14/yahoo-answers-hits-200-million-visitors-worldwide/
Harper, F.M., Raban, D., Rafaeli, S., Konstan, J.: Predictors of answer quality in online Q&A sites. In: Proceedings of the Conference on Human Factors in Computing Systems, pp. 865–874. ACM, New York (2008)
Shachaf, P.: Social reference: a unifying theory. Library & Information Science Research 32(1), 66–76 (2010)
Agichtein, E., Castillo, C., Donato, D., Gionides, A., Mishne, G.: Finding high-quality content in social media. In: Proceedings of the International Conference on Web Search and Web Data Mining, pp. 183–194. ACM, Palo Alto (2008)
Gazan, R.: Microcollaborations in a social Q&A community. Information Processing & Management 46(6), 693–702 (2010)
Harper, F.M., Weinberg, J., Logie, J., Konstan, J.: Question types in social Q&A sites. First Monday 15(7) (2010), http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/viewArticle/2913/2571
Kim, S., Oh, S.: Users’ relevance criteria for evaluating answers in social Q&A site. Journal of the American Society for Information Science and Technology 60(4), 716–727 (2009)
Kim, S.: Questioners’ credibility judgments of answers in a social question and answer site. Information Research 15(2), paper 432 (2010), http://InformationR.net/ir/15-2/paper432.html
Rosenbaum, H., Shachaf, P.: A structuration approach to online communities of practice: the case of Q&A communities. Journal of the American Society for Information Science and Technology 61(9), 1933–1944 (2010)
Shachaf, P.: The paradox of expertise: is the Wikipedia Reference Desk as good as your library? Journal of Documentation 65(6), 977–996 (2009)
Gazan, R.: Specialists and synthesists in a question answering community. In: Proceedings of the American Society for Information Science & Technology Annual Meeting, ASIST, Austin, pp. 1–10 (2006)
Gazan, R.: Seekers, sloths and social reference: Homework questions submitted to a question-answering community. New Review of Hypermedia & Multimedia 13(2), 239–248 (2007)
Nam, K.K., Ackerman, M.S., Adamic, L.A.: Questions in, knowledge in?: a study of Naver’s question answering community. In: Proceedings of the 27th International Conference on Human Factors in Computing Systems, pp. 779–788. ACM, Boston (2009)
O’Neill, N.: Chacha, Yahoo!, and Amazon. Searcher 15(4), 7–11 (2007)
Saxton, M.L., Richardson, J.: Understanding Reference Transactions: Transforming an Art into a Science. Academic Press, San Diego (2002)
DeLone, W.H., McLean, E.: The DeLone and McLean model of information systems success: a ten-year update. Journal of Management Information Systems 19(4), 9–30 (2003)
Rieh, S.: Judgment of information quality and cognitive authority in the Web. Journal of the American Society for Information Science and Technology 53(2), 145–161 (2002)
Fallis, D.: On verifying the accuracy of information: Philosophical perspectives. Library Trends 52(3), 463–487 (2004)
Frické, M., Fallis, D.: Indicators of accuracy for answers to ready reference questions on the Internet. Journal of the American Society for Information Science and Technology 55(3), 238–245 (2004)
Arazy, O., Nov, O., Patterson, R., Yeo, L.: Information quality in Wikipedia: the effects of group composition and task conflict. Journal of Management Information Systems 27(4), 71–98 (2011)
Stvilia, B., Twidale, M.D., Smith, L.C., Gasser, L.: Information quality work organization in Wikipedia. Journal of the American Society for Information Science and Technology 59(6), 983–1001 (2008)
Blooma, J.M., Chua, A.Y.K., Goh, D.: A predictive framework for retrieving the best answer. In: Proceedings of the 2008 ACM Symposium on Applied Computing. ACM, Fortaleza (2008)
Adamic, L.A., Zhang, J., Bakshy, E., Ackerman, M.: Knowledge sharing and Yahoo! Answers: Everyone knows something. In: Proceedings of the International World Wide Web Conference, ACM, Beijing (2008)
Poston, R., Speier, C.: Effective use of knowledge management systems: A process model of content ratings and credibility indicators. MIS Quarterly 29(2), 221–244 (2005)
Bouguessa, M., Dumoulin, B., Wang, S.: Identifying authoritative actors in question-answering forums: The case of Yahoo! Answers. In: Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 866–874. ACM, Las Vegas (2009)
Jurczyk, P., Agichtein, E.: Discovering authorities in question answer communities by using link analysis. In: Proceedings of the Sixteenth ACM Conference on Information and Knowledge Management, pp. 919–922. ACM, New York (2007a)
Jurczyk, P., Agichtein, E.: Hits on question answer portals: exploration of link analysis for author ranking. In: Annual ACM Conference on Research and Development in Information Retrieval, pp. 845–846. ACM, Amsterdam (2007b)
Chen, W., Zeng, Q., Wenyin, L.: A user reputation model for a user-interactive question answering system. In: Proceedings of the Second International Conference on Semantics, Knowledge, and Grid, pp. 40–45. IEEE Computer Society, Washington D.C (2006)
Adamic, L.A., Wei, X., et al.: Individual focus and knowledge contribution. First Monday 5(3) (2010)
Dom, B., Paranjpe, D.: A Bayesian technique for estimating the credibility of question answerers. Proceedings of the Society for Industrial and Applied Mathematics (SIAM), pp. 399–409. SIAM, Atlanta (2008), http://www.siam.org/proceedings/datamining/2008/dm08_36_Dom.pdf
Ong, C., Day, M., Hsu, M.: The measurement of user satisfaction with question answering systems. Information & Management 46(7), 397–403 (2009)
Harper, F.M., Moy, D., Konstan, J.: Facts or friends?: Distinguishing informational and conversational questions in social Q&A sites. In: Conference on Human Factors in Computing Systems, pp. 759–768. ACM, Boston (2009)
Li, B., Liu, Y., Ram, A., Garcia, E.V., Agichtein, E.: Exploring question subjectivity prediction in community QA. In: Proceedings of the 31st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 735–736. ACM, Singapore (2009)
Liu, Y., Li, S., Cao, Y., et al.: Understanding and summarizing answers in community-based question answering services. In: Proceedings of the 22nd International Conference on Computational Linguistics, pp. 497–504. ACL, Manchester (2008)
Hitwise. U.S. visits to question and answer websites increased 118 percent year-over-year. Hitwise, New York (March 19, 2008), http://www.hitwise.com/news/us200803.html
Neuendorf, K.: The Content Analysis Guidebook. Sage, Thousand Oaks (2002)
Lombard, M., Snyder-Duch, J., Bracken, C.: Content analysis in mass communication: assessment and reporting of intercoder reliability. Human Communication Research 28(4), 587–604 (2002)
Krippendorff, K.: Content Analysis: An Introduction to its Methodology, 2nd edn. Sage, Thousand Oaks (2004)
Landis, J.R., Koch, G.: An application of hierarchical Kappa-type statistics in the assessment of majority agreement among multiple observers. Biometrics 33(2), 363–374 (1977)
Zhang, X., Feng, Z.: Group size and incentives to contribute: A natural experiment at Chinese Wikipedia. American Economic Review 101(4), 1601–1615 (2011)
Meneely, A., Williams, L.: Secure open source collaboration: an empirical study of Linus’ Law. In: Proceedings of the 16th ACM Conference on Computer and Communications Security, pp. 453–462. ACM, New York (2009)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Fichman, P. (2012). How Many Answers Are Enough? Optimal Number of Answers for Q&A Sites. In: Aberer, K., Flache, A., Jager, W., Liu, L., Tang, J., Guéret, C. (eds) Social Informatics. SocInfo 2012. Lecture Notes in Computer Science, vol 7710. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-35386-4_20
Download citation
DOI: https://doi.org/10.1007/978-3-642-35386-4_20
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-35385-7
Online ISBN: 978-3-642-35386-4
eBook Packages: Computer ScienceComputer Science (R0)