skip to main content
10.1145/3019612.3019923acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
abstract

Predicting the quality of contests on crowdsourcing-based software development platforms: student research abstract

Published:03 April 2017Publication History

ABSTRACT

As an emerging and promising approach, crowdsourcing-based software development becomes popular in many domains due to the participation of talented pool of developers in the contests, and to promote the ability of requesters (or customers) to choose the 'wining' solution with respect to their desire quality levels. However, due to lack of a central mechanism for team formation, continuity in the developer's work on consecutive tasks and risk of noise in submissions of a contest, requesters of a domain have quality concerns to adopt a crowdsourcing-based software development platform. In order to address this concern, we proposed a measure Quality of Contest (QoC) to analyze and predict the quality of a crowdsourcing-based platform through historical information on its completed tasks. We evaluate the capacity of QoC as assessor to predict the quality. Subsequently, we implement a crawler to mine the information of completed development tasks from the TopCoder platform of Tech Platform Inc (TPI) to empirically investigate the proposed measures. The promising results of QoC measure suggest the applicability of the proposed measure across the other crowdsourcing-based platforms.

References

  1. E. Bonabeau, Decisions 2.0: The Power of Collective Intelligence, MIT Sloan Manage Rev, 50(2), 2009, pp. 45--52.Google ScholarGoogle Scholar
  2. E. Schenk and C. Guittard, Crowdsourcing: What can be outsourced to the crowd, and why?, vol 3, 2009.Google ScholarGoogle Scholar
  3. TopCoder. 10 Burning Questions on Crowdsourcing: Your starting guide to open innovation and crowdsourcing success https://www.topcoder.com/blog/10-burning-questions-on-crowdsourcing-and-open-innovation/, 2013.Google ScholarGoogle Scholar
  4. S. P. Dow, A. Kulkarni, S.R. Klemmer and B. Hartmann. Shepherding the Crowd Yields Better Work. Proc. Computer-Supported Cooperative Work. ACM, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. W. Ebner, M. Leimeister, U. Bretschneiderand H. Krcmar, Leveraging the Wisdom of Crowds: Designing an ITsupported Ideas Competition for an ERP Software Company, Proceedings of 41st Hawaii International Conference System Sciences, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. E. Schenk and C. Guittard, Towards a Characterization of Crowdsourcing Practices, Journal of Innovation Economics, 1(7), 2011, pp. 93--107. Google ScholarGoogle ScholarCross RefCross Ref
  7. D.C. Brabham, The Myth of Amateur Crowds:A critical discourse analysis of crowdsourcing coverage, Information Communication and Society, 15(3), 2012, pp. 394--410. Google ScholarGoogle ScholarCross RefCross Ref
  8. A. Kulkarni, M..Can, and B Hartmann,. Collaboratively Crowdsourcing Workflows with Turkomatic. Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work (CSCW), 2012, pp. 1003--1012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. K.J Stol and B.Fitzgerald, Two's Company, Three's a Crowd: A Case Study of Crowdsourcing Software Development, Proceeding of International Conference on Software Engineering (ICSE), 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Predicting the quality of contests on crowdsourcing-based software development platforms: student research abstract

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SAC '17: Proceedings of the Symposium on Applied Computing
      April 2017
      2004 pages
      ISBN:9781450344869
      DOI:10.1145/3019612

      Copyright © 2017 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 3 April 2017

      Check for updates

      Qualifiers

      • abstract

      Acceptance Rates

      Overall Acceptance Rate1,650of6,669submissions,25%
    • Article Metrics

      • Downloads (Last 12 months)1
      • Downloads (Last 6 weeks)0

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader