ABSTRACT
As an emerging and promising approach, crowdsourcing-based software development becomes popular in many domains due to the participation of talented pool of developers in the contests, and to promote the ability of requesters (or customers) to choose the 'wining' solution with respect to their desire quality levels. However, due to lack of a central mechanism for team formation, continuity in the developer's work on consecutive tasks and risk of noise in submissions of a contest, requesters of a domain have quality concerns to adopt a crowdsourcing-based software development platform. In order to address this concern, we proposed a measure Quality of Contest (QoC) to analyze and predict the quality of a crowdsourcing-based platform through historical information on its completed tasks. We evaluate the capacity of QoC as assessor to predict the quality. Subsequently, we implement a crawler to mine the information of completed development tasks from the TopCoder platform of Tech Platform Inc (TPI) to empirically investigate the proposed measures. The promising results of QoC measure suggest the applicability of the proposed measure across the other crowdsourcing-based platforms.
- E. Bonabeau, Decisions 2.0: The Power of Collective Intelligence, MIT Sloan Manage Rev, 50(2), 2009, pp. 45--52.Google Scholar
- E. Schenk and C. Guittard, Crowdsourcing: What can be outsourced to the crowd, and why?, vol 3, 2009.Google Scholar
- TopCoder. 10 Burning Questions on Crowdsourcing: Your starting guide to open innovation and crowdsourcing success https://www.topcoder.com/blog/10-burning-questions-on-crowdsourcing-and-open-innovation/, 2013.Google Scholar
- S. P. Dow, A. Kulkarni, S.R. Klemmer and B. Hartmann. Shepherding the Crowd Yields Better Work. Proc. Computer-Supported Cooperative Work. ACM, 2012. Google ScholarDigital Library
- W. Ebner, M. Leimeister, U. Bretschneiderand H. Krcmar, Leveraging the Wisdom of Crowds: Designing an ITsupported Ideas Competition for an ERP Software Company, Proceedings of 41st Hawaii International Conference System Sciences, 2008. Google ScholarDigital Library
- E. Schenk and C. Guittard, Towards a Characterization of Crowdsourcing Practices, Journal of Innovation Economics, 1(7), 2011, pp. 93--107. Google ScholarCross Ref
- D.C. Brabham, The Myth of Amateur Crowds:A critical discourse analysis of crowdsourcing coverage, Information Communication and Society, 15(3), 2012, pp. 394--410. Google ScholarCross Ref
- A. Kulkarni, M..Can, and B Hartmann,. Collaboratively Crowdsourcing Workflows with Turkomatic. Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work (CSCW), 2012, pp. 1003--1012. Google ScholarDigital Library
- K.J Stol and B.Fitzgerald, Two's Company, Three's a Crowd: A Case Study of Crowdsourcing Software Development, Proceeding of International Conference on Software Engineering (ICSE), 2014. Google ScholarDigital Library
Index Terms
- Predicting the quality of contests on crowdsourcing-based software development platforms: student research abstract
Recommendations
On Moderating Software Crowdsourcing Challenges
GROUPCrowdsourcing divides a task into small pieces that are carried out by the crowd. In Software Engineering, crowdsourcing divides the software development tasks of to be carried out online by the crow and is simply called Software Crowdsourcing (SW CS). ...
Knowledge based quality analysis of crowdsourced software development platforms
As an emerging and promising approach, crowdsourcing-based software development has become popular in many domains due to the participation of talented pool of developers in the contests, and to promote the ability of requesters (or customers) to choose ...
Crowdsourcing Software Development
The earlier days of software development have witnessed the development through conventional development methods. With the aid of crowdsourcing, the tasks which are performed in a closed environment by limited persons can be distributed among the crowd. ...
Comments