skip to main content
10.1145/2601248.2601300acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
research-article

Crowdsourcing software evaluation

Published: 13 May 2014 Publication History

Abstract

Crowdsourcing is an emerging online paradigm for problem solving which involves a large number of people often recruited on a voluntary basis and given, as a reward, some tangible or intangible incentives. It harnesses the power of the crowd for minimizing costs and, also, to solve problems which inherently require a large, decentralized and diverse crowd. In this paper, we advocate the potential of crowdsourcing for software evaluation. This is especially true in the case of complex and highly variable software systems, which work in diverse, even unpredictable, contexts. The crowd can enrich and keep the timeliness of the developers' knowledge about software evaluation via their iterative feedback. Although this seems promising, crowdsourcing evaluation introduces a new range of challenges mainly on how to organize the crowd and provide the right platforms to obtain and process their input. We focus on the activity of obtaining evaluation feedback from the crowd and conduct two focus groups to understand the various aspects of such an activity. We finally report on a set of challenges to address and realize correct and efficient crowdsourcing mechanisms for software evaluation.

References

[1]
Howe, J. (2006). The rise of crowdsourcing. Wired magazine, 14(6), 1--4.
[2]
Estellés-Arolas, E., & González-Ladrón-de-Guevara, F. (2012). Towards an integrated crowdsourcing definition. Journal of Information science, 38(2), 189--200.
[3]
Naroditskiy, V., Jennings, N. R., Van Hentenryck, P., & Cebrian, M. (2013). Crowdsourcing Dilemma. arXiv preprint arXiv:1304.3548.
[4]
Stringhini, G., Kruegel, C., & Vigna, G. (2010). Detecting spammers on social networks. In Proceedings of the 26th Annual Computer Security Applications Conference (pp. 1--9). ACM.
[5]
Surowiecki, J. (2005). The wisdom of crowds. Random House LLC.
[6]
Ali, R., Solis, C., Omoronyia, I., Salehie, M., Nuseibeh, B. (2012). Social Adaptation: When Software Gives Users a Voice. In the proceedings of the 7th Inerantional Conference on Evaluation of Novel Approaches to Software Engineering (ENASE 2012).
[7]
Vredenburg, K., Mao, J. Y, Smith, P., Carey, T. 2002. A survey of user-centered design practice. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'02), ACM, Minneapolis, Minnesota, USA. p. 471--478.
[8]
Law, E. L.-C. and P.v. Schaik. 2010. Editorial: Modelling user experience - An agenda for research and practice. Interact. Comput. 22(5): p. 313--322.
[9]
Dybå, T. and T. Dingsøyr. 2008. Empirical studies of agile software development: A systematic review. Information and Software Technology. 50(9--10): p. 833--859.
[10]
Adikari, S. and C. McDonald. 2006. User and Usability Modeling for HCI/HMI: A Research Design. In Proceedings of the International Conference on Information and Automation. ICIA'06. Shandong, 151--154.
[11]
Maalej, W., Happel, H-J., Rashid, A. 2009. When users become collaborators: towards continuous and context-aware user input. In Proceedings of the 24th ACM SIGPLAN conference companion on Object oriented programming systems languages and applications (OOPSLA '09). ACM, Orlando, Florida, USA, 981--990.
[12]
Knauss, A. 2012. On the usage of context for requirements elicitation: End-user involvement in IT ecosystems. In Proceedings of the 2012 IEEE 20th International Requirements Engineering Conference (RE '12). IEEE Computer Society, Washington, DC, USA, 345--348.
[13]
Pagano, D., Brügge, B. 2013. User involvement in software evolution practice: a case study. In Proceedings of the 2013 International Conference on Software Engineering (ICSE '13). IEEE Press, Piscataway, NJ, USA, 953--962.
[14]
Kittur, A., Chi, E. H., and Suh, B. 2008. Crowdsourcing user studies with Mechanical Turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'08). ACM, Florence, Italy, 453--456.
[15]
Stolee, K. T. and Elbaum, S. 2010. Exploring the use of crowdsourcing to support empirical studies in software engineering. In Proceedings of the 2010 ACM-IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM'10). ACM, Bolzano-Bozen, Italy, 1--4.
[16]
Liu, D., Bias, R. G., Lease, M., and Kuipers, R. 2012. Crowdsourcing for usability testing. In Proceedings of the American Society for Information Science and Technology, 49(1): p. 1--10.
[17]
Komarov, S., Reinecke, K., and Gajos, K. Z. 2013. Crowdsourcing performance evaluations of user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'13). ACM, Paris, France, 207--216.
[18]
Ali, R., Solis, C., Salehie, M., Omoronyia, I., Nuseibeh, B., Maalej, W. 2011. Social sensing: when users become monitors. In Proceedings of the 19th ACM SIGSOFT symposium and the 13th European conference on Foundations of software engineering (ESEC/FSE'11), ACM, Szeged, Hungary, 476--479.
[19]
D. Brandes and P. Ginnis. A guide to student-centred learning. Nelson Thornes, 1996.
[20]
Almaliki, M., Faniyi, F., Bahsoon, R., Phalp, K., Ali, R. 2014. Requirements-driven Social Adaptation: Expert Survey. The 20th International Working Conference on Requirements Engineering: Foundation for Software Quality (REFSQ 2014).
[21]
Kontio, J., Lehtola, L., and Bragge, J. 2004. Using the focus group method in software engineering: obtaining practitioner and user experiences. In Proceedings of the International Symposium on Empirical Software Engineering (Redondo Beach, CA, USA, August 19--20, 2004). ISESE '04. IEEE, New York, NY, 271--280.
[22]
Braun, V. and Clarke, V. 2006. Using thematic analysis in psychology. Qualitative Research in Psychology. 3, 2 (Jul. 2006), 77--101

Cited By

View all
  • (2025)“Lost in the crowd”: ethical concerns in crowdsourced evaluations of LLMsAI and Ethics10.1007/s43681-025-00671-2Online publication date: 18-Feb-2025
  • (2023)CrowdSurfer: Seamlessly Integrating Crowd-Feedback Tasks into Everyday Internet SurfingProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580994(1-16)Online publication date: 19-Apr-2023
  • (2023)Optimization of Web Service Testing Task Assignment in Crowdtesting EnvironmentJournal of Computer Science and Technology10.1007/s11390-022-0824-738:2(455-470)Online publication date: 30-Mar-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
EASE '14: Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering
May 2014
486 pages
ISBN:9781450324762
DOI:10.1145/2601248
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • Brunel University: Brunel University

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 May 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. crowdsourcing
  2. software evaluation
  3. users feedback

Qualifiers

  • Research-article

Funding Sources

  • Seventh Framework Programme
  • Bournemouth University - Fusion Investment Fund (the BBB, BUUU and VolaComp projects)
  • Graduate School PGR Development Fund

Conference

EASE '14
Sponsor:
  • Brunel University

Acceptance Rates

Overall Acceptance Rate 71 of 232 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)0
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)“Lost in the crowd”: ethical concerns in crowdsourced evaluations of LLMsAI and Ethics10.1007/s43681-025-00671-2Online publication date: 18-Feb-2025
  • (2023)CrowdSurfer: Seamlessly Integrating Crowd-Feedback Tasks into Everyday Internet SurfingProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580994(1-16)Online publication date: 19-Apr-2023
  • (2023)Optimization of Web Service Testing Task Assignment in Crowdtesting EnvironmentJournal of Computer Science and Technology10.1007/s11390-022-0824-738:2(455-470)Online publication date: 30-Mar-2023
  • (2022)Clustering Crowdsourced Test Reports of Mobile Applications Using Image UnderstandingIEEE Transactions on Software Engineering10.1109/TSE.2020.301751448:4(1290-1308)Online publication date: 1-Apr-2022
  • (2022)Supporting Coordination among Participants in Crowdsourcing Software Design2022 IEEE/ACIS 20th International Conference on Software Engineering Research, Management and Applications (SERA)10.1109/SERA54885.2022.9806724(132-139)Online publication date: 25-May-2022
  • (2019)On the Configuration of Crowdsourcing ProjectsCrowdsourcing10.4018/978-1-5225-8362-2.ch003(33-52)Online publication date: 2019
  • (2019)On the Configuration of Crowdsourcing ProjectsSocial Entrepreneurship10.4018/978-1-5225-8182-6.ch010(180-199)Online publication date: 2019
  • (2019)End-User EvaluationsWeb Accessibility10.1007/978-1-4471-7440-0_11(185-210)Online publication date: 4-Jun-2019
  • (2017)Crowd Sourced Evaluation Process for Mobile Learning Application Quality2017 Second International Conference on Information Systems Engineering (ICISE)10.1109/ICISE.2017.17(1-5)Online publication date: Apr-2017
  • (2015)On the Configuration of Crowdsourcing ProjectsInternational Journal of Information System Modeling and Design10.4018/IJISMD.20150701026:3(27-45)Online publication date: 1-Jul-2015
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media