skip to main content
research-article

Aligning Crowdworker Perspectives and Feedback Outcomes in Crowd-Feedback System Design

Published: 16 April 2023 Publication History

Abstract

Leveraging crowdsourcing in software development has received growing attention in research and practice. Crowd feedback offers a scalable and flexible way to evaluate software design solutions and the potential of crowd-feedback systems has been demonstrated in different contexts by existing research studies. However, previous research lacks a deep understanding of the effects of individual design features of crowd-feedback systems on feedback quality and quantity. Additionally, existing studies primarily focused on understanding the requirements of feedback requesters but have not fully explored the qualitative perspectives of crowd-based feedback providers. In this paper, we address these research gaps with two research studies. In study 1, we conducted a feature analysis (N=10) and concluded that from a user perspective, a crowd-feedback system should have five core features (scenario, speech-to-text, markers, categories, and star rating). In the second study, we analyzed the effects of the design features on crowdworkers' perceptions and feedback outcomes (N=210). We learned that offering feedback providers scenarios as the context of use is perceived as most important. Regarding the resulting feedback quality, we discovered that more features are not always better as overwhelming feedback providers might decrease feedback quality. Offering feedback providers categories as inspiration can increase the feedback quantity. With our work, we contribute to research on crowd-feedback systems by aligning crowdworker perspectives and feedback outcomes and thereby making the software evaluation not only more scalable but also more human-centered.

Supplementary Material

MP4 File (v7cscw023.mp4)
Supplemental video

References

[1]
Oshrat Ayalon and Eran Toch. 2019. A/P(rivacy) Testing: Assessing Applications for Social and Institutional Privacy. In Extended Abstracts of the 2019 CHI Conference. Association for Computing Machinery (ACM), New York, NY, USA, 1--6. https://doi.org/10.1145/3290607.3312972
[2]
Virginia Braun and Victoria Clarke. 2006. Using Thematic Analysis in Psychology. Qualitative Research in Psychology, Vol. 3, 2 (2006), 77--101. https://doi.org/10.1191/1478088706qp063oa
[3]
Manuel Brhel, Hendrik Meth, Alexander Maedche, and Karl Werder. 2015. Exploring Principles of User-Centered Agile Software Development: A Literature Review. Information and Software Technology, Vol. 61 (2015), 163--181. https://www.sciencedirect.com/science/article/pii/S0950584915000129
[4]
Yoonseo Choi, Toni-Jan Jan Keith Palma Monserrat, Jeongeon Park, Hyungyu Shin, Nyoungwoo Lee, and Juho Kim. 2021. ProtoChat: Supporting the Conversation Design Process with Crowd Feedback. Proceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW, Vol. 4, CSCW3 (2021), 19--23. https://doi.org/10.1145/3432924
[5]
Fred D. Davis. 1989. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly: Management Information Systems, Vol. 13, 3 (1989), 319--339. https://doi.org/10.2307/249008
[6]
Dennis Dobler, Sarah Friedrich, and Markus Pauly. 2020. Nonparametric MANOVA in Meaningful Effects. Annals of the Institute of Statistical Mathematics, Vol. 72, 4 (8 2020), 997--1022. https://doi.org/10.1007/s10463-019-00717--3
[7]
Steven Dow, Elizabeth Gerber, and Audris Wong. 2013. A Pilot Study of Using Crowds in the Classroom. In Proceedings of the 2013 Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 227--236. https://doi.org/10.1145/2470654.2470686
[8]
Matthew W Easterday, Daniel Rees Lewis, and Elizabeth M Gerber. 2017. Designing Crowdcritique Systems for Formative Feedback. International Journal of Artificial Intelligence in Education, Vol. 27, 3 (2017), 623--663. https://doi.org/10.1007/s40593-016-0125--9
[9]
Anita Gibbs. 1997. Focus Groups. Social research update, Vol. 19, 8 (1997), 1--8.
[10]
Michael D Greenberg, Matthew W Easterday, and Elizabeth M Gerber. 2015. Critiki: A Scaffolded Approach to Gathering Design Feedback from Paid Crowdworkers. In C and C 2015 - Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition. Association for Computing Machinery (ACM), New York, NY, USA, 235--244. https://doi.org/10.1145/2757226.2757249
[11]
Joe F. Hair, Marko Sarstedt, Lucas Hopkins, and Volker G. Kuppelwieser. 2014. Partial Least Squares Structural Equation Modeling (PLS-SEM): An Emerging Tool in Business Research., 106--121 pages. https://doi.org/10.1108/EBR-10--2013-0128
[12]
Saskia Haug and Alexander Maedche. 2021a. Crowd-Feedback in Information Systems Development: A State-of-the-Art Review. In Proceedings of the 42nd International Conference on Information Systems (ICIS) 2021. Association for Information Systems (AIS), New York, NY, USA, 17. https://aisel.aisnet.org/icis2021/is_design/is_design/4
[13]
Saskia Haug and Alexander Maedche. 2021b. Feeasy: An Interactive Crowd-Feedback System. In Adjunct Publication of the 34th Annual ACM Symposium on User Interface Software and Technology, UIST 2021. Association for Computing Machinery (ACM), New York, NY, USA, 41--43. https://doi.org/10.1145/3474349.3480224
[14]
Catherine M. Hicks, Vineet Pandey, C. Ailie Fraser, and Scott Klemmer. 2016. Framing Feedback: Choosing Review Environment Features that Support High Quality Peer Assessment. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 458--469. https://doi.org/10.1145/2858036.2858195
[15]
Starr R. Hiltz and Murray Turoff. 1985. Structuring Computer-Mediated Communication Systems to Avoid Information Overload. Commun. ACM, Vol. 28, 7 (7 1985), 680--689. https://doi.org/10.1145/3894.3895
[16]
Markus Krause, Tom Garncarz, JiaoJiao Song, Elizabeth M Gerber, Brian P Bailey, and Steven P Dow. 2017. Critique Style Guide: Improving Crowdsourced Design Feedback with a Natural Language Model. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 4627--4639. https://doi.org/10.1145/3025453.3025883
[17]
Niklas Leicht. 2018. Given Enough Eyeballs, all Bugs are Shallow - A Literature Review for the Use of Crowdsourcing in Software Testing. In Proceedings of the 51st Hawaii International Conference on System Sciences. Hawaii International Conference on System Sciences, Waikoloa, HI, USA, 4102--4111. https://doi.org/10.24251/HICSS.2018.515
[18]
Fritz Lekschas, Spyridon Ampanavos, Pao Siangliulue, Hanspeter Pfister, and Krzysztof Z Gajos. 2021. Ask Me or Tell Me? Enhancing the Effectiveness of Crowdsourced Design Feedback. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 12. https://doi.org/10.1145/3411764.3445507
[19]
Yuping Liu. 2003. Developing a scale to measure the interactivity of websites. Journal of Advertising Research, Vol. 43, 2 (2003), 207--216. https://doi.org/10.1017/S0021849903030204
[20]
Kurt Luther, Amy Pavel, Wei Wu, Jari Lee Tolentino, Maneesh Agrawala, Björn Hartmann, and Steven Dow. 2014. CrowdCrit: Crowdsourcing and Aggregating Visual Design Critique. In CSCW Companion '14: Proceedings of the Companion Publication of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing. Association for Computing Machinery (ACM), New York, NY, USA, 21--24. https://doi.org/10.1145/2556420.2556788
[21]
Kurt Luther, Jari Lee Tolentino, Wei Wu, Amy Pavel, Brian P Bailey, Maneesh Agrawala, Björn Hartmann, and Steven P Dow. 2015. Structuring, Aggregating, and Evaluating Crowdsourced Design Critique. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing. Association for Computing Machinery (ACM), New York, NY, USA, 473--485. https://doi.org/10.1145/2675133.2675283
[22]
Xiaojuan Ma, Yu Li, Jodi Forlizzi, and Steven Dow. 2015. Exiting the Design Studio: Leveraging Online Participants for Early-Stage Design Feedback. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing. Association for Computing Machinery (ACM), New York, NY, USA, 676--685. https://doi.org/10.1145/2675133.2675174
[23]
Wendy E Mackay. 2004. The Interactive Thread: Exploring Methods for Multi-Disciplinary Design. In Proceedings of the 5th Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques (DIS '04). Association for Computing Machinery, New York, NY, USA, 103--112. https://doi.org/10.1145/1013115.1013131
[24]
D Mu n ante, A Siena, F M Kifetew, A Susi, M Stade, and N Seyff. 2017. Gathering Requirements for Software Configuration from the Crowd. In 2017 IEEE 25th International Requirements Engineering Conference Workshops (REW). IEEE, 176--181. https://doi.org/10.1109/REW.2017.74
[25]
Michael Nebeling, Maximilian Speicher, and Moira C Norrie. 2013. CrowdStudy: General Toolkit for Crowdsourced Evaluation of Web Interfaces. In EICS '13 : proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems. Association for Computing Machinery (ACM), New York, NY; USA, 255. https://doi.org/10.1145/2494603.2480303
[26]
Jonas Oppenlaender and Simo Hosio. 2019. Towards Eliciting Feedback for Artworks on Public Displays. In C&C '19. Association for Computing Machinery (ACM), New York, 562--569. https://doi.org/10.1145/3325480.3326583
[27]
Jonas Oppenlaender, Elina Kuosmanen, Andrés Lucero, and Simo Hosio. 2021. Hardhats and Bungaloos: Comparing Crowdsourced Design Feedback with Peer Design Feedback in the Classroom. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 1--14. https://doi.org/10.1145/3411764.3445380
[28]
Jonas Oppenlaender, Thanassis Tiropanis, and Simo Hosio. 2020. CrowdUI: Supporting Web Design with the Crowd. Proceedings of the ACM on Human-Computer Interaction, Vol. 4, EICS (2020), 1--28. https://doi.org/10.1145/3394978
[29]
David A Robb, Stefano Padilla, Britta Kalkreuter, and Mike J Chantler. 2015. Crowd Sourced Feedback with Imagery Rather than Text: Would Designers Use It?. In Proceedings of the 2015 CHI Conference on Human Factors in Computing Systems, Vol. 2015-April. Association for Computing Machinery (ACM), New York, NY, USA, 1355--1364. https://doi.org/10.1145/2702123.2702470
[30]
David A Robb, Stefano Padilla, Thomas S Methven, Britta Kalkreuter, and Mike J Chantler. 2017. Image-Based Emotion Feedback: How does the Crowd Feel? And Why?. In DIS 2017 - Proceedings of the 2017 ACM Conference on Designing Interactive Systems. Association for Computing Machinery (ACM), New York, NY, USA, 451--463. https://doi.org/10.1145/3064663.3064665
[31]
Peter Gordon Roetzel. 2019. Information Overload in the Information Age: A Review of the Literature from Business Administration, Business Psychology, and Related Disciplines with a Bibliometric Approach and Framework Development. Business Research, Vol. 12, 2 (7 2019), 479--522. https://doi.org/10.1007/s40685-018-0069-z
[32]
Hanna Schneider, Katharina Frison, Julie Wagner, and Andras Butz. 2016. CrowdUX: A Case for Using Widespread and Lightweight Tools in the Quest for UX. In DIS 2016 - Proceedings of the 2016 ACM Conference on Designing Interactive Systems. Association for Computing Machinery (ACM), New York, NY, USA, 415--425. https://doi.org/10.1145/2901790.2901814
[33]
Jean Scholtz. 2001. Adaptation of Traditional Usability Testing Methods for Remote Testing. In Proceedings of the 34th Annual Hawaii International Conference on System Sciences (HICSS-34). IEEE Computer Society, Los Alamitos, CA, USA, 5030. https://doi.org/10.1109/HICSS.2001.926546
[34]
Norbert Seyff, Gregor Ollmann, and Manfred Bortenschlager. 2014. AppEcho: A User-Driven, In Situ Feedback Approach for Mobile Platforms and Applications. In Proceedings of the 1st International Conference on Mobile Software Engineering and Systems. Association for Computing Machinery (ACM), New York, NY, USA, 99--108. https://doi.org/10.1145/2593902.2593927
[35]
Melanie Stade, Marc Oriol, Oscar Cabrera, Farnaz Fotrousi, Ronnie Schaniel, Norbert Seyff, and Oleg Schmidt. 2017. Providing a User Forum is not Enough: First Experiences of a Software Company with CrowdRE. In Proceedings - 2017 IEEE 25th International Requirements Engineering Conference Workshops, REW 2017. IEEE, New York, NY, USA, 164--169. https://doi.org/10.1109/REW.2017.21
[36]
S Shyam Sundar, Haiyan Jia, T Franklin Waddell, and Yan Huang. 2017. Toward a Theory of Interactive Media Effects (TIME). In The Handbook of the Psychology of Communication Technology. Chichester, West Sussex, UK and Malden, Massachusetts and Boston, Massachusetts, 47--86. https://doi.org/10.1002/9781118426456.ch3
[37]
Ralf A.L.F. van Griethuijsen, Michiel W. van Eijck, Helen Haste, Perry J. den Brok, Nigel C. Skinner, Nasser Mansour, Ayse Savran Gencer, and Saouma BouJaoude. 2015. Global Patterns in Students' Views of Science and Interest in Science. Research in Science Education, Vol. 45, 4 (8 2015), 581--603. https://doi.org/10.1007/s11165-014--9438--6
[38]
Karel Vredenburg, Ji Ye Mao, Paul W. Smith, and Tom Carey. 2002. A Survey of User-Centered Design Practice. In Proceedings of the 2002 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 471--478. https://doi.org/10.1145/503376.503460
[39]
Helen Wauck, Yu-Chun Yen, Wai-Tat Fu, Elizabeth Gerber, Steven P Dow, and Brian P Bailey. 2017. From in the Class or in the Wild?. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 5580--5591. https://doi.org/10.1145/3025453.3025477
[40]
Jane Webster and Hayes Ho. 1997. Audience Engagement in Multimedia Presentations. ACM SIGMIS Database: the DATABASE for Advances in Information Systems, Vol. 28, 2 (1997), 63--77. https://doi.org/10.1145/264701.264706
[41]
Jacob O Wobbrock, Leah Findlater, Darren Gergle, and James J Higgins. 2011. The Aligned Rank Transform for Nonparametric Factorial Analyses Using Only ANOVA Procedures. In Proceedings of the 2011 CHI Conference Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 143--146. https://doi.org/10.1145/1978942
[42]
Y Wayne Wu and Brian P Bailey. 2016. Novices Who Focused or Experts Who Didn't? In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 4086--4097. https://doi.org/10.1145/2858036.2858330
[43]
Y Wayne Wu and Brian P Bailey. 2021. Better Feedback from Nicer People: Narrative Empathy and Ingroup Framing Improve Feedback Exchange. Proceedings of the ACM on Human-Computer Interaction, Vol. 4, CSCW3 (2021), 1--20. https://doi.org/10.1145/3432935
[44]
Anbang Xu and Brian P Bailey. 2011. A Crowdsourcing Model for Receiving Design Critique. In Proceedings of the 2011 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 1183--1188. https://doi.org/10.1145/1979742.1979745
[45]
Anbang Xu and Brian P Bailey. 2014. A System for Receiving Crowd Feedback on Visual Designs Abstract. In CSCW Companion '14: Proceedings of the Companion Publication of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing. Association for Computing Machinery (ACM), New York, NY, USA, 37--40.
[46]
Yu-Chun Grace Yen, Steven P Dow, Elizabeth Gerber, and Brian P Bailey. 2017. Listen to Others, Listen to Yourself. In Proceedings of the 2017 ACM SIGCHI Conference on Creativity and Cognition. Association for Computing Machinery (ACM), New York, NY, USA, 158--170. https://doi.org/10.1145/3059454.3059468
[47]
Alvin Yuan, Kurt Luther, Markus Krause, Sophie Isabel Vennix, Steven P Dow, and Björn Bjorn Hartmann. 2016. Almost an Expert: The Effects of Rubrics and Expertise on Perceived Value of Crowdsourced Design Critiques. In Proceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW, Vol. 27. Association for Computing Machinery (ACM), New York, NY, USA, 1005--1017. https://doi.org/10.1145/2818048.2819953

Cited By

View all
  • (2025)Enhancing peer feedback provision through user interface scaffolding: A comparative examination of scripting and self-monitoring techniquesComputers & Education10.1016/j.compedu.2025.105260(105260)Online publication date: Feb-2025
  • (2024)Paintings, Not Noise—The Role of Presentation Sequence in LabelingInteracting with Computers10.1093/iwc/iwae008Online publication date: 15-Mar-2024
  • (2023)Scalable Design Evaluation for Everyone! Designing Configuration Systems for Crowd-Feedback Request GenerationProceedings of Mensch und Computer 202310.1145/3603555.3603566(91-100)Online publication date: 3-Sep-2023
  • Show More Cited By

Index Terms

  1. Aligning Crowdworker Perspectives and Feedback Outcomes in Crowd-Feedback System Design

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Proceedings of the ACM on Human-Computer Interaction
    Proceedings of the ACM on Human-Computer Interaction  Volume 7, Issue CSCW1
    CSCW
    April 2023
    3836 pages
    EISSN:2573-0142
    DOI:10.1145/3593053
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 April 2023
    Published in PACMHCI Volume 7, Issue CSCW1

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. crowd-feedback system
    2. crowdsourcing
    3. design
    4. experimental study
    5. feedback
    6. qualitative interviews

    Qualifiers

    • Research-article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)75
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 14 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Enhancing peer feedback provision through user interface scaffolding: A comparative examination of scripting and self-monitoring techniquesComputers & Education10.1016/j.compedu.2025.105260(105260)Online publication date: Feb-2025
    • (2024)Paintings, Not Noise—The Role of Presentation Sequence in LabelingInteracting with Computers10.1093/iwc/iwae008Online publication date: 15-Mar-2024
    • (2023)Scalable Design Evaluation for Everyone! Designing Configuration Systems for Crowd-Feedback Request GenerationProceedings of Mensch und Computer 202310.1145/3603555.3603566(91-100)Online publication date: 3-Sep-2023
    • (2023)CrowdSurfer: Seamlessly Integrating Crowd-Feedback Tasks into Everyday Internet SurfingProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580994(1-16)Online publication date: 19-Apr-2023

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media