skip to main content
research-article

Aligning Crowdworker Perspectives and Feedback Outcomes in Crowd-Feedback System Design

Published:16 April 2023Publication History
Skip Abstract Section

Abstract

Leveraging crowdsourcing in software development has received growing attention in research and practice. Crowd feedback offers a scalable and flexible way to evaluate software design solutions and the potential of crowd-feedback systems has been demonstrated in different contexts by existing research studies. However, previous research lacks a deep understanding of the effects of individual design features of crowd-feedback systems on feedback quality and quantity. Additionally, existing studies primarily focused on understanding the requirements of feedback requesters but have not fully explored the qualitative perspectives of crowd-based feedback providers. In this paper, we address these research gaps with two research studies. In study 1, we conducted a feature analysis (N=10) and concluded that from a user perspective, a crowd-feedback system should have five core features (scenario, speech-to-text, markers, categories, and star rating). In the second study, we analyzed the effects of the design features on crowdworkers' perceptions and feedback outcomes (N=210). We learned that offering feedback providers scenarios as the context of use is perceived as most important. Regarding the resulting feedback quality, we discovered that more features are not always better as overwhelming feedback providers might decrease feedback quality. Offering feedback providers categories as inspiration can increase the feedback quantity. With our work, we contribute to research on crowd-feedback systems by aligning crowdworker perspectives and feedback outcomes and thereby making the software evaluation not only more scalable but also more human-centered.

Skip Supplemental Material Section

Supplemental Material

v7cscw023.mp4

mp4

3.8 MB

References

  1. Oshrat Ayalon and Eran Toch. 2019. A/P(rivacy) Testing: Assessing Applications for Social and Institutional Privacy. In Extended Abstracts of the 2019 CHI Conference. Association for Computing Machinery (ACM), New York, NY, USA, 1--6. https://doi.org/10.1145/3290607.3312972Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Virginia Braun and Victoria Clarke. 2006. Using Thematic Analysis in Psychology. Qualitative Research in Psychology, Vol. 3, 2 (2006), 77--101. https://doi.org/10.1191/1478088706qp063oaGoogle ScholarGoogle ScholarCross RefCross Ref
  3. Manuel Brhel, Hendrik Meth, Alexander Maedche, and Karl Werder. 2015. Exploring Principles of User-Centered Agile Software Development: A Literature Review. Information and Software Technology, Vol. 61 (2015), 163--181. https://www.sciencedirect.com/science/article/pii/S0950584915000129Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Yoonseo Choi, Toni-Jan Jan Keith Palma Monserrat, Jeongeon Park, Hyungyu Shin, Nyoungwoo Lee, and Juho Kim. 2021. ProtoChat: Supporting the Conversation Design Process with Crowd Feedback. Proceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW, Vol. 4, CSCW3 (2021), 19--23. https://doi.org/10.1145/3432924Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Fred D. Davis. 1989. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly: Management Information Systems, Vol. 13, 3 (1989), 319--339. https://doi.org/10.2307/249008Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Dennis Dobler, Sarah Friedrich, and Markus Pauly. 2020. Nonparametric MANOVA in Meaningful Effects. Annals of the Institute of Statistical Mathematics, Vol. 72, 4 (8 2020), 997--1022. https://doi.org/10.1007/s10463-019-00717--3Google ScholarGoogle ScholarCross RefCross Ref
  7. Steven Dow, Elizabeth Gerber, and Audris Wong. 2013. A Pilot Study of Using Crowds in the Classroom. In Proceedings of the 2013 Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 227--236. https://doi.org/10.1145/2470654.2470686Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Matthew W Easterday, Daniel Rees Lewis, and Elizabeth M Gerber. 2017. Designing Crowdcritique Systems for Formative Feedback. International Journal of Artificial Intelligence in Education, Vol. 27, 3 (2017), 623--663. https://doi.org/10.1007/s40593-016-0125--9Google ScholarGoogle ScholarCross RefCross Ref
  9. Anita Gibbs. 1997. Focus Groups. Social research update, Vol. 19, 8 (1997), 1--8.Google ScholarGoogle Scholar
  10. Michael D Greenberg, Matthew W Easterday, and Elizabeth M Gerber. 2015. Critiki: A Scaffolded Approach to Gathering Design Feedback from Paid Crowdworkers. In C and C 2015 - Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition. Association for Computing Machinery (ACM), New York, NY, USA, 235--244. https://doi.org/10.1145/2757226.2757249Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Joe F. Hair, Marko Sarstedt, Lucas Hopkins, and Volker G. Kuppelwieser. 2014. Partial Least Squares Structural Equation Modeling (PLS-SEM): An Emerging Tool in Business Research., 106--121 pages. https://doi.org/10.1108/EBR-10--2013-0128Google ScholarGoogle ScholarCross RefCross Ref
  12. Saskia Haug and Alexander Maedche. 2021a. Crowd-Feedback in Information Systems Development: A State-of-the-Art Review. In Proceedings of the 42nd International Conference on Information Systems (ICIS) 2021. Association for Information Systems (AIS), New York, NY, USA, 17. https://aisel.aisnet.org/icis2021/is_design/is_design/4Google ScholarGoogle Scholar
  13. Saskia Haug and Alexander Maedche. 2021b. Feeasy: An Interactive Crowd-Feedback System. In Adjunct Publication of the 34th Annual ACM Symposium on User Interface Software and Technology, UIST 2021. Association for Computing Machinery (ACM), New York, NY, USA, 41--43. https://doi.org/10.1145/3474349.3480224Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Catherine M. Hicks, Vineet Pandey, C. Ailie Fraser, and Scott Klemmer. 2016. Framing Feedback: Choosing Review Environment Features that Support High Quality Peer Assessment. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 458--469. https://doi.org/10.1145/2858036.2858195Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Starr R. Hiltz and Murray Turoff. 1985. Structuring Computer-Mediated Communication Systems to Avoid Information Overload. Commun. ACM, Vol. 28, 7 (7 1985), 680--689. https://doi.org/10.1145/3894.3895Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Markus Krause, Tom Garncarz, JiaoJiao Song, Elizabeth M Gerber, Brian P Bailey, and Steven P Dow. 2017. Critique Style Guide: Improving Crowdsourced Design Feedback with a Natural Language Model. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 4627--4639. https://doi.org/10.1145/3025453.3025883Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Niklas Leicht. 2018. Given Enough Eyeballs, all Bugs are Shallow - A Literature Review for the Use of Crowdsourcing in Software Testing. In Proceedings of the 51st Hawaii International Conference on System Sciences. Hawaii International Conference on System Sciences, Waikoloa, HI, USA, 4102--4111. https://doi.org/10.24251/HICSS.2018.515Google ScholarGoogle ScholarCross RefCross Ref
  18. Fritz Lekschas, Spyridon Ampanavos, Pao Siangliulue, Hanspeter Pfister, and Krzysztof Z Gajos. 2021. Ask Me or Tell Me? Enhancing the Effectiveness of Crowdsourced Design Feedback. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 12. https://doi.org/10.1145/3411764.3445507Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Yuping Liu. 2003. Developing a scale to measure the interactivity of websites. Journal of Advertising Research, Vol. 43, 2 (2003), 207--216. https://doi.org/10.1017/S0021849903030204Google ScholarGoogle ScholarCross RefCross Ref
  20. Kurt Luther, Amy Pavel, Wei Wu, Jari Lee Tolentino, Maneesh Agrawala, Björn Hartmann, and Steven Dow. 2014. CrowdCrit: Crowdsourcing and Aggregating Visual Design Critique. In CSCW Companion '14: Proceedings of the Companion Publication of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing. Association for Computing Machinery (ACM), New York, NY, USA, 21--24. https://doi.org/10.1145/2556420.2556788Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Kurt Luther, Jari Lee Tolentino, Wei Wu, Amy Pavel, Brian P Bailey, Maneesh Agrawala, Björn Hartmann, and Steven P Dow. 2015. Structuring, Aggregating, and Evaluating Crowdsourced Design Critique. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing. Association for Computing Machinery (ACM), New York, NY, USA, 473--485. https://doi.org/10.1145/2675133.2675283Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Xiaojuan Ma, Yu Li, Jodi Forlizzi, and Steven Dow. 2015. Exiting the Design Studio: Leveraging Online Participants for Early-Stage Design Feedback. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing. Association for Computing Machinery (ACM), New York, NY, USA, 676--685. https://doi.org/10.1145/2675133.2675174Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Wendy E Mackay. 2004. The Interactive Thread: Exploring Methods for Multi-Disciplinary Design. In Proceedings of the 5th Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques (DIS '04). Association for Computing Machinery, New York, NY, USA, 103--112. https://doi.org/10.1145/1013115.1013131Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. D Mu n ante, A Siena, F M Kifetew, A Susi, M Stade, and N Seyff. 2017. Gathering Requirements for Software Configuration from the Crowd. In 2017 IEEE 25th International Requirements Engineering Conference Workshops (REW). IEEE, 176--181. https://doi.org/10.1109/REW.2017.74Google ScholarGoogle ScholarCross RefCross Ref
  25. Michael Nebeling, Maximilian Speicher, and Moira C Norrie. 2013. CrowdStudy: General Toolkit for Crowdsourced Evaluation of Web Interfaces. In EICS '13 : proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems. Association for Computing Machinery (ACM), New York, NY; USA, 255. https://doi.org/10.1145/2494603.2480303Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Jonas Oppenlaender and Simo Hosio. 2019. Towards Eliciting Feedback for Artworks on Public Displays. In C&C '19. Association for Computing Machinery (ACM), New York, 562--569. https://doi.org/10.1145/3325480.3326583Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Jonas Oppenlaender, Elina Kuosmanen, Andrés Lucero, and Simo Hosio. 2021. Hardhats and Bungaloos: Comparing Crowdsourced Design Feedback with Peer Design Feedback in the Classroom. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 1--14. https://doi.org/10.1145/3411764.3445380Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Jonas Oppenlaender, Thanassis Tiropanis, and Simo Hosio. 2020. CrowdUI: Supporting Web Design with the Crowd. Proceedings of the ACM on Human-Computer Interaction, Vol. 4, EICS (2020), 1--28. https://doi.org/10.1145/3394978Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. David A Robb, Stefano Padilla, Britta Kalkreuter, and Mike J Chantler. 2015. Crowd Sourced Feedback with Imagery Rather than Text: Would Designers Use It?. In Proceedings of the 2015 CHI Conference on Human Factors in Computing Systems, Vol. 2015-April. Association for Computing Machinery (ACM), New York, NY, USA, 1355--1364. https://doi.org/10.1145/2702123.2702470Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. David A Robb, Stefano Padilla, Thomas S Methven, Britta Kalkreuter, and Mike J Chantler. 2017. Image-Based Emotion Feedback: How does the Crowd Feel? And Why?. In DIS 2017 - Proceedings of the 2017 ACM Conference on Designing Interactive Systems. Association for Computing Machinery (ACM), New York, NY, USA, 451--463. https://doi.org/10.1145/3064663.3064665Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Peter Gordon Roetzel. 2019. Information Overload in the Information Age: A Review of the Literature from Business Administration, Business Psychology, and Related Disciplines with a Bibliometric Approach and Framework Development. Business Research, Vol. 12, 2 (7 2019), 479--522. https://doi.org/10.1007/s40685-018-0069-zGoogle ScholarGoogle ScholarCross RefCross Ref
  32. Hanna Schneider, Katharina Frison, Julie Wagner, and Andras Butz. 2016. CrowdUX: A Case for Using Widespread and Lightweight Tools in the Quest for UX. In DIS 2016 - Proceedings of the 2016 ACM Conference on Designing Interactive Systems. Association for Computing Machinery (ACM), New York, NY, USA, 415--425. https://doi.org/10.1145/2901790.2901814Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Jean Scholtz. 2001. Adaptation of Traditional Usability Testing Methods for Remote Testing. In Proceedings of the 34th Annual Hawaii International Conference on System Sciences (HICSS-34). IEEE Computer Society, Los Alamitos, CA, USA, 5030. https://doi.org/10.1109/HICSS.2001.926546Google ScholarGoogle ScholarCross RefCross Ref
  34. Norbert Seyff, Gregor Ollmann, and Manfred Bortenschlager. 2014. AppEcho: A User-Driven, In Situ Feedback Approach for Mobile Platforms and Applications. In Proceedings of the 1st International Conference on Mobile Software Engineering and Systems. Association for Computing Machinery (ACM), New York, NY, USA, 99--108. https://doi.org/10.1145/2593902.2593927Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Melanie Stade, Marc Oriol, Oscar Cabrera, Farnaz Fotrousi, Ronnie Schaniel, Norbert Seyff, and Oleg Schmidt. 2017. Providing a User Forum is not Enough: First Experiences of a Software Company with CrowdRE. In Proceedings - 2017 IEEE 25th International Requirements Engineering Conference Workshops, REW 2017. IEEE, New York, NY, USA, 164--169. https://doi.org/10.1109/REW.2017.21Google ScholarGoogle ScholarCross RefCross Ref
  36. S Shyam Sundar, Haiyan Jia, T Franklin Waddell, and Yan Huang. 2017. Toward a Theory of Interactive Media Effects (TIME). In The Handbook of the Psychology of Communication Technology. Chichester, West Sussex, UK and Malden, Massachusetts and Boston, Massachusetts, 47--86. https://doi.org/10.1002/9781118426456.ch3Google ScholarGoogle ScholarCross RefCross Ref
  37. Ralf A.L.F. van Griethuijsen, Michiel W. van Eijck, Helen Haste, Perry J. den Brok, Nigel C. Skinner, Nasser Mansour, Ayse Savran Gencer, and Saouma BouJaoude. 2015. Global Patterns in Students' Views of Science and Interest in Science. Research in Science Education, Vol. 45, 4 (8 2015), 581--603. https://doi.org/10.1007/s11165-014--9438--6Google ScholarGoogle ScholarCross RefCross Ref
  38. Karel Vredenburg, Ji Ye Mao, Paul W. Smith, and Tom Carey. 2002. A Survey of User-Centered Design Practice. In Proceedings of the 2002 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 471--478. https://doi.org/10.1145/503376.503460Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Helen Wauck, Yu-Chun Yen, Wai-Tat Fu, Elizabeth Gerber, Steven P Dow, and Brian P Bailey. 2017. From in the Class or in the Wild?. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 5580--5591. https://doi.org/10.1145/3025453.3025477Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Jane Webster and Hayes Ho. 1997. Audience Engagement in Multimedia Presentations. ACM SIGMIS Database: the DATABASE for Advances in Information Systems, Vol. 28, 2 (1997), 63--77. https://doi.org/10.1145/264701.264706Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Jacob O Wobbrock, Leah Findlater, Darren Gergle, and James J Higgins. 2011. The Aligned Rank Transform for Nonparametric Factorial Analyses Using Only ANOVA Procedures. In Proceedings of the 2011 CHI Conference Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 143--146. https://doi.org/10.1145/1978942Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Y Wayne Wu and Brian P Bailey. 2016. Novices Who Focused or Experts Who Didn't? In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 4086--4097. https://doi.org/10.1145/2858036.2858330Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Y Wayne Wu and Brian P Bailey. 2021. Better Feedback from Nicer People: Narrative Empathy and Ingroup Framing Improve Feedback Exchange. Proceedings of the ACM on Human-Computer Interaction, Vol. 4, CSCW3 (2021), 1--20. https://doi.org/10.1145/3432935Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Anbang Xu and Brian P Bailey. 2011. A Crowdsourcing Model for Receiving Design Critique. In Proceedings of the 2011 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery (ACM), New York, NY, USA, 1183--1188. https://doi.org/10.1145/1979742.1979745Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Anbang Xu and Brian P Bailey. 2014. A System for Receiving Crowd Feedback on Visual Designs Abstract. In CSCW Companion '14: Proceedings of the Companion Publication of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing. Association for Computing Machinery (ACM), New York, NY, USA, 37--40.Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Yu-Chun Grace Yen, Steven P Dow, Elizabeth Gerber, and Brian P Bailey. 2017. Listen to Others, Listen to Yourself. In Proceedings of the 2017 ACM SIGCHI Conference on Creativity and Cognition. Association for Computing Machinery (ACM), New York, NY, USA, 158--170. https://doi.org/10.1145/3059454.3059468Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Alvin Yuan, Kurt Luther, Markus Krause, Sophie Isabel Vennix, Steven P Dow, and Björn Bjorn Hartmann. 2016. Almost an Expert: The Effects of Rubrics and Expertise on Perceived Value of Crowdsourced Design Critiques. In Proceedings of the ACM Conference on Computer Supported Cooperative Work, CSCW, Vol. 27. Association for Computing Machinery (ACM), New York, NY, USA, 1005--1017. https://doi.org/10.1145/2818048.2819953Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Aligning Crowdworker Perspectives and Feedback Outcomes in Crowd-Feedback System Design

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    Full Access

    • Published in

      cover image Proceedings of the ACM on Human-Computer Interaction
      Proceedings of the ACM on Human-Computer Interaction  Volume 7, Issue CSCW1
      CSCW
      April 2023
      3836 pages
      EISSN:2573-0142
      DOI:10.1145/3593053
      Issue’s Table of Contents

      Copyright © 2023 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 16 April 2023
      Published in pacmhci Volume 7, Issue CSCW1

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader