skip to main content
10.1145/3544549.3585836acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Work in Progress

Prototypes, Platforms and Protocols: Identifying Common Issues with Remote, Unmoderated Studies and their Impact on Research Participants

Published: 19 April 2023 Publication History

Abstract

Remote, unmoderated research platforms have increased the efficiency of traditional design research approaches such as usability testing, while also allowing practitioners to collect more diverse user perspectives than afforded by lab-based methods. The self-service nature of these platforms has also increased the number of studies created by requestors without formal research training. Past research has explored the quality and validity of research findings on these platforms, but little is known about the everyday issues participants face while completing these studies. We conducted an interview-based study with 22 experienced research participants to understand what issues are most commonly encountered and how participants mitigate issues as they arise. We found that a majority of the issues surface across research platforms, requestor protocols and prototypes, and participant responses range from filing support tickets to simply quitting studies. We discuss the consequences of these issues and provide recommendations for researchers and platforms.

Supplementary Material

MP4 File (3544549.3585836-talk-video.mp4)
Pre-recorded Video Presentation

References

[1]
Carol Barnum. 2019. The State of UX Research. J. Usability Studies 15, 1 (nov 2019), 1–7.
[2]
Frank Bentley and Edward Barrett. 2012. Building mobile experiences. MIT Press, Cambridge, MA, USA.
[3]
Frank Bentley, Kathleen O’Neill, Katie Quehl, and Danielle Lottridge. 2020. Exploring the Quality, Efficiency, and Representative Nature of Responses Across Multiple Survey Panels. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–12. https://doi.org/10.1145/3313831.3376671
[4]
Anders Bruun, Peter Gull, Lene Hofmeister, and Jan Stage. 2009. Let Your Users Do the Testing: A Comparison of Three Remote Asynchronous Usability Testing Methods. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Boston, MA, USA) (CHI ’09). Association for Computing Machinery, New York, NY, USA, 1619–1628. https://doi.org/10.1145/1518701.1518948
[5]
Jason Buhle. 2021. The Best of Times for UX Research, the Worst of Times for Usability Research?J. Usability Studies 16, 3 (may 2021), 148–155.
[6]
Umer Farooq and Joseph T. Munko. 2015. Industry Is Changing, and So Must We. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (Seoul, Republic of Korea) (CHI EA ’15). Association for Computing Machinery, New York, NY, USA, 655–661. https://doi.org/10.1145/2702613.2702962
[7]
Stephen Giff and Huseyin Dogan. 2016. UX Research is Dead. Long live UX Research!. In Proceedings of the 30th International BCS Human Computer Interaction Conference 30. BCS Learning ‘I&’ Development Ltd, Poole, UK, 1–3.
[8]
Lilly C. Irani and M. Six Silberman. 2013. Turkopticon: Interrupting Worker Invisibility in Amazon Mechanical Turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Paris, France) (CHI ’13). Association for Computing Machinery, New York, NY, USA, 611–620. https://doi.org/10.1145/2470654.2470742
[9]
Florian Keusch, Bernad Batinic, and Wolfgang Mayerhofer. 2014. Motives for joining nonprobability online panels and their association with survey participation behavior. John Wiley ‘I&’ Sons, Ltd, West Sussex, UK, Chapter 8, 171–191. https://doi.org/10.1002/9781118763520.ch8 arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1002/9781118763520.ch8
[10]
Kara Pernice. 2022. Democratize User Research in 5 Steps. Nielsen Norman Group. Retrieved January 19, 2023 from https://www.nngroup.com/articles/democratize-user-research/
[11]
Helen Petrie and Mitchell Wakefield. 2021. Remote Moderated and Unmoderated Evaluation by Users with Visual Disabilities of an Online Registration and Authentication System for Health Services. In 9th International Conference on Software Development and Technologies for Enhancing Accessibility and Fighting Info-Exclusion (Online, Portugal) (DSAI 2020). Association for Computing Machinery, New York, NY, USA, 38–43. https://doi.org/10.1145/3439231.3439248
[12]
Steven Schirra and Chris Allison. 2018. "I Know What the Companies Are Wanting More of": Professional Participants in Online Usability Studies. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI EA ’18). Association for Computing Machinery, New York, NY, USA, 1–6. https://doi.org/10.1145/3170427.3188681
[13]
Vanessa Williamson. 2016. On the Ethics of Crowdsourced Research. PS: Political Science ‘I&’ Politics 49, 1 (2016), 77–81. https://doi.org/10.1017/S104909651500116X

Index Terms

  1. Prototypes, Platforms and Protocols: Identifying Common Issues with Remote, Unmoderated Studies and their Impact on Research Participants

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems
      April 2023
      3914 pages
      ISBN:9781450394222
      DOI:10.1145/3544549
      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 19 April 2023

      Check for updates

      Author Tags

      1. participant experience
      2. remote research
      3. unmoderated testing
      4. usability testing

      Qualifiers

      • Work in progress
      • Research
      • Refereed limited

      Conference

      CHI '23
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

      Upcoming Conference

      CHI 2025
      ACM CHI Conference on Human Factors in Computing Systems
      April 26 - May 1, 2025
      Yokohama , Japan

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 239
        Total Downloads
      • Downloads (Last 12 months)54
      • Downloads (Last 6 weeks)28
      Reflects downloads up to 05 Mar 2025

      Other Metrics

      Citations

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Full Text

      View this article in Full Text.

      Full Text

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media