ABSTRACT
In the product design process, it is often desirable to quickly obtain information about current user behaviors for topics that cannot be obtained through existing data or instrumentation. Perhaps we would like to understand the use of products we do not have access to or perhaps the action we would like to know about (such as using a coupon) is an action taken outside of a system that can be instrumented. Traditionally, large market research surveys would be conducted to answer these questions, but often designers need answers much faster. We present a study investigating the reliability of fast survey platforms such as Amazon Mechanical Turk and Survey Monkey as compared to larger market research studies for technology behavior research and show that results can be obtained in hours for much smaller costs with accuracy within 10% of traditional larger surveys. This demonstrates that we can rely more heavily on these platforms in the product design process and provide much faster planning iterations that are informed by actual usage data.
- Frank R. Bentley, S. Tejaswi Peesapati, and Karen Church. 2016. "I thought she would like to read it": Exploring Sharing Behaviors in the Context of Declining Mobile Web Use. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 1893--1903. DOI: http://dx.doi.org/10.1145/2858036.2858056 Google ScholarDigital Library
- Adam J. Berinsky, Gregory A. Huber, and Gabriel S. Lenz. Evaluating online labor markets for experimental research: Amazon. com's Mechanical Turk. Political Analysis 20.3 (2012): 351--368. Google ScholarCross Ref
- Michael Buhrmester, Tracy Kwang, and Samuel D. Gosling. Amazon's Mechanical Turk a new source of inexpensive, yet high-quality, data?. Perspectives on psychological science 6.1 (2011): 3--5. Google ScholarCross Ref
- Joseph K. Goodman, Cynthia E. Cryder, and Amar Cheema. Data collection in a flat world: The strengths and weaknesses of Mechanical Turk samples. Journal of Behavioral Decision Making 26.3 (2013): 213--224. Google ScholarCross Ref
- Horton, J.J., Rand, D.G. & Zeckhauser, R.J. Exp Econ (2011) 14: 399. Google ScholarCross Ref
- Winter Mason and Siddharth Suri. "Conducting behavioral research on Amazon's Mechanical Turk." Behavior research methods 44.1 (2012): 123.Google ScholarCross Ref
- Amos Tversky, and Daniel Kahneman. The framing of decisions and the psychology of choice. Environmental Impact Assessment, Technology Assessment, and Risk Analysis. Springer Berlin Heidelberg, 1985. 107--129.Google Scholar
- Amos Tversky, and Daniel Kahneman. Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological review 90.4 (1983): 293.Google Scholar
- Gabriele Paolacci, Jesse Chandler, and Panagiotis G. Ipeirotis. Running experiments on amazon mechanical turk. Judgment and Decision making 5.5 (2010): 411--419.Google Scholar
- Neil Stewart, Christoph Ungemach, Adam JL Harris, Daniel M. Bartels, Ben R. Newell, Gabriele Paolacci, and Jesse Chandler. The average laboratory samples a population of 7,300 Amazon Mechanical Turk workers. Judgment and Decision Making 10, no. 5 (2015): 479.Google ScholarCross Ref
Index Terms
- Comparing the Reliability of Amazon Mechanical Turk and Survey Monkey to Traditional Market Research Surveys
Recommendations
Investigating the Amazon Mechanical Turk Market Through Tool Design
We developed TurkBench to better understand the work of crowdworkers on the Amazon Mechanical Turk (AMT) marketplace. While we aimed to reduce the amount of invisible, unpaid work that these crowdworkers performed, we also probed the day-to-day ...
Crowdsourcing Quality Concerns: An Examination of Amazon’s Mechanical Turk
SIGITE '22: Proceedings of the 23rd Annual Conference on Information Technology EducationThe use of crowdsourcing platforms, such as Amazon’s Mechanical Turk (MTurk), have been an effective and frequent tool for researchers to gather data from participants for a study. It provides a fast, efficient, and cost-effective method for acquiring ...
Investigating the Accessibility of Crowdwork Tasks on Mechanical Turk
CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing SystemsCrowdwork can enable invaluable opportunities for people with disabilities, not least the work flexibility and the ability to work from home, especially during the current Covid-19 pandemic. This paper investigates how engagement in crowdwork tasks is ...
Comments