ABSTRACT
In this paper we discuss a screening process used in conjunction with a survey administered via Amazon.com's Mechanical Turk. We sought an easily implementable method to disqualify those people who participate but don't take the study tasks seriously. By using two previously pilot tested screening questions, we identified 764 of 1,962 people who did not answer conscientiously. Young men seem to be most likely to fail the qualification task. Those that are professionals, students, and non-workers seem to be more likely to take the task seriously than financial workers, hourly workers, and other workers. Men over 30 and women were more likely to answer seriously.
- Amazon Mechanical Turk. https://www.mturk.comGoogle Scholar
- Amazon Mechanical Turk: Best Practices Guide. http://mturkpublic.s3.amazonaws.com/docs/MTURK_BP.pdfGoogle Scholar
- Atwood, J. Is Amazon's Mechanical Turk a Failure? Coding Horror: Programming and Human Factors, April 9, 2007. http://www.codinghorror.com/blog/archives/000828.htmlGoogle Scholar
- Callison-Burch, C. Fast, cheap, and creative: Evaluating translation quality using Amazon's Mechanical Turk, In Proc. EMNLP 2009, ACL and AFNLP (2009), 286--295. Google ScholarDigital Library
- Ipeirotis, P. Turker demographics vs. Internet demographics. http://behind-the-enemylines. blogspot.com/2009/03/turker-demographics-vsinternet. html. 2009.Google Scholar
- Kittur, A., Chi, E.H., and Suh, B. Crowdsourcing User Studies with Mechanical Turk. In Proc. CHI 2008, ACM Press, (2008), 453--456. Google ScholarDigital Library
- MdFedries, P. Technically Speaking: It's a Wiki, Wiki World. IEEE spectrum, 43, (2006), 88. Google ScholarDigital Library
- Sheng, S., Holbrook, M., Kumaraguru, P., Cranor, L., & Downs, J. Who falls for phish? A demographic analysis of phishing susceptibility and effectiveness of interventions. In Proc. CHI 2010, ACM Press, (2010). Google ScholarDigital Library
Index Terms
- Are your participants gaming the system?: screening mechanical turk workers
Recommendations
Who are the crowdworkers?: shifting demographics in mechanical turk
CHI EA '10: CHI '10 Extended Abstracts on Human Factors in Computing SystemsAmazon Mechanical Turk (MTurk) is a crowdsourcing system in which tasks are distributed to a population of thousands of anonymous workers for completion. This system is increasingly popular with researchers and developers. Here we extend previous ...
Understanding the Microtask Crowdsourcing Experience for Workers with Disabilities: A Comparative View
CSCWMicrotask crowdsourcing holds great potential as an employment opportunity with the flexibility and anonymity that individuals with disability may require. Though prior research has explored the accessibility of crowd work, the lived crowd work ...
Anatomy of a Crowdsourcing Platform - Using the Example of Microworkers.com
IMIS '11: Proceedings of the 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous ComputingSince Jeff Howe introduced the term "crowdsourcing" in 2006 for the first time, crowd sourcing has be come a growing market in the current Internet. Thousands of workers categorize images, write articles or perform other small tasks on platforms like ...
Comments