Abstract
The driving force behind digital crowdsourcing are its workers: working, hidden behind the scenes, churning out data in experiments, participating in research studies, completing little tasks to accomplish HITs online. Understanding workers and crowdwork better is therefore key to develop a more effective and fair use of crowdsourcing for research. This chapter attempts to help develop an understanding of the various aspects of the crowd by drawing parallels between workers of different platforms (AMT, Microworkers and Crowdee) through quantitative and qualitative analysis of current and newly collected data. A picture of the crowd is drawn by uncovering their motivations, workplaces, skills and infrastructure, issues and perspectives about the design of microtasks, the employers and the microtask-based platforms. Legal and ethical perspectives on crowdwork are also discussed, and online resources are reviewed that researchers can use as a primer to employ crowdworkers in an ethical and fair way. The chapter provides information, a review of internationally recognised ethical principles and practical advice to those who would like to use crowdsourcing for experiments and to carry out research studies as an informed researcher and crowd employer.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
http://mturk.com last accessed 14 Jun 2017.
- 2.
E.g. http://www.bbc.co.uk/nature/22694347 last accessed 14 Jun 2017.
- 3.
http://microworkers.com last accessed 14 Jun 2017.
- 4.
http://crowdee.de last accessed 14 Jun 2017.
- 5.
Crowdworkers in MTurk.
- 6.
The terms ‘task’ and ‘microtask’ have been used interchangeably here due to the use of multiple platforms that have different terminology for microtasks on them.
- 7.
http://wiki.wearedynamo.org/index.php?title=Fair_payment last accessed 14 Jun 2017.
- 8.
http://www.wes.org/educators/pdf/IndiaPolicyPacket.pdf last accessed 14 Jun 2017.
- 9.
http://www.rediff.com/getahead/report/career-your-skills-not-degree-will-get-you-a-job/20150408.htm last accessed 14 Jun 2017.
- 10.
http://www.wsj.com/articles/SB10001424052748703515504576142092863219826 last accessed 14 Jun 2017.
- 11.
Note that, going to school might have been misunderstood by workers with poor English, as to them education is disseminated in ‘colleges’ and ‘universities’, and not in ‘schools’. We are replicating our survey based on previous studies here, hence we did not change the terminology in this case.
- 12.
The other explanation is that the workers spent time ‘searching’ for work in the ‘hopes’ that they will find something before the end of the day. There isn’t data to confirm this from our surveys but the ethnographic studies have. One such exemplar is where an Indian worker searches through HITs on MTurk for as long as 20 min at a stretch ‘hoping’ to find his or her preferred type of work.
- 13.
http://www.prb.org/Publications/Articles/2012/india-2011-census.aspx last accessed 14 Jun 2017.
- 14.
http://www.pewinternet.org/2015/10/29/technology-device-ownership-2015/ last accessed 14 Jun 2017, http://www.pewinternet.org/2015/06/26/americans-internet-access-2000-2015/ last accessed 14 Jun 2017.
- 15.
http://www.citylab.com/work/2015/09/mapping-the-difference-between-minimum-wage-and-cost-of-living/404644/ last accessed 14 Jun 2017.
- 16.
According to the OECD the net national income in India was $3,718 per year and capita in 2009. https://data.oecd.org/natincome/net-national-income.htm last accessed 14 Jun 2017.
- 17.
- 18.
- 19.
Last accessed (the following) 14 Jun 2017, https://www.reddit.com/r/HITsWorthTurkingFor/wiki/index, http://www.cloudmebaby.com/forums/portal.php, http://www.mturkforum.com/, http://turkernation.com/, http://www.mturkgrind.com/.
- 20.
https://turkopticon.ucsd.edu/ last accessed 14 Jun 2017.
- 21.
http://www.turkalert.com/ last accessed 14 Jun 2017.
- 22.
http://turkernation.com/forumdisplay.php?167-mTurk-Scripts-Programs-amp-Tools last accessed 14 Jun 2017.
- 23.
For information on ‘precarious work’: http://www.laborrights.org/issues/precarious-work last accessed 14 Jun 2017.
- 24.
https://www.mturk.com/mturk/help?helpPage=policies last accessed 14 Jun 2017.
- 25.
http://wiki.wearedynamo.org/index.php/Guidelines_for_Academic_Requesters last accessed 14 Jun 2017.
- 26.
http://www.overtimepaylaws.org/federal-court-approves-settlement-in-crowdsourcing-labor-company-wage-suit/ last accessed 14 Jun 2017.
- 27.
http://www.un.org/en/universal-declaration-human-rights/ last accessed 14 Jun 2017.
- 28.
http://www.cirp.org/library/ethics/nuremberg/ last accessed 14 Jun 2017.
- 29.
- 30.
http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html last accessed 14 Jun 2017.
- 31.
One of the interesting features is that are a number of situations – relatively hidden – where a relatively stable workforce of Turkers work for a given requester over a reasonable period of time.
- 32.
http://crowdsourcing-code.com/documents/5/Code_of_Conduct_Crowdworking_English_072015 last accessed 14 Jun 2017.
- 33.
It should be noted that Germany has a strong trade union tradition – and a culture of cooperation between companies and workers that persists to this day, and that it has been progressive in its approach to crowdsourcing labour rights.
References
Antin, J., Shaw, A.: Social desirability bias and self-reports of motivation: a study of Amazon mechanical turk in the US and India. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2012), pp. 2925–2934 (2012)
Bederson, B.B., Quinn, A.J.: Web workers unite! Addressing challenges of online laborers. In: CHI 2011 Extended Abstracts on Human Factors in Computing Systems, pp. 97–106. ACM (2011)
Berg, J.: Income security in the on-demand economy: findings and policy lessons from a survey of crowdworkers. Comp. Labor Law Policy J. 37(3) (2016)
Bourne, K.A., Forman, P.J.: Living in a culture of overwork an ethnographic study of flexibility. J. Manag. Inquiry 23(1), 68–79 (2014)
Brawley, A.M., Pury, C.L.: Work experiences on MTurk: job satisfaction, turnover, and information sharing. Comput. Hum. Behav. 54, 531–546 (2016)
Callison-Burch, C.: Crowd-workers: aggregating information across turkers to help them find higher paying work. In: Second AAAI Conference on Human Computation and Crowdsourcing (2014)
Chandler, J., Mueller, P., Paolacci, G.: Nonnavet among Amazon mechanical turk workers: consequences and solutions for behavioral researchers. Behav. Res. Methods 46(1), 112–130 (2014)
Crary, J.: 24/7: Late Capitalism and the Ends of Sleep. Verso Books, New York (2013)
Deci, E., Ryan, R.: Intrinsic Motivation and Self-determination in Human Behavior. Plenum Press, New York (1985)
Difallah, D.E., Catasta, M., Demartini, G., Ipeirotis, P.G., Cudr-Mauroux, P.: The dynamics of micro-task crowdsourcing: the case of Amazon MTurk. In: Proceedings of the 24th International Conference on World Wide Web (WWW 2015), pp. 238–247. ACM (2015)
Felstiner, A.: Working the crowd: employment and labor law in the crowdsourcing industry. Berkeley J. Employ. Labor Law 32, 143–203 (2011)
Finkin, M.: Beclouded work in historical perspective. Comp. Labor Law Policy J. 37(3) (2016)
Fort, K., Adda, G., Cohen, K.B.: Amazon mechanical turk: gold mine or coal mine? Comput. Linguist. 37(2), 413–420 (2011)
Gagné, M., Deci, E.L.: Self-determination theory and work motivation. J. Organ. Behav. 26(4), 331–362 (2005)
Garfinkel, H.: A conception of and experiments with “trust” as a condition of concerted stable actions. In: The Production of Reality: Essays and Readings on Social Interaction, pp. 381–392 (1963)
Gupta, N.: An ethnographic study of crowdwork via Amazon mechanical turk in India. Unpublished manuscript (2017). http://eprints.nottingham.ac.uk/41062/
Gupta, N., Martin, D., Hanrahan, B.V., O’Neill, J.: Turk-life in India. In: Proceedings of the 18th International Conference on Supporting Group Work, pp. 1–11. ACM (2014)
Hanhart, P., Korshunov, P., Ebrahimi, T.: Crowdsourcing evaluation of high dynamic range image compression. In: SPIE Optical Engineering + Applications, p. 92170D. International Society for Optics and Photonics (2014)
Hanrahan, B.V., Willamowski, J.K., Swaminathan, S., Martin, D.B.: TurkBench: rendering the market for turkers. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 1613–1616. ACM (2015)
Health UDo, Services H et al.: The Belmont report. Office for Human Research Protections (OHRP) (1979). Accessed 19 Nov 2008
Hirth, M., Hofeld, T., Tran-Gia, P.: Anatomy of a crowdsourcing platform-using the example of microworkers.com. In: 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS), pp. 322–329. IEEE (2011)
Ipeirotis, P.G.: Demographics of mechanical turk (2010)
Ipeirotis, P.G.: Analyzing the Amazon mechanical turk marketplace. XRDS 17(2), 16–21 (2010)
Ipeirotis, P.: Demographics of mechanical turk: now live! April 2015. http://www.behind-the-enemy-lines.com/2015/04/demographics-of-mechanical-turk-now.html
Irani, L.C., Silberman, M.S.: Turkopticon: interrupting worker invisibility in Amazon mechanical turk. In: Proceedings of CHI 2013 (CHI 2013), pp. 611–620. ACM (2013)
Kaufmann, N., Schulze, T., Veit, D.: More than fun and money. Worker motivation in crowdsourcing: a study on mechanical turk. In: Proceedings of the Seventeenth Americas Conference on Information Systems, pp. 1–11 (2011)
Kazai, G., Kamps, J., Milic-Frayling, N.: The face of quality in crowdsourcing relevance labels: demographics, personality and labeling accuracy. In: Proceedings of the 21st ACM International Conference on Information and Knowledge Management, pp. 2583–2586. ACM (2012)
Kittur, A., Nickerson, J.V., Bernstein, M., Gerber, E., Shaw, A., Zimmerman, J., Lease, M., Horton, J.: The future of crowd work. In: Proceedings of the 2013 Conference on Computer Supported Cooperative Work, p. 1301. ACM Press (2013)
Kuek, S.C., Paradi-Guilford, C., Fayomi, T., Imaizumi, S., Ipeirotis, P., Pina, P., Singh, M.: The global opportunity in online outsourcing. Technical report, The World Bank (2015)
Marshall, C.C., Shipman, F.M.: Experiences surveying the crowd: reflections on methods, participation, and reliability. In: Proceedings of the 5th Annual ACM Web Science Conference, pp. 234–243. ACM (2013)
Martin, D., Hanrahan, B.V., O’Neill, J., Gupta, N.: Being a turker. In: Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing, pp. 224–235. ACM (2014)
Martin, D., O’Neill, J., Gupta, N., Hanrahan, B.V.: Turking in a global labour market. Comput. Support. Coop. Work (CSCW) 25(1), 39–77 (2016)
Mason, W., Suri, S.: Conducting behavioral research on Amazons mechanical turk. Behav. Res. Methods 44(1), 1–23 (2012)
Morozov, E.: To Save Everything, Click Here: The Folly of Technological Solutionism. PublicAffairs (2013)
Naderi, B., Polzehl, T., Beyer, A., Pilz, t., Möller, S.: Crowdee: mobile crowdsourcing micro-task platform - for celebrating the diversity of languages. In: Proceedings of the 15th Annual Conference of the International Speech Communication Association (Interspeech 2014). IEEE, September 2014
Naderi, B., Wechsung, I., Möller, S.: Effect of being observed on the reliability of responses in crowdsourcing micro-task platforms. In: 2015 Seventh International Workshop on Quality of Multimedia Experience (QoMEX), pp. 1–2. IEEE (2015)
Naderi, B., Wechsung, I., Möller, S.: Crowdsourcing work motivation scale: development and validation for crowdsourcing micro-task platforms. In prepration (2016)
O’Neill, J., Martin, D.: Relationship-based business process crowdsourcing? In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013. LNCS, vol. 8120, pp. 429–446. Springer, Heidelberg (2013). doi:10.1007/978-3-642-40498-6_33
Paolacci, G., Chandler, J.: Inside the Turk understanding mechanical turk as a participant pool. Curr. Dir. Psychol. Sci. 23(3), 184–188 (2014)
Peer, E., Samat, S., Brandimarte, L., Acquisti, A.: Beyond the turk: an empirical comparison of alternative platforms for crowdsourcing online behavioral research (2015). http://dx.doi.org/10.2139/ssrn.2594183
Ross, J., Irani, L., Silberman, M., Zaldivar, A., Tomlinson, B.: Who are the crowdworkers? Shifting demographics in mechanical turk. In: CHI 2010 Extended Abstracts on Human Factors in Computing Systems, pp. 2863–2872. ACM (2010)
Ryan, R.M., Deci, E.L.: Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am. Psychol. 55(1), 68 (2000)
Salehi, N., Irani, L.C., Bernstein, M.S., Alkhatib, A., Ogbe, E., Milland, K., et al.: We are dynamo: overcoming stalling and friction in collective action for crowd workers. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 1621–1630. ACM (2015)
Silberman, M., Irani, L., Ross, J.: Ethics and tactics of professional crowdwork. XRDS Crossroads ACM Mag. Stud. 17(2), 39–43 (2010)
Silberman, M.S.: What’s fair? Rational action and its residuals in an electronic market. Unpublished manuscript (2010). http://www.scribd.com/doc/86592724/Whats-Fair
Silberman, S., Milland, K., LaPlante, R., Ross, J., Irani, L.: Stop citing Ross et al. 2010, Who are the crowdworkers? (2015). https://medium.com/@silberman/stop-citing-ross-et-al-2010-who-are-the-crowdworkers-b3b9b1e8d300
Acknowledment
This book chapter is dedicated to David Martin who was a fantastic, motivating and inspiring researcher, who unexpectedly passed away in the summer of 2016. This book chapter was one of his final projects, on a subject that he cared about deeply – the people who are behind the scenes, the life and blood of online platforms like AMT: the crowdworkers. Through his ethnomethodological work, he brought forward the working conditions faced by the workers, advocating to bring fairness and humanness to crowdsourcing through technology design and conscious implementation of professional ethics. The authors are glad to have met him at the Dagstuhl Seminar and to have worked with him together on this book chapter. We have lost a valuable member of the academic community, and a good friend.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix: Survey Data
Appendix: Survey Data
See Tables 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 and 13.
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Martin, D. et al. (2017). Understanding the Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing. In: Archambault, D., Purchase, H., Hoßfeld, T. (eds) Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments. Lecture Notes in Computer Science(), vol 10264. Springer, Cham. https://doi.org/10.1007/978-3-319-66435-4_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-66435-4_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-66434-7
Online ISBN: 978-3-319-66435-4
eBook Packages: Computer ScienceComputer Science (R0)