Skip to main content

Understanding the Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing

  • Conference paper
  • First Online:
Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments

Abstract

The driving force behind digital crowdsourcing are its workers: working, hidden behind the scenes, churning out data in experiments, participating in research studies, completing little tasks to accomplish HITs online. Understanding workers and crowdwork better is therefore key to develop a more effective and fair use of crowdsourcing for research. This chapter attempts to help develop an understanding of the various aspects of the crowd by drawing parallels between workers of different platforms (AMT, Microworkers and Crowdee) through quantitative and qualitative analysis of current and newly collected data. A picture of the crowd is drawn by uncovering their motivations, workplaces, skills and infrastructure, issues and perspectives about the design of microtasks, the employers and the microtask-based platforms. Legal and ethical perspectives on crowdwork are also discussed, and online resources are reviewed that researchers can use as a primer to employ crowdworkers in an ethical and fair way. The chapter provides information, a review of internationally recognised ethical principles and practical advice to those who would like to use crowdsourcing for experiments and to carry out research studies as an informed researcher and crowd employer.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://mturk.com last accessed 14 Jun 2017.

  2. 2.

    E.g. http://www.bbc.co.uk/nature/22694347 last accessed 14 Jun 2017.

  3. 3.

    http://microworkers.com last accessed 14 Jun 2017.

  4. 4.

    http://crowdee.de last accessed 14 Jun 2017.

  5. 5.

    Crowdworkers in MTurk.

  6. 6.

    The terms ‘task’ and ‘microtask’ have been used interchangeably here due to the use of multiple platforms that have different terminology for microtasks on them.

  7. 7.

    http://wiki.wearedynamo.org/index.php?title=Fair_payment last accessed 14 Jun 2017.

  8. 8.

    http://www.wes.org/educators/pdf/IndiaPolicyPacket.pdf last accessed 14 Jun 2017.

  9. 9.

    http://www.rediff.com/getahead/report/career-your-skills-not-degree-will-get-you-a-job/20150408.htm last accessed 14 Jun 2017.

  10. 10.

    http://www.wsj.com/articles/SB10001424052748703515504576142092863219826 last accessed 14 Jun 2017.

  11. 11.

    Note that, going to school might have been misunderstood by workers with poor English, as to them education is disseminated in ‘colleges’ and ‘universities’, and not in ‘schools’. We are replicating our survey based on previous studies here, hence we did not change the terminology in this case.

  12. 12.

    The other explanation is that the workers spent time ‘searching’ for work in the ‘hopes’ that they will find something before the end of the day. There isn’t data to confirm this from our surveys but the ethnographic studies have. One such exemplar is where an Indian worker searches through HITs on MTurk for as long as 20 min at a stretch ‘hoping’ to find his or her preferred type of work.

  13. 13.

    http://www.prb.org/Publications/Articles/2012/india-2011-census.aspx last accessed 14 Jun 2017.

  14. 14.

    http://www.pewinternet.org/2015/10/29/technology-device-ownership-2015/ last accessed 14 Jun 2017, http://www.pewinternet.org/2015/06/26/americans-internet-access-2000-2015/ last accessed 14 Jun 2017.

  15. 15.

    http://www.citylab.com/work/2015/09/mapping-the-difference-between-minimum-wage-and-cost-of-living/404644/ last accessed 14 Jun 2017.

  16. 16.

    According to the OECD the net national income in India was $3,718 per year and capita in 2009. https://data.oecd.org/natincome/net-national-income.htm last accessed 14 Jun 2017.

  17. 17.

    http://www.forbes.com/sites/saritharai/2016/01/06/india-just-crossed-1-billion-mobile-subscribers-milestone-and-the-excitements-just-beginning/#786ee6915ac2 last accessed 14 Jun 2017.

  18. 18.

    http://www.thehindu.com/news/cities/mumbai/business/with-220mn-users-india-is-now-worlds-secondbiggest-smartphone-market/article8186543.ece last accessed 14 Jun 2017.

  19. 19.

    Last accessed (the following) 14 Jun 2017, https://www.reddit.com/r/HITsWorthTurkingFor/wiki/index, http://www.cloudmebaby.com/forums/portal.php, http://www.mturkforum.com/, http://turkernation.com/, http://www.mturkgrind.com/.

  20. 20.

    https://turkopticon.ucsd.edu/ last accessed 14 Jun 2017.

  21. 21.

    http://www.turkalert.com/ last accessed 14 Jun 2017.

  22. 22.

    http://turkernation.com/forumdisplay.php?167-mTurk-Scripts-Programs-amp-Tools last accessed 14 Jun 2017.

  23. 23.

    For information on ‘precarious work’: http://www.laborrights.org/issues/precarious-work last accessed 14 Jun 2017.

  24. 24.

    https://www.mturk.com/mturk/help?helpPage=policies last accessed 14 Jun 2017.

  25. 25.

    http://wiki.wearedynamo.org/index.php/Guidelines_for_Academic_Requesters last accessed 14 Jun 2017.

  26. 26.

    http://www.overtimepaylaws.org/federal-court-approves-settlement-in-crowdsourcing-labor-company-wage-suit/ last accessed 14 Jun 2017.

  27. 27.

    http://www.un.org/en/universal-declaration-human-rights/ last accessed 14 Jun 2017.

  28. 28.

    http://www.cirp.org/library/ethics/nuremberg/ last accessed 14 Jun 2017.

  29. 29.

    https://www.wma.net/policies-post/wma-declaration-of-helsinki-ethical-principles-for-medical-research-involving-human-subjects/ last accessed 14 Jun 2017.

  30. 30.

    http://www.hhs.gov/ohrp/humansubjects/guidance/belmont.html last accessed 14 Jun 2017.

  31. 31.

    One of the interesting features is that are a number of situations – relatively hidden – where a relatively stable workforce of Turkers work for a given requester over a reasonable period of time.

  32. 32.

    http://crowdsourcing-code.com/documents/5/Code_of_Conduct_Crowdworking_English_072015 last accessed 14 Jun 2017.

  33. 33.

    It should be noted that Germany has a strong trade union tradition – and a culture of cooperation between companies and workers that persists to this day, and that it has been progressive in its approach to crowdsourcing labour rights.

References

  1. Antin, J., Shaw, A.: Social desirability bias and self-reports of motivation: a study of Amazon mechanical turk in the US and India. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 2012), pp. 2925–2934 (2012)

    Google Scholar 

  2. Bederson, B.B., Quinn, A.J.: Web workers unite! Addressing challenges of online laborers. In: CHI 2011 Extended Abstracts on Human Factors in Computing Systems, pp. 97–106. ACM (2011)

    Google Scholar 

  3. Berg, J.: Income security in the on-demand economy: findings and policy lessons from a survey of crowdworkers. Comp. Labor Law Policy J. 37(3) (2016)

    Google Scholar 

  4. Bourne, K.A., Forman, P.J.: Living in a culture of overwork an ethnographic study of flexibility. J. Manag. Inquiry 23(1), 68–79 (2014)

    Article  Google Scholar 

  5. Brawley, A.M., Pury, C.L.: Work experiences on MTurk: job satisfaction, turnover, and information sharing. Comput. Hum. Behav. 54, 531–546 (2016)

    Article  Google Scholar 

  6. Callison-Burch, C.: Crowd-workers: aggregating information across turkers to help them find higher paying work. In: Second AAAI Conference on Human Computation and Crowdsourcing (2014)

    Google Scholar 

  7. Chandler, J., Mueller, P., Paolacci, G.: Nonnavet among Amazon mechanical turk workers: consequences and solutions for behavioral researchers. Behav. Res. Methods 46(1), 112–130 (2014)

    Article  Google Scholar 

  8. Crary, J.: 24/7: Late Capitalism and the Ends of Sleep. Verso Books, New York (2013)

    Google Scholar 

  9. Deci, E., Ryan, R.: Intrinsic Motivation and Self-determination in Human Behavior. Plenum Press, New York (1985)

    Book  Google Scholar 

  10. Difallah, D.E., Catasta, M., Demartini, G., Ipeirotis, P.G., Cudr-Mauroux, P.: The dynamics of micro-task crowdsourcing: the case of Amazon MTurk. In: Proceedings of the 24th International Conference on World Wide Web (WWW 2015), pp. 238–247. ACM (2015)

    Google Scholar 

  11. Felstiner, A.: Working the crowd: employment and labor law in the crowdsourcing industry. Berkeley J. Employ. Labor Law 32, 143–203 (2011)

    Google Scholar 

  12. Finkin, M.: Beclouded work in historical perspective. Comp. Labor Law Policy J. 37(3) (2016)

    Google Scholar 

  13. Fort, K., Adda, G., Cohen, K.B.: Amazon mechanical turk: gold mine or coal mine? Comput. Linguist. 37(2), 413–420 (2011)

    Article  Google Scholar 

  14. Gagné, M., Deci, E.L.: Self-determination theory and work motivation. J. Organ. Behav. 26(4), 331–362 (2005)

    Article  Google Scholar 

  15. Garfinkel, H.: A conception of and experiments with “trust” as a condition of concerted stable actions. In: The Production of Reality: Essays and Readings on Social Interaction, pp. 381–392 (1963)

    Google Scholar 

  16. Gupta, N.: An ethnographic study of crowdwork via Amazon mechanical turk in India. Unpublished manuscript (2017). http://eprints.nottingham.ac.uk/41062/

  17. Gupta, N., Martin, D., Hanrahan, B.V., O’Neill, J.: Turk-life in India. In: Proceedings of the 18th International Conference on Supporting Group Work, pp. 1–11. ACM (2014)

    Google Scholar 

  18. Hanhart, P., Korshunov, P., Ebrahimi, T.: Crowdsourcing evaluation of high dynamic range image compression. In: SPIE Optical Engineering + Applications, p. 92170D. International Society for Optics and Photonics (2014)

    Google Scholar 

  19. Hanrahan, B.V., Willamowski, J.K., Swaminathan, S., Martin, D.B.: TurkBench: rendering the market for turkers. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 1613–1616. ACM (2015)

    Google Scholar 

  20. Health UDo, Services H et al.: The Belmont report. Office for Human Research Protections (OHRP) (1979). Accessed 19 Nov 2008

    Google Scholar 

  21. Hirth, M., Hofeld, T., Tran-Gia, P.: Anatomy of a crowdsourcing platform-using the example of microworkers.com. In: 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS), pp. 322–329. IEEE (2011)

    Google Scholar 

  22. Ipeirotis, P.G.: Demographics of mechanical turk (2010)

    Google Scholar 

  23. Ipeirotis, P.G.: Analyzing the Amazon mechanical turk marketplace. XRDS 17(2), 16–21 (2010)

    Article  Google Scholar 

  24. Ipeirotis, P.: Demographics of mechanical turk: now live! April 2015. http://www.behind-the-enemy-lines.com/2015/04/demographics-of-mechanical-turk-now.html

  25. Irani, L.C., Silberman, M.S.: Turkopticon: interrupting worker invisibility in Amazon mechanical turk. In: Proceedings of CHI 2013 (CHI 2013), pp. 611–620. ACM (2013)

    Google Scholar 

  26. Kaufmann, N., Schulze, T., Veit, D.: More than fun and money. Worker motivation in crowdsourcing: a study on mechanical turk. In: Proceedings of the Seventeenth Americas Conference on Information Systems, pp. 1–11 (2011)

    Google Scholar 

  27. Kazai, G., Kamps, J., Milic-Frayling, N.: The face of quality in crowdsourcing relevance labels: demographics, personality and labeling accuracy. In: Proceedings of the 21st ACM International Conference on Information and Knowledge Management, pp. 2583–2586. ACM (2012)

    Google Scholar 

  28. Kittur, A., Nickerson, J.V., Bernstein, M., Gerber, E., Shaw, A., Zimmerman, J., Lease, M., Horton, J.: The future of crowd work. In: Proceedings of the 2013 Conference on Computer Supported Cooperative Work, p. 1301. ACM Press (2013)

    Google Scholar 

  29. Kuek, S.C., Paradi-Guilford, C., Fayomi, T., Imaizumi, S., Ipeirotis, P., Pina, P., Singh, M.: The global opportunity in online outsourcing. Technical report, The World Bank (2015)

    Google Scholar 

  30. Marshall, C.C., Shipman, F.M.: Experiences surveying the crowd: reflections on methods, participation, and reliability. In: Proceedings of the 5th Annual ACM Web Science Conference, pp. 234–243. ACM (2013)

    Google Scholar 

  31. Martin, D., Hanrahan, B.V., O’Neill, J., Gupta, N.: Being a turker. In: Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing, pp. 224–235. ACM (2014)

    Google Scholar 

  32. Martin, D., O’Neill, J., Gupta, N., Hanrahan, B.V.: Turking in a global labour market. Comput. Support. Coop. Work (CSCW) 25(1), 39–77 (2016)

    Article  Google Scholar 

  33. Mason, W., Suri, S.: Conducting behavioral research on Amazons mechanical turk. Behav. Res. Methods 44(1), 1–23 (2012)

    Article  Google Scholar 

  34. Morozov, E.: To Save Everything, Click Here: The Folly of Technological Solutionism. PublicAffairs (2013)

    Google Scholar 

  35. Naderi, B., Polzehl, T., Beyer, A., Pilz, t., Möller, S.: Crowdee: mobile crowdsourcing micro-task platform - for celebrating the diversity of languages. In: Proceedings of the 15th Annual Conference of the International Speech Communication Association (Interspeech 2014). IEEE, September 2014

    Google Scholar 

  36. Naderi, B., Wechsung, I., Möller, S.: Effect of being observed on the reliability of responses in crowdsourcing micro-task platforms. In: 2015 Seventh International Workshop on Quality of Multimedia Experience (QoMEX), pp. 1–2. IEEE (2015)

    Google Scholar 

  37. Naderi, B., Wechsung, I., Möller, S.: Crowdsourcing work motivation scale: development and validation for crowdsourcing micro-task platforms. In prepration (2016)

    Google Scholar 

  38. O’Neill, J., Martin, D.: Relationship-based business process crowdsourcing? In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013. LNCS, vol. 8120, pp. 429–446. Springer, Heidelberg (2013). doi:10.1007/978-3-642-40498-6_33

    Chapter  Google Scholar 

  39. Paolacci, G., Chandler, J.: Inside the Turk understanding mechanical turk as a participant pool. Curr. Dir. Psychol. Sci. 23(3), 184–188 (2014)

    Article  Google Scholar 

  40. Peer, E., Samat, S., Brandimarte, L., Acquisti, A.: Beyond the turk: an empirical comparison of alternative platforms for crowdsourcing online behavioral research (2015). http://dx.doi.org/10.2139/ssrn.2594183

  41. Ross, J., Irani, L., Silberman, M., Zaldivar, A., Tomlinson, B.: Who are the crowdworkers? Shifting demographics in mechanical turk. In: CHI 2010 Extended Abstracts on Human Factors in Computing Systems, pp. 2863–2872. ACM (2010)

    Google Scholar 

  42. Ryan, R.M., Deci, E.L.: Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. Am. Psychol. 55(1), 68 (2000)

    Article  Google Scholar 

  43. Salehi, N., Irani, L.C., Bernstein, M.S., Alkhatib, A., Ogbe, E., Milland, K., et al.: We are dynamo: overcoming stalling and friction in collective action for crowd workers. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 1621–1630. ACM (2015)

    Google Scholar 

  44. Silberman, M., Irani, L., Ross, J.: Ethics and tactics of professional crowdwork. XRDS Crossroads ACM Mag. Stud. 17(2), 39–43 (2010)

    Article  Google Scholar 

  45. Silberman, M.S.: What’s fair? Rational action and its residuals in an electronic market. Unpublished manuscript (2010). http://www.scribd.com/doc/86592724/Whats-Fair

  46. Silberman, S., Milland, K., LaPlante, R., Ross, J., Irani, L.: Stop citing Ross et al. 2010, Who are the crowdworkers? (2015). https://medium.com/@silberman/stop-citing-ross-et-al-2010-who-are-the-crowdworkers-b3b9b1e8d300

Download references

Acknowledment

This book chapter is dedicated to David Martin who was a fantastic, motivating and inspiring researcher, who unexpectedly passed away in the summer of 2016. This book chapter was one of his final projects, on a subject that he cared about deeply – the people who are behind the scenes, the life and blood of online platforms like AMT: the crowdworkers. Through his ethnomethodological work, he brought forward the working conditions faced by the workers, advocating to bring fairness and humanness to crowdsourcing through technology design and conscious implementation of professional ethics. The authors are glad to have met him at the Dagstuhl Seminar and to have worked with him together on this book chapter. We have lost a valuable member of the academic community, and a good friend.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Neha Gupta .

Editor information

Editors and Affiliations

Appendix: Survey Data

Appendix: Survey Data

See Tables 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 and 13.

Table 2. Distribution of Gender observed for AMT workers (survey studies and on MTurk Tracker).
Table 3. Distribution of Gender observed for Microworkers and Crowdee workers using survey studies.
Table 4. Distribution of Age observed for AMT workers (survey studies and on MTurk Tracker).
Table 5. Distribution of Age observed for Microworkers and Crowdee workers using survey studies.
Table 6. Distribution of Household Income observed for AMT workers (survey studies and on MTurk Tracker). Please note that for the MTurk tracker data, the data is not available for all income classes (rows) and is therefore aggregated over two classes.
Table 7. Distribution of Household Income observed for Microworkers and Crowdee workers using survey studies. 22.18% of Crowdee participants did not report their household income.
Table 8. Distribution of Household Size (including the worker) observed for AMT workers (survey studies and on MTurk Tracker).
Table 9. Distribution of Household Size (including the worker) observed for Microworkers and Crowdee workers using survey studies.
Table 10. Distribution of highest Education Level achieved observed for all platforms using survey studies.
Table 11. Distribution of Employment Status of crowd workers from all platforms using survey studies.
Table 12. Distribution of Times Crowd Workers Spent on All Platforms (per week).
Table 13. Distribution of Stated Task Approval Rate of crowd workers on all platforms. For the Crowdee platform, no data is available for the stated task approval rate.

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Martin, D. et al. (2017). Understanding the Crowd: Ethical and Practical Matters in the Academic Use of Crowdsourcing. In: Archambault, D., Purchase, H., Hoßfeld, T. (eds) Evaluation in the Crowd. Crowdsourcing and Human-Centered Experiments. Lecture Notes in Computer Science(), vol 10264. Springer, Cham. https://doi.org/10.1007/978-3-319-66435-4_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-66435-4_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-66434-7

  • Online ISBN: 978-3-319-66435-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics