skip to main content
10.1145/2736277.2741685acmotherconferencesArticle/Chapter ViewAbstractPublication PagesthewebconfConference Proceedingsconference-collections
research-article

The Dynamics of Micro-Task Crowdsourcing: The Case of Amazon MTurk

Published: 18 May 2015 Publication History

Abstract

Micro-task crowdsourcing is rapidly gaining popularity among research communities and businesses as a means to leverage Human Computation in their daily operations. Unlike any other service, a crowdsourcing platform is in fact a marketplace subject to human factors that affect its performance, both in terms of speed and quality. Indeed, such factors shape the dynamics of the crowdsourcing market. For example, a known behavior of such markets is that increasing the reward of a set of tasks would lead to faster results. However, it is still unclear how different dimensions interact with each other: reward, task type, market competition, requester reputation, etc. In this paper, we adopt a data-driven approach to (A) perform a long-term analysis of a popular micro-task crowdsourcing platform and understand the evolution of its main actors (workers, requesters, and platform). (B) We leverage the main findings of our five year log analysis to propose features used in a predictive model aiming at determining the expected performance of any batch at a specific point in time. We show that the number of tasks left in a batch and how recent the batch is are two key features of the prediction. (C) Finally, we conduct an analysis of the demand (new tasks posted by the requesters) and supply (number of tasks completed by the workforce) and show how they affect task prices on the marketplace.

References

[1]
O. Alonso and S. Mizzaro. Using crowdsourcing for TREC relevance assessment. Inf. Process. Manage., 48(6):1053--1066, 2012.
[2]
A. Bozzon, M. Brambilla, S. Ceri, M. Silvestri, and G. Vesci. Choosing the right crowd: Expert finding in social networks. In Proceedings of the 16th International Conference on Extending Database Technology, EDBT '13, pages 637--648, New York, NY, USA, 2013. ACM.
[3]
L. Breiman and A. Cutler. Random Forests. https://www.stat.berkeley.edu/ breiman/RandomForests/cc_home.htm. Accessed: 2015-03-04.
[4]
G. Demartini, D. E. Difallah, and P. Cudré-Mauroux. ZenCrowd: Leveraging Probabilistic Reasoning and Crowdsourcing Techniques for Large-scale Entity Linking. In Proceedings of the 21st International Conference on World Wide Web, WWW '12, pages 469--478, New York, NY, USA, 2012. ACM.
[5]
D. E. Difallah, M. Catasta, G. Demartini, and P. Cudré-Mauroux. Scaling-up the Crowd: Micro-Task Pricing Schemes for Worker Retention and Latency Improvement. In Second AAAI Conference on Human Computation and Crowdsourcing, 2014.
[6]
D. E. Difallah, G. Demartini, and P. Cudré-Mauroux. Pick-a-crowd: Tell me what you like, and i'll tell you what to do. In Proceedings of the 22Nd International Conference on World Wide Web, WWW '13, pages 367--374, Republic and Canton of Geneva, Switzerland, 2013. International World Wide Web Conferences Steering Committee.
[7]
S. Faradani, B. Hartmann, and P. G. Ipeirotis. What's the right price? pricing tasks for finishing on time. Human Computation, 11, 2011.
[8]
J. D. Farmer, A. Gerig, F. Lillo, and S. Mike. Market Efficiency and the Long-Memory of Supply and Demand: Is Price Impact Variable and Permanent or Fixed and Temporary. Quant. Finance, 6(2):107--112, 2006.
[9]
M. J. Franklin, D. Kossmann, T. Kraska, S. Ramesh, and R. Xin. CrowdDB: Answering Queries with Crowdsourcing. In Proceedings of the 2011 ACM SIGMOD International Conference on Management of Data, SIGMOD '11, pages 61--72, New York, NY, USA, 2011. ACM.
[10]
U. Gadiraju, R. Kawase, and S. Dietze. A taxonomy of microtasks on the web. In Proceedings of the 25th ACM Conference on Hypertext and Social Media, HT '14, pages 218--223, New York, NY, USA, 2014. ACM.
[11]
Y. Gao and A. G. Parameswaran. Finish them!: Pricing algorithms for human computation. PVLDB, 7(14):1965--1976, 2014.
[12]
P. G. Ipeirotis. Analyzing the amazon mechanical turk marketplace. XRDS, 17(2):16--21, Dec. 2010.
[13]
L. C. Irani and M. S. Silberman. Turkopticon: Interrupting Worker Invisibility in Amazon Mechanical Turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '13, pages 611--620, New York, NY, USA, 2013. ACM.
[14]
A. Kittur, J. V. Nickerson, M. Bernstein, E. Gerber, A. Shaw, J. Zimmerman, M. Lease, and J. Horton. The future of crowd work. In Proceedings of the 2013 Conference on Computer Supported Cooperative Work, CSCW '13, pages 1301--1318, New York, NY, USA, 2013.
[15]
P. Kucherbaev, S. Tranquillini, F. Daniel, F. Casati, M. Marchese, M. Brambilla, and P. Fraternali. Business processes for the crowd computer. In M. La Rosa and P. Soffer, editors, Business Process Management Workshops, volume 132 of Lecture Notes in Business Information Processing, pages 256--267. Springer Berlin Heidelberg, 2013.
[16]
A. Kulkarni, M. Can, and B. Hartmann. Collaboratively crowdsourcing workflows with turkomatic. In Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work, CSCW '12, pages 1003--1012, New York, NY, USA, 2012. ACM.
[17]
J. Mortensen, M. A. Musen, and N. F. Noy. Crowdsourcing the verification of relationships in biomedical ontologies. In AMIA, 2013.
[18]
A. Parameswaran, A. D. Sarma, H. Garcia-Molina, N. Polyzotis, and J. Widom. Human-assisted graph search: It's okay to ask questions. Proc. VLDB Endow., 4(5):267--278, Feb. 2011.
[19]
C. Sarasua, E. Simperl, and N. F. Noy. Crowdmap: Crowdsourcing ontology alignment with microtasks. In Proceedings of the 11th International Conference on The Semantic Web - Volume Part I, ISWC'12, pages 525--541, Berlin, Heidelberg, 2012. Springer-Verlag.
[20]
M. S. Silberman, L. Irani, and J. Ross. Ethics and tactics of professional crowdwork. XRDS, 17(2):39--43, Dec. 2010.
[21]
L. von Ahn and L. Dabbish. Designing games with a purpose. Commun. ACM, 51(8):58--67, Aug. 2008.
[22]
L. von Ahn, R. Liu, and M. Blum. Peekaboom: a game for locating objects in images. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '06, pages 55--64, New York, NY, USA, 2006. ACM.
[23]
M. Vukovic. Crowdsourcing for enterprises. In Services-I, 2009 World Conference on, pages 686--692. IEEE, 2009.
[24]
J. Wang, T. Kraska, M. J. Franklin, and J. Feng. CrowdER: Crowdsourcing Entity Resolution. Proc. VLDB Endow., 5(11):1483--1494, July 2012.
[25]
J. Wang, G. Li, T. Kraska, M. J. Franklin, and J. Feng. Leveraging transitive relations for crowdsourced joins. In Proceedings of the 2013 ACM SIGMOD International Conference on Management of Data, pages 229--240. ACM, 2013.

Cited By

View all
  • (2024)Platform-mediated work in Poland: Worker characteristics and prevalence in societyInternational Journal of Management and Economics10.2478/ijme-2024-0007Online publication date: 22-Feb-2024
  • (2024)The State of Pilot Study Reporting in Crowdsourcing: A Reflection on Best Practices and GuidelinesProceedings of the ACM on Human-Computer Interaction10.1145/36410238:CSCW1(1-45)Online publication date: 26-Apr-2024
  • (2024)"Are we all in the same boat?" Customizable and Evolving Avatars to Improve Worker Engagement and Foster a Sense of Community in Online Crowd WorkProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642429(1-26)Online publication date: 11-May-2024
  • Show More Cited By

Index Terms

  1. The Dynamics of Micro-Task Crowdsourcing: The Case of Amazon MTurk

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    WWW '15: Proceedings of the 24th International Conference on World Wide Web
    May 2015
    1460 pages
    ISBN:9781450334693

    Sponsors

    • IW3C2: International World Wide Web Conference Committee

    In-Cooperation

    Publisher

    International World Wide Web Conferences Steering Committee

    Republic and Canton of Geneva, Switzerland

    Publication History

    Published: 18 May 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. crowdsourcing
    2. design
    3. experimentation
    4. forecasting
    5. human factors
    6. tracking
    7. trend identification

    Qualifiers

    • Research-article

    Funding Sources

    • Swiss National Science Foundation

    Conference

    WWW '15
    Sponsor:
    • IW3C2

    Acceptance Rates

    WWW '15 Paper Acceptance Rate 131 of 929 submissions, 14%;
    Overall Acceptance Rate 1,899 of 8,196 submissions, 23%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)75
    • Downloads (Last 6 weeks)7
    Reflects downloads up to 18 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Platform-mediated work in Poland: Worker characteristics and prevalence in societyInternational Journal of Management and Economics10.2478/ijme-2024-0007Online publication date: 22-Feb-2024
    • (2024)The State of Pilot Study Reporting in Crowdsourcing: A Reflection on Best Practices and GuidelinesProceedings of the ACM on Human-Computer Interaction10.1145/36410238:CSCW1(1-45)Online publication date: 26-Apr-2024
    • (2024)"Are we all in the same boat?" Customizable and Evolving Avatars to Improve Worker Engagement and Foster a Sense of Community in Online Crowd WorkProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642429(1-26)Online publication date: 11-May-2024
    • (2024)Solution Probing Attack Against Coin Mixing Based Privacy-Preserving Crowdsourcing PlatformsIEEE Transactions on Dependable and Secure Computing10.1109/TDSC.2024.3355453(1-15)Online publication date: 2024
    • (2024)An Empirical Study of QoE Estimation for Video Streaming Services Using Crowdsourcing2024 IEEE 35th International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC)10.1109/PIMRC59610.2024.10817196(1-6)Online publication date: 2-Sep-2024
    • (2024)Explaining crowdworker behaviour through computational rationalityBehaviour & Information Technology10.1080/0144929X.2024.232961644:3(552-573)Online publication date: 24-Apr-2024
    • (2023)Efficacy of an Unguided, Digital Single-Session Intervention for Internalizing Symptoms in Web-Based Workers: Randomized Controlled TrialJournal of Medical Internet Research10.2196/4541125(e45411)Online publication date: 7-Jul-2023
    • (2023)The Inefficient Technological RevolutionSSRN Electronic Journal10.2139/ssrn.4584212Online publication date: 2023
    • (2023)“Sometimes It’s Like Putting the Track in Front of the Rushing Train”: Having to Be ‘On Call’ for Work Limits the Temporal Flexibility of CrowdworkersACM Transactions on Computer-Human Interaction10.1145/363514531:2(1-45)Online publication date: 4-Dec-2023
    • (2023)How Many Crowd Workers Do I Need? On Statistical Power when Crowdsourcing Relevance JudgmentsACM Transactions on Information Systems10.1145/359720142:1(1-26)Online publication date: 22-May-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media