skip to main content
10.1145/2441776.2441921acmconferencesArticle/Chapter ViewAbstractPublication PagescscwConference Proceedingsconference-collections
research-article

Co-worker transparency in a microtask marketplace

Authors Info & Claims
Published:23 February 2013Publication History

ABSTRACT

Workers in microtask work environments such as Mechanical Turk typically do not know if or how they fit into a workflow. The research question we posed here was whether displaying information about the number of other workers doing the same task would motivate better or poorer work quality. In experiment 1, we varied the information about co-workers presented to the worker and the number of his or her co-workers: "you" or "you alone" are doing a task, or "you" plus 5, 15, or 50 co-workers. We compared these conditions with a no-social information control. In experiment 2, we crossed the number of co-workers (5 vs. 50) with the type of incentive (individual or group). Results show that visual presentations of co-workers changed workers' perceptions of co-workers, and that the more co-workers participants perceived, the lower their work quality. We suggest future work to determine the kinds of co-worker information that will reduce or increase work quality in microtask settings.

References

  1. Bernstein, M.S., Little, G., Miller, R.C., et al. Soylent: a word processor with a crowd inside. Proceedings of the 23nd annual ACM symposium on User interface software and technology, ACM (2010), 313--322. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Brewer, M.B. and Kramer, R.M. Choice behavior in social dilemmas: Effects of social identity, group size, and decision framing. Journal of Personality and Social Psychology 50, 3 (1986), 543--549.Google ScholarGoogle ScholarCross RefCross Ref
  3. Downs, J.S., Holbrook, M.B., Sheng, S., and Cranor, L.F. Are Your Participants Gaming the System/? Screening Mechanical Turk Workers. Science, (2010), 2399--2402. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Fekete, J.-D., Wijk, J.J., Stasko, J.T., and North, C. Information Visualization. In A. Kerren, J.T. Stasko, J.-D. Fekete and C. North, eds., Springer-Verlag, Berlin, Heidelberg, 2008, 1--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. George, J.M. Extrinsic and Intrinsic Origins of Perceived Social Loafing in Organizations. The Academy of Management Journal 35, 1 (1992), 191--202.Google ScholarGoogle Scholar
  6. Granovetter, M.S. The Impact of Social Structure on Economic Outcomes. Journal of Economic Perspectives 19, 1 (2005), 33--50.Google ScholarGoogle ScholarCross RefCross Ref
  7. Hackman, J.R. and Oldham, G.R. Motivation through the design of work: test of a theory. Organizational Behavior and Human Performance 16, 2 (1976), 250--279.Google ScholarGoogle ScholarCross RefCross Ref
  8. Huberman, B.A., Romero, D.M., and Wu, F. Crowdsourcing, attention and productivity . Journal of Information Science 35 , 6 (2009), 758--765. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Ipeirotis, P.G. and Paritosh, P.K. Managing crowdsourced human computation: a tutorial. Proceedings of the 20th international conference companion on World wide web, ACM (2011), 287--288. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Jans, L., Postmes, T., and Van Der Zee, K.I. The induction of shared identity: The positive role of individual distinctiveness for groups. Personality social psychology bulletin 37, 8 (2011), 1130--1141.Google ScholarGoogle Scholar
  11. Jones, G.R. Task Visibility, Free Riding, and Shirking: Explaining the Effect of Structure and Technology on Employee Behavior. The Academy of Management Review 9, 4 (1984), 684--695.Google ScholarGoogle ScholarCross RefCross Ref
  12. Karau, S.J. and Williams, K.D. Social Loafing/: A Meta-Analytic Review and Theoretical Integration. Journal of Personality and Social Psychology 65, 4 (1993), 681--706.Google ScholarGoogle ScholarCross RefCross Ref
  13. Kerr, N.L. Illusions of efficacy: The effects of group-size on perceived efficacy in social dilemmas. Journal of Experimental Social Psychology 25, 4 (1989), 287--3Google ScholarGoogle ScholarCross RefCross Ref
  14. Kinnaird, P., Dabbish, L., and Kiesler, S. The Impact of a Transparent Workflow in Mechanical Turk. GROUP, ACM (2012).Google ScholarGoogle Scholar
  15. Kittur, A., Chi, E.H., and Suh, B. Crowdsourcing user studies with Mechanical Turk. Proceeding of the twentysixth annual CHI conference on Human factors in computing systems CHI 08 08, April 5--10 (2008), 453. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Kittur, A., Smus, B., and Kraut, R.E. CrowdForge/: Crowdsourcing Complex Work. Human-Computer Interaction, (2011), 1801--1806. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Kraut, R.E., Resnick, P., Kiesler, S., et al. Building Successful Online Communities: Evidence-Based Social Design. MIT, Boston, MA, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Kulkarni, A., Can, M., and Hartmann, B. Collaboratively crowdsourcing workflows with turkomatic. Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work, ACM (2012), 1003--1012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Latane, B. and Dabbs Jr., J.M. Sex, Group Size and Helping in Three Cities. Sociometry 38, 2 (1975), 180--194.Google ScholarGoogle ScholarCross RefCross Ref
  20. Mason, W. and Watts, D.J. Financial incentives and the "performance of crowds." ACM SIGKDD Explorations Newsletter 11, 2 (2010), 100. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Roland E. Kidwell, J. and Bennett, N. Employee Propensity to Withhold Effort: A Conceptual Model to Intersect Three Avenues of Research. The Academy of Management Review 18, 3 (1993), 429--456.Google ScholarGoogle Scholar
  22. Shaw, A.D., Horton, J.J., and Chen, D.L. Designing Incentives for Inexpert Human Raters. CSCW, (2011), 1--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Snow, R., O'Connor, B., Jurafsky, D., and Ng, A.Y. Cheap and Fast -- But is it Good? Evaluating Non-Expert Annotations for Natural Language Tasks. Proceedings of the Conference on Empirical Methods in Natural Language Processing 254, October (2008), 254--263. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Stuart, H.C., Dabbish, L., Kiesler, S., Kinnaird, P., and Kang, R. Social transparency in networked information exchange: a theoretical framework. Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work, ACM (2012), 451--460. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Ware, C. Information visualization: perception for design. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Co-worker transparency in a microtask marketplace

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CSCW '13: Proceedings of the 2013 conference on Computer supported cooperative work
      February 2013
      1594 pages
      ISBN:9781450313315
      DOI:10.1145/2441776

      Copyright © 2013 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 23 February 2013

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate2,235of8,521submissions,26%

      Upcoming Conference

      CSCW '24

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader