Skip to main content

Hybrid Machine-Crowd Interaction for Handling Complexity: Steps Toward a Scaffolding Design Framework

  • Chapter
  • First Online:
Macrotask Crowdsourcing

Abstract

Much research attention on crowd work is paid to the development of solutions for enhancing microtask crowdsourcing settings. Although decomposing difficult problems into microtasks is appropriate for many situations, several problems are non-decomposable and require high levels of coordination among crowd workers. In this chapter, we aim to gain a better understanding of the macrotask crowdsourcing problem and the integration of crowd-AI mechanisms for solving complex tasks distributed across expert crowds and machines. We also explore some design implications of macrotask crowdsourcing systems taking into account their scaling abilities to support complex work in science.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.mturk.com/.

  2. 2.

    https://prolific.ac/.

  3. 3.

    https://crowdcrafting.org/.

  4. 4.

    https://www.wikipedia.org/

References

  • Barbier, G., Zafarani, R., Gao, H., Fung, G., & Liu, H. (2012). Maximizing benefits from crowdsourced data. Computational and Mathematical Organization Theory, 18(3), 257–279.

    Article  Google Scholar 

  • Barowy, D. W., Curtsinger, C., Berger, E. D., & McGregor, A. (2012). Automan: A platform for integrating human-based and digital computation. ACM SIGPLAN Notices, 47(10), 639–654.

    Article  Google Scholar 

  • Bigham, J. P., Bernstein, M. S., & Adar, E. (2015). Human-computer interaction and collective intelligence. Handbook of Collective Intelligence, 57.

    Google Scholar 

  • Borromeo, R. M., & Toyama, M. (2016). An investigation of unpaid crowdsourcing. Human-Centric Computing and Information Sciences, 6(1), 11.

    Article  Google Scholar 

  • Brown, A. W., & Allison, D. B. (2014). Using crowdsourcing to evaluate published scientific literature: Methods and example. PLoS ONE, 9(7), e100647.

    Article  Google Scholar 

  • Chan, J., Chang, J. C., Hope, T., Shahaf, D., & Kittur, A. (2018). Solvent: A mixed initiative system for finding analogies between research papers. In Proceedings of the ACM Conference on Computer-Supported Cooperative Work and Social Computing.

    Google Scholar 

  • Chau, D. H., Kittur, A., Hong, J. I., & Faloutsos, C. (2011). Apolo: Making sense of large network data by combining rich user interaction and machine learning. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (pp. 167–176).

    Google Scholar 

  • Cheng, J., & Bernstein, M. S. (2015). Flock: Hybrid crowd-machine learning classifiers. In Proceedings of the ACM Conference on Computer Supported Cooperative Work and Social Computing (pp. 600–611).

    Google Scholar 

  • Cheng, J., Teevan, J., Iqbal, S. T., & Bernstein, M. S. (2015). Break it down: A comparison of macro-and microtasks. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (pp. 4061–4064).

    Google Scholar 

  • Chettih, A., Gross-Amblard, D., Guyon, D., Legeay, E., & Miklós, Z. (2014). Crowd, a platform for the crowdsourcing of complex tasks. In BDA 2014: Gestion de Données—Principes, Technologies et Applications (pp. 51–55).

    Google Scholar 

  • Correia, A., Schneider, D., Paredes, H., & Fonseca, B. (2018a). SciCrowd: Towards a hybrid, crowd-computing system for supporting research groups in academic settings. In Proceedings of the 24th International Conference on Collaboration and Technology (pp. 34–41).

    Google Scholar 

  • Correia, A., Schneider, D., Fonseca, B., & Paredes, H. (2018b). Crowdsourcing and massively collaborative science: A systematic literature review and mapping study. In Proceedings of the 24th International Conference on Collaboration and Technology (pp. 133–154).

    Google Scholar 

  • Crowston, K., Mitchell, E., & Østerlund, C. (2018). Coordinating advanced crowd work: Extending citizen science. In Proceedings of the 51st Hawaii International Conference on System Sciences (pp. 1681–1690).

    Google Scholar 

  • Daniel, F., Kucherbaev, P., Cappiello, C., Benatallah, B., & Allahbakhsh, M. (2018). Quality control in crowdsourcing: A survey of quality attributes, assessment techniques, and assurance actions. ACM Computing Surveys (CSUR), 51(1), 7.

    Article  Google Scholar 

  • Difallah, D. E., Catasta, M., Demartini, G., & Cudré-Mauroux, P. (2014). Scaling-up the crowd: Micro-task pricing schemes for worker retention and latency improvement. In Second AAAI Conference on Human Computation and Crowdsourcing.

    Google Scholar 

  • Doan, A., Ramakrishnan, R., & Halevy, A. Y. (2011). Crowdsourcing systems on the world-wide web. Communications of the ACM, 54(4), 86–96.

    Article  Google Scholar 

  • Dong, Z., Lu, J., Ling, T. W., Fan, J., & Chen, Y. (2017). Using hybrid algorithmic-crowdsourcing methods for academic knowledge acquisition. Cluster Computing, 20(4), 3629–3641.

    Article  Google Scholar 

  • Dow, S., Kulkarni, A., Klemmer, S., & Hartmann, B. (2012). Shepherding the crowd yields better work. In Proceedings of the ACM Conference on Computer Supported Cooperative Work (pp. 1013–1022).

    Google Scholar 

  • Franzoni, C., & Sauermann, H. (2014). Crowd science: The organization of scientific research in open collaborative projects. Research Policy, 43(1), 1–20.

    Article  Google Scholar 

  • Gaikwad, S. N. S., Morina, D., Ginzberg, A., Mullings, C., Goyal, S., Gamage, D., et al. (2016). Boomerang: Rebounding the consequences of reputation feedback on crowdsourcing platforms. In Proceedings of the 29th ACM Symposium on User Interface Software and Technology (pp. 625–637).

    Google Scholar 

  • Garcia-Molina, H., Joglekar, M., Marcus, A., Parameswaran, A., & Verroios, V. (2016). Challenges in data crowdsourcing. IEEE Transactions on Knowledge and Data Engineering, 28(4), 901–911.

    Article  Google Scholar 

  • Geiger, D., Seedorf, S., Schulze, T., Nickerson, R. C., & Schader, M. (2011). Managing the crowd: Towards a taxonomy of crowdsourcing processes. In Proceedings of the Proceedings of the 17th Americas Conference on Information Systems.

    Google Scholar 

  • Gil, Y., & Hirsh, H. (2012). Discovery informatics: AI opportunities in scientific discovery. In Proceedings of the AAAI Fall Symposium: Discovery Informatics.

    Google Scholar 

  • Gil, Y., Greaves, M., Hendler, J., & Hirsh, H. (2014). Amplify scientific discovery with artificial intelligence. Science, 346(6206), 171–172.

    Article  Google Scholar 

  • Gil, Y., Honaker, J., Gupta, S., Ma, Y., D’Orazio, V., Garijo, D., et al. (2019). Towards human-guided machine learning. In Proceedings of the 24th ACM International Conference on Intelligent User Interfaces.

    Google Scholar 

  • Good, B. M., Nanis, M., Wu, C., & Su, A. I. (2014). Microtask crowdsourcing for disease mention annotation in PubMed abstracts. In Proceedings of the Pacific Symposium on Biocomputing (pp. 282–293).

    Google Scholar 

  • Haas, D., Ansel, J., Gu, L., & Marcus, A. (2015). Argonaut: Macrotask crowdsourcing for complex data processing. Proceedings of the VLDB Endowment, 8(12), 1642–1653.

    Article  Google Scholar 

  • Hansson, K., & Ludwig, T. (2018). Crowd dynamics: Conflicts, contradictions, and community in crowdsourcing. Computer Supported Cooperative Work (CSCW), 1–4.

    Google Scholar 

  • Hetmank, L. (2013). Components and functions of crowdsourcing systems – A systematic literature review. Wirtschaftsinformatik, 4.

    Google Scholar 

  • Hochachka, W. M., Fink, D., Hutchinson, R. A., Sheldon, D., Wong, W. K., & Kelling, S. (2012). Data-intensive science applied to broad-scale citizen science. Trends in Ecology & Evolution, 27(2), 130–137.

    Article  Google Scholar 

  • Hosseini, M., Phalp, K., Taylor, J., & Ali, R. (2014). The four pillars of crowdsourcing: A reference model. In Proceedings of the 2014 IEEE Eighth International Conference on Research Challenges in Information Science (RCIS) (pp. 1–12).

    Google Scholar 

  • Huang, S. W., & Fu, W. T. (2013). Don’t hide in the crowd!: Increasing social transparency between peer workers improves crowdsourcing outcomes. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (pp. 621–630).

    Google Scholar 

  • Ikeda, K., Morishima, A., Rahman, H., Roy, S. B., Thirumuruganathan, S., Amer-Yahia, S., et al. (2016). Collaborative crowdsourcing with Crowd4U. Proceedings of the VLDB Endowment, 9(13), 1497–1500.

    Article  Google Scholar 

  • Kamar, E. (2016). Directions in hybrid intelligence: Complementing AI systems with human intelligence. In IJCAI (pp. 4070–4073).

    Google Scholar 

  • Kittur, A., Smus, B., Khamkar, S., & Kraut, R. E. (2011). Crowdforge: Crowdsourcing complex work. In Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology (pp. 43–52).

    Google Scholar 

  • Kittur, A., Khamkar, S., André, P., & Kraut, R. (2012). CrowdWeaver: Visually managing complex crowd work. In Proceedings of the ACM Conference on Computer Supported Cooperative Work (pp. 1033–1036).

    Google Scholar 

  • Kittur, A., Nickerson, J. V., Bernstein, M., Gerber, E., Shaw, A., Zimmerman, J., et al. (2013). The future of crowd work. In Proceedings of the ACM Conference on Computer-Supported Cooperative Work and Social Computing (pp. 1301–1318).

    Google Scholar 

  • Krivosheev, E., Casati, F., Caforio, V., & Benatallah, B. (2017). Crowdsourcing paper screening in systematic literature reviews. arXiv:1709.05168.

  • Krivosheev, E., Casati, F., & Benatallah, B. (2018). Crowd-based multi-predicate screening of papers in literature reviews. In Proceedings of the World Wide Web Conference (pp. 55–64).

    Google Scholar 

  • Kulkarni, A., Gutheim, P., Narula, P., Rolnitzky, D., Parikh, T., & Hartmann, B. (2012). Mobileworks: Designing for quality in a managed crowdsourcing architecture. IEEE Internet Computing, 16(5), 28–35.

    Article  Google Scholar 

  • Kulkarni, A., Narula, P., Rolnitzky, D., & Kontny, N. (2014). Wish: Amplifying creative ability with expert crowds. In: Second AAAI Conference on Human Computation and Crowdsourcing.

    Google Scholar 

  • Lasecki, W. S. (2014). Crowd-powered intelligent systems. Human Computation Journal.

    Google Scholar 

  • Lasecki, W. S., Teevan, J., & Kamar, E. (2014). Information extraction and manipulation threats in crowd-powered systems. In Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work and Social Computing (pp. 248–256).

    Google Scholar 

  • Law, E., Gajos, K. Z., Wiggins, A., Gray, M. L., & Williams, A. C. (2017). Crowdsourcing as a tool for research: Implications of uncertainty. In Proceedings of the ACM Conference on Computer-Supported Cooperative Work and Social Computing (pp. 1544–1561).

    Google Scholar 

  • Li, G., Wang, J., Zheng, Y., & Franklin, M. J. (2016). Crowdsourced data management: A survey. IEEE Transactions on Knowledge and Data Engineering, 28(9), 2296–2319.

    Article  Google Scholar 

  • Lofi, C., & El Maarry, K. (2014). Design patterns for hybrid algorithmic-crowdsourcing workflows. CBI, 1 (pp. 1–8).

    Google Scholar 

  • Luz, N., Silva, N., & Novais, P. (2015). A survey of task-oriented crowdsourcing. Artificial Intelligence Review, 44(2), 187–213.

    Article  Google Scholar 

  • Marcus, A., & Parameswaran, A. (2015). Crowdsourced data management: Industry and academic perspectives. Foundations and Trends in Databases, 6(1–2), 1–161.

    Article  Google Scholar 

  • Morishima, A., Shinagawa, N., Mitsuishi, T., Aoki, H., & Fukusumi, S. (2012). CyLog/Crowd4U: A declarative platform for complex data-centric crowdsourcing. Proceedings of the VLDB Endowment, 5(12), 1918–1921.

    Article  Google Scholar 

  • Mortensen, M. L., Adam, G. P., Trikalinos, T. A., Kraska, T., & Wallace, B. C. (2017). An exploration of crowdsourcing citation screening for systematic reviews. Research Synthesis Methods, 8(3), 366–386.

    Article  Google Scholar 

  • Nebeling, M., Guo, A., To, A., Dow, S., Teevan, J., & Bigham, J. (2015). WearWrite: Orchestrating the crowd to complete complex tasks from wearables. In Adjunct Proceedings of the 28th Annual ACM Symposium on User Interface Software and Technology (pp. 39–40).

    Google Scholar 

  • Nguyen, A. T., Wallace, B. C., & Lease, M. (2015). Combining crowd and expert labels using decision theoretic active learning. In Proceedings of the Third AAAI Conference on Human Computation and Crowdsourcing.

    Google Scholar 

  • Niu, X. J., Qin, S. F., Vines, J., Wong, R., & Lu, H. (2018). Key crowdsourcing technologies for product design and development. International Journal of Automation and Computing, 1–15.

    Google Scholar 

  • Nov, O., Arazy, O., & Anderson, D. (2014). Scientists@Home: What drives the quantity and quality of online citizen science participation? PLoS ONE, 9(4), e90375.

    Article  Google Scholar 

  • Parshotam, K. (2013). Crowd computing: A literature review and definition. In Proceedings of the South African Institute for Computer Scientists and Information Technologists Conference (pp. 121–130).

    Google Scholar 

  • Peer, E., Brandimarte, L., Samat, S., & Acquisti, A. (2017). Beyond the Turk: Alternative platforms for crowdsourcing behavioral research. Journal of Experimental Social Psychology, 70, 153–163.

    Article  Google Scholar 

  • Quinn, A. J., Bederson, B. B., Yeh, T., & Lin, J. (2010). Crowdflow: Integrating machine learning with Mechanical Turk for speed-cost-quality flexibility. Better Performance over Iterations.

    Google Scholar 

  • Ramirez, J., Krivosheev, E., Baez, M., Casati, F., & Benatallah, B. (2018). CrowdRev: A platform for crowd-based screening of literature reviews. arXiv:1805.12376.

  • Ranard, B. L., Ha, Y. P., Meisel, Z. F., Asch, D. A., Hill, S. S., Becker, L. B., et al. (2014). Crowdsourcing—Harnessing the masses to advance health and medicine, a systematic review. Journal of General Internal Medicine, 29(1), 187–203.

    Article  Google Scholar 

  • Retelny, D., Robaszkiewicz, S., To, A., Lasecki, W. S., Patel, J., Rahmati, N., & Bernstein, M. S. (2014). Expert crowdsourcing with flash teams. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology (pp. 75–85).

    Google Scholar 

  • Rigby, J. (2009). Comparing the scientific quality achieved by funding instruments for single grant holders and for collaborative networks within a research system: Some observations. Scientometrics, 78(1), 145–164.

    Article  Google Scholar 

  • Salehi, N., Teevan, J., Iqbal, S., & Kamar, E. (2017). Communicating context to the crowd for complex writing tasks. In Proceedings of the ACM Conference on Computer Supported Cooperative Work and Social Computing (pp. 1890–1901).

    Google Scholar 

  • Schmitz, H., & Lykourentzou, I. (2016). It’s about time: Online macrotask sequencing in expert crowdsourcing. arXiv:1601.04038.

  • Schmitz, H., & Lykourentzou, I. (2018). Online sequencing of non-decomposable macrotasks in expert crowdsourcing. ACM Transactions on Social Computing, 1(1), 1.

    Article  Google Scholar 

  • Schneider, D., Moraes, K., De Souza, J. M., & Esteves, M. G. P. (2012). CSCWD: Five characters in search of crowds. In Proceedings of the IEEE International Conference on Computer Supported Cooperative Work in Design (pp. 634–641).

    Google Scholar 

  • Sieg, J. H., Wallin, M. W., & von Krogh, G. (2010). Managerial challenges in open innovation: A study of innovation intermediation in the chemical industry. R&D Management, 40(3), 281–291.

    Article  Google Scholar 

  • Stonebraker, M., Bruckner, D., Ilyas, I. F., Beskales, G., Cherniack, M., Zdonik, S. B. et al. (2013). Data curation at scale: The data tamer system. In CIDR.

    Google Scholar 

  • Talia, D. (2019). A view of programming scalable data analysis: From clouds to exascale. Journal of Cloud Computing, 8(1), 4.

    Article  Google Scholar 

  • Tsueng, G., Nanis, M., Fouquier, J., Good, B., & Su, A. (2016). Citizen science for mining the biomedical literature. BioRxiv, 038083.

    Google Scholar 

  • Vaish, R., Davis, J., & Bernstein, M. (2015). Crowdsourcing the research process. Collective Intelligence.

    Google Scholar 

  • Vaish, R., Gaikwad, S. N. S., Kovacs, G., Veit, A., Krishna, R., Arrieta Ibarra, I.,… & Davis, J. (2017). Crowd research: Open and scalable university laboratories. In: Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology (pp. 829–843).

    Google Scholar 

  • Valentine, M. A., Retelny, D., To, A., Rahmati, N., Doshi, T., & Bernstein, M. S. (2017). Flash organizations: Crowdsourcing complex work by structuring crowds as organizations. In Proceedings of the ACM CHI Conference on Human Factors in Computing Systems (pp. 3523–3537).

    Google Scholar 

  • Vaughan, J. W. (2018). Making better use of the crowd: How crowdsourcing can advance machine learning research. Journal of Machine Learning Research, 18(193), 1–46.

    MATH  Google Scholar 

  • Vukovic, M. (2009). Crowdsourcing for enterprises. In IEEE Congress on Services-I (pp. 686–692).

    Google Scholar 

  • Walsh, B., Maiers, C., Nally, G., Boggs, J., & Team, Praxis Program. (2014). Crowdsourcing individual interpretations: Between microtasking and macrotasking. Literary and Linguistic Computing, 29(3), 379–386.

    Article  Google Scholar 

  • Wang, N. C., Hicks, D., & Luther, K. (2018). Exploring trade-offs between learning and productivity in crowdsourced history. In Proceedings of the ACM on Human-Computer Interaction (CSCW) (Vol. 2, p. 178).

    Google Scholar 

  • Weiss, M. (2016). Crowdsourcing literature reviews in new domains. Technology Innovation Management Review, 6(2), 5–14.

    Article  Google Scholar 

  • Whiting, M. E., Gamage, D., Gaikwad, S. N. S., Gilbee, A., Goyal, S., Ballav, A., et al. (2017). Crowd guilds: Worker-led reputation and feedback on crowdsourcing platforms. In Proceedings of the ACM Conference on Computer Supported Cooperative Work and Social Computing (pp. 1902–1913).

    Google Scholar 

  • Xie, H., & Lui, J. C. (2018). Incentive mechanism and rating system design for crowdsourcing systems: Analysis, tradeoffs and inference. IEEE Transactions on Services Computing, 11(1), 90–102.

    Article  Google Scholar 

  • Yan, X., Ding, X., & Gu, N. (2016). Crowd work with or without crowdsourcing platforms. In: Proceedings of the IEEE 20th International Conference on Computer Supported Cooperative Work in Design (CSCWD) (pp. 56–61).

    Google Scholar 

  • Zakaria, N. A., & Abdullah, C. Z. H. (2018). Crowdsourcing and library performance in digital age. Development, 7(3).

    Google Scholar 

  • Zyskowski, K., Morris, M. R., Bigham, J. P., Gray, M. L., & Kane, S. K. (2015). Accessible crowdwork? Understanding the value in and challenge of microtask employment for people with disabilities. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work and Social Computing (pp. 1682–1693).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to António Correia .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Correia, A., Jameel, S., Paredes, H., Fonseca, B., Schneider, D. (2019). Hybrid Machine-Crowd Interaction for Handling Complexity: Steps Toward a Scaffolding Design Framework. In: Khan, VJ., Papangelis, K., Lykourentzou, I., Markopoulos, P. (eds) Macrotask Crowdsourcing. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-030-12334-5_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-12334-5_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-12333-8

  • Online ISBN: 978-3-030-12334-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics