Skip to main content

A Game-Theoretic Approach to Quality Improvement in Crowdsourcing Tasks

  • Conference paper
  • First Online:
Book cover Service Research and Innovation (ASSRI 2015, ASSRI 2017)

Abstract

Together with the rise of social media and mobile computing, crowdsourcing increasingly is being relied on as a popular source of information. Crowdsourcing techniques can be employed to solve a wide range of problems, mainly Human Intelligence Tasks (HITs) which are easy to do for human, but difficult or even impossible for computers. However, the quality of crowdsourced information always has been an issue. Several methods have been proposed in order to increase the chance of receiving high quality contributions from the crowd. In this paper, we propose a novel approach to improve the quality of contributions in crowdsourcing tasks. We employ the game theory to motivate people towards providing information of higher quality levels. We also take into account players’ quality factors such as reputation score, expertise and the level of agreement between players of the game to ensure that the problem owner receives an outcome of an accepted quality level. Simulation results demonstrate the efficacy of our proposed approach in terms of improving quality of the contributions as well as the chance of successful completion of the games, in comparison with state-of-the-art similar methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://archive.org/details/stackexchange.

  2. 2.

    http://gigaom.com/2010/11/19/18-tasks-you-can-crowdsource/.

  3. 3.

    http://www.tomnod.com/.

  4. 4.

    http://genesinspace.org/.

  5. 5.

    http://smorballgame.org/.

References

  1. Peary, B.D., Shaw, R., Takeuchi, Y.: Utilization of social media in the east Japan earthquake and tsunami and its effectiveness. J. Nat. Disaster Sci. 34(1), 3–18 (2012)

    Article  Google Scholar 

  2. Doan, A., Ramakrishnan, R., Halevy, A.Y.: Crowdsourcing systems on the world-wide web. Commun. ACM 54(4), 86–96 (2011)

    Article  Google Scholar 

  3. Beheshti, S., et al.: A systematic review and comparative analysis of cross-document coreference resolution methods and tools. Computing 99(4), 313–349 (2017)

    Article  MathSciNet  Google Scholar 

  4. Beheshti, S., Nezhad, H.R.M., Benatallah, B.: Temporal provenance model (TPM): model and query language (2012). CoRR vol. abs/1211.5009

    Google Scholar 

  5. Allahbakhsh, M., Benatallah, B., Ignjatovic, A., Motahari-Nezhad, H.R., Bertino, E., Dustdar, S.: Quality control in crowdsourcing systems: Issues and directions. IEEE Internet Comput. 17(2), 76–81 (2013)

    Article  Google Scholar 

  6. Amintoosi, H., Kanhere, S.S., Allahbakhsh, M.: Trust-based privacy-aware participant selection in social participatory sensing. J. Inf. Secur. Appl. 20, 11–25 (2015)

    Google Scholar 

  7. Rogstadius, J., et al.: An assessment of intrinsic and extrinsic motivation on task performance in crowdsourcing markets. In: ICWSM (2011)

    Google Scholar 

  8. Amintoosi, H., Kanhere, S.: A reputation framework for social participatory sensing systems. Mobile Networks Appl. 19(1), 88–100 (2014)

    Article  Google Scholar 

  9. Von Ahn, L.: Games with a purpose. Computer 39(6), 92–94 (2006)

    Article  Google Scholar 

  10. Welinder, P., Perona, P.: Online crowdsourcing: rating annotators and obtaining cost-effective labels. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 25–32. IEEE (2010)

    Google Scholar 

  11. Nash, J.F., et al.: Equilibrium points in N-person games. Proc. Nat. Acad. Sci. USA 36(1), 48–49 (1950)

    Article  MathSciNet  Google Scholar 

  12. Dow, S., Kulkarni, A., Klemmer, S., Hartmann, B.: Shepherding the crowd yields better work. In: Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work, pp. 1013–1022. ACM (2012)

    Google Scholar 

  13. Kittur, A., Smus, B., Khamkar, S., Kraut, R.E.: Crowdforge: Crowdsourcing complex work. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 43–52. ACM (2011)

    Google Scholar 

  14. Kulkarni, A., Can, M., Hartmann, B.: Collaboratively crowdsourcing workflows with turkomatic. In: Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work, pp. 1003–1012. ACM (2012)

    Google Scholar 

  15. Vuković, M.: Crowdsourcing for enterprises. In: 2009 World Conference on Services-I, pp. 686–692. IEEE (2009)

    Google Scholar 

  16. Allahbakhsh, M., Ignatovic, A.: An iterative method for calculating robust rating scores. IEEE Trans. Parallel Distrib. Syst. 26(2), 340–350 (2015)

    Article  Google Scholar 

  17. Levien, R., Aiken, A.: Attack-resistant trust metrics for public key certification. In: 7th USENIX Security Symposium, pp. 229–242 (1998)

    Google Scholar 

  18. Doan, A., et al.: Crowdsourcing systems on the world-wide web. Commun. ACM 54(4), 86–96 (2011)

    Article  Google Scholar 

  19. Oyama, S., Baba, Y., Sakurai, Y., Kashima, H.: Accurate integration of crowdsourced labels using workers’ self-reported confidence scores. In: Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence, IJCAI 2013, pp. 2554–2560 (2013)

    Google Scholar 

  20. Chen, J.J., et al.: Opportunities for crowdsourcing research on amazon mechanical turk. Interfaces 5(3) (2011)

    Google Scholar 

  21. Zheng, H., Li, D., Hou, W.: Task design, motivation, and participation in crowdsourcing contests. Int. J. Electron. Commer. 15(4), 57–88 (2011)

    Article  Google Scholar 

  22. Zhao, Y.C., Zhu, Q.: Effects of extrinsic and intrinsic motivation on participation in crowdsourcing contest: A perspective of self-determination theory. Online Inf. Rev. 38(7), 896–917 (2014)

    Article  Google Scholar 

  23. Allahbakhsh, M., Ignjatovic, A., Benatallah, B., Beheshti, S., Foo, N., Bertino, E.: Detecting, representing and querying collusion in online rating systems (2012). CoRR, vol. abs/1211.0963. http://arxiv.org/abs/1211.0963

  24. Allahbakhsh, M., Ignjatovic, A., Benatallah, B., Beheshti, S., Foo, N., Bertino, E.: An analytic approach to people evaluation in crowdsourcing systems (2012). CoRR abs/1211.3200. http://arxiv.org/abs/1211.3200

  25. Kaufmann, N., Schulze, T., Veit, D.: More than fun and money, worker motivation in crowdsourcing-a study on mechanical turk. In: AMCIS 2011, pp. 1–11 (2011)

    Google Scholar 

  26. Hamari, J., Koivisto, J., Sarsa, H.: Does gamification work?-a literature review of empirical studies on gamification. In: 2014 47th Hawaii International Conference on System Sciences (HICSS), pp. 3025–3034. IEEE (2014)

    Google Scholar 

  27. Huotari, K., Hamari, J.: Defining gamification: a service marketing perspective. In: Proceeding of the 16th International Academic MindTrek Conference, pp. 17–22. ACM (2012)

    Google Scholar 

  28. Morschheuser, B., Hamari, J., Koivisto, J.: Gamification in crowdsourcing: a review. In: 2016 49th Hawaii International Conference on System Sciences (HICSS), pp. 4375–4384. IEEE (2016)

    Google Scholar 

  29. Feyisetan, O., Simperl, E., Van Kleek, M., Shadbolt, N.: Improving paid microtasks through gamification and adaptive furtherance incentives. In: Proceedings of the 24th International Conference on World Wide Web, pp. 333–343. ACM (2015)

    Google Scholar 

  30. Goncalves, J., et al.: Game of words: Tagging places through crowdsourcing on public displays. In: Proceedings of the 2014 Conference on Designing Interactive Systems, DIS 2014, pp. 705–714. ACM, New York (2014)

    Google Scholar 

Download references

Acknowledgements

The authors would like to acknowledge Professor Boualem Benatallah at The University of New South Wales, Australia, for his invaluable supports and guidance.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohammad Allahbakhsh .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Allahbakhsh, M., Amintoosi, H., Kanhere, S.S. (2018). A Game-Theoretic Approach to Quality Improvement in Crowdsourcing Tasks. In: Beheshti, A., Hashmi, M., Dong, H., Zhang, W. (eds) Service Research and Innovation. ASSRI ASSRI 2015 2017. Lecture Notes in Business Information Processing, vol 234. Springer, Cham. https://doi.org/10.1007/978-3-319-76587-7_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-76587-7_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-76586-0

  • Online ISBN: 978-3-319-76587-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics