Abstract
Together with the rise of social media and mobile computing, crowdsourcing increasingly is being relied on as a popular source of information. Crowdsourcing techniques can be employed to solve a wide range of problems, mainly Human Intelligence Tasks (HITs) which are easy to do for human, but difficult or even impossible for computers. However, the quality of crowdsourced information always has been an issue. Several methods have been proposed in order to increase the chance of receiving high quality contributions from the crowd. In this paper, we propose a novel approach to improve the quality of contributions in crowdsourcing tasks. We employ the game theory to motivate people towards providing information of higher quality levels. We also take into account players’ quality factors such as reputation score, expertise and the level of agreement between players of the game to ensure that the problem owner receives an outcome of an accepted quality level. Simulation results demonstrate the efficacy of our proposed approach in terms of improving quality of the contributions as well as the chance of successful completion of the games, in comparison with state-of-the-art similar methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Peary, B.D., Shaw, R., Takeuchi, Y.: Utilization of social media in the east Japan earthquake and tsunami and its effectiveness. J. Nat. Disaster Sci. 34(1), 3–18 (2012)
Doan, A., Ramakrishnan, R., Halevy, A.Y.: Crowdsourcing systems on the world-wide web. Commun. ACM 54(4), 86–96 (2011)
Beheshti, S., et al.: A systematic review and comparative analysis of cross-document coreference resolution methods and tools. Computing 99(4), 313–349 (2017)
Beheshti, S., Nezhad, H.R.M., Benatallah, B.: Temporal provenance model (TPM): model and query language (2012). CoRR vol. abs/1211.5009
Allahbakhsh, M., Benatallah, B., Ignjatovic, A., Motahari-Nezhad, H.R., Bertino, E., Dustdar, S.: Quality control in crowdsourcing systems: Issues and directions. IEEE Internet Comput. 17(2), 76–81 (2013)
Amintoosi, H., Kanhere, S.S., Allahbakhsh, M.: Trust-based privacy-aware participant selection in social participatory sensing. J. Inf. Secur. Appl. 20, 11–25 (2015)
Rogstadius, J., et al.: An assessment of intrinsic and extrinsic motivation on task performance in crowdsourcing markets. In: ICWSM (2011)
Amintoosi, H., Kanhere, S.: A reputation framework for social participatory sensing systems. Mobile Networks Appl. 19(1), 88–100 (2014)
Von Ahn, L.: Games with a purpose. Computer 39(6), 92–94 (2006)
Welinder, P., Perona, P.: Online crowdsourcing: rating annotators and obtaining cost-effective labels. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 25–32. IEEE (2010)
Nash, J.F., et al.: Equilibrium points in N-person games. Proc. Nat. Acad. Sci. USA 36(1), 48–49 (1950)
Dow, S., Kulkarni, A., Klemmer, S., Hartmann, B.: Shepherding the crowd yields better work. In: Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work, pp. 1013–1022. ACM (2012)
Kittur, A., Smus, B., Khamkar, S., Kraut, R.E.: Crowdforge: Crowdsourcing complex work. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, pp. 43–52. ACM (2011)
Kulkarni, A., Can, M., Hartmann, B.: Collaboratively crowdsourcing workflows with turkomatic. In: Proceedings of the ACM 2012 Conference on Computer Supported Cooperative Work, pp. 1003–1012. ACM (2012)
Vuković, M.: Crowdsourcing for enterprises. In: 2009 World Conference on Services-I, pp. 686–692. IEEE (2009)
Allahbakhsh, M., Ignatovic, A.: An iterative method for calculating robust rating scores. IEEE Trans. Parallel Distrib. Syst. 26(2), 340–350 (2015)
Levien, R., Aiken, A.: Attack-resistant trust metrics for public key certification. In: 7th USENIX Security Symposium, pp. 229–242 (1998)
Doan, A., et al.: Crowdsourcing systems on the world-wide web. Commun. ACM 54(4), 86–96 (2011)
Oyama, S., Baba, Y., Sakurai, Y., Kashima, H.: Accurate integration of crowdsourced labels using workers’ self-reported confidence scores. In: Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence, IJCAI 2013, pp. 2554–2560 (2013)
Chen, J.J., et al.: Opportunities for crowdsourcing research on amazon mechanical turk. Interfaces 5(3) (2011)
Zheng, H., Li, D., Hou, W.: Task design, motivation, and participation in crowdsourcing contests. Int. J. Electron. Commer. 15(4), 57–88 (2011)
Zhao, Y.C., Zhu, Q.: Effects of extrinsic and intrinsic motivation on participation in crowdsourcing contest: A perspective of self-determination theory. Online Inf. Rev. 38(7), 896–917 (2014)
Allahbakhsh, M., Ignjatovic, A., Benatallah, B., Beheshti, S., Foo, N., Bertino, E.: Detecting, representing and querying collusion in online rating systems (2012). CoRR, vol. abs/1211.0963. http://arxiv.org/abs/1211.0963
Allahbakhsh, M., Ignjatovic, A., Benatallah, B., Beheshti, S., Foo, N., Bertino, E.: An analytic approach to people evaluation in crowdsourcing systems (2012). CoRR abs/1211.3200. http://arxiv.org/abs/1211.3200
Kaufmann, N., Schulze, T., Veit, D.: More than fun and money, worker motivation in crowdsourcing-a study on mechanical turk. In: AMCIS 2011, pp. 1–11 (2011)
Hamari, J., Koivisto, J., Sarsa, H.: Does gamification work?-a literature review of empirical studies on gamification. In: 2014 47th Hawaii International Conference on System Sciences (HICSS), pp. 3025–3034. IEEE (2014)
Huotari, K., Hamari, J.: Defining gamification: a service marketing perspective. In: Proceeding of the 16th International Academic MindTrek Conference, pp. 17–22. ACM (2012)
Morschheuser, B., Hamari, J., Koivisto, J.: Gamification in crowdsourcing: a review. In: 2016 49th Hawaii International Conference on System Sciences (HICSS), pp. 4375–4384. IEEE (2016)
Feyisetan, O., Simperl, E., Van Kleek, M., Shadbolt, N.: Improving paid microtasks through gamification and adaptive furtherance incentives. In: Proceedings of the 24th International Conference on World Wide Web, pp. 333–343. ACM (2015)
Goncalves, J., et al.: Game of words: Tagging places through crowdsourcing on public displays. In: Proceedings of the 2014 Conference on Designing Interactive Systems, DIS 2014, pp. 705–714. ACM, New York (2014)
Acknowledgements
The authors would like to acknowledge Professor Boualem Benatallah at The University of New South Wales, Australia, for his invaluable supports and guidance.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG
About this paper
Cite this paper
Allahbakhsh, M., Amintoosi, H., Kanhere, S.S. (2018). A Game-Theoretic Approach to Quality Improvement in Crowdsourcing Tasks. In: Beheshti, A., Hashmi, M., Dong, H., Zhang, W. (eds) Service Research and Innovation. ASSRI ASSRI 2015 2017. Lecture Notes in Business Information Processing, vol 234. Springer, Cham. https://doi.org/10.1007/978-3-319-76587-7_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-76587-7_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-76586-0
Online ISBN: 978-3-319-76587-7
eBook Packages: Computer ScienceComputer Science (R0)