Abstract
Recommender mechanisms can support the assignment of jobs in crowdsourcing platforms. The use of recommendations can improve the quality and outcome for both worker and requester. Workers expect to get tasks similar to previously finished ones as recommendations, as a preceding study shows. Such similarities between tasks have to be identified and analyzed in order to create task recommendation systems that fulfil the workers’ requirements. How workers characterize task similarity has been left open in the previous study. Therefore, this work provides an empirical study on how workers perceive the similarities between tasks. Different similarity aspects (e.g., the complexity, required action or the requester of the task) are evaluated towards their usefulness and the results are discussed. Worker characteristics, such as age, experience and country of origin are taken into account to determine how different worker groups judge similarity aspects of tasks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ambati, V., Vogel, S., Carbonell, J.: Towards task recommendation in micro-task markets. In: Proceedings of the 11th AAAI Conference on Human Computation, AAAIWS 2011, pp. 80–83. AAAI Press (2011)
Brabham, D.C.: Moving the crowd at threadless: motivations for participation in a crowdsourcing application. Inf. Commun. Soc. 13(8), 1122–1145 (2010)
Chartron, G., Kembellec, G.: General introduction to recommender systems. In: Kembellec, G., Chartron, G., Saleh, I. (eds.) Recommender Systems, pp. 1–23. Wiley, Hoboken (2014)
Chilton, L.B., Horton, J.J., Miller, R.C., Azenkot, S.: Task search in a human computation market. In: Proceedings of the ACM SIGKDD Workshop on Human Computation, pp. 1–9. ACM (2010)
Geiger, D.: Personalized Task Recommendation in Crowdsourcing Systems. Progress in IS. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-22291-2
Goodman, J.K., Cryder, C.E., Cheema, A.: Data collection in a flat world: the strengths and weaknesses of mechanical turk samples. J. Behav. Decis. Mak. 26(3), 213–224 (2013)
Hoßfeld, T., et al.: Best practices for QoE crowdtesting: QoE assessment with crowdsourcing. IEEE Trans. Multimedia 16(2), 541–558 (2014)
Kaufmann, N., Schulze, T., Veit, D.: More than fun and money. Worker motivation in crowdsourcing-a study on mechanical Turk. In: AMCIS, vol. 11, pp. 1–11 (2011)
Schnitzer, S., Neitzel, S., Schmidt, S., Rensing, C.: Perceived task similarities for task recommendation in crowdsourcing systems. In: Proceedings of the 25th International Conference Companion on World Wide Web. International World Wide Web Conferences (2016)
Schnitzer, S., Rensing, C., Schmidt, S., Borchert, K., Hirth, M., Tran-Gia, P.: Demands on task recommendation in crowdsourcing platforms - the worker’s perspective. In: ACM RecSys 2015 CrowdRec Workshop, Vienna (2015)
Yuen, M.-C., King, I., Leung, K.-S.: Task recommendation in crowdsourcing systems. In: Proceedings of the First International Workshop on Crowdsourcing and Data Mining, pp. 22–26. ACM (2012)
Yuen, M.-C., King, I., Leung, K.-S.: TaskRec: a task recommendation framework in crowdsourcing systems. Neural Process. Lett. 41, 1–16 (2014)
Acknowledgements
This work is supported by the Deutsche Forschungsgemeinschaft (DFG) under Grants STE 866/9-2, RE 2593/3-2, in the project “Design und Bewertung neuer Mechanismen für Crowdsourcing”.
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Schnitzer, S., Neitzel, S., Schmidt, S., Rensing, C. (2019). Results of a Survey About the Perceived Task Similarities in Micro Task Crowdsourcing Systems. In: Atzmueller, M., Chin, A., Lemmerich, F., Trattner, C. (eds) Behavioral Analytics in Social and Ubiquitous Environments. MUSE MSM MSM 2015 2015 2016. Lecture Notes in Computer Science(), vol 11406. Springer, Cham. https://doi.org/10.1007/978-3-030-34407-8_6
Download citation
DOI: https://doi.org/10.1007/978-3-030-34407-8_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-33906-7
Online ISBN: 978-3-030-34407-8
eBook Packages: Computer ScienceComputer Science (R0)