ABSTRACT
The results of our exploratory study provide new insights to crowdsourcing knowledge intensive tasks. We designed and performed an annotation task on a print collection of the Rijksmuseum Amsterdam, involving experts and crowd workers in the domain-specific description of depicted flowers. We created a testbed to collect annotations from flower experts and crowd workers and analyzed these in regard to user agreement. The findings show promising results, demonstrating how, for given categories, nichesourcing can provide useful annotations by connecting crowdsourcing to domain expertise.
- V. Boer, M. Hildebrand, L. Aroyo, P. Leenheer, C. Dijkshoorn, B. Tesfa, and G. Schreiber. Nichesourcing: Harnessing the power of crowds of experts. In Knowledge Engineering and Knowledge Management, volume 7603 of LNCS, pages 16--20. Springer, 2012. Google ScholarDigital Library
- A. Ellis, D. Gluckman, A. Cooper, and A. Greg. Your paintings: A nation's oil paintings go online, tagged by the public. Museum and the Web, 2012.Google Scholar
- R. Gligorov, L. B. Baltussen, J. van Ossenbruggen, L. Aroyo, M. Brinkerink, J. Oomen, and A. van Ees. Towards integration of end-user tags with professional annotations. In Proceedings of the WebSci10: Extending the Frontiers of Society On-Line, 2010.Google Scholar
- J. Trant, B. Wyman, and Steve. Investigating social tagging and folksonomy in art museums with steve.museum. In Proceedings of the WWW'06 Collaborative Web Tagging Workshop, 2006.Google Scholar
Index Terms
- Crowd vs. experts: nichesourcing for knowledge intensive tasks in cultural heritage
Recommendations
Crowdsourcing knowledge-intensive tasks in cultural heritage
WebSci '14: Proceedings of the 2014 ACM conference on Web scienceLarge datasets such as Cultural Heritage collections require detailed annotations when digitised and made available online. Annotating different aspects of such collections requires a variety of knowledge and expertise which is not always possessed by ...
Make Hay While the Crowd Shines: Towards Efficient Crowdsourcing on the Web
WWW '15 Companion: Proceedings of the 24th International Conference on World Wide WebWithin the scope of this PhD proposal, we set out to investigate two pivotal aspects that influence the effectiveness of crowdsourcing: (i) microtask design, and (ii) workers behavior. Leveraging the dynamics of tasks that are crowdsourced on the one ...
Crowd-powered experts: helping surgeons interpret breast cancer images
GamifIR '14: Proceedings of the First International Workshop on Gamification for Information RetrievalCrowdsourcing is often applied for the task of replacing the scarce or expensive labour of experts with that of untrained workers. In this paper, we argue, that this objective might not always be desirable, but that we should instead aim at leveraging ...
Comments