ABSTRACT
In this paper, we present a software tool to help emergency planners at Hampshire County Council in the UK to create maps for high-fidelity crowd simulations that require evacuation routes from buildings to roads. The main feature of the system is a crowdsourcing mechanism that breaks down the problem of creating evacuation routes into microtasks that a contributor to the platform can execute in less than a minute. As part of the mechanism we developed a concensus-based trust mechanism that filters out incorrect contributions and ensures that the individual tasks are complete and correct. To drive people to contribute to the platform, we experimented with different incentive mechanisms and applied these over different time scales, the aim being to evaluate what incentives work with different types of crowds, including anonymous contributors from Amazon Mechanical Turk. The results of the 'in the wild' deployment of the system show that the system is effective at engaging contributors to perform tasks correctly and that users respond to incentives in different ways. More specifically, we show that purely social motives are not good enough to attract a large number of contributors and that contributors are averse to the uncertainty in winning rewards. When taken altogether, our results suggest that a combination of incentives may be the best approach to harnessing the maximum number of resources to get socially valuable tasks (such for planning applications) performed on a large scale.
- M. S. Bernstein, G. Little, R. C. Miller, B. Hartmann, M. S. Ackerman, D. R. Karger, D. Crowell, and K. Panovich. Soylent: a word processor with a crowd inside. In Proceedings of UIST 2010, pages 313--322, 2010. Google ScholarDigital Library
- M. G. Findley, M. C. Gleave, R. N. Morello, and D. L. Nielson. Extrinsic, intrinsic, and social incentives for crowdsourcing development information in uganda: A field experiment. Working Paper, 2012.Google Scholar
- K. Heimerl, B. Gawalt, K. Chen, T. Parikh, and B. Hartmann. Communitysourcing: engaging local crowds to perform expert work via physical kiosks. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems, CHI '12, pages 1539--1548, New York, NY, USA, 2012. ACM. Google ScholarDigital Library
- C. Heipke. Crowdsourcing geospatial data. ISPRS Journal of Photogrammetry and Remote Sensing, 65(6):550--557, 2010.Google ScholarCross Ref
- J. Heyman and D. Ariely. Effort for payment a tale of two markets. Psychological Science, 15(11):787--793, 2004.Google ScholarCross Ref
- W. S. Lasecki, R. Wesley, A. Kulkarni, and J. P. Bigham. Speaking with the crowd. In Proceedings of the Symposium on User Interface Software and Technology (UIST 2012), 2012. Google ScholarDigital Library
- W. Mason and D. J. Watts. Financial incentives and the "performance of crowds". In Proceedings of the ACM SIGKDD Workshop on Human Computation, HCOMP '09, pages 77--85, New York, NY, USA, 2009. ACM. Google ScholarDigital Library
- J. Nielsen. Participation inequality: Encouraging more users to contribute, 2006. http://www.useit.com/alertbox/participation_inequality.html.Google Scholar
- A. J. Quinn and B. B. Bederson. Human computation: a survey and taxonomy of a growing field. In Proceedings of the 2011 annual conference on Human factors in computing systems, CHI '11, pages 1403--1412, New York, NY, USA, 2011. ACM. Google ScholarDigital Library
- J. C. Tang, M. Cebrian, N. A. Giacobe, H.-W. Kim, T. Kim, and D. B. Wickert. Reflecting on the darpa red balloon challenge. Commun. ACM, 54(4):78--85, Apr. 2011. Google ScholarDigital Library
- S. Van Wart, K. J. Tsai, and T. Parikh. Local ground: a paper-based toolkit for documenting local geo-spatial knowledge. In Proceedings of the First ACM Symposium on Computing for Development, ACM DEV '10, pages 11:1--11:10, New York, NY, USA, 2010. ACM. Google ScholarDigital Library
- L. Von Ahn, R. Liu, and M. Blum. Peekaboom: a game for locating objects in images. In Proceedings of the SIGCHI conference on Human Factors in computing systems, pages 55--64. ACM, 2006. Google ScholarDigital Library
- H. Zhang, E. Horvitz, and R. C. Miller. Crowdsourcing general computation. In CHI Workshop on Human Computation, 2011.Google Scholar
- H. Zhang, E. Law, R. Miller, K. Gajos, D. Parkes, and E. Horvitz. Human computation tasks with global constraints. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems, CHI '12, pages 217--226, New York, NY, USA, 2012. ACM. Google ScholarDigital Library
- M. Zook, M. Graham, T. Shelton, and S. Gorman. Volunteered geographic information and crowdsourcing disaster relief: A case study of the haitian earthquake. World Medical and Health Policy, 2(2), 2010.Google Scholar
Index Terms
- Collabmap: crowdsourcing maps for emergency planning
Recommendations
CollabMap: augmenting maps using the wisdom of crowds
AAAIWS'11-11: Proceedings of the 11th AAAI Conference on Human ComputationImplications of a Reserve Price in an Agent-Based Common-Value Auction
Auction sellers can use a reserve price to require a minimum bid before items are sold. Theoretical and experimental research has tested the influence of a reserve price in an independent private values auction, but little focus has been given to the ...
Sequential auctions and externalities
SODA '12: Proceedings of the twenty-third annual ACM-SIAM symposium on Discrete algorithmsIn many settings agents participate in multiple different auctions that are not necessarily implemented simultaneously. Future opportunities affect strategic considerations of the players in each auction, introducing externalities. Motivated by this ...
Comments