Skip to main content

Crowdstore: A Crowdsourcing Graph Database

  • Conference paper
  • First Online:
Collaborative Computing: Networking, Applications, and Worksharing (CollaborateCom 2015)

Abstract

Existing crowdsourcing database systems fail to support complex, collaborative or responsive crowd work. These systems implement human computation as independent tasks published online, and subsequently chosen by individual workers. Such pull model does not support worker collaboration and its expertise matching relies on workers’ subjective self-assessment. An extension to graph query languages combined with an enhanced database system components can express and facilitate social collaboration, sophisticated expert discovery and low-latency crowd work. In this paper we present such an extension, CRowdPQ, backed up by the database management system Crowdstore.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Acosta, M., Simperl, E., Flöck, F., Norton, B.: A sparql engine for crowdsourcing query processing using microtasks. Institute AIFB, KIT, Karlsruhe (2012)

    Google Scholar 

  2. Bernstein, M.S., Brandt, J., Miller, R.C., Karger, D.R.: Crowds in two seconds: Enabling realtime crowd-powered interfaces. In: Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, UIST 2011, pp. 33–42, New York, NY, USA. ACM (2011)

    Google Scholar 

  3. Bernstein, M.S., Karger, D.R., Miller, R.C., Brandt, J.: Analytic methods for optimizing realtime crowdsourcing. Computing Research Repository (2012)

    Google Scholar 

  4. Bozzon, A., Brambilla, M., Ceri, S.: Answering search queries with crowdsearcher. In: Proceedings of the 21st International Conference on World Wide Web, WWW 2012, pp. 1009–1018, New York, NY, USA. ACM (2012)

    Google Scholar 

  5. C. Callison-Burch. Fast, cheap, and creative: Evaluating translation quality using amazon’s mechanical turk. In Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing: Volume 1 - Volume 1, EMNLP 2009, pp. 286–295, Stroudsburg, PA, USA. Association for Computational Linguistics (2009)

    Google Scholar 

  6. Calvanese, D., Giacomo, G.D., Lenzerini, M., Vardi, M.Y.: Containment of conjunctive regular path queries with inverse. In: Proceedings of the 2000 International Conference on Knowledge Representation and Reasoning, KR 2000, pp. 176–185. Breckenridge, Colorado, USA (2000)

    Google Scholar 

  7. Downs, J.S., Holbrook, M.B., Sheng, S., Cranor, L.F., Are your participants gaming the system?: Screening mechanical turk workers. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2010, pp. 2399–2402, New York, NY, USA. ACM (2010)

    Google Scholar 

  8. Franklin, M.J., Kossmann, D., Kraska, T., Ramesh, S., Xin, R., Crowddb: answering queries with crowdsourcing. In: Proceedings of the ACM SIGMOD International Conference on Management of Data, SIGMOD 2011, pp. 61–72, New York, NY, USA, 2011. ACM (2011)

    Google Scholar 

  9. Ishii, H., Kobayashi, M., Clearboard: a seamless medium for shared drawing and conversation with eye contact. In: Proceedings of the SIGCHI Conference on Human factors in computing systems, pp. 525–532. ACM (1992)

    Google Scholar 

  10. Kittur, A., Nickerson, J.V., Bernstein, M., Gerber, E., Shaw, A., Zimmerman, J., Lease, M., Horton, J.: The future of crowd work. In: Proceedings of the 2013 Conference on Computer Supported Cooperative Work, CSCW 2013, pp. 1301–1318, New York, NY, USA. ACM (2013)

    Google Scholar 

  11. Le, J., Edmonds, A., Hester, V., Biewald, L., Ensuring quality in crowdsourced search relevance evaluation: the effects of training question distribution. In: SIGIR Workshop on Crowdsourcing for Search Evaluation, pp. 21–26 (2010)

    Google Scholar 

  12. Liptchinsky, V., Satzger, B., Zabolotnyi, R., Dustdar, S.: Expressive languages for selecting groups from graph-structured data. In: Proceedings of the 22Nd International Conference on World Wide Web, WWW 2013, pp. 761–770, Republic and Canton of Geneva, Switzerland. International World Wide Web Conferences Steering Committee (2013)

    Google Scholar 

  13. Marcus, A., Karger, D., Madden, S., Miller, R., Oh, S.: Counting with the crowd. In: Proceedings of the 39th International Conference on Very Large Data Bases, PVLDB 2013, pp. 109–120. VLDB Endowment (2013)

    Google Scholar 

  14. Marcus, A., Wu, E., Karger, D., Madden, S., Miller, R.: Human-powered sorts and joins. Proc. VLDB Endow. 5(1), 13–24 (2011)

    Article  Google Scholar 

  15. Marcus, A., Wu, E., Karger, D.R., Madden, S., Miller, R.C., Crowdsourced databases: query processing with people. In: 5th Biennial Conference on Innovative Data Systems Research (2011)

    Google Scholar 

  16. Parameswaran, A., Polyzotis, N.: Answering queries using humans, algorithms and databases. In: Conference on Inovative Data Systems Research (CIDR 2011). Stanford InfoLab, January 2011

    Google Scholar 

  17. Parameswaran, A.G., Park, H., Garcia-Molina, H., Polyzotis, N., Widom, J., Deco: declarative crowdsourcing. In: Proceedings of the 21st ACM International Conference on Information and Knowledge Management, CIKM 2012, pp. 1203–1212, New York, NY, USA. ACM.(2012)

    Google Scholar 

  18. Wang, J., Kraska, T., Franklin, M.J., Feng, J.: Crowder: crowdsourcing entity resolution. Proc. VLDB Endow. 5(11), 1483–1494 (2012)

    Article  Google Scholar 

  19. Wang, J., Li, G., Kraska, T., Franklin, M. J., Feng, J.: Leveraging transitive relations for crowdsourced joins. In: Proceedings of the 2013 ACM SIGMOD International Conference on Management of Data, SIGMOD 2013, pp. 229–240, New York, NY, USA. ACM (2013)

    Google Scholar 

  20. Wood, P.T.: Query languages for graph databases. ACM SIGMOD Record 41(1), 50–60 (2012)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vitaliy Liptchinsky .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Institute for Computer Sciences, Social Informatics and Telecommunications Engineering

About this paper

Cite this paper

Liptchinsky, V., Satzger, B., Schulte, S., Dustdar, S. (2016). Crowdstore: A Crowdsourcing Graph Database. In: Guo, S., Liao, X., Liu, F., Zhu, Y. (eds) Collaborative Computing: Networking, Applications, and Worksharing. CollaborateCom 2015. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 163. Springer, Cham. https://doi.org/10.1007/978-3-319-28910-6_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-28910-6_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-28909-0

  • Online ISBN: 978-3-319-28910-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics