Skip to main content

Rsourcer: Scaling Feedback on Research Drafts

  • Conference paper
  • First Online:
Intelligent Information Systems (CAiSE 2023)

Part of the book series: Lecture Notes in Business Information Processing ((LNBIP,volume 477))

Included in the following conference series:

  • 308 Accesses

Abstract

Effective feedback is crucial for early-stage researchers (ESRs) to develop their research skills. While feedback from supervisors and colleagues is important, additional feedback from external helpers can be beneficial. However, obtaining diverse and high-quality feedback outside of a research group can be challenging. In this work, we designed and prototyped Rsourcer, a crowdsourcing-based pipeline that simplifies the process of requesting, offering, evaluating, and adopting feedback. We evaluated Rsourcer with a concept validation study and a pilot study, which showed its potential. This work contributes with insights into crowdsourcing support with social technologies and extends research on scaling support for skills development.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 44.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.researchgate.net/.

  2. 2.

    https://slack.com/intl/en-au/features.

  3. 3.

    The video prototype is available at https://bit.ly/3ZZfC9s.

  4. 4.

    The full survey is available at http://bitly.ws/BDD9.

References

  1. Bharadwaj, A., Siangliulue, P., Marcus, A., Luther, K.: Critter: augmenting creative work with dynamic checklists, automated quality assurance, and contextual reviewer feedback. In: CHI 2019, pp. 1–12 (2019)

    Google Scholar 

  2. Braun, V., Clarke, V.: Successful Qualitative Research: A Practical Guide for Beginners. SAGE, Thousand Oaks (2013)

    Google Scholar 

  3. Cheng, R., Zeng, Z., Liu, M., Dow, S.: Critique me: exploring how creators publicly request feedback in an online critique community. In: Proceedings of the ACM on Human-Computer Interaction, vol. 4 (CSCW2), pp. 1–24 (2020)

    Google Scholar 

  4. Fulcher, M.R., et al.: Broadening participation in scientific conferences during the era of social distancing. Trends Microbiol. 28(12), 949–952 (2020)

    Article  Google Scholar 

  5. Greenberg, S., Buxton, B.: Usability evaluation considered harmful (some of the time). In: CHI 2008, pp. 111–120 (2008)

    Google Scholar 

  6. Hinckley, K.: So you’re a program committee member now: on excellence in reviews and meta-reviews and championing submitted work that has merit (2015). https://bit.ly/3cCG3Pg

  7. Introne, J., Semaan, B., Goggins, S.: A sociotechnical mechanism for online support provision. In: CHI 2016, pp. 3559–3571 (2016)

    Google Scholar 

  8. Jiang, Y.: Scaling Research Support for Early-Stage Researchers with Crowdsourcing. Ph.D. thesis, University of New South Wales (2021)

    Google Scholar 

  9. Jiang, Y., Báez, M., Benatallah, B.: Understanding how early-stage researchers perceive external research feedback. In: CI 2021. ACM (2021)

    Google Scholar 

  10. Jiang, Y., Schlagwein, D., Benatallah, B.: A review on crowdsourcing for education: state of the art of literature and practice. In: PACIS 2018 (2018)

    Google Scholar 

  11. Krause, M., Garncarz, T., Song, J., Gerber, E.M., Bailey, B.P., Dow, S.P.: Critique style guide: improving crowdsourced design feedback with a natural language model, pp. 4627–4639. ACM, New York, NY, USA (2017)

    Google Scholar 

  12. Sadler, D.R.: Formative assessment and the design of instructional systems. Instr. Sci. 18(2), 119–144 (1989). https://doi.org/10.1007/BF00117714

  13. Saunders, B., et al.: Saturation in qualitative research: exploring its conceptualization and operationalization. Qual. Q. 1–15 (2017). https://doi.org/10.1007/s11135-017-0574-8

  14. Vaish, R., et al.: Crowd research: open and scalable university laboratories. In: UIST 2017, pp. 829–843. ACM, New York, NY, USA (2017)

    Google Scholar 

  15. Wang, T., Li, L.Y.: ‘tell me what to do‘ vs. ‘guide me through it‘: feedback experiences of international doctoral students. Active Learn. High. Educ. 12(2), 101–112 (2011)

    Google Scholar 

  16. Yen, Y.C.G., Kim, J.O., Bailey, B.P.: Decipher: an interactive visualization tool for interpreting unstructured design feedback from multiple providers. In: CHI 2020. pp. 1–13. ACM, New York, NY, USA (2020)

    Google Scholar 

  17. Yuan, A., Luther, K., Krause, M., Vennix, S.I., Dow, S.P., Hartmann, B.: Almost an expert: the effects of rubrics and expertise on perceived value of crowdsourced design critiques. In: CSCW 2016, pp. 1005–1017. ACM, New York, NY, USA (2016)

    Google Scholar 

  18. Zhang, H., Easterday, M.W., Gerber, E.M., Lewis, D.R., Maliakal, L.: Agile research studios: orchestrating communities of practice to advance research training. In: CSCW 2017 Companion, pp. 45–48. ACM, New York, NY, USA (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuchao Jiang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Jiang, Y., Benatallah, B., Báez, M. (2023). Rsourcer: Scaling Feedback on Research Drafts. In: Cabanillas, C., Pérez, F. (eds) Intelligent Information Systems. CAiSE 2023. Lecture Notes in Business Information Processing, vol 477. Springer, Cham. https://doi.org/10.1007/978-3-031-34674-3_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-34674-3_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-34673-6

  • Online ISBN: 978-3-031-34674-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics