Abstract
Effective feedback is crucial for early-stage researchers (ESRs) to develop their research skills. While feedback from supervisors and colleagues is important, additional feedback from external helpers can be beneficial. However, obtaining diverse and high-quality feedback outside of a research group can be challenging. In this work, we designed and prototyped Rsourcer, a crowdsourcing-based pipeline that simplifies the process of requesting, offering, evaluating, and adopting feedback. We evaluated Rsourcer with a concept validation study and a pilot study, which showed its potential. This work contributes with insights into crowdsourcing support with social technologies and extends research on scaling support for skills development.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
- 3.
The video prototype is available at https://bit.ly/3ZZfC9s.
- 4.
The full survey is available at http://bitly.ws/BDD9.
References
Bharadwaj, A., Siangliulue, P., Marcus, A., Luther, K.: Critter: augmenting creative work with dynamic checklists, automated quality assurance, and contextual reviewer feedback. In: CHI 2019, pp. 1–12 (2019)
Braun, V., Clarke, V.: Successful Qualitative Research: A Practical Guide for Beginners. SAGE, Thousand Oaks (2013)
Cheng, R., Zeng, Z., Liu, M., Dow, S.: Critique me: exploring how creators publicly request feedback in an online critique community. In: Proceedings of the ACM on Human-Computer Interaction, vol. 4 (CSCW2), pp. 1–24 (2020)
Fulcher, M.R., et al.: Broadening participation in scientific conferences during the era of social distancing. Trends Microbiol. 28(12), 949–952 (2020)
Greenberg, S., Buxton, B.: Usability evaluation considered harmful (some of the time). In: CHI 2008, pp. 111–120 (2008)
Hinckley, K.: So you’re a program committee member now: on excellence in reviews and meta-reviews and championing submitted work that has merit (2015). https://bit.ly/3cCG3Pg
Introne, J., Semaan, B., Goggins, S.: A sociotechnical mechanism for online support provision. In: CHI 2016, pp. 3559–3571 (2016)
Jiang, Y.: Scaling Research Support for Early-Stage Researchers with Crowdsourcing. Ph.D. thesis, University of New South Wales (2021)
Jiang, Y., Báez, M., Benatallah, B.: Understanding how early-stage researchers perceive external research feedback. In: CI 2021. ACM (2021)
Jiang, Y., Schlagwein, D., Benatallah, B.: A review on crowdsourcing for education: state of the art of literature and practice. In: PACIS 2018 (2018)
Krause, M., Garncarz, T., Song, J., Gerber, E.M., Bailey, B.P., Dow, S.P.: Critique style guide: improving crowdsourced design feedback with a natural language model, pp. 4627–4639. ACM, New York, NY, USA (2017)
Sadler, D.R.: Formative assessment and the design of instructional systems. Instr. Sci. 18(2), 119–144 (1989). https://doi.org/10.1007/BF00117714
Saunders, B., et al.: Saturation in qualitative research: exploring its conceptualization and operationalization. Qual. Q. 1–15 (2017). https://doi.org/10.1007/s11135-017-0574-8
Vaish, R., et al.: Crowd research: open and scalable university laboratories. In: UIST 2017, pp. 829–843. ACM, New York, NY, USA (2017)
Wang, T., Li, L.Y.: ‘tell me what to do‘ vs. ‘guide me through it‘: feedback experiences of international doctoral students. Active Learn. High. Educ. 12(2), 101–112 (2011)
Yen, Y.C.G., Kim, J.O., Bailey, B.P.: Decipher: an interactive visualization tool for interpreting unstructured design feedback from multiple providers. In: CHI 2020. pp. 1–13. ACM, New York, NY, USA (2020)
Yuan, A., Luther, K., Krause, M., Vennix, S.I., Dow, S.P., Hartmann, B.: Almost an expert: the effects of rubrics and expertise on perceived value of crowdsourced design critiques. In: CSCW 2016, pp. 1005–1017. ACM, New York, NY, USA (2016)
Zhang, H., Easterday, M.W., Gerber, E.M., Lewis, D.R., Maliakal, L.: Agile research studios: orchestrating communities of practice to advance research training. In: CSCW 2017 Companion, pp. 45–48. ACM, New York, NY, USA (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Jiang, Y., Benatallah, B., Báez, M. (2023). Rsourcer: Scaling Feedback on Research Drafts. In: Cabanillas, C., Pérez, F. (eds) Intelligent Information Systems. CAiSE 2023. Lecture Notes in Business Information Processing, vol 477. Springer, Cham. https://doi.org/10.1007/978-3-031-34674-3_8
Download citation
DOI: https://doi.org/10.1007/978-3-031-34674-3_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-34673-6
Online ISBN: 978-3-031-34674-3
eBook Packages: Computer ScienceComputer Science (R0)