Abstract
The data-intensive research paradigm calls for using educational and learning data to generate actionable insights and improve the instruction and learning quality. Although previous research designed and employed teaching analytics or learning analytics tools, few research had incorporated multiple data sources to assess the overall teaching and learning processes comprehensively. To address this gap, we proposed a teaching and learning analytics (TLA) tool that integrated multiple data sources from the instructor and students during educational process, leveraged multiple analytic methods to visualize results and provide supportive feedforward, with the goal to provide data-driven evidence for educational improvement. Mixed methods were conducted from quantitative and qualitative ways to examine the tool’s effects on actual instruction and learning processes. Our results showed that the designed TLA tool with feedforward suggestions had positive effects on instruction, learning, and instructor-student interactions. Based on the results, this research proposed implications for TLA tool design and pedagogical strategies.
Similar content being viewed by others
Data availability
The datasets are available from the corresponding author on reasonable request.
References
Amarasinghe, I., Hernández-Leo, D., & Ulrich Hoppe, H. (2021). Deconstructing orchestration load: Comparing teacher support through mirroring and guiding. International Journal of Computer-Supported Collaborative Learning, 16(3), 307–338. https://doi.org/10.1007/s11412-021-09351-9
Bodily, R., & Verbert, K. (2017). Review of research on student-facing learning analytics dashboards and educational recommender systems. IEEE Transactions on Learning Technologies, 10(4), 405–418. https://doi.org/10.1109/TLT.2017.2740172
Butts, C. T. (2008). Social network analysis with sna. Journal of Statistical Software, 24(6), 1–51. https://doi.org/10.18637/jss.v024.i06
Cathcart, A., Greer, D., & Neale, L. (2014). Learner-focused evaluation cycles: Facilitating learning using feedforward, concurrent and feedback evaluation. Assessment & Evaluation in Higher Education, 39(7), 790–802. https://doi.org/10.1080/02602938.2013.870969
Chamberlin, B. A., & Scot, T. P. (2002). Creating sustainable technology integration with teachers: One one-hour workshop at a time. Journal of Computing in Teacher Education, 19(1), 23–28. https://doi.org/10.1080/10402454.2002.10784456
Chen, X. (2013). Meta-teaching: Meaning and strategy. Africa Education Review, 10(1), 63–74. https://doi.org/10.1080/18146627.2013.855431
Chen, B. (2015). LagSeq: R implementation of lag-sequential analysis (version 0.0.0.9000) [R package]. Retrieved from https://github.com/meefen/LagSeq. Accessed 01 Jan 2023.
Chen, C. L., & Zhang, C. Y. (2014). Data-intensive applications, challenges, techniques and technologies: A survey on big data. Information Sciences, 275(1), 314–347. https://doi.org/10.1016/j.ins.2014.01.015
Chen, G., Clarke, S. N., & Resnick, L. B. (2015). Classroom discourse analyzer (CDA): A discourse analytic tool for teachers. Technology, Instruction, Cognition and Learning, 10(2), 85–105. https://www.researchgate.net/publication/342163333_Classroom_Discourse_Analyzer_CDA_A_Discourse_Analytic_Tool_for_Teachers.
Chen, G., Chan, C. K., Chan, K. K., Clarke, S. N., & Resnick, L. B. (2020). Efficacy of video-based teacher professional development for increasing classroom discourse and student learning. Journal of the Learning Sciences, 29(4), 642–680. https://doi.org/10.1080/10508406.2020.1783269
Chiu, H. Y., Chen, C. C., Joung, Y. J., & Chen, S. (2014). A study of blog networks to determine online social network properties from the tie strength perspective. Online Information Review, 38(3), 381–398. https://doi.org/10.1108/OIR-01-2013-0022
Christy, K. R., & Fox, J. (2014). Leaderboards in a virtual classroom: A test of stereotype threat and social comparison explanations for women’s math performance. Computers & Education, 78(1), 66–77. https://doi.org/10.1016/j.compedu.2014.05.005
Conijn, R., Martinez-Maldonado, R., Knight, S., Buckingham Shum, S., Van Waes, L., & van Zaanen, M. (2020). How to provide automated feedback on the writing process? A participatory approach to design writing analytics tools. Computer Assisted Language Learning, 35(8), 1–31. https://doi.org/10.1080/09588221.2020.1839503
Damşa, C. I. (2014). The multi-layered nature of small-group learning: Productive interactions in object-oriented collaboration. International Journal of Computer-Supported Collaborative Learning, 9(3), 247–281. https://doi.org/10.1007/s11412-014-9193-8
Freitas, S., Gibson, D., Du Plessis, C., Halloran, P., Williams, E., Ambrose, M., Dunwell, I., & Arnab, S. (2014). Foundations of dynamic learning analytics: Using university student data to increase retention. British Journal of Educational Technology, 46(6), 1175–1188. https://doi.org/10.1111/bjet.12212
Guo, Z., & Barmaki, R. (2020). Deep neural networks for collaborative learning analytics: Evaluating team collaborations using student gaze point prediction. Australasian Journal of Educational Technology, 36(6), 53–71. https://doi.org/10.14742/ajet.6436.
Han, J., Kim, K. H., Rhee, W., & Cho, Y. H. (2021). Learning analytics dashboards for adaptive support in face-to-face collaborative argumentation. Computers & Education, 163(1), 104041. https://doi.org/10.1016/j.compedu.2020.104041
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112. https://doi.org/10.3102/003465430298487
Hendry, G. D., White, P., & Herbert, C. (2016). Providing exemplar-based ‘feedforward’ before an assessment: The role of teacher explanation. Active Learning in Higher Education, 17(2), 99–109. https://doi.org/10.1177/1469787416637479
Hermsen, S., Frost, J., Renes, R. J., & Kerkhof, P. (2016). Using feedback through digital technology to disrupt and change habitual behavior: A critical review of current literature. Computers in Human Behavior, 57(1), 61–74. https://doi.org/10.1016/j.chb.2015.12.023
Hey, T., Tansley, S., & Tolle, K. (Eds.). (2009). The fourth paradigm: Data-intensive scientific discovery. Microsoft Research. http://research.microsoft.com/en-us/collaboration/fourthparadigm/.
Holstein, K., McLaren, B. M., & Aleven, V. (2019). Co-designing a real-time classroom orchestration tool to support teacher-AI complementarity. Journal of Learning Analytics, 6(2), 27–52. https://doi.org/10.18608/jla.2019.62.3.
Hounsell, D., McCune, V., Hounsell, J., & Litjens, J. (2008). The quality of guidance and feedback to students. Higher Education Research & Development, 27(1), 55–67. https://doi.org/10.1080/07294360701658765
Janssen, J., Erkens, G., Kanselaar, G., & Jaspers, J. G. M. (2007). Visualization of participation: Does it contribute to successful computer-supported collaborative learning? Computers & Education, 49(4), 1037–1065. https://doi.org/10.1016/j.compedu.2006.01.004
Knight, S., & Shum, S. B. (2017). Theory and learning analytics. In C. Lang, G. Siemens, A. F. Wise, & D. Gašević (Eds.), Handbook of learning analytics (pp. 17–22). Society for Learning Analytics Research. https://doi.org/10.18608/hla17.001.
Knight, S., Shibani, A., Abel, S., Gibson, A., & Ryan, P. (2020). AcaWriter: A learning analytics tool for formative feedback on academic writing. Journal of Writing Research, 12(1), 141–186. https://doi.org/10.17239/jowr-2020.12.01.06.
Lim, L. A., Gentili, S., Pardo, A., Kovanović, V., Whitelock-Wainwright, A., Gašević, D., & Dawson, S. (2019). What changes, and for whom? A study of the impact of learning analytics-based process feedback in a large course. Learning and Instruction, 72(1), 101202. https://doi.org/10.1016/j.learninstruc.2019.04.003
Liu, R., & Koedinger, K. R. (2017). Going beyond better data prediction to create explanatory models of educational data. In C. Lang, G. Siemens, A. F. Wise, & D. Gašević (Eds.), Handbook of learning analytics (pp. 69–76). Society for Learning Analytics Research. https://doi.org/10.18608/hla17.006.
McKenney, S., & Mor, Y. (2015). Supporting teachers in data-informed educational design. British Journal of Educational Technology, 46(2), 265–279. https://doi.org/10.1111/bjet.12262
Ndukwe, I. G., & Daniel, B. K. (2020). Teaching analytics, value and tools for teacher data literacy: A systematic and tripartite approach. International Journal of Educational Technology in Higher Education, 17(1), 1–31. https://doi.org/10.1186/s41239-020-00201-6
Ochoa, X. (2017). Chapter 11. Multimodal learning analytics. In C. Lang, G. Siemens, A. Wise, & D. Gašević (Eds.), Handbook of learning analytics (pp. 129–141). Society for Learning Analytics Research. https://doi.org/10.18608/hla17.011.
Opsahl, T., Agneessens, F., & Skvoretz, J. (2010). Node centrality in weighted networks: Generalizing degree and shortest paths. Social Networks, 32(3), 245–251. https://doi.org/10.1016/j.socnet.2010.03.006
Orsmond, P., Maw, S. J., Park, J. R., Gomez, S., & Crook, A. C. (2011). Moving feedback forward: Theory to practice. Assessment & Evaluation in Higher Education, 38(2), 240–252. https://doi.org/10.1080/02602938.2011.625472
Ouyang, F., & Chang, Y. H. (2019). The relationship between social participatory role and cognitive engagement level in online discussions. British Journal of Educational Technology, 50(3), 1396–1414. https://doi.org/10.1111/bjet.12647
Ouyang, F., & Dai, X. (2022). Using a three-layered social-cognitive network analysis framework to understand online collaborative discussions. Australasian Journal of Educational Technology, 38(1), 164–181. https://doi.org/10.14742/ajet.7166
Ouyang, F., & Scharber, C. (2017). The influences of an experienced instructor’s discussion design and facilitation on an online learning community development: A social network analysis study. The Internet and Higher Education, 35, 34–47. https://doi.org/10.1016/j.iheduc.2017.07.002
Ouyang, F., Chen, S., & Li, X. (2021). Effect of three network visualizations on students’ social-cognitive engagement in online discussions. British Journal of Educational Technology, 52(6), 2242–2262. https://doi.org/10.1111/1467-8535.13126
Papamitsiou, Z., & Economides, A. A. (2015). Temporal learning analytics visualizations for increasing awareness during assessment. International Journal of Educational Technology in Higher Education, 12(1), 129–147. https://doi.org/10.7238/rusc.v12i3.2519
Pardo, A., Jovanovic, J., Dawson, S., Gašević, D., & Mirriahi, N. (2017). Using learning analytics to scale the provision of personalised feedback. British Journal of Educational Technology, 50(1), 128–138. https://doi.org/10.1111/bjet.12592
Pardo, A., Bartimote, K., Buckingham Shum, S., Dawson, S., Gao, J., Gašević, D., Leichtweis, S., Liu, D., Martínez-Maldonado, R., Mirriahi, N., Moskal, A. C. M., Schulte, J., Siemens, G., & Vigentini, L. (2018). OnTask: Delivering data-informed, personalized learning support actions. Journal of Learning Analytics, 5(3), 235–249. https://doi.org/10.18608/jla.2018.53.15.
Pelletier, K., McCormack, M., Reeves, J., Robert J., & Arbino, N. (2022). 2022 Educause horizon report: Teaching and learning edition. Educause. Retrieved from https://library.educause.edu/resources/2022/4/2022-educause-horizon-report-teaching-and-learning-edition.. Accessed 01 Jan 2023.
Persico, D., & Pozzi, F. (2014). Informing learning design with learning analytics to improve teacher inquiry. British Journal of Educational Technology, 46(2), 230–248. https://doi.org/10.1111/bjet.12207
Prieto, L. P., Sharma, K., Kidzinski, Ł, Rodríguez-Triana, M. J., & Dillenbourg, P. (2018). Multimodal teaching analytics: Automated extraction of orchestration graphs from wearable sensor data. Journal of Computer Assisted Learning, 34(2), 193–203. https://doi.org/10.1111/jcal.12232
Reimann, N., Sadler, I., & Sambell, K. (2019). What’s in a word? Practices associated with ‘feedforward’ in higher education. Assessment & Evaluation in Higher Education, 44(8), 1279–1290. https://doi.org/10.1080/02602938.2019.1600655
Saar, M., Prieto, L. P., & Rodríguez Triana, M. J. (2021). Classroom data collection for teachers’ data-informed practice. Technology, Pedagogy and Education, 31(1), 123–140. https://doi.org/10.1080/1475939x.2021.1989024
Sadler, I., Reimann, N., & Sambell, K. (2022). Feedforward practices: A systematic review of the literature. Assessment & Evaluation in Higher Education, 48(3), 1–16. https://doi.org/10.1080/02602938.2022.2073434
Şahin, M., & Yurdugül, H. (2019). An intervention engine design and development based on learning analytics: The intelligent intervention system (In2S). Smart Learning Environments, 6(1), 1–18. https://doi.org/10.1186/s40561-019-0100-7
Sedrakyan, G., Malmberg, J., Verbert, K., Järvelä, S., & Kirschner, P. A. (2020). Linking learning behavior analytics and learning science concepts: Designing a learning analytics dashboard for feedback to support learning regulation. Computers in Human Behavior, 107(1), 1–15. https://doi.org/10.1016/j.chb.2018.05.004
Sergis, S., & Sampson, D. G. (2017). Teaching and learning analytics to support teacher inquiry: A systematic literature review. In A. Peña-Ayala (Ed.), Learning analytics: Fundaments, applications, and trends (pp. 25–63). Springer. https://doi.org/10.1007/978-3-319-52977-6_2.
Shum, B., Ferguson, R., & Martinez-Maldonado, R. (2019). Human-centered learning analytics. Journal of Learning Analytics, 6(2), 1–9. https://doi.org/10.18608/jla.2019.62.1.
Sternberg, R. J., & Grigorenko, E. L. (2002). Dynamic testing: The nature and measurement of learning potential. Cambridge University Press.
Susnjak, T., Ramaswami, G. S., & Mathrani, A. (2022). Learning analytics dashboard: A tool for providing actionable insights to learners. International Journal of Educational Technology in Higher Education, 19(1), 1–23. https://doi.org/10.1186/s41239-021-00313-7
Tissenbaum, M., & Slotta, J. (2019). Supporting classroom orchestration with real-time feedback: A role for teacher dashboards and real-time agents. International Journal of Computer-Supported Collaborative Learning, 14(3), 325–351. https://doi.org/10.1007/s11412-019-09306-1
van de Pol, J., Volman, M., Oort, F., & Beishuizen, J. (2013). Teacher scaffolding in small-group work: An intervention study. Journal of the Learning Sciences, 23(4), 600–650. https://doi.org/10.1080/10508406.2013.805300
van den Bergh, L., Ros, A., & Beijaard, D. (2013). Teacher feedback during active learning: Current practices in primary schools. British Journal of Educational Psychology, 83(2), 341–362. https://doi.org/10.1111/j.2044-8279.2012.02073.x
van Leeuwen, A., Janssen, J., Erkens, G., & Brekelmans, M. (2014). Supporting teachers in guiding collaborating students: Effects of learning analytics in CSCL. Computers & Education, 79(1), 28–39. https://doi.org/10.1016/j.compedu.2014.07.007
van Leeuwen, A., Janssen, J., Erkens, G., & Brekelmans, M. (2015). Teacher regulation of cognitive activities during student collaboration: Effects of learning analytics. Computers & Education, 90(1), 80–94. https://doi.org/10.1016/j.compedu.2015.09.006
Acknowledgements
We appreciate the students participated in this research.
Funding
The authors acknowledge the financial support from the National Natural Science Foundation of China (62177041).
Author information
Authors and Affiliations
Additional information
Publisher's note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
1.1 Interview questions
-
1.
Did you check the tool and the suggestions provided by the research team? Did you encounter any difficulties? How did you resolve those difficulties?
-
2.
During Experimental I, the research team designed the TLA tool to show visualizations, including the teaching analytics part (namely the frequency chart, time-series diagram, and word cloud diagram), the learning analytics part (namely the time-series diagram and word-cloud diagram), and the instructor-student interaction part (namely the social network diagram).
-
1)
Did you check the visualized information provided in the tool? If so, what aspects did you like about the tool? If not, why?
-
2)
Did you think the tool was effective in improving teaching or learning? If so, how did it positively influence your teaching and learning? Please share some details about how and why you think the tool was useful. If not, why?
-
1)
-
3.
During Experimental II, the research team integrated the TLA tool and supportive suggestions to give you feedforward.
-
1)
Did you check the supportive suggestions provided with the tool? If so, what aspects did these suggestions affect your teaching or learning processes? If not, why?
-
2)
Did you think the supportive suggestions were effective in improving teaching or learning, compared with the prior phase? If so, how did these supportive suggestions positively influence your teaching and learning? Please share some details about how and why you think the tool was useful. If not, why?
-
1)
-
4.
Did you encounter any difficulties when you use the TLA tool and the supportive suggestions? What were the difficulties? Do you have any suggestions about future revision of this TLA tool? Are you willing to use similar tools in the future teaching or learning?
-
5.
Did you have any other comments, suggestions, or concerns?
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Zhang, L., Wu, M. & Ouyang, F. The design and implementation of a teaching and learning analytics tool in a face-to-face, small-sized course in China’s higher education. Educ Inf Technol 29, 2697–2720 (2024). https://doi.org/10.1007/s10639-023-11940-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10639-023-11940-0