Abstract
The demand for Open learning analytics (OLA) has grown in recent years due to the increasing interest in the usage of self-organized, networked, and lifelong learning environments. However, platforms that can deliver an effective and efficient OLA are still lacking. Most OLA platforms currently available do not continuously involve end-users in the indicator definition process and follow design patterns which make it difficult to extend the platform to meet new user requirements. These limitations restrict the scope of such platforms where users regulate their own learning process according to their needs. In this paper, we discuss the Open learning analytics platform (OpenLAP) as a step toward an ecosystem that addresses the indicator personalization and platform extensibility challenges of OLA. OpenLAP follows a user-centered learning analytics approach that involves end-users in the process of defining custom indicators that meet their needs. Moreover, it provides a modular and extensible architecture that allows the easy integration of new analytics methods and visualization techniques.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Abbreviations
- Apereo LAI:
-
Apereo Learning Analytics Initiative
- ATAM:
-
Architecture Tradeoff Analysis Method
- GQI:
-
Goal-Question-Indicator
- LA:
-
Learning Analytics
- LAP:
-
Learning Analytics Processor
- LCDM:
-
Learning Context Data Model
- MOOC:
-
Massive Open Online Course
- OLA:
-
Open Learning Analytics
- OLAA:
-
Open Learning Analytics Architecture
- OpenLAP:
-
Open Learning Analytics Platform
- RIDT:
-
Rule-based Indicator Definition Tool
- SoLAR:
-
Society for Learning Analytics Research
- SUS:
-
System Usability Scale
- TEL:
-
Technology Enhanced Learning
- UI:
-
User Interface
References
Baker, R., & Inventado, P. (2014). Educational data mining and learning analytics. In: J. Larusson & B. White (Eds.), Learning analytics (pp. 61–75). Springer. https://doi.org/10.1007/978-1-4614-3305-7_4.
Brooke, J. (1996). SUS-A quick and dirty usability scale. In P. Jordan, B. Thomas, B. Weerdmeester, & A. McClelland (Eds.), Usability evaluation in industry (pp. 189–194). Taylor & Francis.
Chatti, M. A. (2010). Personalization in technology enhanced learning: A social software perspective. Shaker Verlag. Retrieved from http://www.shaker.eu/shop/978-3-8322-9575-2.
Chatti, M. A., & Muslim, A. (2019). The PERLA framework: Blending personalization and learning analytics. International Review of Research in Open and Distributed Learning, 20(1). https://doi.org/10.19173/irrodl.v20i1.3936.
Chatti, M. A., Dyckhoff, A. L., Schroeder, U., & Thüs, H. (2012). A reference model for learning analytics. International Journal of Technology Enhanced Learning, 4, 318–331. https://doi.org/10.1504/IJTEL.2012.051815.
Chatti, M. A., Lukarov, V., Thüs, H., Muslim, A., Yousef, A. M. F., Wahid, U., Greven, C., Chakrabarti, A., & Schroeder, U. (2014). Learning Analytics: Challenges and Future Research Directions. eleed, 10.
Chatti, M. A., Muslim, A., & Schroeder, U. (2017). Toward an open learning analytics ecosystem. In: B. Kei Daniel (Ed.), Big data and learning analytics in higher education (pp. 195–219). Cham: Springer. https://doi.org/10.1007/978-3-319-06520-5_12.
Clements, P., Bachmann, F., Bass, L., Garlan, D., Ivers, J., Little, R., & Stafford, J. (2010). Documenting Software Architectures: Views and Beyond (2nd ed.). Digital Library Digital Library: Addison-Wesley Professional.
Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4, 304–317. https://doi.org/10.1504/IJTEL.2012.051816
Griffiths, D., Hoel, T., & Cooper, A. (2016). Learning Analytics Interoperability: Requirements, Specifications and Adoption. Retrieved June 29, 2021, from https://www.estandard.no/files/LACE_D7-4.pdf.
Ionita, M. T., Hammer, D. K., & Obbink, H. (2002). Scenario-based software architecture evaluation methods: An overview. In Workshop on methods and techniques for software architecture review and assessment at the international conference on software engineering (pp. 19–24).
Jayaprakash, S. M. (2015). Updates from Apereo Learning Analytics Initiative (Apereo LAI). Retrieved June 29, 2021, from http://www.slideshare.net/SandeepMJayaprakash/updates-from-apereo-learning-analytics-initiative-apereo-lai.
Kazman, R., Klein, M., & Clements, P. (2000). ATAM: Method for architecture evaluation. Carnegie Mellon University Pittsburgh PA Software Engineering Institute.
Lukarov, V., Chatti, M. A., Thüs, H., Kia, F. S., Muslim, A., Greven, C., & Schroeder, U. (2014). Data models in learning analytics. In Proceedings of DeLFI Workshops (pp. 88–95).
Muslim, A., Chatti, M. A., Bashir, M. B., Varela, O. E. B., & Schroeder, U. (2018). A modular and extensible framework for open learning analytics. Journal of Learning Analytics, 5(1), 92–100. https://doi.org/10.18608/jla.2018.51.7.
Muslim, A., Chatti, M. A., & Guesmi, M. (2020) Open learning analytics: A systematic literature review and future perspectives. In N. Pinkwart & S. Liu (Eds.), Artificial intelligence supported educational technologies. Advances in analytics for learning and teaching (pp. 3–29). Cham: Springer. https://doi.org/10.1007/978-3-030-41099-5_1.
Muslim, A., Chatti, M. A., Mahapatra, T., & Schroeder, U. (2016). A rule-based indicator definition tool for personalized learning analytics. In Proceedings of the sixth international conference on learning analytics & knowledge (pp. 264–273). New York, NY, USA: ACM. https://doi.org/10.1145/2883851.2883921.
Muslim, A., Chatti, M. A., Mughal, M., & Schroeder, U. (2017). The goal - question - indicator approach for personalized learning analytics. In Proceedings of the 9th international conference on computer supported education - volume 1: CSEDU (pp. 371–378). ScitePress. https://doi.org/10.5220/0006319803710378.
Norman, D. (2013). The design of everyday things. Basic Books.
Romero, C., Ventura, S., & Garcı´a, E. (2008). Data mining in course management systems: Moodle case study and tutorial. Computers & Education, 51, 368–384. https://doi.org/10.1016/j.compedu.2007.05.016
Sclater, N. (2015). Jisc’s Learning Analytics Architecture - who’s involved, what are the products and when will it be available? Retrieved June 29, 2021, from https://analytics.jiscinvolve.org/wp/2015/06/15/jiscs-learning-analytics-architecture-whos-involved-what-are-the-products-and-when-will-it-be-available/.
Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S. B., Ferguson, R., Baker, R. S. J. D. (2011). Open Learning Analytics: an integrated & modularized platform (Doctoral dissertation, Open University Press).
Thüs, H., Chatti, M. A., Greven, C., & Schroeder, U. (2014). Kontexterfassung,-modellierung und-auswertung in Lernumgebungen. In DeLFI 2014-Die 12. e-Learning Fachtagung Informatik (pp. 157–162). Gesellschaft für Informatik.
Author information
Authors and Affiliations
Contributions
The evaluation was designed and conducted by AM. The Literature review was performed and results were documented by AM together with MAC. Editorial reviews and formatting of the paper were done by AM and MAC. US is the head of the department where the evaluation was performed. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Conflicts of interests
The authors declare that they have no conflicts of interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Muslim, A., Chatti, M.A. & Schroeder, U. Supporting Indicator Personalization and Platform Extensibility in Open Learning Analytics. Tech Know Learn 27, 429–448 (2022). https://doi.org/10.1007/s10758-021-09543-0
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10758-021-09543-0