Elsevier

Computers & Education

Volume 58, Issue 1, January 2012, Pages 470-489
Computers & Education

A qualitative evaluation of evolution of a learning analytics tool

https://doi.org/10.1016/j.compedu.2011.08.030Get rights and content

Abstract

LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool’s data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see how the improvements affected the users’ perceived value of the tool. Here, we present the qualitative results of our two evaluations and discuss important lessons learned stemming from the comparison of the two studies. The results show that educators find the kinds of feedback implemented in the tool informative and they value the mix of textual and graphical representations of different kinds of feedback provided by the tool.

Highlights

► a learning analytics tool for educators-directed feedback is presented. ► results of two qualitative evaluation studies of the tool are presented. ► educators found the feedback implemented in the tool informative. ► educators valued the mix of textual and graphical representations of the feedback. ► important lessons learned from the comparison of the two studies are discussed.

Introduction

Today’s web-based learning systems are built under the promise to make the ‘anywhere, anytime’ learning vision possible by transcending the time and space boundaries inherent to the traditional classroom-based teaching and learning. A typical form of web-based learning is through Learning Content Management Systems (LCMSs), such as WebCT1 or Moodle.2 These LCMSs require teachers to constantly adapt their courses (both structure and content) to assure comprehensibility, high performance and learning efficiency of their students (Gašević, Jovanović, & Devedžić, 2007). Educators’ awareness of how students engage in the learning process, how they perform on the assigned learning and assessment tasks, and where they experience difficulties is the imperative for this adaptation. For this reason, educators need comprehensive and informative feedback about the use of their online courses. A comprehensive feedback is based on semantically interlinked data about all the major elements of a learning process, including: learning activities (e.g., reading and discussing), learning content, learning outcomes, and students (Jovanovic et al., 2007). An informative feedback provides an educator with a quick and easy-to-understand insight into a certain aspect of the learning process. Contemporary LCMSs, however, provide rather basic analytics such as simple statistics on technology usage or low-level data on a student’s interaction with learning content (e.g., page view).

Recognizing the importance of analysis of learner activities in learning environments, a new research area of learning analytics has emerged. Learning analytics is defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs”.3 The community around the newly-established International Conference on Learning Analytics and Knowledge aims to improve the present state of learning analytics in order to allow educators to make better-informed decisions on their instructional strategies. This goal is to be accomplished through a holistic approach that combines principles of different computing areas (data and text mining, visual analytics and data visualization) with those of social sciences, pedagogy and psychology. Traditionally, log data analysis and visualization applied for the analysis of students’ behavior and activities have been one of the main research topics in the research communities around venues such as AIED (Artificial Intelligence in Education)4 and EDM (Educational Data Mining).5 The research outcomes of these communities are significant, especially in the domain of generating student-centered feedback by leveraging user tracking data from learning systems such as ITSs (Intelligent Tutoring Systems) (Dominguez et al, 2010, Kari et al, 2010, Roll et al, 2010). To the best of our knowledge, much less research has been dedicated to educator-centered feedback provisioning and analytics. In fact, this is a bit surprising given the fact that there is a strong need and very loud calls for such tools by learning technology practitioners.6 Moreover, there have been very limited attempts to evaluate such systems with educators, especially by using qualitative evaluation methods, as we present in this paper.

In our research, we have been specifically interested in investigating the use of semantic technologies for learning analytics. Semantic technologies can enable meaningful linking of students’ learning activities and their interactions with the learning contents as well as with the other participants of their learning processes. To develop such a learning analytics tool, we made use of the ontology framework, named LOCO (Learning Object Context Ontology), which we developed in our previous work to allow for modeling of learning contexts (Jovanović, Gašević, Knight, & Richards, 2007). The key notion of LOCO is learning context, which is defined as an interplay of a learning activity, learning content, and participants (e.g., learners and/or educators); thus, LOCO enables us to represent and meaningfully interlink learning context data from different learning environments, or from different services (e.g., chat rooms and discussion forums) inside the same learning environment. On top of the LOCO framework and by leveraging semantic annotation (Popov et al., 2003) of diverse kinds of learning resources, we developed a learning analytics tool LOCO-Analyst.7

Our research in learning analytics followed the standard design-based educational research method (Reeves, Herrington, & Oliver, 2005) in which we assumed the use of an iterative approach. At the end of our first iteration, our research prototype (LOCO-Analyst) was capable of providing educators with a basic set of feedback interlinking learning contexts about students’ interactions with learning content, their discussions in forums and chat rooms, and performances on quizzes (Section 2). At the end of this iteration (November 2006), we conducted an empirical study with a group of educators aiming to identify the perceived value of individual elements of learning analytics implemented in the tool and to evaluate the perceived ease to learn the tool. As reported in (Jovanović et al., 2008), the participants’ responses were highly positive. However, the participants also indicated a need for enhancing the way the feedback was presented; they explicitly suggested higher usage of visual representations of feedback. We found this demand for other ways of feedback presentation consistent with the cognitive and educational psychology studies (Cassidy & Eachus, 2000, Dunn, 1983, Harrison et al., 2003, Mayer & Massa, 2003). Responding to the outcomes of our first study, we improved LOCO-Analyst by introducing advanced visualizations (Section 4). In 2009, we conducted the evaluation of this new version of LOCO-Analyst to find out how the educators valued the enhancements.

Having in mind the abovementioned research activities, the objective of this paper is to report on the results of the evaluation of LOCO-Analyst with the specific goal to analyze systematically the qualitative data collected in both studies through open-ended questionnaires with a focus on the effect of feedback visualization. While the results of the first (2006) evaluation are to some extent reported in (Jovanović et al., 2008), our analysis was rather focused on the quantitative (Likert-scale) data; also, the data were not systematically coded. To be able to better understand the emerging trends and compare the results of the two studies (2006 and 2009), we needed to study the evaluation results in a more systematic manner. In order to be able to analyze the qualitative data collected in the both evaluations systematically, we performed a content analysis. In this paper, we report on the results of our analysis.

The reminder of the paper is organized as follows. In the next section, we provide an overview of the first version of our LOCO-Analyst tool. A background is also provided to contextualize the initial development of the LOCO-Analyst. Section 3 is focused on the qualitative evaluation of the first version of the LOCO-Analyst tool (i.e., the 2006 study). The section covers the study design and results. In Section 4, we have described the improvements in the second version of LOCO-Analyst based on the findings of the first study. Section 5 focuses on the qualitative evaluation of the second version of LOCO-Analyst (i.e., the 2009 study).Section 6 presents a comparison of the two evaluations. In this section we also show how the results of the two qualitative evaluations align with the results of our analysis of quantitative data, which were also collected in our evaluations and reported on in (Asadi, Jovanović, Gasevic, & Hatala, 2011). Section 7 discusses the evaluation results as well as some threats to the validity of our findings. Section 8 provides future and related work. We conclude the paper in Section 8.

Section snippets

First version of LOCO-Analyst

We provide an overview of the first version of our LOCO-Analyst tool in this section. The background subsection provides a review of literature to contextualize the initial development of the LOCO-Analyst.

Research questions

Our first evaluation, as a formative evaluation, aimed to provide an understanding of the perceived utility of the tool. While previous research on learning analytics provided some initial understanding of the main qualities required by learning analytics tools for educators, in our formative evaluation we aimed at addressing the following research questions:

RQ1. To what extent do educators perceive a learning analytics tool useful for improving their course content and instruction in their

Improvements in the second version of LOCO-Analyst

Improvements in the second version of LOCO-Analyst were largely made in the light of suggestions received from the 2006 study participants, as summarized in Table 4. The great majority of the participants suggested improvements to the way the data is presented and communicated back to educators. The participants wanted the use of graphical data representation techniques (i.e., data visualization) which are capable of boosting understanding and facilitating insights. Card et al. define

The second evaluation of LOCO-Analyst (2009)

We conducted an evaluation study of the improved version of LOCO-Analyst in 2009 to reassess the perceived usefulness of the enhanced features of the tool. Being a summative evaluation, this study aimed to address the following research questions:

  • RQ1. To what extent the implemented interventions do affect the perceived value of a learning analytics tool?

  • RQ2. To what extent the variables characterizing the perceived utility of a learning analytics tool are associated?

While the discussion

Comparison of the two evaluations

In this section, we discuss the results of our content analysis reported in the previous sections with the primary aim to further cover the effects of the visual interventions introduced in the second version of the LOCO-Analyst tool. As reported in Section 3, the results of our 2006 study of LOCO-Analyst showed that the participants wanted us to supplement the textual (tabular) feedback with the visual representations. In the new version of LOCO-Analyst, we made major enhancements by using

Discussion

In this section, we discuss two main types of threats to validity commonly analyzed in empirical research – internal and external validity. With respect to internal validity of our experiment, we are interested in checking if some confounding factors significantly influence the analysis of the collected data (Chin, 2001). In our studies, two main confounding factors are difference in experience with using similar tools and motivation. In both of our studies very few participants were familiar

Related and future work

This section is dedicated to the recent research that can be very valuable for the improvement of existing and introduction of new types of feedback provided by LOCO-Analyst. For example, Macfadyen and Dawson (2010) conducted an analysis of usage tracking data collected by an LCMS in order to determine the best predictors of students’ academic performance based on their activities in Web-based learning systems. Total number of posted discussion messages, sent email messages, and completed

Conclusion

In this paper, we have analyzed the results of two qualitative studies conducted in 2006 and 2009 to evaluate two versions of LOCO-Analyst, a learning analytics tool. Following the results of the 2006 study, we improved the tool by using data visualization techniques to represent the feedback the tool generates and by enhancing the tool’s graphical user interface. Our results showed that multiple ways of visualizing data increase the perceived value of different feedback types. It is also very

References (50)

  • S. Cassidy et al.

    Learning style, academic belief systems, self-report student proficiency and academic achievement in higher education

    Educational Psychology

    (2000)
  • D.N. Chin

    Empirical evaluation of user models and user-adapted systems

    User Modeling and User-Adapted Interaction

    (2001)
  • Dominguez, A. K., Yacef, K., Curran, J. (2010). “Data mining to generate individualized feedback.” In Proceedings of...
  • R. Dunn

    Learning style and its relationship to exceptionality at both ends of the continuum

    Exceptional Children

    (1983)
  • D.R. Garrison et al.

    Critical inquiry in a text-based environment. Computer conferencing in higher education

    Internet in Higher Education

    (2000)
  • D. Gašević et al.

    Ontology-based annotation of learning object content

    Interactive Learning Environments

    (2007)
  • G. Harrison et al.

    Current perspectives on cognitive learning styles

    Education Canada

    (2003)
  • Hatala, M., Gašević, D., Siadaty, M., Jovanović, J., & Torniai, C. (2009). Can educators develop ontologies using...
  • L. Johnson et al.

    The 2010 horizon report

    (2010)
  • J. Jovanovic et al.

    Using semantic web technologies for the analysis of learning content

    IEEE Internet Computing

    (2007)
  • J. Jovanović et al.

    LOCO-Analyst: semantic web technologies in learning content usage analysis

    International Journal of Continuing Engineering Education and Lifelong Learning

    (2008)
  • J. Jovanović et al.

    Ontologies for effective use of context in e-learning settings

    Educational Technology & Society

    (2007)
  • Jovanovic, J., Gasevic, D., Stankovic, M., Jeremic, Z., & Siadaty, M. (2009). Online presence in adaptive learning on...
  • J. Jovanovic et al.

    The social semantic web in intelligent learning environments – state of the art and future challenges

    Interactive Learning Environments

    (2009)
  • Kari, H., Haddawy, P., Suebnukarn, S. (2010). “Leveraging a domain ontology to increase the quality of feedback in an...
  • Cited by (169)

    • Explainable Artificial Intelligence in education

      2022, Computers and Education: Artificial Intelligence
    • A Potent View on the Effects of E-Learning

      2024, International Journal of Grid and High Performance Computing
    View all citing articles on Scopus
    View full text