Keywords

1 Introduction

In education, digital interactive systems have been assisting the transition from physical classrooms to virtual ones. This has accompanied a plethora of concepts and technologies such as e-learning, digital education, online learning, virtual classrooms, Virtual Learning Environments (VLE) and e-learning systems. Digital content and instruction delivery [1, 2], interactive task delivery [3] and e-assessment [4] have been made possible through these innovative digital systems. As a given design yields results based on its limitations, it therefore becomes a design endeavor to successfully engage users in digital interactive systems both in terms of interaction design and learning design. In the online world of web, the move from Web 1.0 to Web 2.0 has been a critical step forward in user interaction from mere information presentation [5] and continues to evolve along user interaction across devices and platforms. In the same way, engagement and user experience in interactive systems is defined as a much needed step forward from usability [6]. This form of engagement in systems is defined as a quality that draws the users in, and holds their attention [7, 8] and as a quality of the user experience that emphasizes the positive aspects of interaction [9].

Engagement is also a challenging topic in education, where there is a constant struggle to get students engaged in learning. This has been a topic of interest in the traditional physical classrooms even to the latest virtual classrooms. Most importantly, research has linked student engagement with academic success [10]. The transition from teacher-centered learning to student-centered learning has been a major point in giving students freedom and empowerment to control their own learning and engagement. However research has indicated [11] that students are mostly motivated by grades and not the actual learning. It is a critical time when student disengagement is leading to high rates of dropouts [12] in many forms of education, particularly online. Since e-learning is a digital experience and a digital interaction, engagement in e-learning systems cannot be solely attributed to the human element of the relationship. The context or the environment where the engagement occurs is as equally important as the user, and they cannot be separated [13]. Therefore attention should also be given to digital interactive systems where the online engagement occurs in order to understand the existing problems and how they can be mended. But a measurement of engagement in e-learning systems is a challenge, as it is already a challenge to measure engagement in human-computer interaction based systems. From a user-centered design perspective, engagement measurement should be based on both quantitative and qualitative methods. However, existing engagement frameworks are more focused on a quantitative approach such as data analytics (Content Analytics, Social Analytics). With digital systems that constantly produce data, an unprecedented explosion of data has occurred and it has enabled and facilitated such quantitative research approaches.

Amidst this digital data revolution, we should also direct our attention towards what constitutes engagement in digital interactive systems as well as engagement in education. In digital interactive systems, engagement has been defined with concepts such as immersion, behavior, presence or flow [14]. With student engagement, prior research has established a three-dimensional view of engagement constituting of behavioral, cognitive and emotional engagements. They are not separate dimensions, but rather facets of the same construct that interact with one another and create additive effects [15].

Our research approach is based on a measurement Microlevel Student Engagement (MSE) as a timely and actionable informant in digital learning systems [16]. In this paper we present a way to integrate the Human-Computer Interaction (HCI) perspective with the MSE approach in order to better capture and measure user engagement in digital education. This paper is organized as follows; the state of the art engagement in digital education is presented in Sect. 2. Section 3 presents our approach to capture and measure student engagement. The pilot study we carried out and the obtained results are presented in Sect. 4. Finally, in Sect. 5, the results, limitations and future work are discussed.

2 State of the Art

The design of digital interactive systems is based on a user-centered design process that includes methods and techniques to involve users and synthesize information. It is also based on a number of guidelines such as functionality, usability, learnability, efficiency, reliability, maintainability, and utility or usefulness [17]. In terms of digital user experiences, engagement can be defined as “a desirable, even essential, human response to computer-mediated activities” [6]. Different research works are based on this definition and develop specific indicators and measures such as immersion, behavior, presence or flow. The main goal is always to understand people interacting with several digital interactive services and to be able to design based on this understanding. Online and distance learning can sometimes be perceived as a discouraging and disengaging experience because of the isolation [18] of the student compared to a traditional face-to-face classroom setting. In education, engagement is not only about fostering interactions among participants but also about building a framework of connected elements such as behavior, emotion, cognition, motivation, curiosity and experience, among others. Therefore, in addition to the interactive features of a VLE, educators should act as designers and provide the right combination of interactive tools, organization and pedagogy, following a student-centered design approach. A variety of interactive e-learning systems and tools have been produced in this order such as VLEs, intelligent tutoring systems, and submission systems that have created an explosion of learner-data as never before. This data explosion has created solutions as well as chaos in some environments since a distinction has not been made as to what data leads to better information.

In online education, Learning Analytics (LA) leads the charge of data-driven decision making [19]. Since its inception, e-learning systems have been designed to collect and process student information in order to understand and predict what the students will do next or if they will be successful. The sources for data have also grown in the meantime, as there is access to more data from learning platforms, learning tools, websites, forums, social media and so forth forever adding to what should ideally be the definitive set of data which can illustrate the objective nature of learning. It is often the case that data capture in digital learning environments are limited, either due to ethical [20] or technical reasons [21]. Cutting-edge techniques such as facial detection, eye movements and heat maps [22, 23, 24] contribute to other datasets such as navigation patterns, number of clicks and login durations in order to paint a better picture of engagement. However student engagement is still a research area where data is collected using long surveys at the end of a course in order to understand it. This data is macrolevel and has very little to do with their actual learning tasks but better suited to understand general trends.

Student engagement has been discussed in literature properly from the early 90s with the first frameworks that separate its dimensions [25]. Since then, researchers have developed numerous instruments to measure student engagement and understand student learning. Some of these instruments focus on precursors to engagement such as motivation [26, 27], whereas some focus on a single dimension of engagement such as cognition [28], or emotions [29]. Some instruments measure all the dimensions of engagement, however they are administered after the course has ended, therefore captures data at a macrolevel and their findings are generic and not actionable in the immediate scope. The conventional and current approach has been through extensive surveys administered at the end of an academic year, such as the National Survey of Student Engagement/NSSE [30], Student Engagement Questionnaire/SEQ [31] and many others [32]. While the online student engagement was also measured by surveys such as Online Student Engagement Scale [33], there is a lack of tools embedded in VLEs in order to measure student engagement in online environments. In contrast, LA is based on the development of digital systems, tools and services that analyze and visualize the data from the underlying learning environment. Systems and tools such as success predictors, recommender systems, and visualization/dashboard systems [34] have been developed as bespoke systems. Some of the well-known systems include the Purdue University Course Signals [35], as well as other systems such as OUAnalyse [36]. LA is however largely biased on behavioral indicators of student engagement, where even latest developments such as the Moodle Engagement Analytics module use behavioral indicators such as student login activity, assessment submission activity and forum viewing and posting activity [37]. The development of student engagement measurement tools in the form of digital systems therefore is a novel area where proper designs can be innovated through experimentation. In student-facing LA systems more emphasis on understanding student use of systems is needed and such reporting systems need assessment, information selection, visual design, and usability testing [34].

Engagement therefore can be seen as a key component in interactive systems design and user experience design, and also in e-learning by extension. In user experience theory, a framework is presented on interactions between individuals and products [38], which outline three ways of interaction; fluent (actions that are mastered by frequent use), cognitive (actions that require cognition from previous experiences and effort) and expressive (forming a relationship with the product in an expressive way). Each of these dimensions can be aligned with the three dimensions of engagement in learning; behavioral, cognitive and emotional. Therefore, the dimensions of engagement can become the common ground between user interaction (in digital systems such as e-learning systems), user experience as well as student learning.

3 Research Design

Our research approach is focused on a microlevel measurement of student engagement [16] in e-learning systems in order to address the problem of timeliness and actionable nature of the data. However in order to measure a multi-dimensional construct such as student engagement which includes subjective facets such as cognitive and emotional values, and since the actual cognitive and emotional cues are produced inside the students themselves, a workaround is required. Our approach took the form of measuring the engagement of the engagement, meaning the student reproduces the cognitive or the emotional engagement in a measurable form. For this aim, we opted to embed micro-surveys in the VLE in order to capture relevant engagement data in context. This mainly involved emphasizing the cognitive and emotional dimensions of the engagement data, as the behavioral data were captured at the system-level using logs. Unlike traditional engagement surveys or LA based analytics systems, since our data capture required a user-facing data capture module for the self-reported engagement data, user interaction design requirements had to be taken into account. One of the challenges we faced when developing this microlevel data capture module was the design, in both user interface as well as the user interaction.

This research followed the action research methodology [39], which emphasizes investigation and improvement of a real-world situations with iterative plan-act-reflect cycles, where the subsequent iterations are modified based on the experience of the previous. The participatory and cyclical nature of action research methodology and user-centered design process have strong resemblances [40], and therefore matched with our goals of collaborative design and implementation. Therefore, this is an ongoing research project where we planned a process with several iterations. In this paper we present the design and results of the first two iterations.

A VLE contains of different types of learning resources and services such as text, images, videos, exercises, examples or learning tools. We first defined engagement related questions for each type of learning resource and combined them to form micro-survey templates. These questions were aimed to measure a student’s engagement with the resources. For example an image resource was associated with a cognition based engagement whether it was useful in the understanding of a concept, or if a lesson was easy to follow, or an assessment task easy to complete. In the case of emotional engagement, if the student finds a table of lesson contents interesting, or if the student is happy with their performance in an assessment task. Such self-reported data enable us to capture subjective engagement data which constitute an important part of the learning process [41].

We carried out interviews with faculty, instructional designers and experts in order to better understand how to design the micro-surveys and to avoid the intrusiveness of self-reported mechanism can introduce. We have designed the micro-surveys to be as short as possible, as non-disruptive as possible and as relevant as possible. A maximum of 3 questions were selected in the micro-surveys which enabled us to represent the 3 dimensions of engagement when necessary. However, the number of questions were kept to a minimum to capture the most relevant dimension(s). Our design therefore pivoted towards the common approach of complementing the system-level behavioral engagement data with the subjective self-reported data on cognition and emotion. These micro-surveys were embedded by placing a code segment at each learning resource and visually appears as a popup box right next to the resource.

The data collection and user interaction designs were based on a user-centered iterative process that involved faculty and experts. We learnt that student participation to obtain qualitative feedback was not feasible due to their workloads at the time, but interface design suggestions, data requirements were collaboratively decided with the faculty members. The design process also involved the previously mentioned design principles; functionality, usability, learnability, efficiency, reliability, maintainability, and utility or usefulness. With functionality, the module was tested to do what it was required to do: capturing the self-reported data from the students. In terms of usability, qualities such as visual clarity, consistency, informative feedback were taken into account through clean interfaces, consistent design across all the module frames and forms and confirmation messages after data submission. In terms of efficiency and maintainability, the HTML/JavaScript based module was lightweight in loading and fast in operation.

For example the interface design of the system was improved by the removal of a slide-in window prompted by a button-click that opened the data submission form after the first pilot test (Fig. 1). In the later iteration it was explicitly placed alongside the learning resource in order to reduce the action needed by the student as shown in Fig. 2, to improve visual clarity, improve learnability as well as to improve the efficiency. The module code was also modified to take in only the essential manual parameters (type of resource, required survey template) from the embed location and the code was further simplified and automated.

Fig. 1.
figure 1

MSE data capture module design in the pilot test

Fig. 2.
figure 2

MSE data capture module design in the second iteration

4 Results

Our study was carried out at Universitat Oberta de Catalunya in Barcelona, Spain; a fully online university where we integrated the MSE data capture module in the virtual learning spaces of different courses. We carried out 2 pilot tests, the first one in a single course aimed at testing our approach, testing our data capture module functionality and design, and the second pilot test with a larger number of courses to improve the participation rates and user experience design. In the first pilot we used a web-based learning resources platform used by the Fundamentals of Programming course. The course was taken by around 267 first year students and we obtained data from 50 students in a non-enforced manner and no incentive. Through the analysis of the behavioral engagement data (student access to the learning resources) we learnt that only around 150 students were active out of the enrolled count. With the lack of participation in mind, our second iteration involved the design modifications as stated above as well as course-design modifications such as offering an incentive to those who submitted the engagement data as part of the assessment of a task. This led to a small impact on the final mark of the task. The second pilot involved 6 courses which allowed us data capture at multiple levels. For example it involved 1 course which used a web platform to host the learning resources, therefore the data capture module was embedded in the learning resources as well as each assessment task was accompanied by a micro-survey. Two courses offered incentives to students who submitted data through the micro-surveys, whereas 4 other courses only sent the assessment task related surveys and offered no incentive.

The initial data from the first pilot as well as the second pilot have yielded positive signs for MSE data capture. Table 1 shows the response rates from the two iterations of the study, and also whether an incentive was awarded to the student for their engagement data submission.

Table 1. Response rates of students in the two iterations of study

The comparison of the first pilot and the second pilot in the Fundamentals of Programming course shows an improvement, as well as the other courses in the second pilot shows considerably high engagement data submission rates. However this collection of courses cannot be compared across the board by these results alone, since their course design, assessment task design and workloads are different. Nevertheless, in each course the MSE data capture has performed well, taking into their differences in characteristics. For example, the Web Standards course involved a markedly higher number of engagement data submissions from the online learning resources platform, whereas the other courses only required assessment task related engagement data submission. Further courses are required to test how well as the incentives may work for courses with less workload and higher workloads.

In addition to the system design, we also yielded promising results on student engagement as shown below. The results presented here are a showcase of our student engagement related findings and for example the analysis of the data from the Databases course shows valuable insights such as illustrated in Fig. 3. Students reported their microlevel engagement with the continuous assessment task 1 from level 1–5 corresponding from least happy to very happy. The graph presents the students’ expressed level of happiness about their performance in the task with no knowledge of the grades and the grades they received later on. It shows that students who expressed happiness about their performance in the task would later receive the highest grades. Figure 4 shows the same emotional engagement (again happiness as an indicator) in the assessment task 2 comparatively. This type of emotional engagement data can therefore be a valuable indicator to be added to the current set of LA data.

Fig. 3.
figure 3

Level of happiness of students regarding their performance (emotional engagement) corresponded to the grades they later received in assessment task 1

Fig. 4.
figure 4

Level of happiness of students regarding their performance (emotional engagement) corresponded to the grades they later received in assessment task 2

In a similar fashion, we also captured the data from students on how easy the assessment task was to complete from a scale of 1–5, 1 being very difficult to 5 being very easy. Figures 5 and 6 illustrate the findings on the relationship between this cognitive data and their eventual grades for the assessment tasks 1 and 2 respectively. As with the emotional engagement findings, on a general level the cognitive engagement corresponded to the grades they received, where the majority of students who reported the assignment to be easy scored high on the grades.

Fig. 5.
figure 5

Level of ease of the assessment task (cognitive engagement) corresponded to the grades they later received in assessment task 1

Fig. 6.
figure 6

Level of ease of the assessment task (cognitive engagement) corresponded to the grades they later received in assessment task 2

Although more learning resources, such as the VLE components of forums, notice boards need to be integrated with the data collection process, we have gained an early insight to the student engagement with learning resources and assessment tasks and an initial validation for our approach. Our analysis of the data, namely Engagement Analytics (or more specifically MSE Analytics) can be used an extension to LA.

5 Discussion, Limitations and Future Work

Understanding user engagement in digital education or e-learning becomes two-fold. One is user engagement in digital interactive systems, and the other is student engagement in education in its three dimensions of behavioral, cognitive and emotional. E-learning blends these two broad concepts together, where not many experiments have attempted to bring their theoretical backgrounds together in a practical environment. In an online classroom, student engagement is a critical component in knowing whether the students perform well in the classroom or if they perform at all. With low online student engagement levels in mind, using an interactive tool to get students to report their engagement seems a tricky approach. However, our results from the 6 courses in the second pilot have shown that with incentives and an iterative approach for designing interfaces and interactions can play a vital role in capturing a timely set of data. In the two courses where we have offered an incentive for engaging with the engagement module, the rates have gone around up to 70%. However in the courses of Human-Computer Interaction, Interaction Design and Human-Centered Design where no incentive was offered, the engagement survey task was made to look like it was part of the assessment task. This design decision resulted in a higher level of engagement with the data capture module. A VLE is ideally a well-engineered system to serve the students and instructors to carry out their teaching and learning in a seamless way that adds positively to their student experience. However current datasets, used particularly in LA focus often on behavioral data and there is a gap of data where online engagement data is concerned. Our approach of using micro-surveys as a method of capturing self-reported engagement data that conform to the student engagement theory and implementing them in the form of micro-surveys that interact with the students throughout the VLE that conforms to user experience and system design theory, is a step in fulfilling that gap.

While this portion of the data is self-reported therefore subjective in nature, they can be merged seamlessly with the objective and system-level data in multiple combinations. As illustrated in Figs. 3 and 4, we have been able to relate the cognitive, emotional engagement in the form of easiness of the task, happiness towards their own performance, to their eventual grades that show a positive relationship at a glance. While this is in accordance with the previous research [42], we are currently pursuing quantitative validation of this initial set of results as well as a similar line of inquiry towards other factors such as student demographics, enrolled degree program, previous subject experience where relevant, repeating student or not etc.

One of the design limitations of our module from an interaction design point of view was the level of feedback given. While the module captures the data and a feedback message announces that the data is successfully saved, a contextually appropriate feedback is not given such as the classroom-level engagement score for a particular learning resource. Such social and collaborative aspects of interaction and user experience is critical in designing systems [38] and in this case particularly useful as a self-assessment of engagement compared to the classroom. It’s also relevant to the co-experience aspect of user experience in using an interactive system. Another immediate step in our design process is to involve the students directly to better understand some of the design principles we need to take into account, such as learnability, ease of use, intrusiveness and usefulness. With the user tests we also hope to address these aspects such as reliability as well.

While this is a future direction for enhancing our approach, gamification is also a strong candidate, as it can constitute a powerful boost to engage [43]. By translating the engagement frequency and average engagement values to a rating system it would be possible for the implementation of a reputation system or a badge system that motivates the students more. This could also involve a rating system for learning resources based on the data of student engagement with them. Alternative directions include technologies such as chatbots that has seen to engage students more in online learning [44] where an integration between our approach and the chatbot interface could provide a seamless experience in reporting engagement.

In conclusion, engagement is a challenging topic in both user experiences in digital interactive systems and in education. E-learning technologies have brought together concepts of HCI with that of student engagement, bridging together dimensions of behavioral, cognitive and emotional engagement with concepts of user interaction of fluent, cognitive and expressive. Using these theoretical foundations, we have developed an approach of MSE measurement in online education with a user-centered design perspective. As our results have reflected, the importance of MSE lies with its capability of visualizing on-going student engagement in context, and thereby being more timely, valid and actionable. Faculty can use this information in order to under-stand lessons where students have trouble, or learning resources which are not helpful or the general emotional or cognitive state of the classroom. Combined with the behavioral data, MSE has the potential to provide a richer characterization of student engagement than mere behavioral indicators and help further our understanding in user engagement in e-learning systems and digital education.