Abstract
Online education has broadened the avenues of research on student’s behavior and performance. In this paper, we shed light on how make visualizations to better support the instructors in the analyses of students interactions on VLE. We conducted a study in order to identify which data the instructors take into account and their visualization preferences. As result, even though instructors were presented different types of the visualization, the ones they selected the most and evaluated better are in line with they already used.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
- Learning analytics
- Educational data mining
- Visualization
- Virtual Learning Environments
- VLE
- Logs
- Interviews
- eLearning
- Student
- Teacher
- Behavior
- Interaction
1 Introduction
Distance Learning (DL) is no longer a novelty. From the records of correspondence courses in Brazil in the 19th century [1], to the rise of computers and evolution of the Internet, instructors and students have experienced novel ways of teaching and learning. In particular, improvements to multimedia technologies have contributed to the creation of online courses in Virtual Learning Environments (VLEs). One of the reasons for the popularity of online courses is that students can decide their own study pace and participate in courses regardless of geographic distance limitations [2]. Moreover, VLEs are not exclusive to DL. There is evidence of their usage together with face-to-face learning (aka blended learning) [3].
Students’ interactions with VLEs are often stored in logs. The analysis of these logs can predict the students’ performance, evaluate their learning achievement in a course, and even identify behavior patterns [4, 5]. Instructors usually get information from observing what students say and do on a VLE. Some challenges regarding this analytical process are the following: the instructors often do not have time to analyze these logs in depth; they are not statistics experts; or they have not received training to extract key information from VLE logs, as those logs are not always very readable. For instance, Damasceno et al. (2018) conducted interviews with instructors and found evidence that several instructors do not have any analytic tool or information besides the student access logs [3]. According to the instructors interviewed, they are very often already overwhelmed by the time and effort needed to prepare their classes, to prepare and assess coursework, and to evaluate students.
Furthermore, Damasceno et al. uncovered requirements and proposed guidelines for what tools should help to analyze in VLEs [3]. For instance, this requirements are based on papers showing that (i) student engagement in the online environment can be measured by materials accessed on the environment [6,7,8]; and (ii) students can be clustered into different groups based on their access or interaction patterns, such as, with which resources they interact first, how long they stay or what times they usually access the VLE [9,10,11]. In general, the instructors interviewed reported that they intend to identify patterns related to (a) student access (e.g., login, materials, forum) and (b) student performance. They also claim that a dashboard providing such student information could support them in spending their time more efficiently, adopting new teaching methods, and preparing better course materials.
The purpose of this paper is to identify instructors’ preferences regarding the visualization of either student access or performance in courses using VLEs. To achieve our goal, we elaborated a survey about which visualizations answered efficiently a set of questions that are relevant to instructors, according to Damasceno et al. [3]. The visualizations in the survey were developed to support the analysis of the logs from students’ interactions with VLEs and to provide insights to instructors. Then, we asked instructors from Brazilian education institutions to respond to the survey and analyzed the collected data. The main outcome of the study are the types of visualization to show certain VLE data, the instructors’ visualization preferences, and their evaluations of each type of visualization.
This paper is structured as follows. Section 2 shows the design process of VLE data visualization. Section 3 describes the exploratory study performed with instructors about their visualization preferences. In Sect. 4 we discuss the results of the study. Lastly, Sect. 5 presents some final considerations.
2 Proposed Visualizations
Damasceno et al. described 38 requirements and 18 guidelines for tool support for analyzing VLE logs [3]. The guidelines were compiled from the literature and the requirements resulted from the instructors’ interviews. However, 15 requirements were not related to student access or performance, and were therefore removed from the study reported in this paper.
After selecting the relevant requirements and guidelines, we arranged them in 11 visualization groups (VGs) with common VLE data logs and elaborated questions to be answered for each VGFootnote 1. For instance, VG-05 is related to correlations between students’ grades and activity logs (e.g., access, assignments completed, forum posts). Table 1 presents the visualization groups, their corresponding questions and visualization task. This last one is based on the taxonomy of visualization tasks proposed by Amar et al. [12].
Next, based on the information visualization literature (e.g., [13,14,15]) and online data visualization cataloguesFootnote 2, we designed visualizations in order to handle the different kinds of data (e.g., numerical, categorical, time series), their cardinality (e.g., 1, 2, ... N), and the task supported by the chart (e.g., compare, correlate) used by each VG. For the development of the visualizations we used DashFootnote 3, a Python Framework for building data visualization applications. In the Appendix, Figs. 3, 4, 5, 6, 7, 8, 9, 10, 11, 12 and 13 show a sample of all types of visualizations designed for each VG. These visualizations present different ways to view the same data. Although the data depicted in the figures are fictional, they are in line with actual study results presented by Damasceno et al. [3] (e.g., number of clusters presented in Fig. 7, student access patterns in Fig. 10). In addition, each figure is assigned to one VG question (i.e., questions in blue shown in Table 1). Even though different VG questions may use the same type of visualization, they have distinct parameters or axis values. For instance, Fig. 10 presents visualizations related to VG-08’s second question (i.e., related to student access per week), whereas VG-08’s first question uses the same visualization type, but presents data access per day instead of week.
3 Study Procedure
Although instructors are domain experts, they do not necessarily know how to use information visualizations to answer their questions [16]. It is necessary to identify which questions they take into account and how those questions can be answered through graphical means. To achieve this goal, we developed an online survey using all types of chart designed for each VG. This survey was designed to assess just what is relevant for our study. It included fields for the instructors to identify themselves (e.g., name, age, institution) and to answer questions about their experience with VLE, meaningful student information, VLE data logs, data visualization, and their visualization preferences. We also presented all the questions of Table 1 asked them which questions they found relevant to them. Next, the survey displayed all the VG charts related to the questions they selected. As illustrated in Fig. 1, for each chart, a Likert scale ranging from 1 (strongly disagree) to 7 (strongly agree) was presented for the instructors to evaluate how efficiently the chart answers the corresponding question. To break possible ties, the instructors then were asked to select the chart that, in their opinion, better answers the question. The charts presented in the survey were related to questions shown in blue in Table 1.
4 Results
The survey was online between December 2018 and January 2019. We sent emails inviting the participation of 88 instructors and 15 universities from different Brazilian regions. The survey was responded by 21 instructors (15 men and 6 women) from institutions located in eight Brazilian states (Amazonas, Ceará, Maranhão, Paraíba, Paraná, Piauí, Rondônia, and São Paulo).
Table 2 shows an overview of the instructors’ profilesFootnote 4. They all have experience with Virtual Learning Environments: 10 had taught courses that were entirely at a distance, 1 a blended learning course, and 10 both categories. In total, the instructors mentioned having worked with three different VLEs (i.e., Blackboard, Moodle, SIGAA), and Moodle was the most often cited one (by all the instructors). In regard to educational roles performed, most had experience as teacher or tutor (except I-06) and four of them reported experience as coordinator (I-04, I-06, I-09, I-17). Most of the instructors have a computing degree, and two other instructors have programming experience (I-03, I-18).
Table 3 shows an overview of the students’ age and educational background provided by each instructor. Most of them had taught students between 18 and 50 or more years old. In particular, some instructors reported that their students had various educational backgrounds, without specifying them. However, like the instructors, most of the students had a computing background.
All of the instructors claimed to use some VLE communication resource (i.e., chat, forum); most of them reported the use of videos (except I-06, I-09, I-10, I-11, I-18) and 7 mentioned they used e-books (I-02, I-04, I-04, I-11, I-12, I-13, I-16, I-20). They also mentioned other resources, such as text (I-05, I-15, I-19), audio recording (I-15), questionnaires (I-14, I-17), poll (I-19), badges (I-18), and wikis (I-19). Only one mentioned he/she oversaw student access (I-07). Nevertheless, when we asked how they monitor the students and what student data they analyze, most of the instructors reported student access (except I-01, I-06, I-11, I-14, I-16, I-17, I-18, I-20). In addition, they also oversaw the student completion assignments (except I-02, I-06, I-07, I-14, I-16, I-18) and forum usage (except I-04, I-06, I-13, I-14, I-15, I-16, I-17, I-18, I-21).
We asked how the VLE log data are presented to the instructors. Most of them said through tables (I-02, I-04, I-07, I-08, I-11, I-13, I-19, I-21) and graphical means (I-02, I-03, I-04, I-07, I-08, I-10, I-11, I-13, I-15, I-20), such as Bar Chart (I-11, I-15, I-20), Histogram (I-10), Line Graph (I-07) and Pie Chart (I-02). Some instructors mentioned that the VLE shows reports (I-09, I-14, I-21) without specifying their presentation. In particular, I-06 said that the VLE does not show these data and I-17 claimed to use a resource called course progress (also without specifying the data presentation).
In regards the periodicity of reading and interpretation of charts, 6 instructors said to realize this activity more than once per week (I-01, I-05, I-06, I-11, I-15, I-16), 7 once per week (I-03, I-04, I-08, I-09, I-10, I-14, I-20), 3 once per month (I-13, I-17, I-21) and 5 seldom (I-02, I-07, I-12, I-18, I-19), whereas the periodicity of making charts was reported by 4 as once per week (I-03, I-08, I-10, I-11), 8 as once per month (I-01, I-04, I-05, I-06, I-09, I-13, I-16, I-20), 7 as seldom (I-02, I-14, I-15, I-17, I-18, I-19, I-21) and 2 as never (I-07, I-12).
Table 4 presents the questions grouped by VG that instructors take into account or would like to take. The questions related to VG-01 and VG-03 were the most chosen by them. Except for VG-06, at least one question of each VG was selected by most instructors. In other words, most instructors do not (or would not) take into account the correlation between students’ age and their VLE interactions, even though they had said they taught students between 18 and 50 or more. We also ask what other questions they take into account and how they prefer that such questions be answered. In the first question, only three were mentioned: (i) why the students did not access the VLE, (ii) why the students did not meet the assignment deadline, and (iii) what the relation of both students grades and interaction in presential classroom is. In regard to the second question, almost all the instructors (except I-12, I-16, I-18, I-19) said through table (I-01, I-02, I-08, I-09, I-11, I-13, I-17, I-21) and graphical means (I-01, I-03, I-04, I-05, I-06, I-07, I-08, I-09, I-10, I-11, I-13, I-14, I-15, I-17, I-20, I-21), such as Bar Chart (I-01, I-07, I-11, I-13, I-14), Line Graph (I-07) and Pie Chart (I-15, I-21). I-16 and I-19 did not answer, I-12 said what he wants to see without specifying how and I-18 mentioned that he wants to receive by e-mail. Moreover, Table 4 shows the type of visualization most often selected by them. Most instructors chose Table (VG-02, VG-05, VG-06, VG-08, VG-11) or Multi-set Bar Chart (VG-01, VG-03, VG-04, VG-10) as a way of data presentation. Only in the case of VG-07 (Scatterplot) and VG-09 (Stacked Area Graph) another way of data presentation was selected by most of them.
Figure 2 shows the instructors’ evaluation of each VG chart. Only in 5 VGs (VG-02, VG-05, VG-08, VG-09, VG-10) the charts with the best evaluations are the same as the selected one. However, the other charts are among the 3 best evaluated. In line with the instructors’ visualization preferences, data presented in Tables, Bar Chart, Multi-set Bar Chart, Stacked Bar Chart and Lollipop receive good evaluations, whereas the use of Bubble Chart, Flow Chart, Heatmap and Violin Plot received poor evaluations. In general, the charts used to show both data distribution and variation, Box & Whisker Plot received more good evaluations. Another interesting point is in all Heatmap charts with good evaluations had the corresponding data values presented in each cell.
5 Final Considerations
This paper presented some ways of VLE data visualization and reported an analysis of the visualizations evaluated by the instructors through an online survey. In addition, the instructors presented an overview of their experience with VLE, students, how they make analyses about student interactions, and visualization preferences. The instructors evaluated each visualization regarding the requirements uncovered from Damasceno et al. [3] and to report which chart(s) better answer each question. Our aim was to understand which visualizations better support the instructors. Understanding the learning process through visualizations can help instructors design better courses and improve learning effectiveness.
The instructors reported that they take into account more data related to forum usage, assignment completion, and student access, whereas data of correlation between students’ age and their VLE interactions had less importance for them. Moreover, data presented in Tables, Bar Chart, Multi-set Bar Chart, Stacked Bar Chart and Lollipop received good evaluations, whereas the use of Bubble Chart, Flow Chart, Heatmap and Violin Plot received bad evaluations. We note that even though instructors were presented different types of the visualization, the ones they selected the most and evaluated better are in line with both their preferences mentioned before and the charts they already used.
As future work, we plan to develop a dashboard using those visualizations. To evaluate the dashboard, we will want to assess whether there are changes in students’ performance when instructors are able to see information about their behavior and performance, and act accordingly. Furthermore, the results presented in this paper can direct and ground future research, not only on dashboard design for VLEs, but also on recommender systems and intelligent user interfaces that better support instructors who make use of VLEs.
Notes
- 1.
In this paper, visualization groups are identified in the format VG-99.
- 2.
For instance: https://datavizcatalogue.com and https://www.data-to-viz.com.
- 3.
- 4.
In this paper, instructors are identified in the format I-99.
References
Saraiva, T.: Educação a distância no Brasil: lições da história. Em Aberto 16(70), 17–27 (1996). http://emaberto.inep.gov.br/index.php/emaberto/article/view/2076
Seaton, D.T., Bergner, Y., Chuang, I., Mitros, P., Pritchard, D.E.: Who does what in a massive open online course? Commun. ACM 57(4), 58–65 (2014). https://doi.org/10.1145/2500876
de Brandão Damasceno, A.L., dos Santos Ribeiro, D., Colcher, S., Barbosa, S.D.J.: Entrevistas e mapeamento sistemático sobre análise de logs de interação de alunos em ambientes virtuais de aprendizagem. In: Monografias em Ciência da Computação, MCC 08/2018, Departamento de Informática, PUC-Rio (2018). ftp://ftp.inf.puc-rio.br/pub/docs/techreports/18_08_damasceno.pdf
Romero, C., Ventura, S.: Educational data mining: a review of the state of the art. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 40(6), 601–618 (2010). https://doi.org/10.1109/TSMCC.2010.2053532
Dutt, A., Ismail, M.A., Herawan, T.: A systematic review on educational data mining. IEEE Access 5, 15991–16005 (2017). https://doi.org/10.1109/ACCESS.2017.2654247
Cruz-Benito, J., Therón, R., García-Peñalvo, F.J., Pizarro Lucas, E.: Discovering usage behaviors and engagement in an Educational Virtual World. Comput. Hum. Behav. 47, 18–25 (2015). https://doi.org/10.1016/j.chb.2014.11.028
Samson, P.: Can student engagement be measured? And, if so, does it matter? In: IEEE Frontiers in Education Conference (FIE), pp. 1–4. IEEE. https://doi.org/10.1109/FIE.2015.7344077 (2015)
Bote-Lorenzo, M., Gómez-Sánchez, E.: Predicting the decrease of engagement indicators in a MOOC. In: Proceedings of the Seventh International Learning Analytics and Knowledge Conference on - LAK 2017, pp. 143–147. ACM, New York (2017). https://doi.org/10.1145/3027385.3027387
Guo, P., Reinecke, K.: Demographic differences in how students navigate through MOOCs. In: Proceedings of the First ACM Conference on Learning @ Scale Conference - L@S 2014, pp. 21–30. ACM Press, New York (2014). https://doi.org/10.1145/2556325.2566247
Park, J., Denaro, K., Rodriguez, F., Smyth, P., Warschauer, M.: Detecting changes in student behavior from clickstream data. In: Proceedings of the Seventh International Learning Analytics and Knowledge Conference on - LAK 2017, pp. 21–30. ACM, New York (2017). https://doi.org/10.1145/3027385.3027430
Shi, Y., Peng, Z., Wang, H.: Modeling student learning styles in MOOCs. In: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management - CIKM 2017, pp. 979–988. ACM, New York (2017). https://doi.org/10.1145/3132847.3132965
Amar, R., Eagan, J., Stasko, J.: Low-level components of analytic activity in information visualization. In: IEEE Symposium on Information Visualization (2005). https://doi.org/10.1109/INFOVIS.2005.24
Abela, A.: Advanced Presentations By Design: Creating Communication That Drives Action. Wiley, Hoboken (2008)
Kirk, A.: Data Visualization: A Successful Design Process. Packt Publishing Ltd., Birmingham (2012)
Munzner, T.: Visualization Analysis and Design. CRC Press, Boca Raton (2014)
Cox, K., Grinter, R.E., Hibino, S.L., Jagadeesan, L.J., Mantilla, D.: A multi-modal natural language interface to an information visualization environment. Int. J. Speech Technol. 4(3), 297–314 (2001). https://doi.org/10.1023/A:1011368926479
Acknowledgments
We thank CAPES, Coordenação de Aperfeiçoamento de Pessoal de Nível Superior, and CNPq (processes #309828/2015-5 and #311316/2018-2) for the partial financial support to this work.
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Appendix: Visualizations used in each VG
Appendix: Visualizations used in each VG
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
de Brandão Damasceno, A.L., dos Santos Ribeiro, D., Barbosa, S.D.J. (2019). Visualizing Student Interactions to Support Instructors in Virtual Learning Environments. In: Antona, M., Stephanidis, C. (eds) Universal Access in Human-Computer Interaction. Theory, Methods and Tools. HCII 2019. Lecture Notes in Computer Science(), vol 11572. Springer, Cham. https://doi.org/10.1007/978-3-030-23560-4_33
Download citation
DOI: https://doi.org/10.1007/978-3-030-23560-4_33
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-23559-8
Online ISBN: 978-3-030-23560-4
eBook Packages: Computer ScienceComputer Science (R0)