Abstract
Massive Open Online Courses (MOOCs) are delivered through dedicated platforms that provide educational opportunities to a large number of learners around the globe. The discussion forum is a key part of a MOOC platform. Structured communication between learners or between learners and instructors can take place through the forum. This communication has been shown that can have strong learning impact. Teaching Assistants (TAs) have a crucial role coordinating and supporting learners within a MOOC forum. In this paper, we investigate the impact a forum design can have on the TA’s effectiveness while supporting the learners of a MOOC. Towards this objective, a mixed-methods study was performed on two MOOCs delivered through the OpenEdX platform. The goal was to reveal any design issues initially through a participatory ethnographic study and complementarily through a formal usability evaluation. Moreover, through interviews with the TAs, problems they faced while supporting learners were confirmed. The results of this study indicate that the OpenEdX forum design faces a variety of issues that need to be considered by course designers. Such issues cause various problems to teaching assistants, hindering effective support to learners and therefore affecting the learners’ experience. It is further expected that the findings of this study may contribute to effective re-design of MOOC platform forums, more effective and efficient TA interventions and ultimately to improved learning.
You have full access to this open access chapter, Download conference paper PDF
Similar content being viewed by others
Keywords
1 Introduction
Massive Open Online Courses (MOOCs) open up learning opportunities to large numbers of people [1]. Most MOOCs are offered through dedicated online platforms, like Coursera, OpenEdX or Udemy, and most often attract thousands of learners. However, the effectiveness of learning in many MOOCs is questionable, as shown by the low student retention ratio [2]. It was found that in most cases completion rates of MOOCs do not exceed 20% and usually are between 7–11% [3]. There are several reasons for this low performance, among which, an important one has been recognized to be the lack of support and interactivity in MOOCs [4, 5]. Individual learners can receive support through the discussion forum, in the form of asynchronous communication and instructional interaction [6, 7]. It is therefore very important to study the way MOOC discussion forums are designed and used.
A significant user of a MOOC forum is the instructor and his/her assistants [8, 9]. Their role is to guide learners, pose interesting questions and provide insightful answers to learners’ inquiries. Gamage et al. [10] suggest that the usability dimension of a MOOC platform and the help that is provided for any platform problems, are key factors for effective learning.
In this paper, a mixed-methods study is presented that aims to capture and explore encountered problems on the use of two MOOCs, delivered through OpenEdX technology, one of the major MOOC platforms [11]. The goal is to unveil and investigate the main problems that prevented TAs from supporting learners effectively within the discussion forum of this particular platform, and associate these problems with possible usability issues of the MOOC platform and to the ways the courses had been delivered. Moreover, this study aims to provide insights for a future alternative design of the MOOC forum platform that would improve the quality of the support it provides. It will also contribute to future design and development of tools that could assist TAs in providing more efficiently support to learners.
2 Literature Overview
Several past studies have shown that usability affects the participants’ overall learning experience, while the design quality of a learning platform and the ease of using a learning management system contribute to the participants’ satisfaction and performance [12]. In their survey on quality of MOOC designs, Yousef et al. [13] stated that usability, content, collaboration and instructional design play a major role in delivering successful MOOCs, while Soloway et al. [14] earlier had stated that learners should be placed at the center of their design process. Other studies have reported that usability problems are related to the poor design of e-learning courses, resulting in non-motivated and frustrated learners [15, 16]. Tsironis et al. [17] investigated the usability of three popular MOOC platforms, OpenEdX, Coursera and Udacity. The study revealed that all three platforms had usability problems in terms of navigation, finding resources and performing regular tasks.
On the other hand, the support that is provided by the course staff within a MOOC forum is an important part of the learning process [18]; it is a significant factor that affects learners’ attrition within a MOOC [19]. In our case scenario, the main actors that provide support within the forum are the teaching assistants. Their role in online discussions is essential for maintaining the interest and motivation of learners [20]. They keep track of the discussions and intervene whenever there is a need for support. Their presence in the forum is crucial to keep learners engaged and may have positive impact to their performance [21]. In their research, Baxter and Haycock [22] stated that forums fail to handle high volumes of posts due to the fact that topics are fragmented over many threads and there is lack of search facilities. The difficulties of effective interaction and support in very large MOOC forums have been highlighted in many other studies, e.g. [23, 24]. Such studies indicate that a major increase in the forum participation may have negative effect to the support provided by TAs. This issue could be related to usability issues of the forum platform itself that scale up as forum participation increases. It is therefore understood that in order to support learners more efficiently there is need for new tools that could provide TAs a “bird’s eye” view of the forum discussions [25].
Being motivated by such studies, we formulate the main research question of our work, that is to identify the key usability issues of the OpenEdX discussion forum and find out whether these affect the support that is provided to learners by the TAs. To answer this question, we followed a mixed-methods approach in order to capture the different aspects of problems TAs faced during their experience in the forum.
3 Methodology
The study was performed on two MOOCs offered in the mathesis.cup.gr, a major Greek MOOC platform based on OpenEdX technology. The first course, ‘Differential Equations 1’ (DE course), aimed to introduce learners to the mathematical theory of differential equations and their practical use. The second one, ‘Introduction to Python’ (PY course), aimed to introduce learners to computer programming through Python. The duration of both courses was 6 weeks and the enrolled learners were 2,153 for DE and 5,569 for PY. Within each course discussion forum, the support was provided by Teaching Assistants (TAs). The TAs were mostly learners that had attended former MOOCs of the same instructor and demonstrated high engagement and performance. They were subsequently contacted by the instructor, they were assigned the role of TA and were asked to contribute to subsequent editions of the course. For those particular courses, there were two active TAs for the DE and 5 for the PY.
To organize the study, we used an ‘anticipatory data reduction method’ [26], where our research goals are addressed through three exploratory topics (see Fig. 1):
-
1.
Organization of the forum discussions. The way discussions were organized and affected learners and TAs during their experience.
-
2.
TAs’ experience and interactions within the forum. Such experience and interactions provide us with insights about the TAs decisions over time.
-
3.
Usability issues of the forum platform. We capture the main usability issues of the platform and how they affected the TAs support to learners.
Triangulating different methods, data collection techniques (Table 1) and different informants, allowed us to improve the quality, reliability, and rigor of the research and its results [27]. In Table 1, we present the data collection methods employed: participatory ethnography, interviews, usability evaluation through cognitive walkthrough and heuristics, forum log analysis. It should be observed that during the participatory ethnography, we took a ‘lurker’ perspective, without intervening, so as to not affect forum interactions. For the cognitive walkthrough method evaluators were asked to perform typical user tasks in the forum, such as navigation in the forum, posting new questions and modifying existing ones. Further down we present extracts from our data collection methods using the format: [method/course/#TA].
4 Results
4.1 Exploratory Topic 1: Organization of the Forum Discussions
The discussion forum offered by the OpenEdX platform is organized according to a three-level hierarchy, shown in Fig. 2. The terminology used is that of ‘discussions-responses-comments’. In most MOOC forums, a similar architecture is used, albeit with different terminology, e.g. in Coursera the three levels are called ‘threads-posts-comments’. No further levels are allowed, as is the case with other discussion forums, beyond MOOCs, in which a comment can receive further comments, ad infinitum.
During our participatory ethnographic study, it was observed that the organization of each week’s discussions was time-oriented. Each new discussion that was created was added to the top of the left side-bar (see Fig. 2), that contained the week’s discussions. It was observed that there were two kinds of discussions each week: learner discussions, i.e. discussions created by learners, and course discussions, i.e. discussions created by TAs or the course instructor.
As shown in Table 2, the average number of discussions created by learners was much higher than those created by the instructors. It is evident, that learners preferred to create their own discussions instead of participating in the course discussions set by TAs. This behavior increased complexity in the forum information space. During the interviews, TAs stated that the reason they created the topic specific course discussions was to discourage large numbers of learner discussions. They stated that they had problems in following new discussions and in providing prompt support to learners, as the number of learner discussions kept increasing. They also stated that such discussions usually related to a single question posed by the discussion creator and received just a few replies. On the other hand, course discussions were more popular; they received much higher number of replies as compared to learner discussions, as seen in Table 2. In general, we observed that there was no strict organization in the forum. TAs reported that in former MOOCs they participated, there were only learner discussions in the discussion forum. This resulted in a large number of learner discussions and a lot of questions remained unanswered. To resolve the issue, the TAs followed the approach of creating course discussions at the start of every week.
“In former MOOCs of Mr.[INSTRUCTOR], learners were creating so many discussions that it was impossible to keep track of them, so in this forum we decided to create specific discussions every week so as to avoid the chaotic situation we were facing previously.” [INT/PY/TA1]
The usability evaluation performed in the two forums verified the existence of many difficulties in searching for specific items. The evaluators stated that navigation within the forum was very problematic and the search function of the platform was not helpful, so the search for questions of interest was achieved by scrolling through and reading every discussion’s theme.
4.2 Exploratory Topic 2: TA Interactions with Learners
To collect information on this exploratory topic, we went through all the corresponding transcripts of the discussions that TA participated in. We focused on TA interactions that relate to issues they faced with learners. In some replies, the TAs were prompting learners to use the course discussions for posting their content-related questions and avoid creating their own. There were occasions where TAs did not even answer to learners’ questions. An extract from PY forum (TA reply):
“- There is a discussion that was created for questions on this material. Please use that to help your peers with their questions and us to provide you more effective support.” [PE/DE/TA1]
During the interviews, TAs reported that such learner behavior did not comply with the forum policies, which stated: “… learners should use course discussions to post their content-related questions … questions posted in other discussions should not be answered by TAs”. TAs however were not consistent in following this guide. We witnessed occasions where they were strict towards some learners that did not follow the rules, but in other occasions they kept answering questions within learner discussions. This somehow implied a change in their attitude, as if accepting eventually the situation and surrendering to policy violations. In fact, the TAs verified this behavior during the interviews. A related problem had to do with duplicate questions posted within learner discussions and inevitably appeared due to the large number of learner discussions. The TAs’ frustration was conspicuous about it.
“Your question has already been answered in this discussion: [URL]. Please avoid posting questions that have already been answered elsewhere.” [PE/PY/TA2]
Moreover, the TAs reported that many learners that follow such approach avoid engaging into a conversation, they just seek a solution to their problem. For example:
“There was a group of learners that were posting duplicate questions very frequently. It was obvious that they were unaware of other related discussions. They were using the forum just to get a direct answer.” [INT/PY/TA2]
4.3 Exploratory Topic 3: Usability Issues of the Forum Platform
The usability evaluation unveiled a number of usability violations within the forum platform. Specifically, the evaluators reported that navigation was the main problem. The task of searching for discussions of interest to post new questions was difficult to perform. They also stated that during the evaluation of the PY course (attended by a much higher number of participants) this issue seemed to scale up. So navigation within a discussion of the PY course was much more time-consuming than the DE, due to the large number of replies in the former case. Another issue of the platform related to its search function. The evaluators reported that the search results were vague and didn’t improve the navigation process. During the interviews it was stated:
“I usually used the browser’s search function (cntrl-F) and searched with keywords like ‘minutes’ or question marks which I hoped to lead me to recent questions.” [INT/PY/TA1]
This validated the evaluator’s conclusions and implied that TAs often had to improvise for navigation. They also reported that the ‘Create new response’ action was less prominent than the ‘Create new discussion’. The buttons for such actions were positioned in a way that users by mistake created a new discussion when they wanted to post a new question in an existing discussion. They also stated that ‘Help and documentation’ (e.g. the forum’s policies) was not included in the forum platform, but rather it was part of the course material. This may partially explain the limited compliance of learners to the forum policies.
5 Discussion
The results of our study revealed some significant limitations of the OpenEdX forum platform. The main issue refers to the difficulty in navigation within the forum discussions. Firstly, the organization of the discussions was not strict. This resulted in TAs not easily finding new interesting questions asking for their intervention. As the forum evolved over time, the number of new discussions increased and TAs had to contrive new ways of navigating. The forum itself provided email notifications for new messages but such method was treated as spam and was abandoned. The navigation issue was also verified during our participatory ethnography. Towards the end of each week, when the number of discussions had increased, it was very difficult to find conversations of interest. In the case of duplicate questions, we tried many times to find where the original one was answered, but we could hardly locate it. This may explain why learners were often unaware of their duplicate questions. It seems that the navigation problems had negative impact on them too. Usability evaluators also verified that navigation is problematic and specific tasks highlighted their violation.
During the participatory ethnography, we attempted to interpret the fact that learners often created their own discussions instead of using the course discussions. This could be attributed to the unawareness of the forum policies. “Users that attend many MOOCs, take the MOOC policies for granted” [INT/DE/TA1], stated one of the TAs. Usability evaluation also revealed that the policies should had been more visible as the evaluators struggled to find them. There is a need for new ways to inform learners about specific policies before they enter the discussion forum and the study showed that the forum platform lacks such feature. Clearly stated and implemented policies are very important to retain a ‘healthy’ forum as the course evolves over time.
It has been quite clear from the interviews, that such issues had negative effect on the support provided to learners by the TAs of both courses. TAs of PY course stated:
“I spent more time in searching than answering to new questions.” [INT/PY/TA1,2]
Despite these issues, the TAs stated that they were pleased with their contribution. This is due to the fact that they are highly motivated, they participate in a voluntary basis and yet they choose to spend a lot of time in the forum. From the TA interviews, we concluded that the situation in the PY course was worse. They stated that navigation within the course discussions was difficult due to the large number of replies they received and had less time to answer questions due to the course size.
6 Conclusions
Our study highlighted several important usability issues of the OpenEdX platform and their negative impact on the TAs role. The negative impact was exacerbated in the PY course since it had more participants, which shows that the identified issues scale up with participation. We discovered that as participation in the forum increased, the TAs adapted more complex strategies in order to navigate and provide effective support to learners. This increases the required load of effort and detaches them from their main goal. Our study identified limitations of the platform in providing effective tools to assist the TAs on this issue. As a consequence, MOOC organizers resort to increasing the number of TAs as learner participation increases. However, in MOOCs with thousands of learners this cannot be a viable solution.
Due to the importance of the TAs role for supporting activities in a MOOC environment, it is now well understood that there is need for new policies and tools to ease and guide their interventions. Such tools could help TAs keep track of the discussions and provide support to learners efficiently and uninterruptedly while the forum evolves over time. The development of tools that assist TAs and automate their interventions [31] is a promising field of research. Classification algorithms that identify content-related discussions [32] may assist their development, while advanced visualization techniques may produce topic-related overview of the forum state, highlighting recurring topics. Development of such tools may be an interesting direction. Taking into consideration these recommendations, in our ongoing research we plan to follow a more design-based approach by experimentally implementing and validating such new approaches in future MOOCs delivered through the OpenEdX platform.
References
Hill, P.: Online educational delivery models: a descriptive view. EDUCAUSE Rev. 47, 84–86 (2012)
Yuan, L., Powell, S.: MOOCs and open education: implications for higher education. Cetis White Paper (2013)
Liyanagunawardena, T.R., Parslow, P., Williams, S.A.: Dropout: MOOC participants’ perspective. In: European MOOC Stakeholder Summit, pp. 95–100 (2014)
Daniel, J.: Making sense of MOOCs: musings in a maze of myth, paradox and possibility. J. Interact. Media Educ. 2012(3), Art. 18 (2012). https://doi.org/10.5334/2012-18
Kizilcec, R.F., Halawa, S.: Attrition and achievement gaps in online learning. In: Learning@ Scale, pp. 57–66 (2015)
Mak, S., Williams, R., Mackness, J.: Blogs and forums as communication and learning tools in a MOOC. In: International Conference on Networked Learning, pp. 275–285 (2010)
Kumar, M., Kan, M.-Y., Tan, B.C., Ragupathi, K.: Learning instructor intervention from MOOC forums: early results and issues. In: International Educational Data Mining Society, pp. 218–225 (2015)
Berge, Z.L.: The role of the online instructor/facilitator. Educ. Technol. 35, 22–30 (1995)
Brouns, F., Mota, J., Morgado, L.: A networked learning framework for effective MOOC design: the ECO project approach. In: Challenges for Research into Open & Distance Learning: Doing Things Better: Doing Better Things, pp. 161–171 (2014)
Gamage, D., Fernando, S., Perera, I.: Factors leading to an effective MOOC from participiants perspective. In: International Conference on Ubi-Media Computing, pp. 230–235 (2015)
Sandeen, C.: Integrating MOOCs into traditional higher education: the emerging “MOOC 3.0” era. Change Mag. High. Learn. 45, 34–39 (2013)
Chang, S., Tung, F.: An empirical investigation of students’ behavioural intentions to use the online learning course websites. Br. J. Edu. Technol. 39, 71–83 (2008)
Yousef, A.M.F., Chatti, M.A., Schroeder, U., Wosnitza, M.: What drives a successful MOOC? An empirical examination of criteria to assure design quality of MOOCs. In: International Conference on Advanced Learning Technologies, pp. 44–48 (2014)
Soloway, E., Guzdial, M., Hay, K.E.: Learner-centered design: the challenge for HCI in the 21st century. Interactions 1, 36–47 (1994)
Clark, R.C., Mayer, R.E.: E-learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia Learning. Wiley, Hoboken (2016)
O’regan, K.: Emotion and e-learning. J. Asynchronous Learn. Netw. 7, 78–92 (2003)
Tsironis, A., Katsanos, C., Xenos, M.: Comparative usability evaluation of three popular MOOC platforms. In: Global Engineering Education Conference (EDUCON), pp. 608–612 (2016)
Onah, D.F., Sinclair, J.E., Boyatt, R.: Exploring the use of MOOC discussion forums. In: London International Conference on Education, pp. 1–4 (2014)
Yang, D., Wen, M., Kumar, A., Xing, E.P., Rose, C.P.: Towards an integration of text and graph clustering methods as a lens for studying social interaction in MOOCs. Int. Rev. Res. Open Distrib. Learn. 15, 215–234 (2014)
Yang, D., Adamson, D., Rosé, C.P.: Question recommendation with constraints for massive open online courses. In: ACM Conference on Recommender Systems, pp. 49–56 (2014)
Mazzolini, M., Maddison, S.: Sage, guide or ghost? The effect of instructor intervention on student participation in online discussion forums. Comput. Educ. 40, 237–253 (2003)
Baxter, J.A., Haycock, J.: Roles and student identities in online large course forums: implications for practice. Int. Rev. Res. Open Distrib. Learn. 15(1), 20–40 (2014)
Onah, D.F., Sinclair, J., Boyatt, R.: Dropout rates of massive open online courses: behavioural patterns. In: EDULEARN14 Proceedings, pp. 5825–5834 (2014)
Kizilcec, R.F., Piech, C., Schneider, E.: Deconstructing disengagement: analyzing learner subpopulations in massive open online courses. In: International Conference on Learning Analytics and Knowledge, pp. 170–179 (2013)
Sharif, A., Magrill, B.: Discussion forums in MOOCs. Int. J. Learn. Teach. Educ. Res. 12(1), 119–132 (2015)
Miles, M.B., Huberman, A.M., Huberman, M.A., Huberman, M.: Qualitative Data Analysis: An Expanded Sourcebook. Sage Publications, Thousand Oaks (1994)
Greene, J.C., Caracelli, V.J., Graham, W.F.: Toward a conceptual framework for mixed-method evaluation designs. Educ. Eval. Policy Anal. 11, 255–274 (1989)
Stuckey, H.L.: Three types of interviews: qualitative research methods in social health. J. Soc. Health Diab. 1, 56–59 (2013)
Nielsen, J.: Finding usability problems through heuristic evaluation. In: SIGCHI Conference on Human Factors in Computing Systems, pp. 373–380 (1992)
Lewis, C., Polson, P.G., Wharton, C., Rieman, J.: Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces. In: SIGCHI Conference on Human Factors in Computing Systems, pp. 397–404 (1990)
Chandrasekaran, M., Ragupathi, K., Kan, M.-Y., Tan, B.: Towards feasible instructor intervention in MOOC discussion forums. In: International Conference on Information Systems, pp. 2483–2491 (2015)
Wise, A.F., Cui, Y., Jin, W., Vytasek, J.: Mining for gold: identifying content-related MOOC discussion threads across domains through linguistic modeling. Internet High. Educ. 32, 11–28 (2017)
Acknowledgements
This research is performed in the frame of collaboration of the University of Patras with online platform mathesis.cup.gr. Supply of MOOCs data, by Mathesis is gratefully acknowledged. Doctoral scholarship “Strengthening Human Resources Research Potential via Doctorate Research – 2nd Cycle” (MIS-5000432), implemented by the State Scholarships Foundation (ΙΚΥ) is also gratefully acknowledged. This research has also been partially funded by the Spanish State Research Agency (AEI) under project grants TIN2014-53199-C3-2-R and TIN2017-85179-C3-2-R, the Regional Government of Castilla y León grant VA082U16, the ΕC grant 588438-EPP-1-2017-1-EL-EPPKA2-KA.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 IFIP International Federation for Information Processing
About this paper
Cite this paper
Ntourmas, A., Avouris, N., Daskalaki, S., Dimitriadis, Y. (2019). Evaluation of a Massive Online Course Forum: Design Issues and Their Impact on Learners’ Support. In: Lamas, D., Loizides, F., Nacke, L., Petrie, H., Winckler, M., Zaphiris, P. (eds) Human-Computer Interaction – INTERACT 2019. INTERACT 2019. Lecture Notes in Computer Science(), vol 11747. Springer, Cham. https://doi.org/10.1007/978-3-030-29384-0_12
Download citation
DOI: https://doi.org/10.1007/978-3-030-29384-0_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-29383-3
Online ISBN: 978-3-030-29384-0
eBook Packages: Computer ScienceComputer Science (R0)