Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 12205))

Included in the following conference series:

Abstract

In order to attain a good user experience in e-assessment systems, learners should be aware of how they are progressing in the courses and they should feel motivated and engaged. The goal of this paper is to propose an e-assessment system, that aims to increase the self-awareness of learners about the courses’ progress that they are taking, and also to improve the learner’s motivation and engagement. While designing the system, two major design challenges have been identified and addressed. The design challenges include, informing the learners about their progress and making the learners feel motivated to work harder and enhance their learning of the course contents and learning activities. The proposed system informs the learners about their progress and aims to keep them motivated by providing them with minimum grade predictions for the next learning activity they would perform, during the semester of an online course. The learners are also informed of their risk of failing the course throughout the semester. Also, to keep the learners motivated and engaged personalized suggestions are provided by the teachers to enhance the learning of the course. The user experience evaluation of LIS highlights that it has helped the learners to enhance their learning experiences.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Rostaminezhad, M.A., Mozayani, N., Norozi, D., Iziy, M.: Factors related to e-learner dropout: case study of IUST e-learning center. Procedia - Soc. Behav. Sci. 83, 522–527 (2013). https://doi.org/10.1016/j.sbspro.2013.06.100

    Article  Google Scholar 

  2. Bevan, N.: What is the difference between the purpose of usability and user experience evaluation methods. In: Proceedings of the Workshop UXEM 2009 (Interact 2009), pp. 1–4 (2009)

    Google Scholar 

  3. Ternauciuc, A., Vasiu, R.: Testing usability in Moodle: when and how to do it. In: Proceedings of the IEEE 13th International Symposium Intelligent System Informatics, SISY 2015, pp. 263–268 (2015). https://doi.org/10.1109/SISY.2015.7325391

  4. Law, E.L., Leicester, L.E., Hassenzahl, M.: Understanding, scoping and defining user experience: a survey approach, pp. 719–728 (2009)

    Google Scholar 

  5. Hoonhout, J., Law, E.L.-C., Väänänen-Vainio-Mattila, K., Vermeeren, A.P.O.S., Roto, V., Obrist, M.: User experience evaluation methods, p. 521 (2010). https://doi.org/10.1145/1868914.1868973

  6. Hassenzahl, M.: User experience and experience design. Encycl. Hum.-Comput. Interact. 1–35 (2005). https://doi.org/10.1111/j.1654-1103.2012.01452.x

  7. Hassenzahl, M.: A personal journey through user experience the beauty and the cognitive. J. Usability Stud. 13, 168–176 (2018)

    Google Scholar 

  8. Walker, S., Prytherch, D., Turner, J.: The pivotal role of staff user experiences in Moodle and the potential impact on student learning. In: 2013 2nd International Conference on E-Learning E-Technologies Education, ICEEE 2013, pp. 192–197 (2013). https://doi.org/10.1109/ICeLeTE.2013.6644372

  9. Majors, J., Bengs, A., Granlund, S., Ylitalo, A., Byholm, M.: Moodle moods?: a user experience study of a small private online course for higher teacher education. In: Proceedings of the 22nd International Academic Mindtrek Conference, pp. 228–235 (2018). https://doi.org/10.1145/3275116.3275146

  10. Alturki, U.T., Aldraiweesh, A., Kinshuck, D.: Evaluating the usability and accessibility of LMS “Blackboard” at King Saud University. Contemp. Issues Educ. Res. 9, 33 (2016). https://doi.org/10.19030/cier.v9i1.9548

  11. Zaitseva, N.: Usability and user friendliness estimation. 811, 2015–2016 (2016)

    Google Scholar 

  12. Scholl, M.: An implementation of user-experience-based evaluation to achieve transparency in the usage and design of information artifacts. In: Proceedings of the Annual Hawaii International Conference on System Sciences, 21–32 March 2015 (2015). https://doi.org/10.1109/HICSS.2015.14

  13. Machado, M., Tao, E.: Blackboard vs. Moodle: comparing user experience of learning management systems, pp. 7–12 (2007)

    Google Scholar 

  14. Beltrán, B.A.G., López, L.G., Ortíz, A.R.: User-centered design in Moodle redesign for mobile use, pp. 1212–1224 (2016)

    Google Scholar 

  15. Hasan, L.: Usability problems on desktop and mobile interfaces of the Moodle Learning Management System (LMS), pp. 69–73 (2018). https://doi.org/10.1145/3194188.3194192

  16. Sheshasaayee, A., Bee, M.N.: Evaluating user experience in moodle learning management systems. In: Proceedings of the IEEE International Conference on Innovative Mechanisms for Industry Applications, ICIMIA 2017, pp. 735–738 (2017). https://doi.org/10.1109/ICIMIA.2017.7975562

  17. Fenu, G., Marras, M., Meles, M.: A learning analytics tool for usability assessment in Moodle environments. J. E-Learn. Knowl. Soc. 13, 23–34 (2017). https://doi.org/10.20368/1971-8829/1388

  18. Arshad, R., Majeed, A., Afzal, H., Muzammal, M., ur Rahman, A.: Evaluation of navigational aspects of Moodle. Int. J. Adv. Comput. Sci. Appl. 7 (2016). https://doi.org/10.14569/ijacsa.2016.070342

  19. Luo, M., Zheng, S., Zhang, Y.: Design and implementation of efficient user interface in a synchronous e-learning system. In: Proceedings of the 2016 8th International Conference on Information Technology in Medicine and Education, ITME 2016, pp. 490–494 (2017). https://doi.org/10.1109/ITME.2016.0117

  20. Santoso, H.B., Isal, R.Y.K., Basaruddin, T., Sadira, L., Schrepp, M.: Research-in-progress: user experience evaluation of Student Centered E-Learning Environment for computer science program. In: Proceedings of the 2014 3rd International Conference on User Science and Engineering. Experience Engineer Engage, i-USEr 2014, pp. 52–55 (2015). https://doi.org/10.1109/IUSER.2014.7002676

  21. Green, S., Nacheva-Skopalik, L., Pearson, E.: An adaptable personal learning environment for e-learning and e-assessment. IV.5 (2009). https://doi.org/10.1145/1500879.1500939

  22. Sahid, D.S.S., Santosa, P.I., Ferdiana, R., Lukito, E.N.: Evaluation and measurement of Learning Management System based on user experience. In: Proceedings of the 2016 6th International Annual Engineering Seminar, InAES 2016, pp. 72–77 (2017). https://doi.org/10.1109/INAES.2016.7821910

  23. Wang, C.H., Wang, T.Y.: E-learning platform of STEAM aesthetic course materials based on user experience. In: Proceedings of the 2018 1st International Cognitive Cities Conference, IC3 2018, pp. 123–128 (2018). https://doi.org/10.1109/IC3.2018.00-46

  24. Ardito, C., et al.: Usability of E-learning tools. 80 (2004). https://doi.org/10.1145/989863.989873

  25. Zaharias, P.: Usability in the Context of e-Learning: A Framework Augmenting ‘Traditional’ Usability Constructs with Instructional Design and Motivation to Learn. https://doi.org/10.4018/jthi.2009062503

  26. Zaharias, P., Poylymenakou, A.: Developing a usability evaluation method for e-learning applications: beyond functional usability for e-learning applications, 7318 (2009). https://doi.org/10.1080/10447310802546716

  27. Zaharias, P., Pappas, C.: Quality management of learning management systems: a user experience perspective. Curr. Issues Emerg. eLearn. 3, 60–83 (2016)

    Google Scholar 

  28. Costabile, M.F., et al.: On the usability evaluation of e-learning applications, pp. 1–10 (2005)

    Google Scholar 

  29. Davids, M.R., Chikte, U.M.E., Halperin, M.L.: An efficient approach to improve the usability of e-learning resources: the role of heuristic evaluation. 242–248 (2019). https://doi.org/10.1152/advan.00043.2013

  30. Sim, G., Read, J.C.: Using computer-assisted assessment heuristics for usability evaluations. Br. J. Educ. Technol. 47, 694–709 (2016). https://doi.org/10.1111/bjet.12255

    Article  Google Scholar 

  31. McLoughlin, C., Luca, J.: Beyond marks and measurement: developing dynamic and authentic forms of e-assessment. In: Australasian Society for Computers in Learning in Tertiary Education, ASCILITE 2006, vol. 2, pp. 559–562 (2006)

    Google Scholar 

  32. Longstaff, E.: How MOOCs can empower learners: a comparison of provider goals and user experiences. J. Furth. High. Educ. 41, 314–327 (2017)

    Article  Google Scholar 

  33. Carless, D., Boud, D.: The development of student feedback literacy: enabling uptake of feedback. Assess. Eval. High. Educ. 43, 1315–1325 (2018). https://doi.org/10.1080/02602938.2018.1463354

    Article  Google Scholar 

  34. Stödberg, U.: A research review of e-assessment. Assess. Eval. High. Educ. 37, 591–604 (2012). https://doi.org/10.1080/02602938.2011.557496

    Article  Google Scholar 

  35. Baneres, D., Rodríguez-Gonzalez, M.E., Serra, M.: An early feedback prediction system for learners at-risk within a first-year higher education course. IEEE Trans. Learn. Technol. 12, 249–263 (2019). https://doi.org/10.1109/TLT.2019.2912167

    Article  Google Scholar 

  36. Minguillón, J., Conesa, J., Rodríguez, M.E., Santanach, F.: Learning analytics in practice: providing e-learning researchers and practitioners with activity data. In: Spector, J., et al. (eds.) Frontiers of Cyberlearning. LNET, pp. 145–167. Springer, Singapore (2018). https://doi.org/10.1007/978-981-13-0650-1_8

    Chapter  Google Scholar 

  37. Reeve, J.: Motivating Others: Nurturing Inner Motivational Resources. Allyn & Bacon (1996)

    Google Scholar 

  38. Yoo, S.J., Huang, W.D.: Engaging online adult learners in higher education: motivational factors impacted by gender, age, and prior experiences. J. Contin. High. Educ. 61, 151–164 (2013). https://doi.org/10.1080/07377363.2013.836823

    Article  Google Scholar 

  39. Henrie, C.R., Halverson, L.R., Graham, C.R.: Measuring student engagement in technology-mediated learning: a review. Comput. Educ. 90, 36–53 (2015). https://doi.org/10.1016/j.compedu.2015.09.005

    Article  Google Scholar 

  40. Doherty, K., Doherty, G.: Engagement in HCI: conception, theory and measurement. ACM Comput. Surv. 51, 99:1–99:39 (2018). https://doi.org/10.1145/3234149

  41. Gray, J.A., DiLoreto, M.: The effects of student engagement, student satisfaction, and perceived learning in online learning environments. NCPEA Int. J. Educ. Leadersh. Prep. 11, 98–119 (2016)

    Google Scholar 

  42. Assami, S., Daoudi, N., Ajhoun, R.: Personalization criteria for enhancing learner engagement in MOOC platforms. In: 2018 IEEE Global Engineering Education Conference (EDUCON), pp. 1265–1272. IEEE (2018)

    Google Scholar 

Download references

Acknowledgments

This work was funded by the eLearn Center at Universitat Oberta de Catalunya through the project: New Goals 2018NG001 “LIS: Learning Intelligent System”.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Sidra Iftikhar , Ana-Elena Guerrero-Roldán , Enric Mor or David Bañeres .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Iftikhar, S., Guerrero-Roldán, AE., Mor, E., Bañeres, D. (2020). User Experience Evaluation of an e-Assessment System. In: Zaphiris, P., Ioannou, A. (eds) Learning and Collaboration Technologies. Designing, Developing and Deploying Learning Experiences. HCII 2020. Lecture Notes in Computer Science(), vol 12205. Springer, Cham. https://doi.org/10.1007/978-3-030-50513-4_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-50513-4_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-50512-7

  • Online ISBN: 978-3-030-50513-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics