Skip to main content

Problem-Based Multiple Response Exams for Students with and Without Learning Difficulties

  • Conference paper
  • First Online:
Computer Supported Education (CSEDU 2021)

Abstract

Objective computer-assisted examinations (CAA) are considered a preferable option compared to constructed response (CR) ones because marking is done automatically without the intervention of the examiner. This publication compares the attitudes and perceptions of a sample of engineering students towards a specific objective examination format designed to assess the students’ proficiency to solve electronics problems. Data were collected using a 15-item questionnaire which included a free text question. Overall the students expressed a preference for the objective-type examination format. The students who self-reported to face learning difficulties (LD) were equally divided between the two examination formats. Their examination format preference was determined by the details of their learning difficulties, indicating that none of the two assessment formats effectively solves the assessment question for these students. For the rest of the respondents, examination format preference was accompanied by opposing views regarding answering by guessing, having the opportunity to express their views, selecting instead of constructing an answer, having the opportunity to demonstrate their knowledge, and having control of the exam answers.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bull, J., McKenna, C.: Blueprint for Computer-Assisted Assessment, 1st edn. Routledge Falmer, London (2004)

    Google Scholar 

  2. Stavroulakis, P., Photopoulos, P., Ventouras, E., Triantis, D.: Comparison of electronic examinations using adaptive multiple-choice questions and constructed-response questions. In: Proceedings of the 12th International Conference on Computer Supported Education, Volume 1: CSEDU, pp. 358–365 (2020)

    Google Scholar 

  3. Photopoulos, P., Tsakiridis, O., Stavrakas, I., Triantis, D.: Weighted scoring of multiple-choice questions based exams: expert and empirical weighting factors. In: Proceedings of the 12th International Conference on Computer Supported Education, Volume 1: CSEDU, pp. 382–387 (2020)

    Google Scholar 

  4. Photopoulos, P., Tsonos, C., Stavrakas, I., Triantis, D.: Preference for multiple choice and constructed response exams for engineering students with and without learning difficulties. In: Proceedings of the 13th International Conference on Computer Supported Education, Volume 1: CSEDU, pp. 220–231 (2021)

    Google Scholar 

  5. Case, S.M., Swanson, D.B.: Extended-matching items: a practical alternative to free-response questions. Teach. Learn. Med. 5(2), 107–115 (1993). https://doi.org/10.1080/10401339309539601

    Article  Google Scholar 

  6. Fenderson, B.A., Damjanov, I., Robeson, M.R., Veloski, J.J., Rubin, E.: The virtues of extended matching and uncued tests as alternatives to multiple choice questions. Hum. Pathol. 28(5), 526–532 (1997)

    Article  Google Scholar 

  7. Gero, A., Stav, Y., Wertheim, I., Epstein, A.: Two-tier multiple-choice questions as a means of increasing discrimination: case study of a basic electric circuits course. Glob. J. Eng. Educ. 21(2), 139–144 (2019)

    Google Scholar 

  8. Duffy, G., O’Dwyer, A.: Measurement of first year engineering students’ cognitive activities using a spatial skills test and an electrical concepts test: implications for curriculum design. In: Proceedings of the Research in Engineering Education Symposium, Dublin, Ireland (2015)

    Google Scholar 

  9. Duffy, G., Sorby, S., Bowe, B.: An investigation of the role of spatial ability in representing and solving word problems among engineering students. J. Eng. Educ. 109, 424–442 (2020). https://doi.org/10.1002/jee.20349

    Article  Google Scholar 

  10. Wasis, Kumaidi, Bastari, Mundilarto, Wi̇ntarti̇, A.: Analytical weighting scoring for physics multiple correct items to improve the accuracy of students’ ability assessment. Eurasian J. Educ. Res. 18(76), 187–202 (2018)

    Google Scholar 

  11. Zeidner, M.: Essay versus multiple-choice type classroom exams: the student’s perspective. J. Educ. Res. 80(6), 352–358 (1987). https://doi.org/10.1080/00220671.1987.10885782

    Article  Google Scholar 

  12. Kaipa, R.M.: Multiple choice questions and essay questions in curriculum. J. Appl. Res. High. Educ. 13(1), 16–32 (2021). https://doi.org/10.1108/JARHE-01-2020-0011

    Article  Google Scholar 

  13. Paxton, M.: A linguistic perspective of multiple-choice questioning. Assess. Eval. High. Educ. 25(2), 109–119 (2000). https://doi.org/10.1080/713611429

    Article  Google Scholar 

  14. Finn, J.D., Pannozzo, G.M., Achilles, C.M.: The “Why’s” of class size: student behavior in small classes. Rev. Educ. Res. 73(3), 321–368 (2003). https://doi.org/10.3102/00346543073003321

    Article  Google Scholar 

  15. Bettinger, E., Doss, C., Loeba, S., Rogers, A., Taylor, E.: The effects of class size in online college courses: experimental evidence. Econ. Educ. Rev. 58, 68–85 (2017). https://doi.org/10.1016/j.econedurev.2017.03.006

  16. Kauppi, N.: Waiting for Godot? On some of the obstacles for developing counter-forces in higher education. Globalizations 16(5), 745–750 (2019). https://doi.org/10.1080/14747731.2019.1578100

    Article  Google Scholar 

  17. Grummell, B., Lynch, K.: New managerialism: a political project in Irish education. In: Murphy, M.P., Dukelow, F. (eds.) The Irish Welfare State in the Twenty-First Century, pp. 215–235. Palgrave Macmillan UK, London (2016). https://doi.org/10.1057/978-1-137-57138-0_10

    Chapter  Google Scholar 

  18. Lynch, K.: Control by numbers: new managerialism and ranking in higher education. Crit. Stud. Educ. 56(2), 190–207 (2015). https://doi.org/10.1080/17508487.2014.949811

    Article  Google Scholar 

  19. Trammell, J.: Accommodations for multiple choice tests. J. Postsecond. Educ. Disabil. 24(3), 251–254 (2011)

    Google Scholar 

  20. Niazov, Z., Hen, M., Ferrari, J.R.: Online and academic procrastination in students with learning disabilities: the impact of academic stress and self-efficacy. Psychol. Rep. (2021). https://doi.org/10.1177/0033294120988113

    Article  Google Scholar 

  21. Nieminen, J.H., Pesonen, H.V.: Politicising inclusive learning environments: how to foster belonging and challenge ableism? High. Educ. Res. Dev. (2021). https://doi.org/10.1080/07294360.2021.1945547

    Article  Google Scholar 

  22. Liasidou, A.: Critical disability studies and socially just change in higher education. Br. J. Spec. Educ. 41(2), 120–135 (2014). https://doi.org/10.1111/1467-8578.12063

    Article  Google Scholar 

  23. Gravett, K., Ajjawi, P.: Belonging as situated practice. Stud. High. Educ. (2021). https://doi.org/10.1080/03075079.2021.1894118

    Article  Google Scholar 

  24. Benson, W., Probst, T., Jiang, L., Olson, K., Graso, M.: Insecurity in the Ivory Tower: direct and indirect effects of pay stagnation and job insecurity on faculty performance. Econ. Ind. Democr. 41(3), 693–708 (2020). https://doi.org/10.1177/0143831X17734297

    Article  Google Scholar 

  25. Li, A.Y.: Dramatic declines in higher education appropriations: state conditions for budget punctuations. Res. High. Educ. 58(4), 395–429 (2016). https://doi.org/10.1007/s11162-016-9432-0

    Article  Google Scholar 

  26. Krug, K.S., Dickson, K.W., Lessiter, J.A., Vassar, J.S.: Student preference rates for predominately online, compressed, or traditionally taught university courses. Innov. High. Educ. 41(3), 255–267 (2015). https://doi.org/10.1007/s10755-015-9349-0

    Article  Google Scholar 

  27. Holley, D., Oliver, M.: Pedagogy and new power relationships. Int. J. Manag. Educ. (2000). https://www.researchgate.net/publication/238721033_Pedagogy_and_New_Power_Relationships/citations

  28. Watts, R.: Public Universities, Managerialism and the Value of Higher Education, 1st edn., p. 20, 22–23, 230–233. Palgrave Macmillan, London (2017)

    Google Scholar 

  29. Teräs, M., Suoranta, J., Teräs, H., Curcher, M.: Post-Covid-19 education and education technology ‘Solutionism’: a seller’s market. Postdigit. Sci. Educ. 2(3), 863–878 (2020). https://doi.org/10.1007/s42438-020-00164-x

    Article  Google Scholar 

  30. Mandel, A., Hörnlein, A., Ifland, M., Lüneburg, E., Deckert, J., Puppe, F.: Cost analysis for computer supported multiple-choice paper examinations. GMS Z. Med. Ausbild. 28(4), Doc.55 (2011). https://doi.org/10.3205/zma000767. https://www.researchgate.net/publication/51970103_Cost_analysis_for_computer_supported_multiple-choice_paper_examinations. Accessed 30 Nov 2020

  31. Loewenberger, P., Bull, J.: Cost-effectiveness analysis of computer-based assessment. ALT-J. – Assoc. Learn. Technol. J. 11(2), 23–45 (2003)

    Google Scholar 

  32. Bull, J.: Computer-assisted assessment: impact on higher education institutions. J. Educ. Technol. Soc. 2(3), 123–126 (1999). https://www.jstor.org/stable/jeductechsoci.2.3.123

  33. Topol, B., Olson, J., Roeber, E.: The Cost of New Higher Quality Assessments: A Comprehensive Analysis of the Potential Costs for Future State Assessments. Stanford University, Stanford, CA (2010). Stanford Center for Opportunity Policy in Education

    Google Scholar 

  34. Collins, R.: Social distancing as a critical test of the micro-sociology of solidarity. Am. J. Cult. Sociol. 8, 477–497 (2020). https://doi.org/10.1057/s41290-020-00120-z

    Article  Google Scholar 

  35. Rahman, A., Arifin, N., Manaf, M., Ahmad, M., Mohd Zin, N.A., Jamaludin, M.: Students’ perception in blended learning among science and technology cluster students. J. Phys.: Conf. Ser. 1496, 012012, 1–11 (2020). https://doi.org/10.1088/1742-6596/1496/1/012012

  36. Vivitsou, M.: Digitalisation in education, allusions and references. Center Educ. Stud. J. 9(3) 117–136, (2019). https://doi.org/10.26529/cepsj.706. Robotisation, Automatisation, the End of Work and the Future of Education

  37. Mintzberg, H.: The Structuring of Organizations, pp. 352–354. Prentice Hall, Englewood Cliffs (1979)

    Google Scholar 

  38. Tan, K.H.K.: How teachers understand and use power in alternative assessment. Educ. Res. Int. 11 (2012). https://doi.org/10.1155/2012/382465. Article ID 382465

  39. Simkin, M.G., Kuechler, W.L.: Multiple-choice tests and student understanding: what is the connection? Decis. Sci. J. Innov. Educ. 3, 73–98 (2005). https://doi.org/10.1111/j.1540-4609.2005.00053.x

    Article  Google Scholar 

  40. Scharf, E.M., Baldwin, L.P.: Assessing multiple choice question (MCQ) tests - a mathematical perspective. Act. Learn. High. Educ. 8(1), 31–47 (2007). https://doi.org/10.1177/1469787407074009

    Article  Google Scholar 

  41. Wong, M.-Y.: Teacher–student power relations as a reflection of multileveled intertwined interactions. Br. J. Sociol. Educ. 37(2), 248–267 (2016). https://doi.org/10.1080/01425692.2014.916600

    Article  Google Scholar 

  42. Núñez-Peña, M.I., Bono, R.: Math anxiety and perfectionistic concerns in multiple-choice assessment. Assess. Eval. High. Educ. 46(6), 865–878 (2021). https://doi.org/10.1080/02602938.2020.1836120

    Article  Google Scholar 

  43. Pamphlett, R., Farnill, D.: Effect of anxiety on performance in multiple choice examination. Med. Educ. 29, 297–302 (1995). https://doi.org/10.1111/j.1365-2923.1995.tb02852.x

    Article  Google Scholar 

  44. Tozoglu, D., Tozoglu, M. D., Gurses, A., Dogar, C.: The students’ perceptions: essay versus multiple-choice type exams. J. Baltic Sci. Educ. 2(6), 52–59 (2004). http://oaji.net/articles/2016/987-1482420585.pdf

  45. Gupta, C., Jain, A., D’Souza, A.S.: Essay versus multiple-choice: a perspective from the undergraduate student point of view with its implications for examination. Gazi Med. J. 27, 8–10 (2016). https://doi.org/10.12996/GMJ.2016.03

    Article  Google Scholar 

  46. van de Watering, G., Gijbels, D., Dochy, F., van der Rijt, J.: Students’ assessment preferences, perceptions of assessment and their relationships to study results. High. Educ. 56, 645–658 (2008). https://doi.org/10.1007/s10734-008-9116-6

    Article  Google Scholar 

  47. Traub, R.E., MacRury, K.: Multiple choice vs. free response in the testing of scholastic achievement. In: Ingenkamp, K., Jager, R.S. (eds.) Tests und Trends 8: Jahrbuch der Pa ̈dagogischen Diagnostik, pp. 128–159. Weinheim und Basel, Beltz (1990)

    Google Scholar 

  48. Birenbaum, M., Feldman, R.A.: Relationships between learning patterns and attitudes towards two assessment formats. Educ. Res. 40(1), 90–97 (1998)

    Article  Google Scholar 

  49. Parmenter, D.A.: Essay versus multiple-choice: student preferences and the underlying rationale with implications for test construction. Acad. Educ. Leadersh. 13(2), 57–71 (2009)

    MathSciNet  Google Scholar 

  50. Scouller, K.: The influence of assessment method on students’ learning approaches: multiple choice question examination versus assignment essay. High. Educ. 35, 453–472 (1998). https://doi.org/10.1023/A:1003196224280

    Article  Google Scholar 

  51. Chan, N., Kennedy, P.E.: Are multiple-choice exams easier for economics students? A comparison of multiple-choice and “equivalent” constructed-response exam questions. South. Econ. J. 68(4), 957–971 (2002)

    Google Scholar 

  52. Heiman, T., Precel, K.: Students with learning disabilities in higher education: academic strategies profile. J. Learn. Disabil. 36(3), 248–258 (2003)

    Article  Google Scholar 

  53. Gelbar, N., Madaus, J.: Factors related to extended time use by college students with disabilities. Remedial Spec. Educ. (2020). https://doi.org/10.1177/0741932520972787

  54. Slaughter, M.H., Lindstrom, J.H., Anderson, R.: Perceptions of extended time accommodations among postsecondary students with disabilities. Exceptionality (2020). https://doi.org/10.1080/09362835.2020.1727339

    Article  Google Scholar 

  55. Nieminen, J.H.: Disrupting the power relations of grading in higher education through summative self-assessment. Teach. High. Educ. (2020). https://doi.org/10.1080/13562517.2020.1753687

    Article  Google Scholar 

  56. DiBattista, D., Gosse, L.: Test anxiety and the immediate feedback assessment technique. J. Exp. Educ. 74(4), 311–327 (2006)

    Article  Google Scholar 

  57. Emeka, Ch., Zilles, C.: Student perceptions of fairness and security in a versioned programming exams. In: ICER 2020: Proceedings of the 2020 ACM Conference on International Computing Education Research, pp. 25–35 (2020). https://doi.org/10.1145/3372782.3406275

  58. Duncan, H., Purcell, C.: Consensus or contradiction? A review of the current research into the impact of granting extra time in exams to students with specific learning difficulties (SpLD). J. Furth. High. Educ. 44(4), 439–453 (2020). https://doi.org/10.1080/0309877X.2019.1578341

    Article  Google Scholar 

  59. Entwistle, A., Entwistle, N.: Experiences of understanding in revising for degree examinations. Learn. Instr. 2, 1–22 (1992). https://doi.org/10.1016/0959-4752(92)90002-4

    Article  MATH  Google Scholar 

  60. Martinez, M.E.: Cognition and the question of test item format. Educ. Psychol. 34(4), 207–218 (1999)

    Article  Google Scholar 

  61. Biggs, J.B., Kember, D., Leung, D.Y.P.: The revised two factor study process questionnaire: R-SPQ-2F. Br. J. Educ. Psychol. 71, 133–149 (2001)

    Article  Google Scholar 

  62. Sobral, S.R.: Bloom’s taxonomy to improve teaching-learning in introduction to programming. Int. J. Inf. Educ. Technol. 11(3), 148–153 (2021)

    Google Scholar 

  63. Beichner, R.J.: Testing student interpretation of kinematics graphs. Am. J. Phys. 62, 750–784 (1994)

    Article  Google Scholar 

  64. Trotskovsky, E., Sabag, N.: The problem of non-linearity: an engineering students’ misconception. Int. J. Inf. Educ. Technol. 9(6), 449–452 (2019)

    Google Scholar 

  65. Gipps, C.V.: What is the role for ICT-based assessment in universities? Stud. High. Educ. 30(2), 171–180 (2005). https://doi.org/10.1080/03075070500043176

    Article  Google Scholar 

  66. Lukhele, R., Thissen, D., Wainer, H.: On the relative value of multiple-choice, constructed response, and examinee selected items on two achievement tests. J. Educ. Meas. 31(3), 234–250 (1994)

    Article  Google Scholar 

  67. Bridgeman, B.: A comparison of quantitative questions in open-ended and multiple-choice formats. J. Educ. Meas. 29, 253–271 (1992)

    Article  Google Scholar 

  68. Bush, M.: A multiple choice test that rewards partial knowledge. J. Furth. High. Educ. 25(2), 157–163 (2001)

    Article  Google Scholar 

  69. McKenna, P.: Multiple choice questions: answering correctly and knowing the answer. Interact. Technol. Smart Educ. l 16(1), 59–73 (2018)

    Article  Google Scholar 

  70. Ventouras, Ε, Triantis, D., Tsiakas, P., Stergiopoulos, C.: Comparison of oral examination and electronic examination using paired multiple-choice questions. Comput. Educ. 56(3), 616–624 (2011). https://doi.org/10.1016/j.compedu.2010.10.003

    Article  Google Scholar 

  71. Redish, E.F., Scherr, R.E., Tuminaro, J.: Reverse engineering the solution of a “simple” physics problem: why learning physics is harder than it looks. Phys. Teach. 44, 293–300 (2006). https://doi.org/10.1119/1.2195401

    Article  Google Scholar 

  72. Adeyemo, S.A.: Students’ ability level and their competence in problem-solving task in physics. Int. J. Educ. Res. Technol. 1(2), 35–47 (2010)

    Google Scholar 

  73. McBeath, R.J. (ed.): Instructing and Evaluating in Higher Education: A Guidebook for Planning Learning Outcomes. Educational Technology Publications, Englewood Cliffs (1992)

    Google Scholar 

  74. Holt, A.: An analysis of negative marking in multiple-choice assessment. In: Mann, S., Bridgeman, N. (eds.) 19th Annual Conference of the National Advisory Committee on Computing Qualifications (NACCQ 2006), Wellington, New Zealand, pp. 115–118 (2006). https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.679.2244&rep=rep1&type=pdf

  75. Brown, E., Glover, C.: Evaluating written feedback. In: Bryan, C., Clegg, K. (eds.) Innovative Assessment in Higher Education, pp. 81–91. Routledge, London (2006)

    Google Scholar 

  76. Vlachos, F., Avramidis, E., Dedousis, G., Chalmpe, M., Ntalla, I., Giannakopoulou, M.: Prevalence and gender ratio of dyslexia in Greek adolescents and its association with parental history and brain injury. Am. J. Educ. Res. 1(1), 22–25 (2013). https://doi.org/10.12691/education-1-1-5

    Article  Google Scholar 

  77. Lehmann, D.R., Hulbert, J.: Are three-point scales always good enough? J. Mark. Res. 9(4), 444–446 (1972)

    Article  Google Scholar 

  78. Matell, M.S., Jacoby, J.: Is there an optimal number of alternatives for Likert scale items? Study 1: reliability and validity. Educ. Psychol. Meas. 31, 657–674 (1971)

    Article  Google Scholar 

  79. Kalka, D., Lockiewicz, M.: Happiness, life satisfaction, resiliency and social support in students with dyslexia. Int. J. Disabil. Dev. Educ. 65(5), 493–508 (2018). https://doi.org/10.1080/1034912X.2017.1411582

    Article  Google Scholar 

  80. Leach, L., Neutze, G., Zepke, N.: Assessment and empowerment: some critical questions. Assess. Eval. High. Educ. 26(4), 293–305 (2001). https://doi.org/10.1080/02602930120063457

    Article  Google Scholar 

  81. McLaughlin, M.J., Speirs, K.E., Shenassa, E.D.: Reading disability and adult attained education and income. J. Learn. Disabil. 47(4), 374–386 (2014). https://doi.org/10.1177/0022219412458323

    Article  Google Scholar 

  82. Thomas, L.: Developing inclusive learning to improve the engagement, belonging, retention, and success of students from diverse groups. In: Shah, M., Bennett, A., Southgate, E. (eds.) Widening Higher Education Participation, pp. 135–159. Elsevier (2016)

    Google Scholar 

  83. Elsalem, L., Al-Azzam, N., Jum’ah, A.A., Obeidat, N., Sindiani, A.M., Kheirallah, K.A.: Stress and behavioral changes with remote E-exams during the Covid-19 pandemic: a cross-sectional study among undergraduates of medical sciences. Ann. Med. Surg. 60, 271–279 (2020). https://doi.org/10.1016/j.amsu.2020.10.058

  84. Clark, T.M., Callam, C.S., Paul, N.M., Stoltzfus, M.W., Turner, D.: Testing in the time of COVID-19: a sudden transition to unproctored online exams. J. Chem. Educ. 97(9), 3413–3417 (2020). https://doi.org/10.1021/acs.jchemed.0c00546

    Article  Google Scholar 

  85. Munoz, A., Mackay, J.: An online testing design choice typology towards cheating threat minimisation. J. Univ. Teach. Learn. Pract. 16(3) (2019). Article 5. https://ro.uow.edu.au/jutlp/vol16/iss3/5. Accessed 15 June 2020

  86. OECD: Remote online exams in higher education during the COVID-19 crisis (2020). oecd.org/education/remote-online-exams-in-higher-education-during-the-covid-19-crisis-f53e2177-en.htm

  87. Ladyshewsky, R.K.: Post-graduate student performance in supervised in-class vs. unsupervised online multiple-choice tests: implications for cheating and test security. Assess. Eval. High. Educ. 40(7), 883–897 (2015)

    Google Scholar 

  88. Schultz, M., Schultz, J., Round, G.: Online non-proctored testing and its affect on final course grades. Bus. Rev. Cambr. 9, 11–16 (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Panos Photopoulos .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Photopoulos, P., Tsonos, C., Stavrakas, I., Triantis, D. (2022). Problem-Based Multiple Response Exams for Students with and Without Learning Difficulties. In: Csapó, B., Uhomoibhi, J. (eds) Computer Supported Education. CSEDU 2021. Communications in Computer and Information Science, vol 1624. Springer, Cham. https://doi.org/10.1007/978-3-031-14756-2_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-14756-2_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-14755-5

  • Online ISBN: 978-3-031-14756-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics