ABSTRACT
Curriculum assessment is a critical but arduous task. It yields insight into the strengths and weaknesses of our programs, outlining paths for curriculum improvement. However, current methods of assessment are often viewed as burdensome, as disorganized, or even as unfruitful. A lack of consistent assessment makes determining the impact of program modifications to be extremely difficult, if not impossible. Even when issues are identified, lengthy and backloaded assessments do not leave any time to address these issues for the assessed students. To overcome the obstacles of program assessment, we have constructed the Continuous Assessment Database System (CADS). This system conducts direct assessment of student prerequisite knowledge at each semester's outset. By tracking prerequisite knowledge, we determine whether students recall the crucial elements of our curriculum. Furthermore, educators receive data on students' preparedness early in the semester, allowing for them to adapt their courses to best meet student needs. In order to generate informative reports for faculty each term, CADS maps its assessment questions to student information, to CS knowledge topics, to departmental student learning outcomes, and to taxonomic mastery levels. These reports visualize current data against historical data and present deeper examinations of significant data shifts for causal analysis. Different diagrams give insight into specific courses, while others examine student learning across the entire CS program via diverse mappings. Overall, CADS provides a model for using technology to craft an expedient form of program assessment that dynamizes a curriculum. This paper details our CADS implementation.
- Adel Abunawass, Will Lloyd, and Edwin Rudolph. 2004. COMPASS: A CS Program Assessment Project. In Proceedings of the 9th Annual SIGCSE Conference on Innovation and Technology in Computer Science Education (ITiCSE '04), 127--131. DOI: https://doi.org/10.1145/1007996.1008031Google ScholarDigital Library
- ACM Computing Curricula Task Force (Ed.). 2013. Computer Science Curricula 2013: Curriculum Guidelines for Undergraduate Degree Programs in Computer Science. ACM, Inc. DOI: https://doi.org/10.1145/2534860Google Scholar
- Lorin W. Anderson and David R. Krathwohl. 2001. A taxonomy for learning, teaching, and assessing: a revision of Bloom's taxonomy of educational objectives (complete ed.). Longman.Google Scholar
- Debra Bath, Calvin Smith, Sarah Stein, and Richard Swann. 2004. Beyond mapping and embedding graduate attributes: bringing together quality assurance and action learning to create a validated and living curriculum. Higher Education Research & Development 23, 3 (Aug. 2004), 313--328. DOI: https://doi.org/10.1080/0729436042000235427Google ScholarCross Ref
- Benjamin S. Bloom (Ed.). 1956. Taxonomy of Educational Objectives: The Classification of Educational Goals / By a Committee of College and University Examiners. D. McKay Co., New York, NY.Google Scholar
- Larry Booth. 2006. A Database to Promote Continuous Program Improvement. In Proceedings of the 7th Conference on Information Technology Education (SIGITE '06), 83--88. DOI: https://doi.org/10.1145/1168812.1168834Google ScholarDigital Library
- Larry Booth, Jon Preston, and Junfeng Qu. 2007. Continuous Program Improvement: A Project to Automate Record-keeping for Accreditation. In Proceedings of the 8th ACM SIGITE Conference on Information Technology Education (SIGITE '07), 155--160. DOI: https://doi.org/10.1145/1324302.1324336Google ScholarDigital Library
- Mark Britton, Nancy Letassy, and Melissa Medina. 2008. A Curriculum Review and Mapping Process Supported by an Electronic Database System. American Journal of Pharmaceutical Education 72, 5, Article 99 (October 2008). DOI: https://doi.org/10.5688/aj720599Google ScholarCross Ref
- David B. Dahl and Scott Crawford. 2009. RinRuby: Accessing the R Interpreter from Pure Ruby. J. Stat. Softw. 29, 4 (January 2009), 1--18.Google ScholarCross Ref
- David Heinemeier Hansson. Ruby on Rails. Ruby on Rails. Retrieved August 29, 2019 from https://rubyonrails.org/Google Scholar
- Ursula Fuller, Colin G. Johnson, Tuukka Ahoniemi, Diana Cukierman, Isidoro Hernán-Losada, Jana Jackova, Essi Lahtinen, Tracy L. Lewis, Donna McGee Thompson, Charles Riedesel, and Errol Thompson. 2007. Developing a Computer Science-specific Learning Taxonomy. In Working Group Reports on ITiCSE on Innovation and Technology in Computer Science Education (ITiCSE-WGR '07), 152--170. DOI: https://doi.org/10.1145/1345443.1345438Google Scholar
- Richard Gluga, Judy Kay, Raymond Lister, Simon, Michael Charleston, James Harland, and Donna Teague. 2013. A Conceptual Model for Reflecting on Expected Learning vs. Demonstrated Student Performance. In Proceedings of the Fifteenth Australasian Computing Education Conference - Volume 136 (ACE '13), 77--86. Retrieved July 18, 2019 from http://dl.acm.org/citation.cfm?id=2667199.2667208Google Scholar
- Gregory Brown. Prawn. Prawn: Fast, Nimble PDF Generation for Ruby. Retrieved August 29, 2019 from http://prawnpdf.org/api-docs/2.0/Google Scholar
- Hadley Wickham and RStudio. Tidyverse. Retrieved August 29, 2019 from https://www.tidyverse.org/Google Scholar
- Instructure, Inc. Canvas. Instructure. Retrieved August 29, 2019 from https://www.instructure.com/canvas/?newhome=canvas&lead_source_description=instructure.comGoogle Scholar
- B. Nigel Oliver, Sonia Julie Ferns, Barbara Whelan, and Linda Lilly. 2010. Mapping the Curriculum for Quality Enhancement: Refining a Tool and Processes for the Purpose of Curriculum Renewal.Google Scholar
- Woody Peterson. 2019. SimpleXlsxReader. Retrieved August 29, 2019 from https://github.com/woahdae/simple_xlsx_readerGoogle Scholar
- Kathryn E. Sanders and Robert McCartney. 2003. Program Assessment Tools in Computer Science: A Report from the Trenches. In Proceedings of the 34th SIGCSE Technical Symposium on Computer Science Education (SIGCSE '03), 31--35. DOI: https://doi.org/10.1145/611892.611926Google Scholar
- Celia Schahczenski and Michele Van Dyne. 2019. Easing the Burden of Program Assessment: Web-based Tool Facilitates Measuring Student Outcomes for ABET Accreditation. In Proceedings of the 52nd Hawaii International Conference on System Sciences. DOI: https://doi.org/10.24251/HICSS.2019.919Google ScholarCross Ref
- The Joint Task Force on Computing Curricula (Ed.). 2001. Computing Curricula 2001. J. Educ. Resour. Comput. 1, 3es (September 2001). DOI: https://doi.org/10.1145/384274.384275Google ScholarDigital Library
- The PostgreSQL Community Association of Canada. PostgreSQL. Retrieved August 29, 2019 from https://www.postgresql.org/Google Scholar
- Sander Valstar, William G. Griswold, and Leo Porter. 2019. The Relationship Between Prerequisite Proficiency and Student Performance in an Upper-Division Computing Course. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education (SIGCSE '19), 794--800. DOI: https://doi.org/10.1145/3287324.3287419Google ScholarDigital Library
- Kathleen Timmerman and Travis Doom. 2017. Infrastructure for Continuous Assessment of Retained Relevant Knowledge. In Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education (SIGCSE '17), 579--584. DOI: https://doi.org/10.1145/3017680.3017738Google ScholarDigital Library
- Henry M. Walker, Sue Fitzgerald, and John F. Dooley. 2015. Curricular Assessment: Tips and Techniques. In Proceedings of the 46th ACM Technical Symposium on Computer Science Education (SIGCSE '15), 265--266. DOI: https://doi.org/10.1145/2676723.2677319Google Scholar
Index Terms
- The Return Statement: Establishing a Continuous Assessment Database System for Consistent Program Feedback
Recommendations
ABET accreditation with IT criteria
SIGITE '05: Proceedings of the 6th conference on Information technology educationIn recent years the SIGITE community has invested significant effort into developing IT accreditation standards. New draft general and program specific accreditation criteria were recently created. The Brigham Young University IT program is undergoing ...
Streamlining Computer Science Curriculum Development and Assessment using the New ABET Student Outcomes
WCCCE '19: Proceedings of the Western Canadian Conference on Computing EducationRecent updates to the ABET accreditation body's Criteria for Accrediting Computing Programs provide opportunities for program design and assessment efficiency and effectiveness improvements. An exploration of these opportunities resulted in a re-...
Assessment of an introductory database course: a case study
SIGITE '08: Proceedings of the 9th ACM SIGITE conference on Information technology educationThe Information Technology Department of the Golisano College of Computing and Information Sciences at Rochester Institute of Technology uses program assessment activities to guide curricular decisions which increase the quality of our programs. The ...
Comments