skip to main content
10.1145/3328778.3366916acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

The Return Statement: Establishing a Continuous Assessment Database System for Consistent Program Feedback

Published:26 February 2020Publication History

ABSTRACT

Curriculum assessment is a critical but arduous task. It yields insight into the strengths and weaknesses of our programs, outlining paths for curriculum improvement. However, current methods of assessment are often viewed as burdensome, as disorganized, or even as unfruitful. A lack of consistent assessment makes determining the impact of program modifications to be extremely difficult, if not impossible. Even when issues are identified, lengthy and backloaded assessments do not leave any time to address these issues for the assessed students. To overcome the obstacles of program assessment, we have constructed the Continuous Assessment Database System (CADS). This system conducts direct assessment of student prerequisite knowledge at each semester's outset. By tracking prerequisite knowledge, we determine whether students recall the crucial elements of our curriculum. Furthermore, educators receive data on students' preparedness early in the semester, allowing for them to adapt their courses to best meet student needs. In order to generate informative reports for faculty each term, CADS maps its assessment questions to student information, to CS knowledge topics, to departmental student learning outcomes, and to taxonomic mastery levels. These reports visualize current data against historical data and present deeper examinations of significant data shifts for causal analysis. Different diagrams give insight into specific courses, while others examine student learning across the entire CS program via diverse mappings. Overall, CADS provides a model for using technology to craft an expedient form of program assessment that dynamizes a curriculum. This paper details our CADS implementation.

References

  1. Adel Abunawass, Will Lloyd, and Edwin Rudolph. 2004. COMPASS: A CS Program Assessment Project. In Proceedings of the 9th Annual SIGCSE Conference on Innovation and Technology in Computer Science Education (ITiCSE '04), 127--131. DOI: https://doi.org/10.1145/1007996.1008031Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. ACM Computing Curricula Task Force (Ed.). 2013. Computer Science Curricula 2013: Curriculum Guidelines for Undergraduate Degree Programs in Computer Science. ACM, Inc. DOI: https://doi.org/10.1145/2534860Google ScholarGoogle Scholar
  3. Lorin W. Anderson and David R. Krathwohl. 2001. A taxonomy for learning, teaching, and assessing: a revision of Bloom's taxonomy of educational objectives (complete ed.). Longman.Google ScholarGoogle Scholar
  4. Debra Bath, Calvin Smith, Sarah Stein, and Richard Swann. 2004. Beyond mapping and embedding graduate attributes: bringing together quality assurance and action learning to create a validated and living curriculum. Higher Education Research & Development 23, 3 (Aug. 2004), 313--328. DOI: https://doi.org/10.1080/0729436042000235427Google ScholarGoogle ScholarCross RefCross Ref
  5. Benjamin S. Bloom (Ed.). 1956. Taxonomy of Educational Objectives: The Classification of Educational Goals / By a Committee of College and University Examiners. D. McKay Co., New York, NY.Google ScholarGoogle Scholar
  6. Larry Booth. 2006. A Database to Promote Continuous Program Improvement. In Proceedings of the 7th Conference on Information Technology Education (SIGITE '06), 83--88. DOI: https://doi.org/10.1145/1168812.1168834Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Larry Booth, Jon Preston, and Junfeng Qu. 2007. Continuous Program Improvement: A Project to Automate Record-keeping for Accreditation. In Proceedings of the 8th ACM SIGITE Conference on Information Technology Education (SIGITE '07), 155--160. DOI: https://doi.org/10.1145/1324302.1324336Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Mark Britton, Nancy Letassy, and Melissa Medina. 2008. A Curriculum Review and Mapping Process Supported by an Electronic Database System. American Journal of Pharmaceutical Education 72, 5, Article 99 (October 2008). DOI: https://doi.org/10.5688/aj720599Google ScholarGoogle ScholarCross RefCross Ref
  9. David B. Dahl and Scott Crawford. 2009. RinRuby: Accessing the R Interpreter from Pure Ruby. J. Stat. Softw. 29, 4 (January 2009), 1--18.Google ScholarGoogle ScholarCross RefCross Ref
  10. David Heinemeier Hansson. Ruby on Rails. Ruby on Rails. Retrieved August 29, 2019 from https://rubyonrails.org/Google ScholarGoogle Scholar
  11. Ursula Fuller, Colin G. Johnson, Tuukka Ahoniemi, Diana Cukierman, Isidoro Hernán-Losada, Jana Jackova, Essi Lahtinen, Tracy L. Lewis, Donna McGee Thompson, Charles Riedesel, and Errol Thompson. 2007. Developing a Computer Science-specific Learning Taxonomy. In Working Group Reports on ITiCSE on Innovation and Technology in Computer Science Education (ITiCSE-WGR '07), 152--170. DOI: https://doi.org/10.1145/1345443.1345438Google ScholarGoogle Scholar
  12. Richard Gluga, Judy Kay, Raymond Lister, Simon, Michael Charleston, James Harland, and Donna Teague. 2013. A Conceptual Model for Reflecting on Expected Learning vs. Demonstrated Student Performance. In Proceedings of the Fifteenth Australasian Computing Education Conference - Volume 136 (ACE '13), 77--86. Retrieved July 18, 2019 from http://dl.acm.org/citation.cfm?id=2667199.2667208Google ScholarGoogle Scholar
  13. Gregory Brown. Prawn. Prawn: Fast, Nimble PDF Generation for Ruby. Retrieved August 29, 2019 from http://prawnpdf.org/api-docs/2.0/Google ScholarGoogle Scholar
  14. Hadley Wickham and RStudio. Tidyverse. Retrieved August 29, 2019 from https://www.tidyverse.org/Google ScholarGoogle Scholar
  15. Instructure, Inc. Canvas. Instructure. Retrieved August 29, 2019 from https://www.instructure.com/canvas/?newhome=canvas&lead_source_description=instructure.comGoogle ScholarGoogle Scholar
  16. B. Nigel Oliver, Sonia Julie Ferns, Barbara Whelan, and Linda Lilly. 2010. Mapping the Curriculum for Quality Enhancement: Refining a Tool and Processes for the Purpose of Curriculum Renewal.Google ScholarGoogle Scholar
  17. Woody Peterson. 2019. SimpleXlsxReader. Retrieved August 29, 2019 from https://github.com/woahdae/simple_xlsx_readerGoogle ScholarGoogle Scholar
  18. Kathryn E. Sanders and Robert McCartney. 2003. Program Assessment Tools in Computer Science: A Report from the Trenches. In Proceedings of the 34th SIGCSE Technical Symposium on Computer Science Education (SIGCSE '03), 31--35. DOI: https://doi.org/10.1145/611892.611926Google ScholarGoogle Scholar
  19. Celia Schahczenski and Michele Van Dyne. 2019. Easing the Burden of Program Assessment: Web-based Tool Facilitates Measuring Student Outcomes for ABET Accreditation. In Proceedings of the 52nd Hawaii International Conference on System Sciences. DOI: https://doi.org/10.24251/HICSS.2019.919Google ScholarGoogle ScholarCross RefCross Ref
  20. The Joint Task Force on Computing Curricula (Ed.). 2001. Computing Curricula 2001. J. Educ. Resour. Comput. 1, 3es (September 2001). DOI: https://doi.org/10.1145/384274.384275Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. The PostgreSQL Community Association of Canada. PostgreSQL. Retrieved August 29, 2019 from https://www.postgresql.org/Google ScholarGoogle Scholar
  22. Sander Valstar, William G. Griswold, and Leo Porter. 2019. The Relationship Between Prerequisite Proficiency and Student Performance in an Upper-Division Computing Course. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education (SIGCSE '19), 794--800. DOI: https://doi.org/10.1145/3287324.3287419Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Kathleen Timmerman and Travis Doom. 2017. Infrastructure for Continuous Assessment of Retained Relevant Knowledge. In Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education (SIGCSE '17), 579--584. DOI: https://doi.org/10.1145/3017680.3017738Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Henry M. Walker, Sue Fitzgerald, and John F. Dooley. 2015. Curricular Assessment: Tips and Techniques. In Proceedings of the 46th ACM Technical Symposium on Computer Science Education (SIGCSE '15), 265--266. DOI: https://doi.org/10.1145/2676723.2677319Google ScholarGoogle Scholar

Index Terms

  1. The Return Statement: Establishing a Continuous Assessment Database System for Consistent Program Feedback

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        SIGCSE '20: Proceedings of the 51st ACM Technical Symposium on Computer Science Education
        February 2020
        1502 pages
        ISBN:9781450367936
        DOI:10.1145/3328778

        Copyright © 2020 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 26 February 2020

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate1,595of4,542submissions,35%

        Upcoming Conference

        SIGCSE Virtual 2024
        SIGCSE Virtual 2024: ACM Virtual Global Computing Education Conference
        November 30 - December 1, 2024
        Virtual Event , USA
      • Article Metrics

        • Downloads (Last 12 months)9
        • Downloads (Last 6 weeks)2

        Other Metrics

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader