skip to main content
10.1145/2330601.2330664acmconferencesArticle/Chapter ViewAbstractPublication PageslakConference Proceedingsconference-collections
research-article

Course correction: using analytics to predict course success

Published: 29 April 2012 Publication History

Abstract

Predictive analytics techniques applied to a broad swath of student data can aid in timely intervention strategies to help prevent students from failing a course. This paper discusses a predictive analytic model that was created for the University of Phoenix. The purpose of the model is to identify students who are in danger of failing the course in which they are currently enrolled. Within the model's architecture, data from the learning management system (LMS), financial aid system, and student system are combined to calculate a likelihood of any given student failing the current course. The output can be used to prioritize students for intervention and referral to additional resources. The paper includes a discussion of the predictor and statistical tests used, validation procedures, and plans for implementation.

References

[1]
Cohen, A. and Brawer, F. 2009. The American Community College. John Wiley and Sons, NY, ISBN: 9780470605486.
[2]
University of Phoenix, 2010 Academic Annual Report. http://www.phoenix.edu/about_us/publications/academic-annual-report/2010.html
[3]
Garman, G. 2010. A Logistic Approach to Predicting Student Success in Online Database Courses. American Journal of Business Education. 3(12), 1--5.
[4]
Moore, R. 2007. Do Students Performances and Behaviors in Supporting Courses Predict Their Performances and Behaviors in Primary Courses? Research and Teaching in Developmental Education. 23(2), 38--48.
[5]
Wang, A. Y. & Newlin, M. H. 2000. Characteristics of students who enroll and succeed in Psychology web-based classes. J. Educational Psychology. 92(1), 137--143.
[6]
Reisetter, M. & Boris, G. 2004. What works: student perceptions of effective elements in online learning. Quarterly Review of Distance Education. 5(4), 277--291.
[7]
Sadik, A. & Reisman, S. 2004. Design and implementation of a web-based learning environment: lessons learned. Quarterly Review of Distance Education. 5(3), 157--171.
[8]
Ramos, C. & Yudko, E. 2008. "Hits" (Not "Discussion Posts") Predict Student Success in Online Courses: A Double Cross-Validation Study. Computers & Education. 50(4), 1174--1182.
[9]
Martinez, D. 2001. Predicting student outcomes using discriminant function analysis. Paper presented at the 39th Annual Meeting of the Research and Planning Group, Lake Arrowhead CA.
[10]
Morris, L., Wu, S. & Finnegan, C., 2005. Predicting retention in online general education courses. American Journal of Distance Education. 19(1), 23--26.

Cited By

View all
  • (2024)Predictive Modelling with the Open University Learning Analytics Dataset (OULAD): A Systematic Literature ReviewArtificial Intelligence in Education. Posters and Late Breaking Results, Workshops and Tutorials, Industry and Innovation Tracks, Practitioners, Doctoral Consortium and Blue Sky10.1007/978-3-031-64315-6_46(477-484)Online publication date: 2-Jul-2024
  • (2023)A Human-Centered Review of Algorithms in Decision-Making in Higher EducationProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580658(1-15)Online publication date: 19-Apr-2023
  • (2023)Machine Learning Models for Performance of Students in Foundation and Post Foundation in a University in OmanProceedings of the First International Conference on Aeronautical Sciences, Engineering and Technology10.1007/978-981-99-7775-8_35(327-339)Online publication date: 26-Dec-2023
  • Show More Cited By

Recommendations

Reviews

Symeon D. Retalis

Learning analytics (LA) is a very hot topic that can be used to assess academic progress and predict future performance. The latest Horizon Report suggests that the time-to-adoption of LA by educational institutions will be around two to three years [1]. Thus, a lot of research and development initiatives are currently underway. This interesting and well-written paper demonstrates a predictive analytic model for identifying students who are in danger of failing a course. It was created for the needs of the University of Phoenix. Barber and Sharkey begin with a discussion of various known reasons students fail to graduate, as well as indicators that help identify a student that may have a tendency to fail. The authors show how they fine-tuned their model through three versions over several runs by adding variables. As a result, the Model 2 version accurately predicted 85 percent of outcomes for all students at week 0 (compared to 50 percent in Model 1). The authors consider how LA can help teachers, educational managers, and students to predict course failure. LA can also be seen from other perspectives. For example, the outcome of this process can help instructional designers to better measure the quality of a course design and understand what works and what does not work [2]. In addition, LA can improve assessment of student performance by analyzing various indicators such as student postings and grades on assignments [3]. Adopting LA techniques is not easy. Stakeholders need usable analysis tools such as SNAPP [4] and NodeXL [5]. The development of new models like this one, as well as new techniques and tools, will undoubtedly help stakeholders analyze and interpret data about student progress and learning behavior. Online Computing Reviews Service

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
LAK '12: Proceedings of the 2nd International Conference on Learning Analytics and Knowledge
April 2012
282 pages
ISBN:9781450311113
DOI:10.1145/2330601
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 29 April 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. higher education
  2. learning analytics
  3. predictive analytics
  4. predictive modeling
  5. retention

Qualifiers

  • Research-article

Conference

LAK 2012
Sponsor:
  • SIGWEB
  • TEKRI
  • Desire2Learn
  • EDUCAUSE
  • University of British Columbia
LAK 2012: Second International Conference on Learning Analytics and Knowledge
April 29 - May 2, 2012
British Columbia, Vancouver, Canada

Acceptance Rates

Overall Acceptance Rate 236 of 782 submissions, 30%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)21
  • Downloads (Last 6 weeks)3
Reflects downloads up to 10 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Predictive Modelling with the Open University Learning Analytics Dataset (OULAD): A Systematic Literature ReviewArtificial Intelligence in Education. Posters and Late Breaking Results, Workshops and Tutorials, Industry and Innovation Tracks, Practitioners, Doctoral Consortium and Blue Sky10.1007/978-3-031-64315-6_46(477-484)Online publication date: 2-Jul-2024
  • (2023)A Human-Centered Review of Algorithms in Decision-Making in Higher EducationProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580658(1-15)Online publication date: 19-Apr-2023
  • (2023)Machine Learning Models for Performance of Students in Foundation and Post Foundation in a University in OmanProceedings of the First International Conference on Aeronautical Sciences, Engineering and Technology10.1007/978-981-99-7775-8_35(327-339)Online publication date: 26-Dec-2023
  • (2023)Using Institutional Data to Drive Quality, Improvement, and InnovationTechnology-Enhanced Learning and the Virtual University10.1007/978-981-99-4170-4_29(571-594)Online publication date: 21-Sep-2023
  • (2023)Using Institutional Data to Drive Quality, Improvement, and InnovationTechnology-Enhanced Learning and the Virtual University10.1007/978-981-19-9438-8_29-1(1-24)Online publication date: 8-Apr-2023
  • (2022)Predicting student outcomes using digital logs of learning behaviors: Review, current standards, and suggestions for future workBehavior Research Methods10.3758/s13428-022-01939-9Online publication date: 26-Aug-2022
  • (2022)Performance Prediction for Undergraduate Degree Programs Using Machine Learning Techniques - A Preliminary ReviewVAWKUM Transactions on Computer Sciences10.21015/vtcs.v10i2.127810:2(45-60)Online publication date: 31-Dec-2022
  • (2022)Connecting the dots – A literature review on learning analytics indicators from a learning design perspectiveJournal of Computer Assisted Learning10.1111/jcal.1271640:6(2432-2470)Online publication date: 26-Jul-2022
  • (2022)Identifying Students’ Progress and Mobility Patterns in Higher Education Through Open-Source Visualization2022 IEEE Integrated STEM Education Conference (ISEC)10.1109/ISEC54952.2022.10025315(154-161)Online publication date: 26-Mar-2022
  • (2022)How can learning analytics techniques improve the Learning Process ? An overview2022 2nd International Conference on Innovative Research in Applied Science, Engineering and Technology (IRASET)10.1109/IRASET52964.2022.9738003(1-5)Online publication date: 3-Mar-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media