Monday, 4 July 2016

Predicting student success, and failure

Carrie Wells writes in the Baltimore Sun:
Officials at the University System of Maryland have begun to analyze student data — grades, financial aid information, demographics, even how often they swipe their ID cards at the library or the dining hall — to find undergraduates who are at risk of dropping out...
University system officials say the practice, called predictive analysis, will boost graduation rates by enabling educators to intervene with struggling students before failure becomes inevitable.
The whole story is well worth reading, covering how big data analytics can help identify at-risk students, but also the valid privacy concerns that this sort of data mining raise. I was interested in this because of two research projects I've recently been involved in. The first project was one I blogged about last April, and was somewhat similar to the work that Wells was looking at (but not nearly as sophisticated, mainly because we had less data available):
In the final (multivariate) specification of the logistic regression model (which only included data we would have known before the students commenced study, and data that are available for all students):
  • Students aged 25 years and over (at first enrolment) had significantly lower odds of degree completion than those aged 19 years and under;
  • Male students had significantly lower odds of degree completion than female students;
  • Asian students had significantly higher odds of degree completion than all other ethnic groups, and Maori and Pacific Island students had the lowest odds of degree completion;
  • Domestic students had significantly lower odds of degree completion than international students;
  • Special admission (or provisional entrance) students had significantly lower odds of degree completion than other students; 
  • Students who initially completed the Certificate of University Preparation (CUP) had significantly lower odds of degree completion than other students; and
  • Students initially enrolled in conjoint degrees had significantly lower odds of degree completion than students enrolled in single degrees.
...at the least there is one take-away from Jacinda's work, which is that maybe we need to target more pastoral care or mentoring and role models for conjoint degree students.
Which brings me to this recent working paper by Papu Siameja and I. In the paper we first identify students at risk of failing ECON100, and then we used a simple randomised experiment to trial two very simple interventions. For the first intervention group (Treatment A), we sent them an email providing information about academic support. The second intervention group (Treatment B) received the email plus a follow up personal phone call. We ran the experiment in 2013 and 2014, but did not persist with it because my initial analyses showed little effect (on test results). However, when we look at pass rates, it appears there was an effect:
Both treatments appear to increase the odds of students passing the course, and the effect of Treatment B is statistically significant. Specifically, the results show that students who are part of the Treatment B group in 2014 had more than seven times higher odds of passing than the control group.
What is interesting is that the effect of Treatment B was only significant in 2014, and not 2013. In 2013 we had a staff member make the phone calls, but in 2014 we had a student (one of the tutors in the paper) make the calls. Maybe the (younger) tutor was simply better at connecting with the at-risk students and impressing on them the importance of remaining engaged in the course? Either way, this intervention turned out to be highly cost effective - I estimate the cost-per-failure-averted at about NZ$69. It's something we'll probably bring back next semester.

[HT for the Wells article: Marginal Revolution]

No comments:

Post a Comment