Wednesday 27 April 2022

Grade inflation and college completion in the U.S.

Grade inflation is one of those rare problems where the incentives are all wrong. Teachers have the incentive to raise grades, because they get better teaching evaluations (for example, see this post). Academic departments and university administrators have an incentive not to discourage inflated grades, because higher grades attract students. Students have no incentive to argue against grade inflation, because higher grades make them seem like they are doing better (even though, as noted in this post, the signalling value of grades is reduced by grade inflation, and grade inflation may actively harm students' learning).

What is grade inflation and why is it harmful? In this 2020 article published in the journal Economics of Education Review (ungated earlier version here), Adam Tyner (Thomas B. Fordham Institute) and Seth Gershenson (American University) outline several ways of thinking about grade inflation. However, to me their paper is a solution looking for a problem. I think most people would recognise grade inflation as what Tyner and Gershenson term dynamic grade inflation, that is the:

...relationship between grades & achievement changes over time...

In other words, if students with a given level of understanding this year receive a higher grade than students with the same level of understanding ten years ago would have, then that represents grade inflation. Simple.

Or not quite so simple. As I noted in this post, the evidence on the existence of grade inflation is somewhat patchy, and it isn't clear how much of it can be explained by changes over time in student quality, teaching quality, or course selection by students.

That's where this new NBER working paper by Jeffrey Denning (Brigham Young University) and co-authors comes in. They consider the puzzle of increasing college completion rates from the 1990s to today. This is a puzzle because:

Trends in the college wage premium, student enrollment, student preparation, student studying, labor supply in college, time spent studying, and the price of college would all predict decreasing college graduation rates. The patterns for enrollment by institution type yields an ambiguous prediction. Despite the bulk of the trends predicting decreasing graduation, we document that the college graduation rate is increasing.

A lower college wage premium should decrease the incentives for students to complete college. Greater enrolments should reduce the 'quality' of the marginal student, and reduce the proportion of students completing college. Students are now less prepared when leaving high school than previous generations (as noted in my review of the Goldin and Katz book, The Race between Education and Technology). Students work more, and consequently spend less time studying, and that is in turn related to the high cost of tertiary education, and those are all impediments to the completion of university study. And in spite of all of those trends, college completion rates in the U.S. have trended upwards since the 1990s.

Denning et al. implicate changing standards as the driver of these higher completion rates, that is, grade inflation. They use a range of data to support their case, including nationally-representative longitudinal data from the National Education Longitudinal Study of 1988 (NELS:88) and the Education Longitudinal Study of 2002 (ELS:2002), detailed administrative data from nine large public universities (Clemson, Colorado, Colorado State, Florida, Florida State, Georgia Tech, North Carolina State, Purdue, and Virginia Tech) for cohorts entering between 1990 and 2000, and detailed microdata from an unnamed 'Public Liberal Arts College'.

There is a lot of detail in the paper, so I'll just quickly describe some of the highlights. Denning et al. use a decomposition method based on the change over time from the two longitudinal studies, and find that:

...there is a 3.77 percentage point increase in the probability of graduation from the NELS:88 cohort to the ELS:2002 cohort. The total explained by observable characteristics is -1.92. This suggests that covariates would predict that graduation rates would decrease by 1.92 percentage points. Hence, the residual or unexplained change is 5.69 percentage points or 151 percent of the change is unexplained by covariates. Student preparedness would predict a decline in graduation rates of 1.26 percentage points. Student-faculty ratios explain a 0.28 percentage point decline and initial school type explains no change in graduation rates.

In other words, the probability of graduation has increased over time, but observable changes in student preparedness, student-faculty ratio, and school type all point in the wrong direction (as per the trends noted earlier).

The longitudinal studies are limited in the variables that are available, so Denning et al. next turn to the administrative data from the nine public universities. Looking at what explains the increase in GPA over time, they find that there is:

...a statistically significant increase of 0.019 per year in first-year GPA between 1990 and 2000. Controlling for demographic characteristics, school attended, and home zip code leave the coefficient unchanged. Including very flexible controls for SAT scores reduces the coefficient on year of entry only slightly to 0.014. We also include fixed effects for major by institution to account for the potential of changing major composition. Last, we include fixed effects for all first-semester courses and the coefficient is unchanged. We include these fixed effects to account for shifts in student course taking that may explain changes in GPA but find that courses taken cannot explain the change in GPA.

This evidence shows that rising grades cannot be meaningfully explained by demographics, preparation, courses, major, or school type. Put another way, equally prepared students in later cohorts from the same zip code, of the same gender and race, with the same initial courses, the same major, and at the same institution have higher first-year GPAs than earlier cohorts.

 That certainly smells like grade inflation. Finally, Denning et al. look at the data from unnamed Public Liberal Arts College, which has an advantage in that there is an objective measure of student performance. In two required science courses, students sat exactly the same exam in different years. IN that case:

...we control for course fixed effects, demographic characteristics, and final exam scores in these two science courses and find that a year later entry corresponds to a large and statistically significant 0.060-point increase in GPA... Students with the exact same score on the exact same final exam earned better grades in later years.

Finally, Denning et al. go back to the original decomposition analysis, and add in GPA, and find that:

...the change due to observables (including first-year GPA) is 2.49 percentage points or 66 percent of the total change. The change explained by GPA alone is 3.57 or 95 percent of the observed change...

Summing up this study, almost all of the observed change in college completion rates is explained by the change in GPA over time, and the change in GPA over time is not related to student performance, as well as not being related to changes in student demographics, choice of major, or choice of college. This is grade inflation, writ large.

Does any of this matter? Well, as I noted earlier, it does reduce the signalling value of education. It is harder for the genuinely good students to distinguish themselves from the not-quite-as-good students, when they are all receiving the same high grades. However, Denning et al. point to some further work that might help us to understand the consequences of grade inflation better:

...future work should consider the effects of grade inflation on learning, major choice, the decision to enroll in graduate school, the skill composition of the workforce, and the college wage premium.

All of that would be welcome evidence for why we should be concerned.

[HT: Marginal Revolution, for the Denning et al. paper]

Read more:

No comments:

Post a Comment