Saturday, 29 December 2018

The leaning tower that is PISA rankings

Many governments are fixated on measurement and rankings. However, as William Bruce Cameron wrote (and which has wrongly been attributed to Albert Einstein), "Not everything that counts can be counted, and not everything that can be counted counts". And even things that can be measured and are important might not be measured in a way that is meaningful.

Let's take as an example the PISA rankings. Every three years, the OECD tests 15-year-old students around the world in reading, maths, science, and in some countries, financial literacy. They then use the results from those tests to create rankings in each subject. Here are New Zealand's 2015 results. In all three subjects (reading, maths, and science), New Zealand ranks better than the OECD average, but shows a decline since 2006. To be more specific, in 2015 New Zealand ranked 10th in reading (down from 5th in 2006), 21st in maths (down from 10th in 2006), and 12th in science (down from 7th in 2006).

How seriously should we take these rankings? It really depends on how seriously the students take the PISA tests. They are low-stakes tests, which means that the students don't gain anything from doing them. And that means that there might be little reason to believe that the results are reflective of actual student learning. Of course, students in New Zealand are not the only students who might not take these tests seriously. New Zealand's ranking would only be adversely affected if students here are more likely not to take the test seriously, or where the students who don't take the test seriously are better students on average than those who don't take the test seriously in other countries.

So, are New Zealand students less serious about PISA than students in other countries? In a recent NBER Working Paper (ungated version here), Pelin Akyol (Bilkent University, Turkey), Kala Krishna and Jinwen Wang (both Penn State) crunch the numbers for us. They identify non-serious students as those where there were several questions left unanswered (by skipping them or not finishing the test) despite time remaining, or where students spent too little time on several questions (relative to their peers). They found that:
[t]he math score of the student is negatively correlated with the probability of skipping and the probability of spending too little time. Female students... are less likely to skip or to spend too little time. Ambitious students are less likely to skip and more likely to spend too little time... students from richer countries are more likely to skip and spent too little time, though the shape is that of an inverted U with a turning point at about $43,000 for per capita GDP.
They then adjust for these non-serious students, by imputing the number of correct answers they would have gotten had they taken the test seriously. You can see the more complete results in the paper. Focusing on New Zealand, our ranking would increase from 17th to 13th if all students in all countries took the test seriously, which suggests to me that the low-stakes PISA test is under-estimating New Zealand students' results. We need to be more careful about how we interpret these international education rankings based on low-stakes tests.

[HT: Eric Crampton at Offsetting Behaviour, back in August; also Marginal Revolution]

No comments:

Post a Comment