(The article below was published by the Mail & Guardian on the 23rd of September 2016)
Every few years South Africa participates in international tests of reading, mathematics and science in an attempt to see what our students know and can do, and how this is changing over time. These tests are independently set and usually comparable over time. Every 4 or 5 years we test our Grade 9 students in maths and science (Timss) and our Grade 4/5 students in reading (Pirls), and every 6 years we test our Grade 6 students in reading and mathematics (Sacmeq). 2016 happens to be the year when all the results of these assessments are released to the public. The 2015 Timss and Pirls results will be released in November/December this year and the 2013 Sacmeq results were presented to Parliament earlier this month, which is what I will focus on here.
In what would have been the biggest news of the post-apartheid period – at least if it were true – Parliament heard that South Africa’s primary education system improved faster than any other education system in the history of international testing, i.e. since 1967. Our alleged improvement was more than twice as large as the fastest improving country in the world, Brazil. To be specific, South African Grade 6 students’ test scores improved by more than 0.9 standard deviations between 2007 and 2013 or the equivalent of an extra 3 full years worth of learning. To put this in perspective, this is the same as taking Thailand- or Mexico’s education system and making it equal to Finland’s or Canada’s in 6 years. It makes for a great story or a wish from a Fairy Godmother, but not for plausible results from a psychometrically-rigorous international test. Note that it is not only South Africa that experienced these colossal ‘gains’, but all Sacmeq countries, which is even more suspicious. A big part of the alleged Sacmeq improvements actually arise from different methodologies employed in 2007 and 2013, making them incomparable until they are properly equated.
The results presented to Parliament compare data from 2007 and 2013, yet the way these results were calculated in each period was not the same, and I should I know. I was appointed by Sacmeq itself earlier this year to analyse the results for the international Sacmeq report. After analysing the data I raised a number of serious technical concerns about the data that significantly affect the comparability and validity of the findings, and especially the fact that the weaker students had been excluded from the final analysis. I advised the Sacmeq Secretariat to address these concerns before any publication of the results since doing so would be misleading. Based on the subsequent response from the SACMEQ Secretariat indicating that this would not happen I explained that I could not in good conscience continue with the analysis and chose to resign on technical grounds in August this year. The issues I raised have not been addressed since the results presented to Parliament were the same as those that I identified as problematic. At the same time this was going on I emailed the Department flagging my concerns and cautioning against publishing the results.
The Department of Basic Education itself was shocked by the unprecedented improvements. In the presentation to Parliament they explain: “Given the significant improvements, the South African national research team requested SACMEQ to double check the results and were subsequently reassured on their accuracy.” This is simply not good enough.
The lack of comparability between 2007 and 2013 is so glaringly obvious one doesn’t need inside knowledge of the data to see how implausible the results are. At the same time that the student reading scores soared (rising by 0.9 standard deviations), the teacher reading scores plummeted (dropping by 0.8 standard deviations), which is extremely peculiar. If we are to believe the results, by 2013 basically all South African students could read, with illiteracy rates dropping from 27% in 2007 to 3% in 2013. This is totally at odds with the other main international test we do, Pirls in 2011, which showed that 29% of Grade 4 students were reading-illiterate and 58% could not read for meaning, confirming a host of smaller studies showing the same thing.
If we dig a little deeper, the Department’s presentation to Parliament apparently showed that the biggest improvers were Limpopo and the Eastern Cape. Go figure. These are the very same provinces that were placed under administration (Section 100) in 2011 because they were so utterly dysfunctional. To use the Minister’s own words, these are the education system’s “pockets of disaster” whose 2015 matric results were a “national catastrophe.” Yet Sacmeq would have us believe that illiteracy in Limpopo has been totally eradicated, dropping from 49% in 2007 to 5% in 2013. In stark contrast, our other major international test (Prepirls) showed that of the more than 2900 Grade 4 students that were tested in Limpopo in 2011, 50% were reading-illiterate and 83% could not read for meaning.
For those unfamiliar with the norms of psychometrics and testing, it is perhaps helpful to explain by analogy. The scale of the ‘improvements’ in test scores shown in SACMEQ 2013 is tantamount to saying that Usain Bolt and all the other athletes in the race ran the 100m in 5 seconds without ever wondering whether there was something wrong with the stopwatch. The sad thing about all of this is that it does seem that South Africa is really improving – other reliable evidence points to this – but not nearly as fast as the SACMEQ IV test scores would have us believe. According to the presentation, the Sacmeq questionnaire data also encouragingly shows that students’ access to their own textbooks increased substantially over the period from 45% to 66% for reading textbooks and from 36% to 66% for maths textbooks. This is good news.
In the latest turn of events the Department explained that apparently the results presented to Parliament were in fact “preliminary”, that an “extensive verification process” is currently underway, and that it is “fully aware of the issues raised in this regard.” Yet why then did it choose to go ahead and present questionable results to Parliament? Apparently researchers – AKA me – have “mislead the public” and my motives are “unclear.” There is nothing unclear about my motives; there is a major technical concern and the public should not be mislead into trusting these results presented to Parliament. There is also no uncertainty about whether the Sacmeq IV results should have been presented to Parliament. They should not have been presented while there is still so much uncertainty around the comparability of the results, end of story. The Department has been aware of the serious technical concerns around the results for some time now, since I emailed a number of members of the Department’s own research team many months ago drawing attention to these problems and cautioning against publishing any results until they could be rectified.
What I do not understand is why the Department would undermine their own technical credibility by presenting obviously questionable results to Parliament. Personally I would also not be surprised if the Sacmeq data – once comparable – did show an improvement in line with those of other studies. Soon we will also have the Pirls results of 2015 as another data point to verify what is going on. In South African education there is probably already a good story to tell, why muddy the waters by reporting such obviously impossible improvements based on dodgy data? The Department and Sacmeq must make sure the results of Sacmeq 2007 and 2013 are strictly comparable before reporting any further results and causing additional confusion.