(The article below first appeared in the Sunday Times on the 10th of January 2016)
Like so many things in South Africa, this year’s matric results are a paradox of good policies badly implemented. This time it was the Minister’s bold ‘promotion policy’ that led to an extra 21% more learners writing matric (644,536 this year compared to 532,860 last year). The policy limits the number of times learners can repeat a grade to once every three years and means fewer students drop out, being ‘promoted’ instead. While her decisive action has led to increased efficiency and improved access, it has also inadvertently caused a huge crack in the matric standardisation process, one that is only now starting to become apparent. The fact that the Department did not properly identify all progressed learners, and that Umalusi did not (and perhaps could not) take account of all progressed learners in their standardisation process calls into question the massive upward adjustments in marks that took place behind the scenes.
As usual, some commentators have myopically focussed on the drop in the matric pass rate, from 76% (2014) to 71% (2015) as if this, in and of itself, were a meaningful indication of anything. It isn’t. Or that it signalled a decline in quality, or harder exams. It doesn’t. Yes, the matric pass rate went down but the number of learners passing it went up. And in fact the real question might not be why the matric pass rate dropped, but why it didn’t drop further. In comparing the media statement from Umalusi and the technical report from the Department, the answer is quite clear. The decision was made to raise the raw marks across the board, from Maths and Physical Science to Life Science, Maths Literacy, History, Accounting, Geography and 24 other subjects. Umalusi themselves make a point of emphasizing that this was an “unprecedented set of adjustments”. When the Maths Literacy pass rate is adjusted from 38% to the final (and publicly reported) 71%, this is most certainly unprecedented, and I would argue, unwarranted. Was the test really so much more difficult than previous years? (This is the only reason why one is allowed to adjust the marks at all). Why did the internal and external moderators not pick up the huge increase in difficulty? Is it not more plausible that the massive drop in pre-adjusted performance was actually due to the additional 112,000 weaker learners who would’ve otherwise dropped out? If so, Umalusi shouldn’t have adjusted.
This is not to say that the Minister was wrong in introducing the promotion policy. Quite the opposite; she was heeding local and international research which shows that excessive repetition is costly, inefficient and has no educational benefit to the learner. Yes, we do need to find ways of preventing and remediating the problem, but rooting out wasteful repetition in the mean time is prudent and wise. A positive effect of this policy and the extra-large class of 2015 meant many more learners taking and passing key subjects, with about 52,000 extra matric passes, 9000 extra maths passes and 15,500 extra bachelor passes.
Both Umalusi and the Department claim that there were only 65,671 progressed learners. Yet there were an extra 111,676 matrics this year. So where did the other 46,005 extra learners come from? The clear answer is that there was a big policy change preventing schools failing learners multiple times and encouraging them to promote weak learners and push them into matric. Secondly, the way provinces record and report who is a progressed learner is highly dubious and varies by province and district. So, although we have approximately 66,000 ‘officially’ progressed learners, we also have 46,000 ‘quasi-progressed’ learners (what Umalusi calls ‘borderline candidates’).
The reason why all of this matters is because it influences the decision of whether to adjust the matric results and by how much. Umalusi is only ever meant to adjust the marks up or down if they believe the exam was harder or easier than previous years. The core assumption in this standardisation process is that the different matric cohorts (2013, 2014 or 2015 matrics) are of equal ability. Thus, any differences between the years can only be because the paper was easier or harder. And this is where the crack emerges. There is simply no way that the 2015 distribution of 645,000 matrics (including progressed and quasi-progressed learners) are as strong as the distribution of 533,000 learners in 2014. Thus the reason the 2015 cohort did so much worse on the raw scores was because of the extra 112,000 weaker learners, not because the tests were harder. We know that Umalusi did not take this into account because there is no way of identifying the 46,000 quasi-progressed learners. In Umalusi’s defence they couldn’t have excluded them even if they had wanted to because provinces didn’t record them. But it doesn’t seem Umalusi excluded these 112,000 (or even the 66,000) learners when they standardised the 2014 and 2015 distributions. This is illogical.
In an unusual change from previous media statements, this year Umalusi included the raw failure rates of subjects (i.e. before any adjustments). This can be compared to the marks in the technical report issued by the Department. The only difference between the two figures are the Umalusi adjustment, a small change due to school based assessments and a small language compensation for second language learners (extra 4 percentage points). When I refer to ‘raw’ and ‘final-adjusted’ pass rates I mean before and after these are accounted for. The three subjects I will focus on here are Maths Literacy, Geography and Business Studies since they all have big increases in enrolments which suggests these were the subjects taken by the progressed and quasi-progressed learners. The differences between the raw pass rate and the final-adjusted pass rate are large for Geography (increased from 66% to 77%), for Business Studies (increased from 54% to 76%) and especially for Maths Literacy (from a shockingly low 38% to 71% after adjustments!). For a national assessment these are incredibly large adjustments.
This could only be justified if the 2015 exams were extraordinarily more difficult in 2015 than in 2014. I simply do not buy it. The internal and external moderators all agreed that these exams were set at the appropriate level. To warrant adjustments of this magnitude they would have had to have been way out in their judgements. Why are we looking for alternative explanations for the big drop in raw marks when this one is staring us in the face? The most logical and obvious reason for the drop is the inclusion of an extra 112,000 weaker learners in 2015. Paper difficulty is marginal by comparison. In maths literacy alone there were 76,791 extra candidates in 2015. Where did these learners come from? It is clear that these are the weaker progressed and borderline candidates and that they are the main reason why the raw marks dropped so much. If so then we cannot just adjust the raw marks upwards, as was done this year.
The Umalusi standardisation process is necessary and probably the best we can do when different papers are written year-on-year, but Umalusi needs to clarify what happened here and in future be more transparent in their standardisation process. Unfortunately, no amount of standardisation can solve the biggest problem in our education system which is the fact that most children attending the poorest 75% of schools do not learn to read for meaning by the end of grade three and are forever behind. Indeed, matric starts in grade 1.
Dr Nic Spaull is an education economist at Stellenbosch University. He can be found on Twitter @NicSpaull and his work can be found at nicspaull.com