OK so I got a little frustrated with explaining the whole matric standardisation vibe to a million people, so here’s the deal once and for all…
Soon after the matric results jamboree ended and people went back to work, there were a few unanswered questions about how the matric exams of 2015 were standardised by Umalusi. Already in September of 2015 I was asking some Departmental officials whether the Minister was going to report the matric results of progressed and non-progressed learners separately (which is what I would’ve done). The logic being that the public (wrongly) view the matric exam as the main indicator of the health of the education system and that if progressed learners were lumped with non-progressed learners there would be excess pressure to ensure that the matric pass rate did not drop too much. But this is not the approach that the DBE took. I also emailed someone in Umalusi to ask how they were planning on doing their standardisation given that the two groups of matrics (2014 and 2015) were so different, with the latter having many more students due to the progressed learner policy.
After comparing the publicly-available Umalusi statement and the publicly available DBE NSC Technical Report 2015 it became possible to see how large the Umalusi adjustments were this for 2014 and 2015 for nine subjects – see table below.
I won’t rehash my full argument here (if you’re interested read my article Matric Cracks Starting to Show). The yellow highlighted subjects are those that had big jumps in enrolments, for example there were an extra 76,791 learners taking maths-literacy in 2015 compared to 2014. Notice that the pass rates increased substantially between the raw pass rate and the pass rate after (1) Umalusi adjustment, (2) language compensation, (3) school-based assessment. The gist of that article was to say that the progressed learners of 2015 were not properly accounted for in the Umalusi standardisation process and that the most logical reason for a drop in performance of the raw marks was the inclusion of extra (weaker) learners, i.e. progressed learners, rather than a much more difficult exam.
Subsequent to my article, the CEO of Umalusi, Dr Rakometsi, wrote a reply titled “Promoted pupils had no big effect on matric rates” and clarified a number of issues. For the sake of brevity I will summarize the most salient points here:
- Umalusi was told by the DBE that there were only 66,088 progressed learners
- If one excludes these 66,088 progressed learners then the failure rate within subjects increased by between 1% and 4%.
- He confirmed that “the pass rate on raw marks is at 38% for maths-literacy
- The maximum adjustment that Umalusi can make is 10 percentage points which was applied for mathematics literacy “because the paper turned out to be more difficult in 2015 compared to previous years. As a results of this maximum adjustment, 27% of learners who scored between 20-29% obtained a pass”
- One paragraph in the article is particularly important and so I quote it verbatim:
“The report indicates that the impact of progressed learners to the individual subjects was minimal. As a result, there was no basis to standardise the results of the progressed learners separately. What we call progressed learners is actually only the KNOWN progressed learners. The argument that there were more is an assumption. Umalusi can only work on the information before it, not on assumptions and extrapolations.”
From the above we can draw two important conclusions:
(1) The 66,088 progressed learners were not excluded when the results were standardised relative to earlier years, despite knowing that these learners were weaker students. This seems totally bizarre. We know that these are weaker learners, why would we include them in the norm-referencing curve and compare to earlier years were these students did not exist? Even if they only contributed to a drop in the pass rate of between 1-4% why were they excluded?
2) (the most important conclusion) Umalusi only looked at the 66,088 “officially” progressed learners and ignored all the other information suggesting that there might be additional weaker learners who were actually progressed learners but were not captured as progressed learners, what I called “quasi-progressed” learners in my article. We know that provinces are not recording who is a progressed learner with the same accuracy.
Perhaps the most telling evidence is just to ask how many extra matrics there were in 2015 compared to 2014? The answer is 111,676 (644,536 in 2015 compared to 532,860 in 2014). But if there were only 66,088 progressed learners, where did the remaining 45,588 learners come from?
Some have suggested that it’s from a change of age policy that happened in 1998, but that lead to a small cohort in 2011/2012 not now, as Stephen Taylor has shown using EMIS data. The table below (taken from here) shows the different enrolment numbers by year and grade. What we can see is that the matric class of 2011 was always going to be small If we look at the matric class of 2011 there were 534,498 learners in matric and only 930,019 learners in grade 8 four years earlier. Basically we knew that the matric class of 2011 was going to be smaller. Whereas if we look at the matric class of 2015 (with 687,230 learners according to this table) this is unexplainably big. If we look at the grade 8 cohort of 2011 we see that there were 1,008,110 which is only about 7000 learners more than the grade 8 class of 2010. So how are we to explain the massive difference we see when we compare the 2014 and 2015 matrics (111,676)?
In my mind the answer is straight-forward – the extra learners in matric 2015 are the direct result of trying to decrease grade repetition by “promoting” weaker learners into higher grades rather than fail them. If this is correct then we needed to exclude the full 111,676 learners when standardising relative to earlier years. Umalusi will (and has) argued that this was not possible and that they did not even try to take into account of quasi-progressed learners.
So those of you that’ve read this far might be asking “So who cares? Why is this even important?” and the answer is that it matters a lot for universities and the labour-market if Umalusi gets this wrong. If the standardisation process assumed that a drop in the raw marks was only due to an increase in test difficulty (which is what Umalusi did) when a more plausible explanation was that it was because we included an extra 21% of weaker learners, then the real value (and signal) of a basic matric is actually declining over time.
On page 171 of the 2014 Ministerial Task Team Review of the NSC we read the specifications of when Umalusi can and cannot adjust matric marks:
“Reasons for differences may include: cohort differences, changes in curriculum, changes in setters of the examination papers, disruptions in the schooling processes and security leakages. In the absence of evidence of such possible reasons, it is then generally accepted that the differences are due to deviations in the standards of the examination or marking and therefore marks are adjusted to compensate for the deviations (Umalusi 2007a, 29).” [emphasis added]
Personally I do think that some of these tests increased in difficulty, but it is ludicrous to think that adding 21% more students who are KNOWN to be weaker students would not decrease the marks. Also this is the first year where basically all adjustments were upward. There was not a single downward adjustment. Coincidence much?
Just because Umalusi could not identify the quasi-progressed learners doesn’t mean they can just ignore them. Hence the cartoon above. It would seem that Umalusi has essentially said “Yes we can see that the cohort is much bigger. Yes we can see that there was a clear policy intervention to progress weaker learners. Yes we can see that the official numbers of progressed learners do not match with the full increase in the size of the cohort. But we are going to pretend that there were only 66,088 progressed learners. We refuse to accept any other reality because we can’t do anything about it anyway so what’s the use in knowing.
The fact that the marks were pushed up primarily at the bottom (probably too much) means that students passed in 2015 who would not have passed in 2012. It means students have qualified to study basic degrees in 2015 who would not have qualified if they wrote in 2012. So, if I’m right, any of the following could result:
- There will be a flood of applications for degrees and diploma’s that require the lowest levels of matric points. Thousands more students will have ‘met the criteria’ and the universities would not have not anticipated this. In fact I have already heard that UNISA has had an unprecedented increase in applications, Edgewood at UKZN has been swamped. Damelin has a huge spike in applications. If I’m right then these students should never have qualified for university and will fail. They might incur debt, move province, make life decisions based on incorrect signal. (Let us not speculate on how the surge in applications for NSFAS will stoke the FeesMustFall fire and mean that there is less to go around and more angry students with hopes and dreams that the State cannot fulfil.
- Universities will see students that are not even remotely prepared for higher academic study and will have to increase their access programs and expect higher failure rates.
- As a result of the above the universities will increase the matric-point requirements for entry into their programs for 2017 (particularly programs like B.A, B.Soc-Sci, B.Ed etc.). They will also start to rely more on the National Benchmarking Tests in their selection criteria. [Sidebar, researchers should compare NBT results with matric results in 2013, 2014 and 2015 to see if there are any differences that might be attributable to wrongly-boosted matric marks].
- The gap between the earnings of those with a matric and those with a matric+degree will grow (note it is already large, see graph below). This is largely because the adjustment was primarily at the bottom meaning there are many more students with a low-level-matric who have, in effect, lower levels of knowledge and skill than low-level-matrics of 5 years ago.
(Source: Hendrik van Broekhuizen, 2013, here)
As I said, none of the above precludes the fact that the tests were more difficult in 2015 (although this is still speculation). I am only saying that there is no way in my mind that including an extra hundred thousand weaker learners didn’t play any part in the drop in the raw marks. And, in essence, that is what Umalusi is arguing.
Dr Rakometsi and I will be discussing this on the Redi Hlabi show at 10am tomorrow (25 Jan 2016). It should be an interesting discussion 🙂