Serious technical concerns about SACMEQ IV results presented to parliament

 On the 12th of September the Department of Basic Education (DBE) presented a briefing to the Basic Education Portfolio Committee on the SACMEQ IV (2013) results. This was the first time that these results were presented in a public forum, and the first time that the national average SACMEQ IV reading score (558) and the national average SACMEQ IV mathematics score (587) were presented. The presentation is publicly available here.

I have an intimate knowledge of the SACMEQ IV data given that I was appointed by the SACMEQ Secretariat to analyse the SACMEQ IV data for the international report (date of appointment: 19 April 2016, Purchase Order Number UB-P02016001112, University of Botswana). After analysing the data I raised a number of serious technical concerns arising from the data  that significantly affect the comparability and validity of the findings and advised the SACMEQ Secretariat to address these concerns before any publication of the results (letter dated 26 May 2016). I also emailed the SACMEQ Technical Advisory Committee (9 June 2016) outlining the technical issues. Only one member responded and indicated that the item-response-theory (IRT) analysis should be redone with two independent verification checks. Based on the subsequent response from the SACMEQ Secretariat indicating that this would not happen I explained that I could not in good conscience continue with the analysis and chose to resign on technical grounds (resignation letter dated 7 August 2016).

The principal grounds for my technical concerns and subsequent resignation was the non-comparability of the results between SACMEQ III and SACMEQ IV because of the different methodologies employed when calculating test scores between SACMEQ III and SACMEQ IV, and particularly the fact that weaker students had been excluded from the final results in the process. This does not seem to have been addressed since the results presented to Parliament were the same as those that I identified as problematic.

Unfortunately, the contract that I have signed with SACMEQ prevents me from publishing any results that are based on that data until the international report has been publicly released, at which time I will provide a full account of my technical concerns and reasons for the non-comparability. I have also subsequently deleted the data on SACMEQ’s request.

The Department of Basic Education is already aware of all of my concerns since I emailed a number of members of the Department’s research team drawing attention to these problems and cautioning against publishing any results until they could be rectified. It would seem that the Department has chosen to push ahead and report these problematic results to Parliament in spite of these numerous concerns.

Comments on the SACMEQ IV presentation to parliament:

  • The gains in the SACMEQ reading and mathematics scores between 2007 and 2013 are so unbelievably large that they would make South Africa the fastest improving education system in the world, together with Country 2 and Country 7 (names of other countries were excluded in the presentation to parliament. These countries improved by more than 0.9 standard deviations or 0,13 standard deviations per year. The improvement from 495 to 587 is an 18,5% improvement in test scores (or 2,7% improvement per year). A 2012 study looking at how fast education systems can improve points to Brazil which is the fastest improving education system in any testing system. Yet the SACMEQ IV results presented to parliament would have us believe that South Africa (and Country 2 and Country 7) improved at least twice as fast as Brazil, the fastest improving country. This is extremely extremely unlikely. (It is also unlikely that the teacher test scores have dropped so drastically). We know from other data (such as TIMSS 2003 and 2011) that South Africa has improved in Grade 9 mathematics but this improvement was only half as large as that reported by SACMEQ IV. South Africa’s scores may well have improved between 2007 and 2013 but we cannot say if they have improved or decreased until the results are actually comparable.
  • The fact that teacher test scores plummeted at the same time that student test scores soared should already make us very curious about the technical procedures that might lead to such a situation.

I think the best way forward is for the Department of Basic Education to issue a statement explaining whether they believe the SACMEQ III and IV results are comparable and why and whether these results were based on an earlier version of the data, one which has subsequently changed. And secondly for us to wait for the SACMEQ IV Technical report to be released by SACMEQ so that the procedures, methods and assumptions underlying the SACMEQ IV results can be scrutinised by researchers around the world. This is the reason that the TIMSS, PIRLS and PISA results are all only released at the same time as the full international report.

SACMEQ is an extremely important indicator of changes in reading and mathematics achievement over time. It’s reputation and technical credibility must be upheld for it to retain its position as the main cross-national testing system in the region. To do so, the methods and analysis must be above reproach and open to scrutiny by independent researchers.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s