Shaky data skews literacy results (M&G article on SACMEQ IV)

screen-shot-2016-09-25-at-11-26-28-am

(The article below was published by the Mail & Guardian on the 23rd of September 2016)

Every few years South Africa participates in international tests of reading, mathematics and science in an attempt to see what our students know and can do, and how this is changing over time. These tests are independently set and usually comparable over time. Every 4 or 5 years we test our Grade 9 students in maths and science (Timss) and our Grade 4/5 students in reading (Pirls), and every 6 years we test our Grade 6 students in reading and mathematics (Sacmeq). 2016 happens to be the year when all the results of these assessments are released to the public. The 2015 Timss and Pirls results will be released in November/December this year and the 2013 Sacmeq results were presented to Parliament earlier this month, which is what I will focus on here.

In what would have been the biggest news of the post-apartheid period – at least if it were true – Parliament heard that South Africa’s primary education system improved faster than any other education system in the history of international testing, i.e. since 1967. Our alleged improvement was more than twice as large as the fastest improving country in the world, Brazil. To be specific, South African Grade 6 students’ test scores improved by more than 0.9 standard deviations between 2007 and 2013 or the equivalent of an extra 3 full years worth of learning. To put this in perspective, this is the same as taking Thailand- or Mexico’s education system and making it equal to Finland’s or Canada’s in 6 years. It makes for a great story or a wish from a Fairy Godmother, but not for plausible results from a psychometrically-rigorous international test. Note that it is not only South Africa that experienced these colossal ‘gains’, but all Sacmeq countries, which is even more suspicious. A big part of the alleged Sacmeq improvements actually arise from different methodologies employed in 2007 and 2013, making them incomparable until they are properly equated.

The results presented to Parliament compare data from 2007 and 2013, yet the way these results were calculated in each period was not the same, and I should I know. I was appointed by Sacmeq itself earlier this year to analyse the results for the international Sacmeq report. After analysing the data I raised a number of serious technical concerns about the data that significantly affect the comparability and validity of the findings, and especially the fact that the weaker students had been excluded from the final analysis. I advised the Sacmeq Secretariat to address these concerns before any publication of the results since doing so would be misleading. Based on the subsequent response from the SACMEQ Secretariat indicating that this would not happen I explained that I could not in good conscience continue with the analysis and chose to resign on technical grounds in August this year. The issues I raised have not been addressed since the results presented to Parliament were the same as those that I identified as problematic. At the same time this was going on I emailed the Department flagging my concerns and cautioning against publishing the results.

The Department of Basic Education itself was shocked by the unprecedented improvements. In the presentation to Parliament they explain: “Given the significant improvements, the South African national research team requested SACMEQ to double check the results and were subsequently reassured on their accuracy.” This is simply not good enough.

The lack of comparability between 2007 and 2013 is so glaringly obvious one doesn’t need inside knowledge of the data to see how implausible the results are. At the same time that the student reading scores soared (rising by 0.9 standard deviations), the teacher reading scores plummeted (dropping by 0.8 standard deviations), which is extremely peculiar. If we are to believe the results, by 2013 basically all South African students could read, with illiteracy rates dropping from 27% in 2007 to 3% in 2013. This is totally at odds with the other main international test we do, Pirls in 2011, which showed that 29% of Grade 4 students were reading-illiterate and 58% could not read for meaning, confirming a host of smaller studies showing the same thing.

If we dig a little deeper, the Department’s presentation to Parliament apparently showed that the biggest improvers were Limpopo and the Eastern Cape. Go figure. These are the very same provinces that were placed under administration (Section 100) in 2011 because they were so utterly dysfunctional. To use the Minister’s own words, these are the education system’s “pockets of disaster” whose 2015 matric results were a “national catastrophe.” Yet Sacmeq would have us believe that illiteracy in Limpopo has been totally eradicated, dropping from 49% in 2007 to 5% in 2013. In stark contrast, our other major international test (Prepirls) showed that of the more than 2900 Grade 4 students that were tested in Limpopo in 2011, 50% were reading-illiterate and 83% could not read for meaning.

For those unfamiliar with the norms of psychometrics and testing, it is perhaps helpful to explain by analogy. The scale of the ‘improvements’ in test scores shown in SACMEQ 2013 is tantamount to saying that Usain Bolt and all the other athletes in the race ran the 100m in 5 seconds without ever wondering whether there was something wrong with the stopwatch. The sad thing about all of this is that it does seem that South Africa is really improving – other reliable evidence points to this – but not nearly as fast as the SACMEQ IV test scores would have us believe. According to the presentation, the Sacmeq questionnaire data also encouragingly shows that students’ access to their own textbooks increased substantially over the period from 45% to 66% for reading textbooks and from 36% to 66% for maths textbooks. This is good news.

In the latest turn of events the Department explained that apparently the results presented to Parliament were in fact “preliminary”, that an “extensive verification process” is currently underway, and that it is “fully aware of the issues raised in this regard.” Yet why then did it choose to go ahead and present questionable results to Parliament? Apparently researchers – AKA me – have “mislead the public” and my motives are “unclear.” There is nothing unclear about my motives; there is a major technical concern and the public should not be mislead into trusting these results presented to Parliament. There is also no uncertainty about whether the Sacmeq IV results should have been presented to Parliament. They should not have been presented while there is still so much uncertainty around the comparability of the results, end of story. The Department has been aware of the serious technical concerns around the results for some time now, since I emailed a number of members of the Department’s own research team many months ago drawing attention to these problems and cautioning against publishing any results until they could be rectified.

What I do not understand is why the Department would undermine their own technical credibility by presenting obviously questionable results to Parliament. Personally I would also not be surprised if the Sacmeq data – once comparable – did show an improvement in line with those of other studies. Soon we will also have the Pirls results of 2015 as another data point to verify what is going on. In South African education there is probably already a good story to tell, why muddy the waters by reporting such obviously impossible improvements based on dodgy data? The Department and Sacmeq must make sure the results of Sacmeq 2007 and 2013 are strictly comparable before reporting any further results and causing additional confusion.

 

Serious technical concerns about SACMEQ IV results presented to parliament

 On the 12th of September the Department of Basic Education (DBE) presented a briefing to the Basic Education Portfolio Committee on the SACMEQ IV (2013) results. This was the first time that these results were presented in a public forum, and the first time that the national average SACMEQ IV reading score (558) and the national average SACMEQ IV mathematics score (587) were presented. The presentation is publicly available here.

I have an intimate knowledge of the SACMEQ IV data given that I was appointed by the SACMEQ Secretariat to analyse the SACMEQ IV data for the international report (date of appointment: 19 April 2016, Purchase Order Number UB-P02016001112, University of Botswana). After analysing the data I raised a number of serious technical concerns arising from the data  that significantly affect the comparability and validity of the findings and advised the SACMEQ Secretariat to address these concerns before any publication of the results (letter dated 26 May 2016). I also emailed the SACMEQ Technical Advisory Committee (9 June 2016) outlining the technical issues. Only one member responded and indicated that the item-response-theory (IRT) analysis should be redone with two independent verification checks. Based on the subsequent response from the SACMEQ Secretariat indicating that this would not happen I explained that I could not in good conscience continue with the analysis and chose to resign on technical grounds (resignation letter dated 7 August 2016).

The principal grounds for my technical concerns and subsequent resignation was the non-comparability of the results between SACMEQ III and SACMEQ IV because of the different methodologies employed when calculating test scores between SACMEQ III and SACMEQ IV, and particularly the fact that weaker students had been excluded from the final results in the process. This does not seem to have been addressed since the results presented to Parliament were the same as those that I identified as problematic.

Unfortunately, the contract that I have signed with SACMEQ prevents me from publishing any results that are based on that data until the international report has been publicly released, at which time I will provide a full account of my technical concerns and reasons for the non-comparability. I have also subsequently deleted the data on SACMEQ’s request.

The Department of Basic Education is already aware of all of my concerns since I emailed a number of members of the Department’s research team drawing attention to these problems and cautioning against publishing any results until they could be rectified. It would seem that the Department has chosen to push ahead and report these problematic results to Parliament in spite of these numerous concerns.

Comments on the SACMEQ IV presentation to parliament:

  • The gains in the SACMEQ reading and mathematics scores between 2007 and 2013 are so unbelievably large that they would make South Africa the fastest improving education system in the world, together with Country 2 and Country 7 (names of other countries were excluded in the presentation to parliament. These countries improved by more than 0.9 standard deviations or 0,13 standard deviations per year. The improvement from 495 to 587 is an 18,5% improvement in test scores (or 2,7% improvement per year). A 2012 study looking at how fast education systems can improve points to Brazil which is the fastest improving education system in any testing system. Yet the SACMEQ IV results presented to parliament would have us believe that South Africa (and Country 2 and Country 7) improved at least twice as fast as Brazil, the fastest improving country. This is extremely extremely unlikely. (It is also unlikely that the teacher test scores have dropped so drastically). We know from other data (such as TIMSS 2003 and 2011) that South Africa has improved in Grade 9 mathematics but this improvement was only half as large as that reported by SACMEQ IV. South Africa’s scores may well have improved between 2007 and 2013 but we cannot say if they have improved or decreased until the results are actually comparable.
  • The fact that teacher test scores plummeted at the same time that student test scores soared should already make us very curious about the technical procedures that might lead to such a situation.

I think the best way forward is for the Department of Basic Education to issue a statement explaining whether they believe the SACMEQ III and IV results are comparable and why and whether these results were based on an earlier version of the data, one which has subsequently changed. And secondly for us to wait for the SACMEQ IV Technical report to be released by SACMEQ so that the procedures, methods and assumptions underlying the SACMEQ IV results can be scrutinised by researchers around the world. This is the reason that the TIMSS, PIRLS and PISA results are all only released at the same time as the full international report.

SACMEQ is an extremely important indicator of changes in reading and mathematics achievement over time. It’s reputation and technical credibility must be upheld for it to retain its position as the main cross-national testing system in the region. To do so, the methods and analysis must be above reproach and open to scrutiny by independent researchers.

 

 

Do you know an outlier township/rural school in KZN, LP or GP??

Color pencils representing the concept of Standing out from the crowd

 

So I have finally taken to crowd-sourcing in my research and I need your help!

Do you know of any ‘outlier’ or exceptional township or rural primary schools in either KZN, Limpopo or Gauteng? Schools that manage to succeed against the odds and achieve great learning outcomes.

In a new ESRC/DFID study we want to understand how these schools manage to get the results they do, and specifically to understand the school leadership and management characteristics in these schools. But first we need to identify these outlier schools. We’re trying a number of different approaches to identifying these schools and then triangulating the results.

So word or mouth or your first-hand experience with an exceptional township/rural school could really help us. If you have any suggestions please send me an email at NicholasSpaull[at]gmail.com with the name of the primary school and why you think it’s an outlier school and any contact info if you have it!

Looking forward to hearing from you!!

Nic

21st Century Skills: MakerSpace

fablab_news

If you’re interested in 21st Century Skills (like Creativity, Collaboration, Critical Thinking, Communication), you should be looking into MakerSpace which now has a branch in Durban🙂 The aim is a kind of ‘make it yourself’ drive, and helps by providing the skills, tools and training to do it. For education this might be about 3D-printing, or how to use and program an Arduino, or Robot Making (see pamphlet below). This reminded me of Stanford’s FabLearn Labs which works on a similar logic (the photo above is of a FabLearn Lab). If your school can afford these types of courses I would strongly recommend moving in this direction…

School programmes

Screen Shot 2016-07-29 at 11.51.20 AM

You can find out more here – http://themakerspace.co.za/

Exemplary school design: Seven Fountains Primary School (Kokstad)

Screen Shot 2016-07-28 at 1.37.54 PM

Screen Shot 2016-07-28 at 1.38.07 PM

Screen Shot 2016-07-28 at 1.38.22 PM

Screen Shot 2016-07-28 at 1.38.33 PM.png

Screen Shot 2016-07-28 at 1.38.44 PM

Screen Shot 2016-07-28 at 1.39.01 PM

The above was one of the case studies featured in”Designing for Education” – a publication by the OECD (2011). More information about the design of this school can be found here and here.

What makes a school really great? [Guest blog post: Gabi Wills]

school

What makes a school really great? Those first impressions that count” – Dr Gabi Wills

Curriculum coverage? Teacher motivation? Print-rich environments? Learning goals and targets? These are a few of the things that I see as important as I have looked through mounds of literature on what makes an effective school. Together with a team of education experts we are preparing to engage in research in schools in South Africa in township and rural areas that exceed despite the odds. In preparation we are having to think hard and fast about questionnaires to capture what it is that separates these schools from the rest. Most of the time this can be a surprisingly difficult task. In post-Apartheid South Africa there have been numerous studies on schools where data is captured on indicators of school functionality. Using our fanciest modelling, we then try and see which of the many indicators of observed factors explain why certain schools do better than others. But most of the time we simply can’t explain the variation in learner performance that we observe across schools, particularly in the majority of poorer schools in the system. I am however starting to wonder if we simply have not measured effectively the things that really count.

As academics we tend to limit ourselves to our peer-reviewed readings, to our computer screens and the occasional conference. But we miss too many opportunities for the ‘aha’ moment when it all comes together. Increased burdens of work limit time to experiment and explore. Well at least for me. After feeling unusually disimpassioned and just wearied by just too much information, today I did something different but obvious. Rather than running off to the office and opening my computer, I started my day in the reception of a great preparatory school in Durban. I sat and observed. I started reading the display books on the reception table, observed the honour boards proudly displayed, watched teachers coming in and out and hearing in the background the sound of children vocalising their prose for the next drama production. After 60 minutes of this, and particularly reading an inspiring 2010 prize-giving speech of the headmaster in one of the coffee table books, things were becoming clearer. Before I even got to the classroom, I realised that great schools do this:

  • They celebrate their history – no matter how small or great. Equally they dream about the future. As read in one of the headmasters prize-giving speeches (also documented) there is “a deep affectionate respect for folk who have gone before”. When history has not been particularly becoming, they consider how they learn from this and how obstacles were overcome.
  • They celebrate excellence. Even the smallest achievement of present and past students is meticulously documented and preserved that all who visit can see. The annual prize-giving is a revered and celebrated event. Photographs of awards and those awarded take centre stage.

But you are probably wondering why these two features (past history, past achievement) matters for the now? The importance of this extends beyond school pride, it legitimises the worth of the institution beyond one individual. Great people create great institutions with a reason for existence beyond their founders. Moving on, great schools….

  • Are intentional about cultivating school pride. In just this reception area, school pride emanates from every intentionally displayed item on the walls, in the greeting of the security guard to the glow from the weekly polished floors. School logos, obviously displayed school songs and mottos are evident. Children don’t just come here to learn. They find a sense of a belonging in an organisation with its own unique character which parents have strategically worked at crafting with the school staff over decades.
  • They treat discipline and manners among children as non-negotiable inputs and outputs of the schooling process. I was greeted with respect by even the littlest grade Rs who politely stood aside and smiled as they did. Where the banter of naughty children is heard, the voice of disciplinary teacher towers louder. It’s clear who is in control.

These are just four observations before I have even spoken to a single person. Moving on to meeting two principals in the school…

  • Respect for teachers, visitors, cleaning staff and the security guard is evident from leaders in this institution. Despite the hassle I present, I am given a tour of classrooms, cupboards, facilities and libraries as two principals enthusiastically express why and how they do things around here. The cleaner is introduced as a fellow colleague.
  • Leaders have intentionally hired the right people (of course in this case they have the privileged control over hiring with lots of SGB paid teachers they can afford). The principal talks about each teacher as a “leader”, “striving relentlessly”, “passionate” and “dedicated.”
  • The economist in me can’t help but ask a few monetary related questions and it’s obvious that there are well-proven financial structures in place. This school doesn’t miss a beat when it comes to the financial operations it requires to keep this ship moving. This is where parents with financial skills come in and are drawn upon for their expertise. The principals I speak to are exactly aware of how much this ship requires, where it requires resources and if anything is ever left over.
  • Teachers have a sense of mastery of the curriculum and are acutely aware of where it can be altered or adapted to better the learning opportunities for their students without stepping beyond CAPS learning requirements. Official workbooks are only used if a more suitable option for their students is not available (and positively at times the workbooks are often considered the best option).

After just 2 hours, I think I have got clearer what my next questionnaire needs to be about and probably saved myself two day of agonising thinking. For all our studies after just bumbling along as a regular person I come that much closer to realising what matters, what separates out the average school from the great. I suspect I have just observed what every interested parent or teacher has known all along.

//

One of Gabi’s recent Working Papers on principal leadership changes in South Africa is available here.

Recruiting 12 fieldworkers for our ESRC study :)

help

Fieldworkers required: Research on Socio-Economic Policy (ReSEP), based in the Economics Department at Stellenbosch University, has embarked on a research project focussing on exceptional township and rural primary schools in three provinces in South Africa. ReSEP plans to recruit 9 experienced fieldworkers to assist them with school-based research.

Job description: Fieldworkers

Periods of work: Early October 2016 with an option to renew for forthcoming fieldwork

Overall period of fieldwork: Early October 2016, February/March 2017, September/October 2017

Daily rate for fieldwork: R800 per day with R100 per day subsistence allowance

Daily rate for training: R450 per day

Description of the project: The main aim of the research is to better understand why some schools perform better than others specifically in township and rural areas. We will develop a new survey instrument that captures the actual practices and behaviours of teachers and principals in challenging contexts.

This will be done where each exceptional school is paired with a nearby ‘typical’ school, i.e. one with similar geographical and socioeconomic characteristics.

The study will be conducted in various phases from May 2016 to September 2018 in 60 schools across the Western Cape, KwaZulu-Natal and Limpopo provinces. This advert is based on fieldwork for the first phase, piloting the various research instruments in three schools per province, this may however be extended to the subsequent data collection phases (February and October 2017) based on fieldworker performance and availability.

Fieldworkers will be required to

  • interview school principals and teachers using quantitative instruments;
  • conduct reading and vocabulary tests with young children;
  • conduct assessments of classrooms and learner workbooks;
  • provide informative insights into the school environments they visit.

Minimum requirements: 1) A bachelor’s degree (although in exceptional cases 3 year diploma’s also considered) 2) Fluency in reading and writing in English as well as one of the three following languages: Sepedi, isiZulu or IsiXhosa

Preference will be given to individuals:

  • with previous research experience particularly from academic disciplines such as linguistics, psychology, teaching and education research. Although other background disciplines will also be considered.
  • with a valid driver’s license and regular driving experience.  with a research interest in the project.
  • who live in the province in which they are surveying schools (i.e. Limpopo, Western Cape or KwaZulu-Natal).

Qualities:

Ideal candidates should possess the following qualities:

  • Integrity and honesty
  • Organised with the ability to plan effectively
  • Confident and professional
  • Good interpersonal skills
  • Works easily in a team
  • Enjoys working with children
  • Takes initiative to solve problems as they arise
  • The ability to objectively observe an environment without influencing how it functions.
  • The ability to articulate clearly in language and writing your observations.

Required time commitments:

  • 2016: Early October 2016 (6 days). 2 days compulsory training. 3-4 days pilot fieldwork.
  • Working hours in field: 6.30am/7am start to be at a school by 8am depending on travel distances. 8 hour working day. Each fieldworker will be expected to verify the GPS location/address of the piloting schools prior to the field visit. Late arrivals will not be accepted as this will jeopardise the completion of data collection.
  • Important note: Fieldwork will likely involve being away from home Monday to Friday (given that schools may be located very far from central hubs) and will potentially require driving to required areas on a Sunday to be able to start fieldwork promptly in schools on a Monday morning. All travel arrangement, logistics and accommodation will be arranged through ReSEP, however it is the responsibility of each fieldworker to verify the suitability of the arrangement.

Application:

12 shortlisted candidates will be required to attend training at a central location where 9 fieldworkers (3 per province) will be chosen from the most suitable candidates identified. All shortlisted candidates will be compensated the daily rate of R450 for the duration of the training.

If you are interested in applying for the position please send to mschreve@sun.ac.za by Mon. 8 August 2016

i) Your CV. Including two references.

ii) A short written piece (500 word limit) on what characteristics/features in your opinion distinguish a school as being better than others. The piece must be written in English and in one of the three following languages: Sepedi, isiZulu or IsiXhosa.

iii) A covering letter. In your covering letter please explain why you think you would be a good match for this position. In the subject line please include: “Application: ESRC fieldworker.”

The University reserves the right to investigate qualifications and conduct background checks on all candidates. Should no feedback be received from the University within four weeks of the closing date, kindly accept that your application did not succeed.