Category Archives: Uncategorized

The best SA research article I’ve read this year


This article by Elizabeth Pretorius (UNISA) is easily one of the best academic articles I’ve read this year! I would strongly recommend that anyone interested in (1) reading/literacy, (2) the education crisis in SA, or (3) how to get out of the mess we are currently in, should read this article!

Pretorius, E. 2014. Supporting transition or playing catch-up in Grade 4? Implications for standards in education and training. Perspectives in Education. 32 (1), pp 51-76.


This paper describes an intervention programme that was originally intended to support transition to English as language of learning and teaching (LoLT) in Grade 4 in a township school, using a pre- and post-test design. Because the pre-tests revealed very poor literacy levels in both Zulu home language and English, the intervention programme was modified in an attempt to fast-track the learners to literacy levels more appropriate to their grade. This paper outlines the intervention, presents the pre- and post-test results of the English literacy assessments, reflects on the effects of the intervention, and briefly considers some of the reasons for the initial poor literacy performance. Finally, a model for literacy development in high-poverty contexts is proposed to minimise the need to play catch-up in the Intermediate Phase.

JPAL Executive Education Course (on RCT evaluations)

After submitting my PhD a few weeks ago I am slowly getting back into the swing of things. That includes blogging and the Q&A series. But for now I thought I would pass on this advert for JPAL’s latest Executive Education Course. JPAL is a great organization and I would highly recommend this course to those professionals who wants to get a better understanding of how randomised control trails work.
J-PAL Africa Executive Education Course
Location: School of Economics, University of Cape Town
Course Dates: 19-23 January 2015
Application Deadline: 4 November 2014
We are pleased to invite you to apply to the upcoming J-PAL Executive Education course in Evaluating Social Programmes, which will provide you with a thorough understanding of randomised evaluations. Our Executive Education courses are valued by people from a variety of backgrounds: managers and researchers from international development organizations, governments, as well as trained economists looking to retool. If you have colleagues or friends who may be interested in applying to this course, we encourage you to pass this invitation on to them as well.
The course is a 5-day, full-time course and will run from the 19 – 23 January 2015 at the University of Cape Town in South Africa. The purpose of the course is to provide participants with a thorough understanding of why and how to use randomisation in an impact evaluation and to provide pragmatic step-by-step training for conducting their own randomised evaluation. J-PAL affiliates with extensive experience in using randomised impact evaluations to test the effectiveness of social programmes, in Africa and globally, will teach the course.
You may view more information about the course content and fees here. You can also access the course applications through that website, or by following this link here. The deadline for applications is 4 November 2014. We receive far more applications to the course than we can accommodate, and we encourage applicants to apply early. We will notify participants of whether their application has been accepted by November 7th, after which payment of the course fee (see course fees here) will need to be made by 2 December, before a place on the course can be confirmed.  Once acceptance is confirmed, participants are responsible for making their own travel and accommodation arrangements.
We hope to receive your application for the course soon. Do let us know if you have questions about the course. 
Warm regards,
Laura Poswell
J-PAL Africa Executive Director

Big Brother is watching (Links I liked)


  • Earth meet Big Brother, Big Brother – Earth. WIRED’s latest article on Edward Snowden is profoundly shocking. The NSA has the potential to track “everyone” in a city using the MAC address from cellphones, it “accidentally” sunk the Internet in Syria while trying to spy on Syrian civilians, and frequently passes unredacted data on to Israel about Palestinian American citizens. WTF.
  • The non-monetary benefits of learning may be difficult to measure, but they shape and determine what we recognize to be the quality of our life” – from “Why academic achievement matters (via Harry Patrinos).
  • After watching the trailer to Boyhood I’ve fallen in love with this song by Family of the Year.
  • Henry Miller on friendship – worth reading (thanks Clint Clark)
  • Next year’s IEA conference is happening in Cape Town (22-23 June 2015). For those who are doing work on IEA data (PIRLS/TIMSS) the deadline for submission is 1 December 2014.
  • The Free State Department of Basic Education is clearly trying to game the ANAs – see this article. We really need to be thinking about the potential unintended consequences of the ANAs. I think the ANAs are a positive development in the SA education system, but we need to be paying closer attention to the potential unintended consequences of the ANAs and how we can minimize these unintended consequences. For one thing, the ANAs are not ready to be used as an accurate indicator of student or school performance across the grades or over time. 
  • In case you were wondering what your skin looks like with and without sunscreen, see here. *will never not wear sunscreen again, or use double negatives*
  • The impact of national and international assessment programmes on education policy, particularly policies regarding resource allocation and teaching and learning practices in developing countries” – Systematic review of the evidence by ACER 2014

There’s STILL madness in WEF rankings


I don’t usually repost articles that I’ve written in the past but given that the World Economic Forum has recently released it’s 2014/15 Global Competitiveness report I thought it makes sense. The article below first appeared in the M&G on the 13th of June. If you’re part of the media and want to quote me on something RE the 2014/15 rankings you can use any of the following:

  • “These results are completely and utterly preposterous. Of course South Africa’s education system is in crisis and performing worse than other middle-income (and some low-income) countries) but it definitely isn’t the worst in the world.”
  • “The methods used to calculate these education rankings are subjective, unscientific, unreliable and lack any form of technical credibility or cross-national comparability.”
  • “The mistakes in the WEF’s methodology are so egregious that one needs only look at the list of countries and their respective rankings to appreciate how ridiculous they really are – failed states rank above modernising middle-income countries. How on earth do Japan and Cote d’Ivoire have the same ranking for the quality of their maths and science instruction? This is not science, this is an unscientific opinion survey”
  • “The WEF has seriously undermined its own technical credibility by reporting these ridiculous education rankings. Until it rectifies its methodology, no one should take the rankings seriously.”


In the past two weeks the South African media has had a field day lamenting the state of maths and science education in the country. This is because the World Economic Forum (WEF) recently ranked South Africa 148th (out of 148 countries) on the quality of its maths and science education.

Let me cut to the chase and say, unequivocally, that the methods used to calculate these education rankings are subjective, unscientific, unreliable and lack any form of technical credibility or cross-national comparability. I am not disputing that South Africa’s schooling system is currently in crisis (it is), or that South Africa performs extremely weakly relative to other low- and middle-income countries (it does). What I am disputing is that these “rankings” should be taken seriously by anyone or used as evidence of deterioration (they shouldn’t).

The mistakes in the WEF’s methodology are so egregious that one needs only look at the list of countries and their respective rankings to appreciate how ridiculous they really are. How is it possible that the quality of maths and science education in failed states such as Chad (ranked 127th on the WEF list), Liberia (125th) and Haiti (120th) is better than modernising middle-income countries such as Brazil (136th) and Mexico (131st)? How do countries such as Madagascar (82nd) and Zambia (76th) outrank countries such as Israel (78th), Spain (88th) and Turkey (101st)?

Although these preposterous rankings sound like an April Fool’s joke gone wrong, they are reported without qualm on page 287 of the WEF Information Technology Report 2014. Even a cursory analysis of the faulty ranking methodology the WEF employed shows how it is possible to arrive at these outlandish “rankings.” The WEF asked between 30 and 100 business executives in each country to answer questions (relating only to their own country), using a scale of one to seven to record their perceptions, with one representing the worst possible situation and seven the best possible situation.

The question relating to maths and science education was phrased as follows: “In your country, how would you assess the quality of maths and science education in schools?” with “one” being “extremely poor – among the worst in the world”, and “seven” being “excellent – among the best in the world”.

In South Africa, 47 business executives were surveyed for these rankings. On the question relating to maths and science, the average score among these 47 executives was 1.9, indicating that the vast majority of these South African business executives believed that the quality of maths and science education in the country was “among the worst in the world.” Yet this is really just a measure of the perceptions of these 47 businessmen, as the department of basic education has correctly pointed out.

By contrast, when the 55 Malawian and 85 Zambian business executives were surveyed, they were more optimistic about the maths and science education provided to students in their countries, yielding average scores of 3.2 and four respectively.

This explains why Malawi ranks 113th and Zambia ranks 76th whereas South Africa ranks 148th. Yet we know from objective cross-national standardised testing in the region that Zambia and Malawi are two of the few countries that South Africa actually does outperform.

Clearly the ratings given by these business executives are subjective and dependent on their particular mental reference points, which obviously differ by country. These 47 South African executives were not asked to rank South Africa relative to other specific countries – such as Madagascar, Malawi or Mali – only relative to “the world”.

Although the perceptions of business executives are important in their own right, it is ludicrous to use these within-country perceptions to rank “the quality of maths and science education” between countries; particularly when we have objectively verifiable, cross-nationally comparable scientific evidence for maths and science performance for at least 113 countries.

Looking at South Africa specifically, we participate in two major cross-national standardised testing systems that aim to compare the mathematics and science performance of South African students with that of students in other countries. The Trends in International Mathematics and Science Study (Timss) tests grade eight students from middle- and high-income countries, and the Southern Africa Consortium for Monitoring Educational Quality (Sacmeq) study tests grade six students from 15 countries in sub-Saharan Africa.

Worse than South Africa
Of the countries participating in Sacmeq, South Africa came 8th in maths, behind much poorer countries such as Kenya (2nd), Swaziland (5th) and Tanzania (3rd), but ahead of Mozambique (10th), Namibia (13th), Zambia (14th) and Malawi (15th). Although this situation is no cause for celebration, it does show that these countries – which outrank South Africa in the WEF rankings – are in fact doing worse than South Africa in reality.

If we look beyond Africa to the Timss rankings, South Africa performs abysmally. Of the 42 countries that participated from around the world (including 21 developing countries), South Africa came joint last with Honduras in 2011. This should shock us to the core. But it does not mean that we have the worst education system in the world. Rather, we have the worst education system of those 42 countries that take part in these assessments.

There is a big difference. Only 21 developing countries took part in these assessments, but there are around 115 developing countries in the WEF tables. The fact that Mali, Madagascar, Liberia and Haiti (for example) do not take part in these assessments means that business executives in these countries have very little reliable information on the quality of education in their countries.

In South Africa the basic education department has wisely chosen to take part in these assessments so that we have reliable information on the performance of our education system, however low that performance might be.

Continuing participation
This is one thing that the department should be commended for –that is, for continuing to participate in these assessments, which provide valuable information, despite being lambasted by their findings.

Perhaps the best example of how flawed the WEF methodology is is illustrated by comparing Indonesia and Japan on the WEF rankings and on the well-respected Organisation for Economic Co-operation and Development’s Programme for International Student Assessment (Pisa) rankings, which also tests math and science, as does Timss.

In the WEF rankings, executives in Indonesia and Japan both gave an average score of 4.7 for the quality of maths and science education in their respective countries. This placed Japan 34th and Indonesia 35th of the 148 countries. Yet, of the 65 countries participating in the 2012 round of the Pisa maths and science testing, Japan came 7th (out of 65) and Indonesia came 64th. Go figure.

Although there are some early signs of improvement in the South African education system, we know that things remain dire. South African students perform worse than all middle-income countries that participate in assessments, and even worse than some low-income African countries.

But to claim that South Africa has the worst quality of maths and science education in the world, and to use executives’ perceptions over scientific evidence to do so, is irrational and irresponsible.

The WEF has seriously undermined its own technical credibility by reporting these ridiculous education rankings. Until it rectifies its methodology, no one should take the rankings seriously.

We need more than a stab in the dark (M&G article co-authored with Hamsa Venkat)

maths teacher

We need more than a stab in the dark” – Hamsa Venkat & Nic Spaull

[This article first appeared in the Mail & Guardian on the 8th of August 2014.]

Almost everything that is associated with mathematics in South Africa is either contentious or depressing or both. One could talk about the flawed World Economic Forum rankings, the confusion around the pass mark in matric, or the fact that only 3% of Grade 9 students reached the “High” or “Advanced” mathematics benchmark in the 2011 round of international student testing in Timss. However it is not our intention to bang the now familiar drum of low and unequal performance – the refrain that best characterises our schooling system. Of course we need to know how bad things really are, but we also need to know why they are so bad, and perhaps more importantly how we get ourselves out of this quagmire.

In grappling with these issues we believe that the national discourse around schooling needs to turn towards our most critical resource: teachers. No education system can move beyond the quality of its teachers. At its most basic level this is essentially what schooling is; the student and the teacher in the presence of content. Harvard’s Professor Richard Elmore has argued again and again that there are really only three ways to improve student learning at scale: (1) raise the level of content that students are taught, (2) increase the knowledge and skills that teachers bring to the teaching of that content, or (3) increase the level of students’ active learning of the content. In the South African context the evidence points towards huge deficits in the latter two areas: teacher content knowledge and pedagogical skill as well as low levels of curriculum coverage and cognitive demand.

Without ambiguity or the possibility of misinterpretation, all studies of mathematics teachers in South Africa have shown that teachers do not have the content knowledge of mathematics needed to impart to students even a rudimentary understanding of the subject. Unfortunately, almost all of these studies have been small-scale localised initiatives aimed at testing teachers in only a few schools or at most in one district. One recent exception was the 2013 analysis by Nick Taylor and Stephen Taylor of the SACMEQ 3 (2007) data – the most recent nationally representative data on teacher content knowledge. At the end of their paper they concluded that, “The subject knowledge base of the majority of South African grade 6 mathematics teachers is simply inadequate to provide learners with a principled understanding of the discipline.” In a paper we published this week we extended Taylor and Taylor’s work and analysed the nationally representative SACMEQ data from a curricular perspective. We wanted to know what grade 6 mathematics teachers know relative to the curriculum that their students are expected to master (CAPS).

This is important to determine what level in-service and pre-service teacher training should focus on. Preliminary results from a Joint Education Trust study show that pre-service training courses offered by five South African institutions had large differences in the amount and the nature of mathematics on offer. Furthermore, in-service education is commonly piecemeal, and frequently related to ‘managing’ the curriculum and assessment rather than with promoting understanding and communication of mathematics.

The findings from our analysis were sobering. Based on the 401 Grade 6 teacher responses in the SACMEQ 3 sample, we found that 79% of South African grade 6 mathematics teachers have a content knowledge level below the grade 6/7 level band even though they are currently teaching grade 6 mathematics. It is also worth noting that our definition of grade-level-mastery was a relatively low benchmark – teachers only needed to score 60% of the items in a grade band correct to be classified as competent in that band. Breaking this grade band analysis down further, the following patterns of results were seen:

  • 17% of the teachers had content knowledge below a grade 4 or 5 level
  • 62% of the teachers had a grade 4 or 5 level of content knowledge
  • 5% of the teachers had a grade 6 or 7 level of content knowledge
  • 16% of the teachers had at least a grade 8 or 9 level of content knowledge

Our analysis also confirmed particular weaknesses on problems relating to ratio and proportion, and multiplicative reasoning more generally – the kind of thinking that underlies many tasks involving fractions and decimal working as well.

While sobering, this analysis is useful for policy purposes and useful for thinking pragmatically about primary mathematics teacher education and development. The results suggest the need to begin work at the level of concepts at lower levels (Grades 4 and 5) in order to build more solid foundations of key ideas, rather than starting with higher-level mathematics.

We would argue that many of the problems we see in South African schools often have their roots in low levels of teacher content knowledge. When teachers lack confidence in the subject they are teaching this leads to two consequences. Either they do not cover those parts of the curriculum with which they are uncomfortable or they restrict classroom interactions to low-level problems that limit students’ opportunity to learn. Gaps in content knowledge also lead to highly disconnected mathematics teaching. This works against helping students to see connections between mathematical ideas, connections that are important for flexible and efficient problem-solving.

There are some signs of mobilization in the education field. The Association of Mathematics Educators of South Africa established a mathematics teacher education group in 2013 and has begun gathering information on pre-service course offerings. The Joint Education Trust study nearing completion is doing the same for the Intermediate Phase level. The Department of Basic Education has started preliminary work on developing tests which can be used to identify which teachers have critically low levels of content knowledge. All these initiatives are commendable and show promise, but the key obstacle to progress remains a lack of evaluation of in-service teacher training programs.

We know that content knowledge is not the whole story: good mathematics teaching requires a host of practical and interactional skills, but deep and connected content knowledge is a critical base. In researching our paper, we were unable to find evidence of any intervention that has been shown to raise mathematics teacher content knowledge at any scale in South Africa. Not a single one. Programs need to be piloted and evaluated before they are scaled up and only scaled up if they actually work. They should also be evaluated at different scales. Models that work for 10 schools may not work for 100 schools. What works in Gauteng may not work in the Eastern Cape. In the absence of rigorous evaluation we are shooting in the dark on a wing and a prayer. Our teachers deserve better.

There are moves towards more open discussions about problems related to teachers’ mathematical knowledge and greater consensus around the need for longer term interventions and evaluation of our development models and efforts. We believe that our findings and those of others, contentious as they might be, are important to face and acknowledge if we are to develop intervention models and content that build from the ground as it currently stands towards the improved mathematical outcomes that we all so desperately want to see.


Professor Hamsa Venkatakrishnan holds the position of SA Numeracy Chair at Wits University. Nic Spaull is an education researcher in the Economics Department at Stellenbosch University. Their joint research paper can be found at: 

Links to all my presentations 2012-2014

Job Vacancy: Education Innovation Researcher/Programme Manager



In the interest of getting the best people into education I am reposting a job vacancy for a Research/Programme Manager for Education Innovations program at the Bertha Centre at UCT’s Graduate School of Business (see details below). If you have a job-vacancy in education that you’d like to advertise on the blog please send me an email and I’ll post it.


The Bertha Centre for Social Innovation and Entrepreneurship at the UCT Graduate School of Business is advertising for a job vacancy (see documents here and here). The role would involve managing (HR, budgets, reporting etc.) the Center for Education Innovations ( team and leading the research into policy, emerging trends and impact within the education sector in Southern Africa. If you know of any potential candidates, please circulate within your network.

Closing date: 14 July 2014

Please see attachments: UCT Application form and Job Advertisement, email any queries and applications to

Many Thanks,