For those following the SA media you will have noticed that South Africa has once again succumbed to the World Economic Forum education rankings published every year. This year South Africa ranked 148/148 on the “quality of math and science education.” I was sufficiently angered by these preposterous rankings that I have written an article for the Mail & Guardian which will hopefully come out next week Friday. But in the mean time Martin Gustafsson (of ReSEP and DBE) has written an excellent piece on the matter which I have re-posted below (original here). In short, things are bad, but not that bad.
Each year the World Economic Forum releases its Global Competitiveness Report which aims to “assess the competitiveness landscape” and “provide insight into the drivers of their productivity and prosperity.” They furthermore claim that this report “remains the most comprehensive assessment of national competitiveness worldwide.” Included in the report is an indicator of education quality where South Africa performs extremely poorly (133 of 148).
Much of the work conducted at RESEP focuses on education in South Africa, the quality of that education and the links between the schooling system and the labour market. Martin Gustafsson, one of the researchers at RESEP, has looked into the WEF rankings on education and discusses four salient features which explain why the WEF rankings on education are especially problematic.
1. Once again, in 2014 the Global Competitiveness Report of the World Economic Forum (WEF) has caused a stir in South Africa as, despite a relatively good overall competitiveness ranking (53 out of 148 countries in the 2013-2014 report), a few indicators related to government service delivery, in particular education, put the country amongst the worst in the world, and for some indicators at the very bottom of the ranking. Specifically, in terms of the ‘quality of primary education’ we are at position 133 out of 148; we are at position 122 for the net primary enrolment ratio; position 146 for ‘quality of the educational system’, and position 148 (bottom of all countries) for the quality of mathematics and science at the higher education level. The 2013-2014 report does not really present anything new. Figures in reports from earlier years are very similar.
2. With regard to the educational quality indicators, it is important to bear in mind that the WEF does not make use of any standardised testing system in producing its report. Instead, it makes use of an expert opinion approach. In the case of South Africa, around 50 respondents, all from the ‘business community’, are asked to rate the quality of education along a seven-point scale from very good to very poor. One would expect the South African respondents to rate the quality of South Africa’s schooling poorly for a number of reasons. One is that in South Africa we have good data on our educational quality relative to other countries. In particular, the internationally comparable TIMSS testing system has consistently placed South Africa last, or almost last, with respect to lower secondary school mathematics and physical science, amongst the around 20 developing countries that have participated in TIMSS (the other African countries participating have been Botswana, Egypt, Ghana, Tunisia and Morocco). However, there are around 150 developing countries in the world (around 115 included in the WEF tables), many of which have very poor information on the quality of their education systems. One suspects that business experts in these countries would not rate their educational quality too poorly as they simply do not have the required information. In SACMEQ 2007, South Africa came eighth out of 15 countries in Grade 6 mathematics. It is noteworthy that although Lesotho did considerably worse than South Africa in SACMEQ, its WEF ranking for the quality of primary schooling indicator is 120, against 132 for South Africa. This illustrates the problem with subjective data on a matter which is relatively amenable to measurement. I, and others, have tried ‘sewing together’ reliable test-based indicators of education quality, by taking advantage of the fact that some countries participate in more than one international testing system, thus creating overlaps. I found that South Africa’s quality of school education ranks position 106 out of 113 countries, so clearly very poorly. However, just looking at the 88 countries not included in my list of 113 suggests many are poor countries with little in the way of credible testing systems and probably a quality of education below that of South Africa. The bottom line is that test-based data suggest that indeed South Africa’s quality of education requires a lot of fixing, and is well below where it should be, yet the catchy slogan that we are ‘at the bottom of the world’ is not supported by the evidence.
3. With regard to the primary enrolment ratio, it is important to note that UNESCO’s enrolment ratios (the data source for the WEF) are widely regarded as problematic and often not amenable to useful international comparisons due to the fact that UNESCO calculates its ratios using official enrolment totals and official population totals, in other words information from very different data sources. In many developing countries there are strange discrepancies between the two sets of data. The problem for South Africa is that this discrepancy works in the reverse direction compared to most other developing countries. In South Africa, total population figures for children are simply too high compared to the enrolment totals. In most developing countries, the problem is that enrolment totals are inflated. South Africa’s enrolment ratios in the UNESCO reports appear to be relatively poor, but this means nothing and has confused a lot of people. Enrolment ratios derived from household surveys are a lot more reliable and these indicate that South Africa’s enrolment ratios, at least at the primary and secondary levels, are good by international standards. There is an abundance of literature that shows this. The WEF report itself points to the strangeness of the enrolment ratios it uses. According to the report, at the primary level our enrolment ratio is ranked position 122, but at the secondary level it is ranked 55. This raises an obvious question: How can enrolments at the secondary level be relatively good when at the primary level they are poor, yet the former depends on the latter?
Part of Martin’s PhD research involved developing a method to compare the performance of countries on different (sometimes non-overlapping) international assessments of educational achievement. His 2012 Working Paper “More countries, similar results. A nonlinear programming approach to normalising test scores needed for growth regressions” can be found here.