Important new education research

holy

    • Ursula Hoadley and Joe Muller have just published their important paper looking at assessment in South Africa “Visibility and differentiation: Systemic testing in a developing country context” (Curriculum Journal, 2016)- I prefer their earlier title “Testing testing: Investigating the epistemic potential of systemic testing” (Un-gated draft-version of that paper here).
    • Why has large-scale standardised testing attracted such a bad press? Why has pedagogic benefit to be derived from test results been downplayed? The paper investigates this question by first surveying the pros and cons of testing in the literature, and goes on to examine educators’ responses to standardised, large-scale tests in a sample of low socio-economic status (SES) schools in the Western Cape, South Africa. The paper shows that teachers and school managers have an ambivalent attitude to tests, wary of the reputational costs they can incur, but also curious about the differentiated picture test results can give them as they learn to ‘read’ the underlying codes embedded in the results. The paper concludes that a focus on what tests make visible and a recognition of the pedagogic agency of teachers points to potential pedagogic benefits of systemic tests.

    • Craig Paxton has finally finished his PhD thesis “Possibilities and constraints for improvement in rural South African schools” (UCT, 2015). This is on my to skim/read list together with Eric Schollar’s PhD (see below)
    • Part of Craig’s PhD abstract:”Rural South African schools face a complex mix of challenges, which make improvement a daunting task. Not only do schools deal with the time, place and space issues that face rural schools worldwide, but in addition they contend with a legacy of severely deprived schooling under the apartheid system. Using the framework of the Five Essential Supports, developed by the Consortium on Chicago School Research, together with Bourdieu’s notions of habitus and doxa, this thesis examines what improvement might mean in this deeply disadvantaged context. The five supports – leadership, learning climate, school-community ties, ambitious instruction and professional capacity – are contextualised to account for both the rural setting and the peculiarities of education in South Africa’s former homeland communities. Alongside this largely quantitative framework, Bourdieu’s conceptual tools are brought to bear, offering an alternative perspective that makes sense of the complex forces produced by history and rurality

    •  Pritchett’s new (2015) RISE Working Paper “Creating Education Systems Coherent for Learning Outcomes.” This has been quite an influential paper for me. Although in the South African context I would almost always add “Capacitation” to his four criteria Delegation, Financing, Information, Motivation. Lant also has a great (and scathing) critique of meta-analyses of quantitative studies:
    • If one were to take this approach of “rigorous evidence” at face value then there is rigorous evidence that nothing in the conventional wisdom actually works. There is rigorous evidence that giving out textbooks doesn’t matter, there is rigorous evidence pay for performance doesn’t matter, there is rigorous evidence that class size doesn’t matter. Of course there is also rigorous evidence that all these elements of the conventional wisdom do matter. The usual approach of doing a “systematic review” of the literature that simply counts studies (in a quality weighted basis) is not at all helpful. Suppose that context A is a system coherent for learning—so that teachers know what students should learn, that learning is measured on a regular and reliable basis and teachers are motivated to achieve high student learning—and class size is reduced. Let’s assume that learning improves (as there is RCT evidence from the USA, for instance, that this is true). Context B is a system coherent for schooling only. Class size is reduced. Let’s assume learning doesn’t improve (as there is RCT evidence from Kenya, for instance, that this is true). Suppose the only two studies in the systematic review were USA and Kenya. Then the conclusion would be that “class size improves student learning in 50 percent of the studies.” Now suppose that 8 more rigorous studies were done in the USA so that a systematic review would conclude “class size improves student learning in 90 percent of studies.” Suppose, in contrast, 8 more studies were done in Kenya. Then a systematic review of the rigorous evidence would conclude “class size improves student learning in 10 percent of the studies.” All three statements are equally worthless. The (assumed) truth is that “class size improves performance in context A but not in context B” and hence unless one knows whether the relevant context is A or B the systematic review finding of impact in 50 percent, 90 percent or 10 percent of the studied cases is irrelevant.

  • Glewwe & Muralidharan’s new (2015) RISE Working Paper “Improving School Education Outcomes in Developing Countries” they find that:
  • Interventions that focus on improved pedagogy (especially supplemental instruction to children lagging behind grade level competencies) are particularly effective, and so are interventions that improve school governance and teacher accountability

Matric 2015 standardisation matters

denial

OK so I got a little frustrated with explaining the whole matric standardisation vibe to a million people, so here’s the deal once and for all…

Soon after the matric results jamboree ended and people went back to work, there were a few unanswered questions about how the matric exams of 2015 were standardised by Umalusi. Already in September of 2015 I was asking some Departmental officials whether the Minister was going to report the matric results of progressed and non-progressed learners separately (which is what I would’ve done). The logic being that the public (wrongly) view the matric exam as the main indicator of the health of the education system and that if progressed learners were lumped with non-progressed learners there would be excess pressure to ensure that the matric pass rate did not drop too much. But this is not the approach that the DBE took. I also emailed someone in Umalusi to ask how they were planning on doing their standardisation given that the two groups of matrics (2014 and 2015) were so different, with the latter having many more students due to the progressed learner policy.

After comparing the publicly-available Umalusi statement and the publicly available DBE NSC Technical Report 2015 it became possible to see how large the Umalusi adjustments were this for 2014 and 2015 for nine subjects – see table below.

Screen Shot 2016-01-24 at 1.27.52 PM

I won’t rehash my full argument here (if you’re interested read my article Matric Cracks Starting to Show). The yellow highlighted subjects are those that had big jumps in enrolments, for example there were an extra 76,791 learners taking maths-literacy in 2015 compared to 2014. Notice that the pass rates increased substantially between the raw pass rate and the pass rate after (1) Umalusi adjustment, (2) language compensation, (3) school-based assessment. The gist of that article was to say that the progressed learners of 2015 were not properly accounted for in the Umalusi standardisation process and that the most logical reason for a drop in performance of the raw marks was the inclusion of extra (weaker) learners, i.e. progressed learners, rather than a much more difficult exam.

bother

Subsequent to my article, the CEO of Umalusi, Dr Rakometsi, wrote a reply titled “Promoted pupils had no big effect on matric rates” and clarified a number of issues. For the sake of brevity I will summarize the most salient points here:

  • Umalusi was told by the DBE that there were only 66,088 progressed learners
  • If one excludes these 66,088 progressed learners then the failure rate within subjects increased by between 1% and 4%.
  • He confirmed that “the pass rate on raw marks is at 38% for maths-literacy
  • The maximum adjustment that Umalusi can make is 10 percentage points which was applied for mathematics literacy “because the paper turned out to be more difficult in 2015 compared to previous years. As a results of this maximum adjustment, 27% of learners who scored between 20-29% obtained a pass”
  • One paragraph in the article is particularly important and so I quote it verbatim:

“The report indicates that the impact of progressed learners to the individual subjects was minimal. As a result, there was no basis to standardise the results of the progressed learners separately. What we call progressed learners is actually only the KNOWN progressed learners. The argument that there were more is an assumption. Umalusi can only work on the information before it, not on assumptions and extrapolations.”

From the above we can draw two important conclusions:

(1) The 66,088 progressed learners were not excluded when the results were standardised relative to earlier years, despite knowing that these learners were weaker students. This seems totally bizarre. We know that these are weaker learners, why would we include them in the norm-referencing curve and compare to earlier years were these students did not exist? Even if they only contributed to a drop in the pass rate of between 1-4% why were they excluded?

2) (the most important conclusion) Umalusi only looked at the 66,088 “officially” progressed learners and ignored all the other information suggesting that there might be additional weaker learners who were actually progressed learners but were not captured as progressed learners, what I called “quasi-progressed” learners in my article. We know that provinces are not recording who is a progressed learner with the same accuracy.

Perhaps the most telling evidence is just to ask how many extra matrics there were in 2015 compared to 2014? The answer is 111,676 (644,536 in 2015 compared to 532,860 in 2014). But if there were only 66,088 progressed learners, where did the remaining 45,588 learners come from?

dissolve

Some have suggested that it’s from a change of age policy that happened in 1998, but that lead to a small cohort in 2011/2012 not now, as Stephen Taylor has shown using EMIS data. The table below (taken from here) shows the different enrolment numbers by year and grade. What we can see is that the matric class of 2011 was always going to be small  If we look at the matric class of 2011 there were 534,498 learners in matric and only 930,019 learners in grade 8 four years earlier. Basically we knew that the matric class of 2011 was going to be smaller. Whereas if we look at the matric class of 2015 (with 687,230 learners according to this table) this is unexplainably big. If we look at the grade 8 cohort of 2011 we see that there were 1,008,110 which is only about 7000 learners more than the grade 8 class of 2010. So how are we to explain the massive difference we see when we compare the 2014 and 2015 matrics (111,676)?

Screen Shot 2016-01-24 at 2.04.53 PM

In my mind the answer is straight-forward – the extra learners in matric 2015 are the direct result of trying to decrease grade repetition by “promoting” weaker learners into higher grades rather than fail them. If this is correct then we needed to exclude the full 111,676 learners when standardising relative to earlier years. Umalusi will (and has) argued that this was not possible and that they did not even try to take into account of quasi-progressed learners.

head banging

So those of you that’ve read this far might be asking “So who cares? Why is this even important?” and the answer is that it matters a lot for universities and the labour-market if Umalusi gets this wrong.  If the standardisation process assumed that a drop in the raw marks was only due to an increase in test difficulty (which is what Umalusi did) when a more plausible explanation was that it was because we included an extra 21% of weaker learners, then the real value (and signal) of a basic matric is actually declining over time.

On page 171 of the 2014 Ministerial Task Team Review of the NSC we read the specifications of when Umalusi can and cannot adjust matric marks:

“Reasons for differences may include: cohort differences, changes in curriculum, changes in setters of the examination papers, disruptions in the schooling processes and security leakages. In the absence of evidence of such possible reasons, it is then generally accepted that the differences are due to deviations in the standards of the examination or marking and therefore marks are adjusted to compensate for the deviations (Umalusi 2007a, 29).” [emphasis added]

Personally I do think that some of these tests increased in difficulty, but it is ludicrous to think that adding 21% more students who are KNOWN to be weaker students would not decrease the marks. Also this is the first year where basically all adjustments were upward. There was not a single downward adjustment. Coincidence much?

thats messed up

Just because Umalusi could not identify the quasi-progressed learners doesn’t mean they can just ignore them. Hence the cartoon above. It would seem that Umalusi has essentially said “Yes we can see that the cohort is much bigger. Yes we can see that there was a clear policy intervention to progress weaker learners. Yes we can see that the official numbers of progressed learners do not match with the full increase in the size of the cohort. But we are going to pretend that there were only 66,088 progressed learners. We refuse to accept any other reality because we can’t do anything about it anyway so what’s the use in knowing.

The fact that the marks were pushed up primarily at the bottom (probably too much) means that students passed in 2015 who would not have passed in 2012. It means students have qualified to study basic degrees in 2015 who would not have qualified if they wrote in 2012. So, if I’m right, any of the following could result:

  • There will be a flood of applications for degrees and diploma’s that require the lowest levels of matric points. Thousands more students will have ‘met the criteria’ and the universities would not have not anticipated this. In fact I have already heard that UNISA has had an unprecedented increase in applications, Edgewood at UKZN has been swamped. Damelin has a huge spike in applications. If I’m right then these students should never have qualified for university and will fail. They might incur debt, move province, make life decisions based on incorrect signal. (Let us not speculate on how the surge in applications for NSFAS will stoke the FeesMustFall fire and mean that there is less to go around and more angry students with hopes and dreams that the State cannot fulfil.
  • Universities will see students that are not even remotely prepared for higher academic study and will have to increase their access programs and expect higher failure rates.
  • As a result of the above the universities will increase the matric-point requirements for entry into their programs for 2017 (particularly programs like B.A, B.Soc-Sci, B.Ed etc.). They will also start to rely more on the National Benchmarking Tests in their selection criteria. [Sidebar, researchers should compare NBT results with matric results in 2013, 2014 and 2015 to see if there are any differences that might be attributable to wrongly-boosted matric marks]. 
  • The gap between the earnings of those with a matric and those with a matric+degree will grow (note it is already large, see graph below). This is largely because the adjustment was primarily at the bottom meaning there are many more students with a low-level-matric who have, in effect, lower levels of knowledge and skill than low-level-matrics of 5 years ago.

Screen Shot 2016-01-24 at 2.30.15 PM

(Source: Hendrik van Broekhuizen, 2013, here)

As I said, none of the above precludes the fact that the tests were more difficult in 2015 (although this is still speculation). I am only saying that there is no way in my mind that including an extra hundred thousand weaker learners didn’t play any part in the drop in the raw marks. And, in essence, that is what Umalusi is arguing.

oprah no

Dr Rakometsi and I will be discussing this on the Redi Hlabi show at 10am tomorrow (25 Jan 2016). It should be an interesting discussion :)

redi-tlhabi_detail.jpg

Links I liked

think

 

“Matric Cracks Starting to Show” – my ST article

crack

(The article below first appeared in the Sunday Times on the 10th of January 2016)

Like so many things in South Africa, this year’s matric results are a paradox of good policies badly implemented. This time it was the Minister’s bold ‘promotion policy’ that led to an extra 21% more learners writing matric (644,536 this year compared to 532,860 last year). The policy limits the number of times learners can repeat a grade to once every three years and means fewer students drop out, being ‘promoted’ instead. While her decisive action has led to increased efficiency and improved access, it has also inadvertently caused a huge crack in the matric standardisation process, one that is only now starting to become apparent. The fact that the Department did not properly identify all progressed learners, and that Umalusi did not (and perhaps could not) take account of all progressed learners in their standardisation process calls into question the massive upward adjustments in marks that took place behind the scenes.

As usual, some commentators have myopically focussed on the drop in the matric pass rate, from 76% (2014) to 71% (2015) as if this, in and of itself, were a meaningful indication of anything. It isn’t. Or that it signalled a decline in quality, or harder exams. It doesn’t. Yes, the matric pass rate went down but the number of learners passing it went up. And in fact the real question might not be why the matric pass rate dropped, but why it didn’t drop further. In comparing the media statement from Umalusi and the technical report from the Department, the answer is quite clear. The decision was made to raise the raw marks across the board, from Maths and Physical Science to Life Science, Maths Literacy, History, Accounting, Geography and 24 other subjects. Umalusi themselves make a point of emphasizing that this was an “unprecedented set of adjustments”. When the Maths Literacy pass rate is adjusted from 38% to the final (and publicly reported) 71%, this is most certainly unprecedented, and I would argue, unwarranted. Was the test really so much more difficult than previous years? (This is the only reason why one is allowed to adjust the marks at all). Why did the internal and external moderators not pick up the huge increase in difficulty? Is it not more plausible that the massive drop in pre-adjusted performance was actually due to the additional 112,000 weaker learners who would’ve otherwise dropped out? If so, Umalusi shouldn’t have adjusted.

This is not to say that the Minister was wrong in introducing the promotion policy. Quite the opposite; she was heeding local and international research which shows that excessive repetition is costly, inefficient and has no educational benefit to the learner. Yes, we do need to find ways of preventing and remediating the problem, but rooting out wasteful repetition in the mean time is prudent and wise. A positive effect of this policy and the extra-large class of 2015 meant many more learners taking and passing key subjects, with about 52,000 extra matric passes, 9000 extra maths passes and 15,500 extra bachelor passes.

Both Umalusi and the Department claim that there were only 65,671 progressed learners. Yet there were an extra 111,676 matrics this year. So where did the other 46,005 extra learners come from? The clear answer is that there was a big policy change preventing schools failing learners multiple times and encouraging them to promote weak learners and push them into matric. Secondly, the way provinces record and report who is a progressed learner is highly dubious and varies by province and district. So, although we have approximately 66,000 ‘officially’ progressed learners, we also have 46,000 ‘quasi-progressed’ learners (what Umalusi calls ‘borderline candidates’).

The reason why all of this matters is because it influences the decision of whether to adjust the matric results and by how much. Umalusi is only ever meant to adjust the marks up or down if they believe the exam was harder or easier than previous years. The core assumption in this standardisation process is that the different matric cohorts (2013, 2014 or 2015 matrics) are of equal ability. Thus, any differences between the years can only be because the paper was easier or harder. And this is where the crack emerges. There is simply no way that the 2015 distribution of 645,000 matrics (including progressed and quasi-progressed learners) are as strong as the distribution of 533,000 learners in 2014. Thus the reason the 2015 cohort did so much worse on the raw scores was because of the extra 112,000 weaker learners, not because the tests were harder. We know that Umalusi did not take this into account because there is no way of identifying the 46,000 quasi-progressed learners. In Umalusi’s defence they couldn’t have excluded them even if they had wanted to because provinces didn’t record them. But it doesn’t seem Umalusi excluded these 112,000 (or even the 66,000) learners when they standardised the 2014 and 2015 distributions. This is illogical.

In an unusual change from previous media statements, this year Umalusi included the raw failure rates of subjects (i.e. before any adjustments). This can be compared to the marks in the technical report issued by the Department. The only difference between the two figures are the Umalusi adjustment, a small change due to school based assessments and a small language compensation for second language learners (extra 4 percentage points). When I refer to ‘raw’ and ‘final-adjusted’ pass rates I mean before and after these are accounted for. The three subjects I will focus on here are Maths Literacy, Geography and Business Studies since they all have big increases in enrolments which suggests these were the subjects taken by the progressed and quasi-progressed learners. The differences between the raw pass rate and the final-adjusted pass rate are large for Geography (increased from 66% to 77%), for Business Studies (increased from 54% to 76%) and especially for Maths Literacy (from a shockingly low 38% to 71% after adjustments!). For a national assessment these are incredibly large adjustments.

This could only be justified if the 2015 exams were extraordinarily more difficult in 2015 than in 2014. I simply do not buy it. The internal and external moderators all agreed that these exams were set at the appropriate level. To warrant adjustments of this magnitude they would have had to have been way out in their judgements. Why are we looking for alternative explanations for the big drop in raw marks when this one is staring us in the face? The most logical and obvious reason for the drop is the inclusion of an extra 112,000 weaker learners in 2015. Paper difficulty is marginal by comparison. In maths literacy alone there were 76,791 extra candidates in 2015. Where did these learners come from? It is clear that these are the weaker progressed and borderline candidates and that they are the main reason why the raw marks dropped so much. If so then we cannot just adjust the raw marks upwards, as was done this year.

The Umalusi standardisation process is necessary and probably the best we can do when different papers are written year-on-year, but Umalusi needs to clarify what happened here and in future be more transparent in their standardisation process. Unfortunately, no amount of standardisation can solve the biggest problem in our education system which is the fact that most children attending the poorest 75% of schools do not learn to read for meaning by the end of grade three and are forever behind. Indeed, matric starts in grade 1.

Dr Nic Spaull is an education economist at Stellenbosch University. He can be found on Twitter @NicSpaull and his work can be found at nicspaull.com

Guest blog: Gabi Wills on “Teacher union membership in SA”

triangles

Below is an Extract from Gabi Wills forthcoming PhD thesis:

Chapter 4: Teachers’ unions and industrial action in South African schooling. Exploring their impacts on learning. In Wills (2015, forthcoming) An economic perspective on school leadership and teachers’ unions in South Africa”

Teacher union membership in South Africa

During apartheid, the provision of unequal education to race groups was an instituted policy mechanism to supress the majority of South Africa’s black population. Most notoriously, black people were intentionally provided inferior education through the then ruling party’s “Bantu education”[1] policies. Separate education departments, divided along racial lines, implemented not only distinctive curricula for students but distinctive forms of authority over teachers. As noted by Chisholm (1999), control over white teachers was largely professional in nature where they were consulted in the formation of curricula and given a degree of autonomy in work. By contrast, control over black teachers was intentionally bureaucratic and authoritarian in line with state intentions for social control. Black teachers were closely monitored by inspectors, subject advisors and other representations of white subjugation. In the late eighties, however, large political opposition arose to apartheid in general and particularly its unjust education policies (Govender, 2004). The linkage with the apartheid state of bureaucratic controls over teachers generated considerable teacher resistance which persists today.

As a rough estimate, two thirds[2] of all persons in education (including administrators, management, support staff and privately employed personnel in schools) are formally identified as members of a teacher union in South Africa. In absolute terms, this represents 380 000 members using 2012 data where membership rates and choice of teacher union differ across provinces.

Screen Shot 2015-12-24 at 9.15.41 AM

Figure 4.1: Teacher union membership in South Africa, 2012

If one limits the national teacher union membership estimate to only teachers this estimate is likely to be higher. Armstrong (2014: 4) using the Labour Force Surveys between 2000 and 2007 identified that roughly 76 percent of teachers in South Africa are union members. What these national estimates do not recognise is the interesting provincial dimension to union membership in the education sector which is highest in provinces such as the North West, Limpopo, KwaZulu-Natal, Mpumulanga and the Eastern Cape but notably lower in Gauteng Province and the Western Cape.

There are various different teacher unions in South Africa, but by far the dominant union is the South African Democratic Teachers’ Union, most commonly referred to as SADTU. Audited 2012 figures indicate that their membership comprised roughly 253 000 personnel which represents two thirds of all registered teacher union members. SADTU membership has also grown substantially over the past twenty years, with membership figures in 2012 that were 2.5 times that in 1996 (Govender, 2004).[3] A clear provincial dimension exists to SADTU affiliation. Their proliferation is strongest in the Limpopo Province where figures from the Public Service Co-ordinating Bargaining Council suggest that 82 percent of all unionised education personnel in Limpopo are registered members of SADTU, compared with a figure of 48 percent in the Western Cape. The next largest teachers’ union is the National Professional Teachers’ Association of South Africa (NAPTOSA) with just over 50 000 members as at December 2012. Affiliation to this union is strongest in the Western Cape and the Gauteng Province when expressed as a proportion of unionised teachers in each province. These provincial differences in union membership are worth noting. They may have implications for differences in the balance of negotiating power across provincial chambers of the ELRC and in the functioning of provincial administration departments of education.

Considering the two largest teachers’ unions in South Africa, SADTU and NAPTOSA, both play a role in negotiating conditions of work for teachers in two sets of combined teachers unions[4] in the sector specific ELRC. Both unions fulfil a primary function as bargaining agents for their members, although on the basis of sheer vote size SADTU’s influence in negotiations is considerably more substantive. However, in balancing their secondary functions as political and professional organisations[5] they are divergent in their ideologies (Chisholm, 1999; de Clercq, 2013). Teacher unions represented in what is now NAPTOSA existed in the early days of apartheid with typically white leadership and an agenda largely concerned with the professionalism of teachers. By contrast SADTU, having emerged in direct opposition to apartheid, is understandably more militant, political and concerned with the rights of the ‘worker’ than promoting professionalism (Chisholm, 1999). Moreover, SADTU is an affiliate of COSATU – one of the three members in the tripartite ruling alliance – which prioritises their role as a political organisation over their function as a professional body. As a political organisation, their presence is extensive not only in terms of membership numbers. The organisational structure of the union facilitates an on-site presence across almost all school districts and in the majority of schools.

______________________________________________

[1] The Bantu Education Act of 1953 was the designed plan of former Prime Minister H.F. Verwoerd. In his own words he said, “There is no place for [the Bantu] in the European community above the level of certain forms of labour. It is of no avail for him to receive a training which has as its aim, absorption in the European community” (Senate, 1954). The Bantu Education system was established to educate black youth only to a level where they could operate as labourer, worker and servant.

[2] See the notes of Figure 4.1 for a description of how this figure was estimated relying on union membership figures from the Public Services Bargaining Council (PSBC). Calculating teacher unionisation rates with available data in South Africa is not straight forward, where it is not obvious what groups of education personnel are included in the PSBC figures. On the basis of a priori expectations this estimate of 66 percent seems too low but it must be noted that in both the numerator and denominator of the calculation are non-educator personnel such as provincial or district staff, school support staff and privately employed SGB or other staff members at the school level. If one were to limit the numerator and denominator to include only educators, this figure may be higher if more educators than administrators are unionised. It is also noted that some studies have erroneously attributed teacher union membership figures reported by the PSBC as referring to teachers only, when non-teachers in the education sector are also included in these figures. For example, both SADTU and NAPTOSA attract teachers in the public and private sector and other workers in the education sector to their membership base. If this is not recognised, this results in over inflated estimates of teacher unionisation as high as 90 percent in some studies.

[3] The majority of the growth in SADTU’s membership took place between 1996 and 1999 when their membership base grew from 106 000 to nearly 200 000 three years later (Govender, 2004).

[4] At the ELRC, negotiations and consultation takes place between the Employer (the DBE) and two sets of combined trade unions (CTU). The first is the CTU-SADTU where SADTU membership vote weights are combined with the Cape Teachers’ Professional Association (CTPA). NAPTOSA’s bargaining power is established through the combined ‘Autonomous Teachers Union’ (ATU) which includes a number of smaller unions including the Suid-Afrikaanse Onderwysersunie (SAOU), the National Teachers’ Union (NATU), the Professional Educators Union (PEU), the Public Servants Association (PSA) and the Health and Other Service Personnel Trade Unions of South Africa (HOSPERA).

[5] As noted by Cowen and Strunk (2014), there are three main functions of teachers’ unions. The first and most dominant role is that of a bargaining agent for member teachers and the second role is that of a political organisation advocating for teachers. As a political organisation, their function is to act as an interest group, “active not only in promoting or opposing particular pieces of legislation or administrative policy, but also as a force in national, state and local elections” (ibid, 2014: 4). The third role is that of a professional organisation, providing support to individual teachers. In particular, where teacher unions embrace their role as a catalyst for the professionalization of the teaching force, this can yield very positive impacts for educational systems. However, this role is not widely explored in relation to its influence on student achievement and altering district/national resources for education (Cowen and Strunk, 2014: 4).

Q&A with Jonathan Jansen (SAERA SIG)

jansen

The Q&A with Professor Jonathan Jansen below first appeared in the SAERA “System-wide Educational Change” Newsletter.

Q&A with Jonathan Jansen

You have recently published two new books on educational change that have received critical attention from the public, How to Fix Schools (2014) and Leading for Change (2015). What is it about your thinking that has attracted so many readers?

First of all, everyone knows that there is something wrong in education, those that work in schools and those that send their children there and the general public. So I think that it helps attract a broad readership because of the focus on the issue in the media. Second, I think people don’t just read books they read authors and so the fact that I have been in the public space a lot more in recent years largely through my column in The Times for example. This gives the books the kind of attention that they would not have gotten otherwise. Third, in both of these publications, I try not to deal with these concerns in abstract academic language but to make them as practical as possible and to be as insightful as possible for people who are not specialists. It is for them to enjoy and to spur them to action. And fourth, I try to insert a sense of hopefulness at the end to each book.

Much of your work centres on educational leadership at various levels, particularly the role of leaders in transforming relationships. Can you talk a little about how you conceptualise the relationship between leaders and leadership at institutional level and system-wide policy or programme driven change?

I have more or less given up on the ability of system leaders to be able to make change from the top. I have been around too long to believe that this is even possible.
As you know in the early days of the transition, during and after the days of NEPI, I sat on committees of government when there was excitement about being able to say something worthwhile and there was a fair chance that you would be heard by the minister or the DG. I don’t have that sense anymore. My sense is that critics and public intellectuals are being shut out of these kinds of spaces. So I made a conscious decision to engage through visible practices of leadership. Sometimes these gain the attention of those in higher authority. Idon’tthinkwehavea government that is attentive enough or serious enough about educational change to allow practice and research to inform policy. For example, the most profound piece of practical research ever done in the history of this country is the National Development Plan. The diagnosis is profound. And yet, the government seems unable to absorb the good thinking and sound data into development actions. In saying this, I have not strayed too far from my work on the symbolic functions of policy. That is not to say that I am cynical or unpatriotic. Quite the opposite. There is limited space time and space to make a difference. For me, I am able to do this, for example, by working on projects like How to Fix Schools by providing books and videos to all high schools in the country with training support where required.

This doesn’t require working directly with government at national, provincial or local levels on policy.

What do you consider to be the key research areas to focus on for young and mid-career researchers interested in leadership and system- wide educational change?

The first thing to be said to young and mid-career scholars working on system-wide educational change is that it is time for bold new ideas and approaches to the subject. In other words, instead of replicating the kinds of things senior scholars like myself, Pam Christie and Jo Muller have done, a new generation needs to ask different kinds of questions and experiment with new methodologies. The upcoming generation of scholars need to provide new insights about the persistence of certain kinds of problems built into the system of education. Second, we need to move away from the fidelity perspective in policy studies. Too many policy studies still focus on how faithfully people implement this or that policy. When there is a big gap, they explain it as the consequence of ‘stubborn teachers’ or think that policies are ‘stupid’. We need to move beyond this fidelity approach to policy studies and introduce new questions about the relationship between policy and practice. This will take us away from stale formulations. Third, I also think we need to move away from a pre-occupation with analyses of exceptional cases. When we do use them, we need take a fresh look at how they are used. Finally, I think we need to shift our focus from the problem of change to the problem of continuity.

The System-wide Educational Change SIG has had its inaugural meeting this year in Bloemfontein at SAERA. What do we need to do as an emerging research community to build scholarship in our field?

We need to get more people with great potential into the field. That must be the number one priority. We need to connect them with scholars and scholarship around the world. It is not sufficient that they do their PhD studies at UCT, Wits or the Free State. New scholars with great potential need to spend time with people like Alma Harris in the UK or Andy Hargreaves at Boston College. The capacity to speak to local problems in intellectually powerful ways means knowing about and being fluent with educational change discourse and practices in other parts of the world. This would allow us to be theoretically courageous. As you know I have often say that you cannot get very far at UCT without referencing Bernstein or Bourdieu, at UWC without referencing Marx and Frere, and Stellenbosch without van der Walt or van der Stoep. Getting out will allow people emerge from the deeply entrenched tracks of theory and method. Broadening intellectual life requires being international and comparative in focus. Advanced scholars these days need more than a PhD and for a good post-doc experience, they need wider exposure.

We know that you tweet and have a strong presence on YouTube. How should our research community be engaging with the new media space?

This is an important issue less for myself than for a new generation of scholars. I really cannot see how a new PhD scholar in the post 2015 world can work unless they are connected to a whole lot of sites and platforms for their day to day research. Online professional networks provide a platform for your ideas, for your CVs, for the circulation of your research papers. These new media spaces allow for wider audiences to actively engage with your ideas and scholarship. The people I spoke about earlier, Alma and Andy use twitter to give

the latest updates on the papers they are presenting at conferences in Malaysia and Australia. I’m amazed at how a bunch of us get tagged and can engage with their ideas through the simple medium of twitter. I have just recorded a series of lectures on leadership and research that will go on YouTube. And we will see the responses from around the world. We are certainly not going to get far unless there is a consciousness about social media and the technical ability to navigate with these new spaces.

What would you want to see us doing with the SAERA SIG newsletter?

I’m excited about the potential of the newsletter. It can allow us to know more about what everyone is doing around the country. I often get surprised about research that is going on in some institution or another, and I think of myself as being in touch. The newsletter is a forum for sharing new ideas, new questions, and new methodologies. It can also showcase the experts in specific areas. For example, Brigette Smit knows a lot about computerised qualitative data analysis. It also provides a space to highlight what is happening around the world. The newsletter can also stimulate a dialogue between up- and -coming scholars and established academics. I’m amazed at how many emails I get from young scholars asking advice and requesting to be mentored. The newsletter would provide an organised platform for this.

Links I liked (and some personal reflections)

rise

  • Taylor, N. 1989. Falling at the First Hurdle: Initial encounters with the formal system of African education in South Africa. Research Report #1. EPU. (via JET Education). – an old but important report that is not in the public domain yet (as far as I’m aware) – thanks JET for scanning this.

  • Improving learning in primary schools of developing countries: A meta-analysis of randomized experiments” – Patrick McEwan (2015) (via Servaas van der Berg).
  • The independent Task Team led by Prof John Volmink, which was appointed to look into the ‘jobs-for-cash’ scandal exposed by CityPress last year, has found that SADTU has a ‘stranglehold‘ over the State in provinces such as the Eastern Cape, Limpopo, Mpumalanga and KwaZulu-Natal. These scandals sometimes turn deadly when the ‘right’ candidate is not appointed. On this topic I would highly recommend Gabi Wills’ new article “Informing principal policy reforms in South Africa through data-based evidence.” To give you the highlight: The cohort of principals that are currently in the system are, on average, much older than they were in the past meaning that there is soon to be a wave of principal retirements. Whereas in 2004 only 17% of principals were aged 55yrs+, in 2012 that figure was 33%! If these principals retire at 60 this means that between 2012 and 2017 there will be about 7000 principal replacements! (remember there are only about 24,000 public schools in SA).
  • This latest report shows that the South African Council of Educators (SACE) is a toothless dog, as I have argued before. Earlier this year SACE ran their own investigation into the exact same jobs-for-cash scam and could not find “a single bit of evidence” that there was corruption in the appointment of teachers and principals in SA. Subsequently CityPress has claimed SADTU ‘told SACE to end their investigation” after the names of top SADTU officials started cropping up in the investigation. So how is it that SACE ran an investigation on the same issue at the same time and found no evidence while Volmink’s team found multiple examples of corruption, and 13 of the cases were so strong that they could already be passed on to the police? Go figure. Minister Motshekga needs to put a target on SACE and reform the entire organization. It is rotten through and through.
  • Holstee have come up with a set of 10 questions to ask yourself about the year that was. Reflection. Contemplation. Good stuff.
  • I’m re-reading Henri Nouwen’s “Reaching Out” – the book where he outlines his understanding of spirituality from the Christian perspective. It’s lovely, not too preachy or crispy-clean / three-bags-full-sir Christianiaty which I have little tolerance for. One quote:

“When loneliness is haunting me with its possibility of being a threshold instead of a dead end, a new creation instead of a grave, a meeting place instead of an abyss, then time loses its desperate clutch on me. Then I no longer have to live in a frenzy of activity, overwhelmed and afraid of the missed opportunity” – Anonymous in Nouwen’s Reaching Out p35

All models are wrong but some are useful.

— George Box (via Farnam Street Brain Food)

I am really enjoying poetry for the first time in a long time…

“I said to my soul, be still and wait without hope, for hope would be hope for the wrong thing; wait without love, for love would be love of the wrong thing; there is yet faith, but the faith and the love are all in the waiting. Wait without thought, for you are not ready for thought: So the darkness shall be the light, and the stillness the dancing.” – T.S. Elliott

Also Pablo Neruda.

It was also my birthday last month which started in tears and ended in champagne with a view! Ad Astra Per Aspera!

IMG_7650

Photo credit: Michael Chandler (@MrChandlerHouse)