What have statistics got to do with the coalition’s education policy?
This isn’t an article about whether the coalition’s education policies are working – it will take longer to get enough statistics for that one. Instead, it is an article about schools minister Nick Gibb getting in a statistical muddle. He’s had a bad six months.
This isn’t just a misplaced typo here and there. This is about using statistics to support a specific policy issue. But getting it wrong.
Spring marks the stressful announcement of secondary school place allocations and around one in six did not get their first choice of school this year. The Daily Mail quoted Mr Gibb in April: ‘These figures expose the fact that there simply aren’t enough good schools.’ They do nothing of the sort.
If parents are faced with a choice between an outstanding school and a good school, it would be reasonable to pick the former as first choice. If many parents think the same, a number of children are likely to end up in the nearby ‘good’ school instead of the outstanding one. It’s not their first choice, but that doesn’t mean they’ve ended up in a bad school.
My first choice of holiday might be the Bahamas; it doesn’t mean my second choice is a disappointment. Now, it may well be true that there are not enough ‘good schools’ – but can you ever have enough of a good thing? Either way, this is not the statistic to use to shed light on the matter of school performance.
A second example sees Mr Gibb at loggerheads with his professionals. In August, a data release revealed that around 900 pupils were suspended each day for attacks or verbal abuse. Mr Gibb worried it revealed weak discipline in schools; the Association of School and College leaders argued it demonstrated strong discipline. Who was right?
They were both wrong. The number of suspensions, on its own, tells you nothing about whether discipline is strong or weak. To judge that, we would need more information, for instance an assessment of how many ‘suspension-worthy’ activities happened in the first place and how tightly they were tackled (whatever that means). This is difficult data to obtain. Nonetheless, even if a phenomenon is hard to analyse, that is no excuse for using inappropriate data to draw conclusions about it.
A final example is the most important. The expansion of Labour’s academies programme is a key brick in building the coalition’s school of the future. Academies are free from local authority control, accountable directly to ministers. As described to parliament, the idea that schools with more autonomy achieve better results is central to this ambition, an idea seemingly borne out by the performance of academies under Labour.
However, Labour’s academies were to be sponsored schools, new partnerships brought in to turn around failing schools when all else failed. The new academies are not the same, they are not low-performing schools – in fact, it is schools that are outstanding or good that are being actively encouraged to become academies and break off their local chains.
The coalition is hungry for statistics that show its academies are working. Last month it found some.
Provisional 2011 GCSE results show academies with a 5.3 percentage point increase in pupils achieving 5 or more GCSEs including English and Maths, vs a 2.6 percentage point increase when averaged across all state-maintained schools. Mr Gibb was proud: ‘Academy status gives professionals the freedom they need to do their job and today’s figures show that that autonomy works.’
The only problem is that these figures do not show that autonomy works. They certainly demonstrate a difference between academies and other state schools, but they don’t suggest why this is the case. It may be autonomy. It may not. Here’s a couple of alternatives.
Could it be the role of external sponsors? The comparison used by the DfE is not between ‘all academies’ and ‘all state schools’ but only those 166 academies for which two years of data exist – i.e. the sponsored academies of the Labour government. Could it be that it’s the role of a sponsor, who typically brings new energy, money and expertise, to support the struggling school, that makes up some of this difference?
More likely is the fact that these schools have different starting points. These academies were low-performing schools – and their 2010 results show it. Academies started with 40.6 per cent of pupils achieving the target in 2010, whereas all maintained schools started from 55.2 per cent.
Is it possible that it is easier to improve grades when you start from a low base? The statistics blog which this article is draws on (see below) certainly thinks it’s a possibility. In fact, if the analysis it provides is correct it suggests the improvement can be entirely explained by this difference in starting point.
It may be autonomy that’s all important in making schools perform better, but these statistics don’t help us decide. It becomes, at this stage, a matter of judgement, of political opinion what is really making the difference.
Why do these statistics matter? After all, the elected government has a right to its political judgement and to make political decisions, regardless of statistics. We all know how the ‘damned lies’ wear many guises.
It comes down to evidence-based policymaking. If statistics are used as an afterthought to support a given policy position or tortured into supporting conclusions they shouldn’t, they lend a specious strength to policies that are, in truth, based on political judgements. It would be more honest to let those political judgements stand on their own strengths, without a muddle of statistics distracting them.
—————————————————————————————
This article draws on individual reports from the peer-reviewed statistics website, www.statistics-anonymous.com For more detail behind the concerns described here, please read the original reports as linked throughout this article.
—————————————————————————————
Actually, in areas like mine if you choose an Outstanding school as first choice then the Good school won’t let you in. You have the option of either risking choosing an Outstanding school and ending up with a Bad school or going for a Good school in the first place.
On a separate note, you can always find stats to back an argument one way or the other… the fact is Academies get better statistical results because they 1. push the kids towards less academic subjects, 2. get their students to take GCSEs in Year 10 and then spend Year 11 doing retakes. I don’t approve of the government taking an active policy to promote schools who game their statistics in this way.