Has DfE, including its supposedly public-minded official statisticians, been misusing data in its drive to force on primary schools its favoured policy of academy status?
The question arises since I performed an analysis that seems to raise serious difficulties about a key statistic used by a minister to defend the academies scheme.
On February 2nd, education minister Nick Gibb was confronted on BBC Radio 4’s Today programme with the findings of a report by the cross-party House of Commons Education Select Committee. The committee, following an inquiry on academies and free schools, had concluded the previous week: ‘We have sought but not found convincing evidence of the impact of academy status on attainment in primary schools.’
The minister responded that sponsored academies – generally previously struggling schools which are taken over by a ‘sponsor’ entering into a contract with the Secretary of State to run the school – were improving faster than the national average.
He said: ‘We do know sponsored academies do improve standards of education in our schools. If you look at the primary sponsored academies, they’ve seen their reading, writing and maths results improve at double the rate seen across all schools.’ He added: ‘Primary [sponsored academies]…have seen their reading, writing and maths results improve at double the rate of local authority schools.’
This seemed to mark a change of position for the DfE, which less than a year ago concluded, in its publication Academies: research priorities and questions that ‘The research evidence [is] primarily based on secondary schools and with more and more primary schools becoming academies, further evidence is needed on what drives those schools to become academies and what makes them viable and sustainable.’
So was Mr Gibb’s statement accurate? Investigating, it became clear that the source was DfE’s Statistical First Release which accompanied the publication of primary league tables on December 12th, 2014. The document is headed with the reassuring logo ‘National Statistics’. It says: ‘Attainment in sponsored academies increased by 7 percentage points between 2013 and 2014, compared to 3 percentage points in converter academies and LA maintained schools.’
This statement seemed factual enough. But doubts began to surface in my mind after digging a tiny bit further into the data.
So, the 420 sponsored academies included in the statistic did indeed improve at faster than the national rate for other schools between 2013 and 2014, rising seven percentage points in the proportion of their pupils achieving level 4 in all of reading, writing and maths, from 61 to 68 per cent. By contrast, the 13,396 non-academy (local authority) schools rose three points, from 77 to 80 per cent, while, among 1,006 converter academies – generally previously successful schools choosing to take on the status – the rise was from 80 to 83 per cent.
The immediate question, though, was whether like was really being compared with like. With both types of school, other than sponsored academies, starting with higher average scores in 2013, sponsored academies would appear to have had more room for improvement.
Another way of looking at that is to say that, clearly, the closer a school gets to 100 per cent of its children achieving level 4s in all three subjects, the less scope it has to improve on this measure; at 100 per cent, it has no scope at all.
This would seem to be a basic statistical point. Yet it was not acknowledged anywhere in this statistical release that the higher rate of progress might be at least in part a product of sponsored academies starting from a lower base. The comparison used in the release, then, might be deemed invalid. Without further information it certainly looked potentially misleading.
The fairer comparison, then, would be to look at schools with the same statistical starting points. In other words, among schools averaging 61 per cent in 2013, did sponsored academies or non-sponsored academies improve faster?
Again, there is no mention of this potential statistical comparison in the release. So I have now performed this data analysis myself, based on the DfE’s official underlying school-by-school assessment data.
Staggeringly, this seems to show that, when schools with the same starting points in 2013 are compared, sponsored academies fared worse than a comparison group of primaries in 2014.
I am not a professional statistician, and the analysis below is rudimentary. But I did it in two ways. First, I decided to look at all schools which, in 2013, had exactly 61 per cent of their pupils achieving at least level four in reading, writing and maths. Remember, this was the average figure for 2013 sponsored academies.
This yielded 113 primaries: six sponsored academies, two converter academies and 105 non-academy state schools. Among the 107 that were not sponsored academies, results improved to 70.7 per cent in 2014, a rise of 10 percentage points.
Second, I widened the comparison group to include a much larger number of primaries: those which had results, in 2013, ranging from 56 to 66 per cent. Again, I made sure that this sample, of 1,650 schools, had an average result of 61 per cent in 2013.
What was the outcome? Well, the schools which were not sponsored academies improved on average to 72 per cent. So that’s an 11 point improvement, compared to a seven point gain in sponsored academies. (The 11 point gain included figures for academy converters; removing them from the sample, non-academy maintained schools – ‘local authority schools’ in Mr Gibb’s phrase – went up 10 percentage points, which again is higher than the seven points of sponsored academies).
So my research seems to point to an opposite conclusion – sponsored academy results rising less quickly than those of a comparison group – to that of the DfE’s official statistical publication.
It would have been easy for the DfE’s professional statisticians to have published a similar assessment. But they did not. Nor did they publish any statistical caveats about the sponsored academy-to-national-average improvement comparison they chose to use.
Why does this matter? It seems to me to be very important on the ground, where I hear regularly of communities struggling with campaigns against academy status being forced on them by DfE, in the face of claims by ministers that this should be the only option for school improvement.
Indeed, DfE guidance says that, in schools deemed inadequate, ministers’ ‘expectation’ is always that they should become sponsored academies.
Last month, David Cameron went further, proposing that thousands of schools deemed by Ofsted not to be inadequate but merely to ‘require improvement’ should become sponsored academies in the event of a new Conservative government.
But if statistical evidence on an area absolutely central to the current political debate about education is being made to say the opposite of what a reasonable person might think the data actually tell us, acknowledging the need to compare like with like, we have serious problems. Is evidence being made to fit policy, rather than vice-versa?
I’d make one final point. In a recent article, Cambridge Primary Review Trust chair Robin Alexander, wrote: ‘Deep and lasting improvements in England’s education system will be secured only when, in their discourse and their handling of evidence, policymakers practise the best that has been thought and said rather than preach it, exemplify the educated mind rather than demean it.’
It is staggering that the DfE’s statistical publication was first released without the basic caveats and checks which would be expected of statistics students completing their assignments, and was then endorsed by a minister of education. And a minister of education with an enthusiasm for mathematics, at that. What kind of an example does this set for pupils? We all deserve better.
I invited DfE to comment on the content of this blog but it did not respond.
Warwick Mansell is a freelance journalist and author of Education by Numbers: the tyranny of testing (Methuen, 2007).
The stance of ministers and DfE towards evidence has been a constant concern of CPRT and CPR, as it must be for any organisation that cares about the probity and efficacy of education policy and seeks to generate the kind of evidence that a well-founded education system requires. CPRT’s concern is shared by the House of Commons Education Select Committee, which in 2014 launched an online enquiry into DfE’s use of evidence. Robin Alexander’s submission to this enquiry was published as the CPRT blog on 19 December.