Are ministers continuing to misuse data in promoting their favoured policies?
Nick Gibb, the schools minister, seems not to have taken on board the implications of a recent letter from the statistics watchdog, stemming from one of my previous CPRT blogs, about how primary schools’ test results should be interpreted and presented.
When the provisional 2015 Key Stage 2 results for England were published last week, Mr Gibb was quoted in the DfE’s press release celebrating big gains overall in average results since Labour left office in 2010. The minister also highlighted, again, the performance of academies, and in particular that of sponsored academies – typically struggling schools whose management is transferred to an outside body which signs a contract with the Secretary of State – as improving faster than the national average.
However, in doing so he ignored a warning from the UK Statistics Authority (UKSA) about over-interpretation of data. The DfE release also seemed to be heavily skewed in favour of a particular narrative, when, as I suggest below, other interpretations are available. And the national data themselves seem to beg questions about what, in reality, has driven the big recent jumps in pupil performance.
That UKSA intervention was prompted after I wrote my CPRT blog in February and followed this up with one for NAHT which argued that seemingly big improvements in sponsored academy KS2 results last year may have been nothing to do with academy status. Rather, I argued, they seemed to follow a national trend, whereby schools of all types with low statistical starting points had improved faster than the national average.
I wrote to the UKSA citing the two blogs and arguing that a DfE statistical release published in December 2014, on which ministers had relied to support their academies policy, should have investigated whether improvements in sponsored academy results came not as a result of the schools’ governance structures, but simply reflected a broader statistical trend for all types of schools.
Ed Humpherson, UKSA director general for regulation, wrote to DfE in July to suggest that while ministers were entitled to use the contents of DfE’s December 2014 statistical release when they commented on the academies policy, the paper itself should have made clear that ‘the differences in the rates of improvement [of academies versus other schools] were not necessarily caused by school type.’ He also recommended that future statistical publications should see DfE ‘commenting on limitations’ when interpreting these statistics, in order to ‘make it clearer to Ministers and to other users that the statistics could not be used to infer a causal link between school type and either attainment or rates of improvement.’
Last week came the first test of how DfE and ministers would react to this advice, with the first statistical publication revealing this year’s KS2 results, and the accompanying press release. Mr Humpherson’s warning seems to have been taken on board to some extent in the DfE statistical release, but – perhaps unsurprisingly – not at all by the minister.
The new DfE statistical release has a section on academy performance, as was the case last year. Again, it notes how sponsored academies improved faster than the average for all schools. This time, though, it says that when interpreting differential rates of improvement between types of school ‘it should be noted that the extent to which a school improves is related to a range of factors. Schools with the lowest previous outcomes tend to see the largest improvements…’
For me, this does not go far enough in stating clearly, in line with UKSA, that differences in improvement rates between schools of different types may be nothing to do with whether the institution is an academy or not.
Yes, this extra line of interpretation is an improvement on last year, and in that sense should be welcomed. However, it appears not to have been clear enough for Mr Gibb, whose press release claims: ‘The results…show that sponsored primary academies…are improving more quickly than those run by local authorities.’ Most controversially, Mr Gibb is also quoted as saying: ‘These results vindicate our decision to expand the valuable academies programme into primary schools.’
So, Mr Gibb is inferring a causal link between school type and results, seemingly against the advice of the UKSA.
As mentioned in previous blogs, this is not a purely political or statistical debate with only abstract implications. No, this possibly erroneous and misleading interpretation is likely to have profound implications on the ground, as struggling primary schools are pushed, often controversially, towards sponsored academy status on evidential grounds which still seem dubious.
Of course it may be that this year’s sponsored academy results do not fit the statistical pattern of previous years. It may be that they have improved substantially, while other previously low-performing local authority schools have not. We will not know for sure if that is the case until all school-by-school results are published towards Christmas. But such a phenomenon seems unlikely, based on what has happened in the recent past.
We do also already have further data for 2015 which cast Mr Gibb’s pronouncements in the press release in a somewhat different light from that intended. In the DfE release, Mr Gibb talks not only of major improvements since 2010, with 90,000 more pupils achieving the expected levels in maths and literacy, but of the results in different local authority areas. The narrative with regard to the latter is almost entirely negative. In fact, throughout this release, the only messages to come through are that ministers and their policies are proving successful; that the types of schools favoured by ministers in their reforms are proving successful; and that particular local authorities – yes, that’s government, but not the national government presided over by ministers – are underperforming and so are facing a ‘crackdown’.
Remarkably, there is no mention at all that other actors in this annual statistical drama – children, their schools and teachers, and their parents – may have played a part in improving results.
In relation to local authorities, the release features a table of ‘best performing local authority areas’ and ‘worst performing local authority areas’, but the text focuses only on the latter, with Mr Gibb promising to write to directors of LAs at the bottom of the rankings to get them to ‘explain how they intend to improve the teaching of reading and arithmetic in the primary schools under their control’.
There are several ways to unpick that last phrase, by the way. For example, do local authority directors really have much influence over teaching content? Is ‘arithmetic’ all that mathematics amounts to now? Have local authorities really ‘controlled’ schools since the 1988 Education Reform Act, introduced by the Conservatives supposedly to stop LA control happening? But we must move on.
The interesting thing is that, within these latest statistics, DfE did publish LA-by-LA figures which point to some large improvements in recent years. So, two authorities have improved their headline percentage of pupils achieving level four in reading, writing and mathematics by 12 points since 2012. In Hull, the figure rose from well below the national average, at 67 per cent, in 2012, to 79 per cent, just below the national figure. In Portsmouth, the gain was also by 12 points, from 65 to 77 per cent. Another five authorities – Redcar, Herefordshire, Suffolk, East Sussex and Hounslow – improved by at least nine percentage points across the three years. Overall, five of the top seven fastest rising authorities, on this measure, had below-average results in 2012 so have either closed the gap with the national average or have surpassed it.
Some of them, including Hull, it is true, do have a higher than average numbers of academies. Yet outside one very small authority – Rutland, where performance tends to jump around every year – the fastest-rising LA from 2014 to 2015 on this headline measure was South Tyneside, where results surged by seven percentage points. DfE data reveals that South Tyneside has only one sponsored primary academy. Meanwhile, the academy chain widely seen as the most successful in England – Ark Schools – posted average headline results which, at 72 per cent, were a point lower than the lowest-performing local authorities nationally. Will Mr Gibb now be writing to Ark?
It is possible, then, to see from the above statistics how an alternative narrative could have been crafted, perhaps based on ministerial praise for local authority areas which have risen on the Government’s chosen measures. As ever, interpretation of statistics can depend on what the interpreter chooses not to highlight.
One final set of questions present themselves from the press release’s statistics. What do the last few years of generally improving national data actually mean?
Of course, the implications of the press release, as voiced by Mr Gibb, are clear. Results have improved strongly since 2010. This shows, said Mr Gibb, that ‘the government is delivering on its one nation vision for education’ and that ministerial policies are paying off. The national data behind this claim show that the proportion of pupils achieving the expected level 4 in all of reading, writing and maths rose from 62 per cent in 2009 to 80 per cent this year.
But to repeat: why has this happened? I’m not convinced that any of the three policies listed in the DfE press release – introducing higher floor targets, banning calculators from maths tests and introducing a spelling, punctuation and grammar test – have been entirely behind it.
And perhaps the most obvious change that a government can make to teaching and learning – the introduction of a new national curriculum – cannot have contributed here as none of the pupils taking the 2015 tests have experienced the national curriculum introduced by the previous government.
So it is a bit of a mystery. Perhaps readers of this blog can explain why the figures have jumped. I am certainly curious about them, and would like to investigate further. For if anything is to be underlined from recent ministerial interpretations of figures, it is the need continually to ask questions.
Warwick Mansell, one of CPRT’s regular bloggers, is a freelance journalist and author of ‘Education by Numbers: the tyranny of testing’ (Methuen, 2007).
This is not the first time that our bloggers have had cause to challenge the government’s use of evidence. Click here for further comment.