The Cambridge Primary Review Trust

Search

  • Home
    • CPRT national conference
    • Blog
    • News
  • About CPRT
    • Overview
    • Mission
    • Aims
    • Priorities
    • Programmes
    • Priorities in Action
    • Organisation
    • People
      • National
    • Professional development
    • Media
  • CPR
    • Overview
    • Remit
    • Themes
    • Themes, Perspectives and Questions in Full
    • Evidence
    • People
    • CPR Publications
    • CPR Media Coverage
    • Dissemination
  • Networks
    • Overview
    • Schools Alliance
  • Research
    • Overview
    • CPRT/UoY Dialogic Teaching Project
    • Assessment
    • Children’s Voice
    • Learning
    • Equity and Disadvantage
    • Teaching
    • Sustainability and Global Understanding
    • Vulnerable children
    • Digital Futures
    • Demographic Change, Migration and Cultural Diversity
    • Systemic Reform in Primary Education
    • Alternative models of accountability and quality assurance
    • Initial Teacher Education
    • SW Research Schools Network
    • CPR Archive Project
  • CPD
  • Publications
  • Contact
    • Contact
    • Enquiries
    • Regional
    • School
    • Media
    • Other Organisations

September 4, 2015 by Warwick Mansell

Test of truth

Are ministers continuing to misuse data in promoting their favoured policies?

Nick Gibb, the schools minister, seems not to have taken on board the implications of a recent letter from the statistics watchdog, stemming from one of my previous CPRT blogs, about how primary schools’ test results should be interpreted and presented.

When the provisional 2015 Key Stage 2 results for England were published last week, Mr Gibb was quoted in the DfE’s press release celebrating big gains overall in average results since Labour left office in 2010. The minister also highlighted, again, the performance of academies, and in particular that of sponsored academies – typically struggling schools whose management is transferred to an outside body which signs a contract with the Secretary of State – as improving faster than the national average.

However, in doing so he ignored a warning from the UK Statistics Authority (UKSA) about over-interpretation of data. The DfE release also seemed to be heavily skewed in favour of a particular narrative, when, as I suggest below, other interpretations are available. And the national data themselves seem to beg questions about what, in reality, has driven the big recent jumps in pupil performance.

That UKSA intervention was prompted after I wrote my CPRT blog in February and followed this up with one for NAHT which argued that seemingly big improvements in sponsored academy KS2 results last year may have been nothing to do with academy status. Rather, I argued, they seemed to follow a national trend, whereby schools of all types with low statistical starting points had improved faster than the national average.

I wrote to the UKSA citing the two blogs and arguing that a DfE statistical release published in December 2014, on which ministers had relied to support their academies policy, should have investigated whether improvements in sponsored academy results came not as a result of the schools’ governance structures, but simply reflected a broader statistical trend for all types of schools.

Ed Humpherson, UKSA director general for regulation, wrote to DfE in July to suggest that while ministers were entitled to use the contents of DfE’s December 2014 statistical release when they commented on the academies policy, the paper itself should have made clear that ‘the differences in the rates of improvement [of academies versus other schools] were not necessarily caused by school type.’ He also recommended that future statistical publications should see DfE ‘commenting on limitations’ when interpreting these statistics, in order to ‘make it clearer to Ministers and to other users that the statistics could not be used to infer a causal link between school type and either attainment or rates of improvement.’

Last week came the first test of how DfE and ministers would react to this advice, with the first statistical publication revealing this year’s KS2 results, and the accompanying press release. Mr Humpherson’s warning seems to have been taken on board to some extent in the DfE statistical release, but – perhaps unsurprisingly – not at all by the minister.

The new DfE statistical release  has a section on academy performance, as was the case last year. Again, it notes how sponsored academies improved faster than the average for all schools. This time, though, it says  that when interpreting differential rates of improvement between types of school ‘it should be noted that the extent to which a school improves is related to a range of factors. Schools with the lowest previous outcomes tend to see the largest improvements…’

For me, this does not go far enough in stating clearly, in line with UKSA, that differences in improvement rates between schools of different types may be nothing to do with whether the institution is an academy or not.

Yes, this extra line of interpretation is an improvement on last year, and in that sense should be welcomed. However, it appears not to have been clear enough for Mr Gibb, whose press release claims: ‘The results…show that sponsored primary academies…are improving more quickly than those run by local authorities.’ Most controversially, Mr Gibb is also quoted as saying: ‘These results vindicate our decision to expand the valuable academies programme into primary schools.’

So, Mr Gibb is inferring a causal link between school type and results, seemingly against the advice of the UKSA.

As mentioned in previous blogs, this is not a purely political or statistical debate with only abstract implications. No, this possibly erroneous and misleading interpretation is likely to have profound implications on the ground, as struggling primary schools are pushed, often controversially, towards sponsored academy status on evidential grounds which still seem dubious.

Of course it may be that this year’s sponsored academy results do not fit the statistical pattern of previous years. It may be that they have improved substantially, while other previously low-performing local authority schools have not. We will not know for sure if that is the case until all school-by-school results are published towards Christmas. But such a phenomenon seems unlikely, based on what has happened in the recent past.

We do also already have further data for 2015 which cast Mr Gibb’s pronouncements in the press release in a somewhat different light from that intended. In the DfE release, Mr Gibb talks not only of major improvements since 2010, with 90,000 more pupils achieving the expected levels in maths and literacy, but of the results in different local authority areas. The narrative with regard to the latter is almost entirely negative. In fact, throughout this release, the only messages to come through are that ministers and their policies are proving successful; that the types of schools favoured by ministers in their reforms are proving successful; and that particular local authorities – yes, that’s government, but not the national government presided over by ministers  – are underperforming and so are facing a ‘crackdown’.

Remarkably, there is no mention at all that other actors in this annual statistical drama – children, their schools and teachers, and their parents – may have played a part in improving results.

In relation to local authorities, the release features a table of ‘best performing local authority areas’ and ‘worst performing local authority areas’, but the text focuses only on the latter, with Mr Gibb promising to write to directors of LAs at the bottom of the rankings to get them to ‘explain how they intend to improve the teaching of reading and arithmetic in the primary schools under their control’.

There are several ways to unpick that last phrase, by the way. For example, do local authority directors really have much influence over teaching content? Is ‘arithmetic’ all that mathematics amounts to now? Have local authorities really ‘controlled’ schools since the 1988 Education Reform Act, introduced by the Conservatives supposedly to stop LA control happening? But we must move on.

The interesting thing is that, within these latest statistics, DfE did publish LA-by-LA figures which point to some large improvements in recent years. So, two authorities have improved their headline percentage of pupils achieving level four in reading, writing and mathematics by 12 points since 2012. In Hull, the figure rose from well below the national average, at 67 per cent, in 2012, to 79 per cent, just below the national figure. In Portsmouth, the gain was also by 12 points, from 65 to 77 per cent. Another five authorities – Redcar, Herefordshire, Suffolk, East Sussex and Hounslow – improved by at least nine percentage points across the three years. Overall, five of the top seven fastest rising authorities, on this measure, had below-average results in 2012 so have either closed the gap with the national average or have surpassed it.

Some of them, including Hull, it is true, do have a higher than average numbers of academies. Yet outside one very small authority – Rutland, where performance tends to jump around every year – the fastest-rising LA from 2014 to 2015 on this headline measure was South Tyneside, where results surged by seven percentage points. DfE data reveals that South Tyneside has only one sponsored primary academy. Meanwhile, the academy chain widely seen as the most successful in England – Ark Schools – posted average headline results which, at 72 per cent, were a point lower than the lowest-performing local authorities nationally. Will Mr Gibb now be writing to Ark?

It is possible, then, to see from the above statistics how an alternative narrative could have been crafted, perhaps based on ministerial praise for local authority areas which have risen on the Government’s chosen measures. As ever, interpretation of statistics can depend on what the interpreter chooses not to highlight.

One final set of questions present themselves from the press release’s statistics. What do the last few years of generally improving national data actually mean?

Of course, the implications of the press release, as voiced by Mr Gibb, are clear. Results have improved strongly since 2010. This shows, said Mr Gibb, that ‘the government is delivering on its one nation vision for education’ and that ministerial policies are paying off. The national data behind this claim show that the proportion of pupils achieving the expected level 4 in all of reading, writing and maths rose from 62 per cent in 2009 to 80 per cent this year.

But to repeat: why has this happened? I’m not convinced that any of the three policies listed in the DfE press release – introducing higher floor targets, banning calculators from maths tests and introducing a spelling, punctuation and grammar test – have been entirely behind it.

And perhaps the most obvious change that a government can make to teaching and learning – the introduction of a new national curriculum – cannot have contributed here as none of the pupils taking the 2015 tests have experienced the national curriculum introduced by the previous government.

So it is a bit of a mystery. Perhaps readers of this blog can explain why the figures have jumped. I am certainly curious about them, and would like to investigate further. For if anything is to be underlined from recent ministerial interpretations of figures, it is the need continually to ask questions.

Warwick Mansell, one of CPRT’s regular bloggers, is a freelance journalist and author of ‘Education by Numbers: the tyranny of testing’ (Methuen, 2007).

This is not the first time that our bloggers have had cause to challenge the government’s use of evidence. Click here for further comment.

For other blogs by Warwick Mansell click here and/or download CPRT’s book Primary Colours.

Filed under: academies, assessment, Cambridge Primary Review Trust, Department for Education, evidence, KS2 tests, Nick Gibb, standards, UK Statistics Authority, Warwick Mansell

December 17, 2014 by Robin Alexander

From phonics check to evidence check

In ministerial speeches ‘evidence-based policy’ is now as almost as routine as ‘I care passionately about …’ and is as likely to be greeted with hollow laughter. So it’s to the credit of the House of Commons Education Select Committee that it has undertaken an enquiry into the use of evidence by the Department of Education, asking  DfE to list the evidential basis of a number of policies  before inviting the public to comment via a web forum. Nine areas of policy were nominated for these ‘evidence checks’: phonics, teaching assistants, professional measurement metrics, summer-born children, the National College of Teaching and Leadership, universal infant free school meals, the impact of raising the school participation age, music, the school starting age. There was a further section on DfE’s use of evidence generally and this prompted the largest number of responses, including the  following which we reprint as our final blog of 2014. It’s longer than usual, but then you won’t be hearing from us again until January.

DFE’s use of evidence

Several contributors to this enquiry commend DfE for its commitment to evidence, but surely this is a minimum condition of good governance, not a cause for genuflection. More to the point are the concerns of Dame Julia Higgins that DfE’s use of evidence is inconsistent (or, as Janet Downs puts it, ‘slippery’) and those many other contributors across the Committee’s nine themes who find DfE overly selective in the evidence on which it draws and the methodologies it prefers.

The principal filters appear to be ideological (‘is this researcher one of us’?) and electoral (‘will the findings boost our poll ratings / damage those of the opposition’?) and such scientifically inadmissible criteria are compounded by DfE’s marked preference for research dealing in big numbers, little words and simple solutions.

In the latter context, we should be wary of endorsing without qualification the view of several contributors that the randomised control trial (RCT) is the evidential ‘gold standard’, trumping all other attempts to get at the truth. Education is complex and contested, and its central questions are as much ethical as technical – a challenge which the fashionable but amoral mantra ‘what works’ conveniently ignores. The RCT language of ‘treatment’ and ‘dosage’ is fine for drug trials but is hardly appropriate to an activity which is more craft and art than science, and in untutored hands the effort to make teaching fit this paradigm may reduce to the point of risibility or destruction the very phenomena it claims to test. I should add that I make these observations not as a disappointed research grant applicant but as recipient of substantial funding from the rightly esteemed Educational Endowment Foundation for a ‘what works’ project involving RCT.

Of the nine ‘evidence check’ memoranda submitted to the Committee by DfE, those on phonics, the school starting age and the National College most conspicuously display some of the tendencies I’ve so far identified. Thus the defence and citations in DfE’s phonics statement neatly sidestep the methodological controversies and evidential disputes surrounding what is now the government’s mandated approach to teaching reading, so the contributor who applauds DfE’s grossly biased bibliography as ‘accurate’ is plain wrong.

DfE’s school starting age citations carelessly – or perhaps carefully – attribute a publication of the Cambridge Primary Review (Riggall and Sharp) to NFER, but again avoid any evidence running counter to the official view that children should be packed off to school as soon as possible; or the more nuanced finding of the Cambridge Primary Review that the real issue is not the starting age for formal schooling but the availability and quality of early years provision, wherever it takes place; or indeed the inconvenient truth that some of this country’s more successful PISA competitors start formal schooling one or even two years later than England.

As for the National College of Teaching and Leadership (NCTL), no independent evidence is offered in support of DfE’s insistence that this agency, and the models of teacher training and school improvement it espouses, justify its consumption of public funds. Only two publications are cited in DfE’s ‘evidence check’. One is NCTL’s statement of accounts; the other a DfE press release which is neither evidence nor independent. Proper evaluation of NCTL became all the more essential when DfE abolished the relatively ‘arms length’ bodies that NCTL subsumed and charged it with ‘delivering’ approved policies. Of course NCTL can be shown to be effective in relation to the delivery of policies x and y. But what if those policies are wrong?

The Committee has received many unhappy comments from parents about schools’ draconian responses to term-time absences. These highlight a further problem: there are important areas of educational policy, at both school and national level, where evidence is rarely or never on view and parents and the electorate are expected to comply with what may be little more than unsubstantiated claims. In the case of those blanket bans on term-time absence about which so many parents complain to the Committee, as with the tendency to fill more and more of children’s (and parents’) waking hours with homework (i.e. schoolwork done at home) of variable and in some cases little educational value, there appears to be a deep-seated assumption that schools have a monopoly of useful learning. The Cambridge Primary Review scotched this mistaken and indeed arrogant belief in the comprehensive research review on children’s lives outside school that it commissioned from Professor Berry Mayall. Except that the then government preferred summarily to reject the evidence and abuse the Review team rather than engage with the possibility that schools might do even better if more of them understood and built on what their pupils learn outside school.

So although the Education Committee has applied its ‘evidence check’ to nine areas of policy, it might also consider extending its enquiry in two further directions: first, by examining the evidential basis of policies and initiatives, such as those exemplified above, about which teachers, parents and indeed children themselves express concern; second by adding some of those frontline policies which DfE has justified by reference to evidence but which are conspicuously absent from the Committee’s list.

Examples in the latter category might include: (i) the government’s 2011-13 review of England’s National Curriculum; (ii) the development of new requirements for assessment and accountability in primary schools; (iii) the rapid and comprehensive shift to school-led and school-based initial teacher education; (iv) the replacement of the old TDA teacher professional standards by the current set; (v) the strenuous advocacy and preferential treatment of academies and free schools. Each of these illustrates, sometimes in extreme form, my initial concerns about politico-evidential selectivity and methodological bias.

Thus in the 2011-13 national curriculum review ministers deployed exceptionally reductionist and naive interpretations of the wealth of international evidence with which they were provided by DfE officials and others. They resisted until the last moment overwhelming evidence about the educational centrality of spoken language. They ignored Ofsted warnings, grounded in two decades of school inspection (and indeed evidence going back long before Ofsted) about the damage caused by a two-tier curriculum that elevates a narrow view of educational basics above all else – damage not just to the wider curriculum but also the ‘basics’ themselves. And they declined to publish or act on their own internal enquiry which confirmed the continuing seriousness of the challenge of curriculum expertise in primary schools, an enquiry which – and this much is to ministers’ credit – DfE undertook in response to, and in association with, the Cambridge Primary Review. The report of that enquiry, and the wider evidence that informed it, still awaits proper consideration. A job for the Education Committee perhaps?

Similarly, DfE, like its predecessor DCSF, has stubbornly held to its view – challenged by the Education Committee as well as numerous research studies and the Bew enquiry – that written summative tests are the best way both to assess children’s progress and hold schools and teachers to account, and that they provide a valid proxy for children’s attainment across the full spectrum of their learning.

Then, and in pursuit of what has sometimes looked suspiciously like a vendetta against those in universities who undertake the research that sometimes rocks the policy boat, DfE has ignored international evidence about the need for initial teacher education to be grounded in equal partnership between schools and higher education, preferring the palpable contradiction of locating an avowedly ‘world class’ teacher education system in schools that ministers tell us are failing to deliver ‘world class’ standards. Relatedly, DfE has accepted a report from its own enquiry into professional standards for teachers which showed even less respect for evidence than the earlier and much-criticised framework from TDA, coming up with ‘standards’ which manage to debase or exclude some of the very teacher attributes that research shows are most crucial to the standards of learning towards which these professional standards are supposedly directed.

Finally, in pursuit of its academies drive government has ignored the growing body of evidence from the United States that far from delivering superior standards as claimed, charter schools, academies’ American inspiration, are undermining public provision and tainted by financial and managerial corruption. England may not have gone that far, but new inspection evidence on comparative standards in academies and maintained schools (in HMCI’s Annual Report for 2013-14) should give the Committee considerable pause for thought about the motivation and consequences of this initiative.

In relation to the Committee’s enquiry as a whole, the experience of the Cambridge Primary Review (2006-10) and its successor the Cambridge Primary Review Trust is salutary, depressing and (to others than hardened cynics) disturbing. Here we had the nation’s most comprehensive enquiry into English primary education for half a century, led by an expert team, advised and monitored by a distinguished group of the great and good, supported by consultants in over 20 universities as well as hundreds of professionals, and generating a vast array of data, 31 interim reports and a final report with far-sighted conclusions and recommendations, all of them firmly anchored in evidence, including over 4000 published sources.

Far from welcoming the review as offering, at no cost to the taxpayer, an unrivalled contribution to evidence-based policy and practice in this vital phase of education, DCSF – DfE’s predecessor – systematically sought to traduce and discredit it by misrepresenting its findings in order to dismiss them, and by mounting ad personam attacks against the Review’s principals. Such behaviour in the face of authoritative and useful evidence was unworthy of holders of elected office and, for the teachers and children in our schools, deeply irresponsible.

It is with some relief that we note that DfE’s stance towards the Review and its successor the Cambridge Primary Review Trust has been considerably more positive under the Coalition than under Labour, and we record our appreciation of the many constructive discussions we have had with ministers and officials since 2010. Nevertheless, when evidential push comes to political shove, evidence discussed and endorsed in such meetings capitulates, more often than not, to the overriding imperatives of ideology, expediency and media narrative. This, notwithstanding the enhanced research profile applauded by other contributors, remains the default.

 

www.robinalexander.org.uk

This particular Education Committee enquiry was set up as a web forum, with a deadline of 15 December. The DfE evidence checks and the comments they provoked may be viewed here.
DfE’s use of evidence attracted 154 comments, followed by summer-born children (111), phonics (90) and the school starting age (64).

Filed under: Cambridge Primary Review Trust, Department for Education, evidence, evidence check, evidence-based policy, House of Commons Education Committee, Robin Alexander

August 27, 2014 by Robin Alexander

Does education pass the family test?

In 2010, Michael Gove renamed Labour’s Department for Children, Schools and Families (DCSF) the Department for Education (DfE), at a stroke ejecting Ed Balls’s tiresomely winsome munchkins from the Sanctuary Buildings atrium, ending baffled discussion about whether DCSF stood for comedy and science fiction or curtains and soft furnishings, and heralding a gimmick-free return to core business.

Then last week, with the Gove supremacy a receding memory but with Govine policies firmly in place for the duration, the PM announced that from November 2014 every new government policy ‘will be assessed for its impact on the family.’  The PM’s admission that too many existing policies have failed his ‘family test’ must prompt us to ask whether he had in mind the doings of the demoted Gove.  After all, who needs munchkins to tell them that children’s needs and family circumstances are as inextricably the business of schools and hence DfE as are curriculum, tests and standards?

Labour appeared to understand this relationship, up to a point. So the Cambridge Primary Review found widespread support for Sure Start, EYFS, Every Child Matters, the Children’s Act, the Childcare Act, Every Parent Matters, the Children’s Plan and Narrowing the Gap, an impressive procession of ‘joined-up’ initiatives through which the Labour government sought to reduce childhood risk, increase childhood protection, support families and maximise educational opportunities. But CPR also reported growing and often intense opposition to the same government’s apparatus of high stakes testing, higher stakes inspection, performance tables, naming, shaming and closely prescribed pedagogy, all of which also impacted on children and families, with outcomes that remain hotly contested.

In any event, this so-called standards agenda was widely thought to exacerbate what, in her important research survey for CPR, Berry Mayall called the ‘scholarisation’ of childhood: the incursion of schooling and its demands ever more deeply into children’s lives at an ever younger age, leaving little room for a childhood unimpeded by pressures which in many other education systems, including some that perform better than the UK in the international PISA tests, start a year or even two years later than in England. When Britain came bottom of a rather different performance table, UNICEF’s comparative rating of childhood well-being in rich countries, opponents of these tendencies drew the obvious conclusion.

Hence the reaction: Sue Palmer’s best-selling ‘Toxic Childhood’, the Children’s Society Good Childhood Enquiry, and latterly the Save Childhood Movement. And hence, true to the laws of policy physics, the ministerial counter-reaction, from Labour’s ‘these people are peddling out-of-date research’ – a lamely unoriginal and transparently defensive response to unpalatable evidence – to the Coalition’s earthier recourse to personal abuse: ‘Marxists intent on destroying our schools … enemies of promise … bleating bogus pop-psychology’.

Meanwhile, the rich became richer and the poor poorer.

In relation to children and families, then, there is all too often a pretty fundamental policy disconnect. Education policy may give with one hand but take with another; and education policy strives to narrow the gap that economic policy no less assiduously maintains and even widens, not pausing to ask why the gap is there in the first place.  For surely Treasury ministers know as well as their DfE colleagues how closely the maps of income, health, wellbeing and educational achievement coincide; that unequal societies have unequal education systems and unequal educational outcomes; and that equity is a significant factor in other nations’ PISA success – though in all this we need to avoid facile cause-effect claims and we know that fine schools can and do break the mould.  Yet will the ‘family test’ be applied as stringently to the policies of Chancellor Osborne, I wonder, as to those of Education ex-Secretary Gove? Or will the social and educational fallout of austerity be written off as unavoidable collateral damage?

But I suspect that linking the policy dots is not what the new family test is about and each policy will be assessed in isolation. In any case, how many new education policies, if any, will the government introduce in the eight months before the 2015 general election? And at a time when the demography of childhood and parenting is more diverse than ever, how exactly is ‘family’ defined? Isn’t the family test both too muddled and too late?

For its part, CPRT, like CPR before it, is operating more holistically, and we have invited leading researchers to help us. Kate Pickett, co-author of The Spirit Level – that groundbreaking epidemiological study of the causes, manifestations and consequences of inequality – to help us. In one of five new CPRT research surveys, Kate is revisiting her own and CPR’s evidence on equality, equity and disadvantage and examining more recent data in order to re-assess causes, consequences and solutions. Her report will be published early in 2015. In parallel, we have commissioned research updates on children’s voice, development and learning from Carol Robinson and Usha Goswami; and on assessment and teaching from Wynne Harlen and David Hogan. Squaring the schooling/family circle we have embarked, in collaboration with the University of York, on an Educational Endowment Foundation-supported project to develop and test the power of high quality classroom talk to increase engagement and raise learning standards among those of our children who are growing up in the most challenging circumstances.

You’ll find information about all these projects on the CPRT website. We hope and believe they will pass the family test.

www.robinalexander.org.uk

Filed under: Cambridge Primary Review, childhood, Coalition Education, David Cameron, Department for Education, Family Test, Labour Education, Michael Gove, Robin Alexander, social disadvantage

August 5, 2014 by Robin Alexander

From the playing fields of Eton

Tony Little, head of Eton, has intervened in the education debate with claims that are more frequently heard in maintained schools and CPR reports: the exam system is no longer fit for purpose; copying Singapore and Shanghai is not the way to raise standards; there’s more to education than league tables; children need a broad rich curriculum as well as the basics …

No less interesting is DfE’s dismissive response: ‘We make no apology for holding schools to account for the results their pupils achieve in national tests and public examinations. Parents deserve to know that their children are receiving the very best possible teaching.’

Sounds familiar? Yes indeed: here’s DfE’s predecessor, DCSF, responding to one of CPR’s interim reports in 2008: ‘We make no apology for our focus on school standards. We want every child to achieve to the best of their abilities, succeed and be happy, and we know that parents and teachers want that too. The idea that children are over tested is not a view that the government accepts.’

Different government, same script, same scriptwriter, same populist ploy of pitting professionals against parents, same refusal to debate the issues that matter. See ‘Two worlds of education’. QED.

www.robinalexander.org.uk

Filed under: Cambridge Primary Review, Department for Education, Robin Alexander, Tony Little

Contact

Cambridge Primary Review Trust - Email: administrator@cprtrust.org.uk

Copyright © 2025