The Cambridge Primary Review Trust

Search

  • Home
    • CPRT national conference
    • Blog
    • News
  • About CPRT
    • Overview
    • Mission
    • Aims
    • Priorities
    • Programmes
    • Priorities in Action
    • Organisation
    • People
      • National
    • Professional development
    • Media
  • CPR
    • Overview
    • Remit
    • Themes
    • Themes, Perspectives and Questions in Full
    • Evidence
    • People
    • CPR Publications
    • CPR Media Coverage
    • Dissemination
  • Networks
    • Overview
    • Schools Alliance
  • Research
    • Overview
    • CPRT/UoY Dialogic Teaching Project
    • Assessment
    • Children’s Voice
    • Learning
    • Equity and Disadvantage
    • Teaching
    • Sustainability and Global Understanding
    • Vulnerable children
    • Digital Futures
    • Demographic Change, Migration and Cultural Diversity
    • Systemic Reform in Primary Education
    • Alternative models of accountability and quality assurance
    • Initial Teacher Education
    • SW Research Schools Network
    • CPR Archive Project
  • CPD
  • Publications
  • Contact
    • Contact
    • Enquiries
    • Regional
    • School
    • Media
    • Other Organisations

October 30, 2015 by Robin Alexander

Face the music

Opera North has reported dramatic improvements in key stage 2 test results in two primary schools, one in Leeds, the other in Hull, and both in areas deemed severely deprived. ‘Dramatic’ in this instance is certainly merited: in one of the schools the proportion of children gaining level 4 in reading increased from 78 per cent in 2014 to 98 per cent in 2015, with corresponding increases in writing (75 to 86 per cent) and mathematics (73 to 93 per cent).

But what, you may ask, has this to do with opera?  Well, since 2013 the schools in question – Windmill Primary in Leeds and Bude Park Primary in Hull – have been working with Opera North as part of the Arts Council and DfE-supported In Harmony programme. This aims ‘to inspire and transform the lives of children in deprived communities, using the power and disciplines of community-based orchestral music-making.’  Opera North’s In Harmony project, now being extended, is one of six, with others in Gateshead, Lambeth, Liverpool, Nottingham and Telford. In the Leeds project, every child spends up to three hours each week on musical activity and some also attend Opera North’s after-school sessions. Most children learn to play an instrument and all of them sing. For the Hull children, singing is if anything even more important. Children in both schools give public performances, joining forces with Opera North’s professional musicians. For the Leeds children these may take place in the high Victorian surroundings of Leeds Town Hall.

Methodological caution requires us to warn that the test gains in question reflect an apparent association between musical engagement and standards of literacy and numeracy rather than the proven causal relationship that would be tested by a randomised control trial (and such a trial is certainly needed).  But the gains are sufficiently striking, and the circumstantial evidence sufficiently rich, to persuade us that the relationship is more likely to be causal than not, especially when we witness how palpably this activity inspires and sustains the enthusiasm and effort of the children involved. Engagement here is the key: without it there can be no learning.

It’s a message with which for many years arts organisations and activists have been familiar, and which they have put into impressive practice.  To many members of Britain’s principal orchestras, choirs, art galleries, theatres and dance companies, working with children and schools is now as integral to their day-to-day activity as the shows they mount, while alongside publicly-funded schemes like In Harmony, the Prince’s Foundation for Children and the Arts pursues on an even larger scale the objective of immersing disadvantaged children in the arts by taking them to major arts venues and enabling them to work with leading arts practitioners.  Meanwhile, outside such schemes many schools develop their own productive partnerships with artists and performers on a local basis.

Internationally, the chance move of a major German orchestra’s headquarters and rehearsal space into a Bremen inner-city secondary school created first unease, then a dawning sense of opportunity and finally an extraordinary fusion of students and musicians, with daily interactions between the two groups, students mingling with orchestra members at lunch and sitting with them rehearsals, and a wealth of structured musical projects.

But perhaps the most celebrated example of this movement is Venezuela’s El Sistema, which since 1975 has promoted ‘intensive ensemble participation from the earliest stages, group learning, peer teaching and a commitment to keeping the joy of musical learning and music making ever-present’ through participation in orchestral ensembles, choral singing, folk music and jazz. El Sistema’s best-known ambassador in the UK – via its spectacular performances at the BBC Proms – is the Simon Bolivar Youth Orchestra, and it is El Sistema that provides the model for In Harmony, as it does, obviously, for Sistema Scotland with its ‘Big Noise’ centres in Raploch (Stirling), Govanhill (Glasgow) and Torry (Aberdeen).

By and large, the claims made for such initiatives are as likely to be social and personal as musical, though Geoffrey Baker  has warned against overstating their achievements and even turning them into a cult. Thus Sistema Scotland’s Big Noise is described as ‘an orchestra programme that aims to use music making to foster confidence, teamwork, pride and aspiration in the children taking part’.  There are similar outcomes from Deutsche Kammerphilharmonie Bremen’s move into the Tenever housing estate, with dramatic improvements reported in pupil behaviour and the school’s reputation transformed from one to be avoided to one to which parents from affluent parts of the city now queue to send their children.

Similarly, the initial NFER evaluation report on In Harmony cites ‘positive effects on children’s self-esteem, resilience, enjoyment of school, attitudes towards learning, concentration and perseverance’ with, as a bonus, ‘some perceived impact on parents and families including raised aspirations for their children, increased enjoyment of music and confidence in visiting cultural venues, and increased engagement with school.’  Children and the Arts sees early engagement with the arts through its Quest and Start programmes as a way of ‘raising aspirations, increasing confidence, improving communication skills andunlocking creativity.’ Such engagement is offered not only in ‘high-need areas where there is often socio-economic disadvantage or low arts access’ but also, through the Start Hospices programme, to children with life-limiting and life-threatening illnesses and conditions.

The SAT score gains from Opera North’s In Harmony projects in Leeds and Hull add a further justificatory strand; one, indeed, that might just make policymakers in their 3Rs bunker sit up and take notice.  For while viewing the arts as a kind of enhanced PSHE – a travesty, of course – may be just enough to keep these subjects in the curriculum, demonstrating that they impact on test scores in literacy and numeracy may make their place rather more secure.

This, you will say, is unworthily cynical and reductive. But cynicism in the face of policymakers’ crude educational instrumentality is, I believe, justified by the curriculum utterances and decisions of successive ministers over the past three decades, while the reductiveness is theirs, not mine. Thus Nicky Morgan excludes the arts from the EBacc, but in her response to the furore this provokes she reveals the limit of her understanding by confining her justification for the arts to developing pupils’ sense of ‘Britishness’, lamely adding that she ‘would expect any good school to complement [the EBacc subjects] with a range of opportunities in the arts’.  ‘A range of opportunities’ – no doubt extra-curricular and optional – is hardly the same as wholehearted commitment to convinced, committed and compulsory arts education taught with the same eye to high standards that governments reserve for the so-called core subjects.  Underlining the poverty of her perspective, Morgan tells pupils that STEM subjects open career options while arts subjects close them.

What worries me no less than the policy stance – from which, after all, few recent Secretaries of State have deviated – is the extent to which, in our eagerness to convince these uncomprehending ministers that the arts and arts education are not just desirable but essential, we may deploy only those justifications we think they will understand, whether these are generically social, behavioural and attitudinal (confidence, self-esteem) or in the realm of transferable skills (creativity, literacy, numeracy), or from neuroscience research (attention span, phonological awareness, memory). The otherwise excellent 2011 US report on the arts in schools from the President’s Committee on the Arts and Humanities falls into the same trap of focussing mainly on social and transferable skills, though it does at least synthesise a substantial body of research evidence on these matters which this country’s beleaguered advocates of arts education will find useful.

Let me not be misunderstood: the cognitive, personal and social gains achieved by El Sistema, Children and the Arts, In Harmony and similar ventures are as impressive as they are supremely important for children and society, especially in cultures and contexts where children suffer severe disadvantage.  And if it can be shown that such experiences enhance these children’s mastery of literacy and numeracy, where in the words of CPRT’s Kate Pickett, they encounter a much steeper ‘social gradient’ than their more affluent peers, then this is doubly impressive.

But the danger of presenting the case for arts education solely in these terms, necessary in the current policy climate though it may seem to be, is that it reduces arts education to the status of servant to other subjects, a means to someone else’s end (‘Why study music?’ ‘To improve your maths’) rather than an end in itself; and it justifies the arts on the grounds of narrowly-defined utility rather than intrinsic value. It also blurs the vital differences that exist between the various arts in their form, language, practice, mode of expression and impact.  The visual arts, music, drama, dance and literature have elements in common but they are also in obvious and fundamental ways utterly distinct from each other. They engage different senses, require different skills and evoke different responses – synaptic as well as intellectual and emotional. All are essential. All should be celebrated.

This loss of distinctiveness is perhaps unwittingly implied by the evaluation of the only Education Endowment Foundation (EEF) project in this area. EEF evaluates ‘what works’ interventions designed to enhance the literacy and numeracy attainment of disadvantaged pupils (including CPRT’s own dialogic teaching project) and its ‘Act, Sing, Play’ project has tested the relative impact of music and drama on the literacy and numeracy attainment of Year 2 pupils. It found no significant difference between the two subjects. So, in the matter of using the arts as a way to raise standards in the 3Rs, do we infer that any art will do?

So, yes, the power of the arts, directly experienced and expertly taught, is such that they advance children’s development, understanding and skill beyond as well as within the realms of the auditory, visual, verbal, kinaesthetic and physical. And yes, it should be clearly understood that while the arts can cultivate affective and social sensibilities, when properly taught they are in no way ‘soft’ or intellectually undemanding, and to set them in opposition to so-called ‘hard’ STEM subjects, as Nicola Morgan did, is as crass as claiming that creativity has no place in science or engineering. But until schools have the inclination and confidence to champion art for art’s sake, and to make the case for each art in its own terms, and to cite a wider spectrum of evidence than social development alone, then arts education will continue to be relegated to curriculum’s periphery.

For this is a historic struggle against a mindset that is deeply embedded and whose policy manifestations include a national curriculum that ignores all that we have to come know about the developmental and educative power of the arts, and indeed about its economic as well as cultural value, and perpetuates the same ‘basics with trimmings’ curriculum formula that has persisted since the 1870s and earlier.

That’s why the Cambridge Primary Review argued that the excessively sharp differentiation of ‘core’ and ‘foundation’ subjects should cease and all curriculum domains should be approached with equal seriousness and be taught with equal conviction and expertise, even though, of course, some will be allocated more teaching time than others. This alternative approach breaks with the definition of ‘core’ as a handful of ring-fenced subjects and allows us instead to identify core learnings across a broader curriculum, thereby greatly enriching children’s educational experience, maximising the prospects for transfer of learning from one subject to another, and raising standards.

Seriousness, conviction, expertise: here we confront the challenge of teaching quality. Schemes like Sistema, In Harmony and those sponsored by Children and the Arts succeed because children encounter trained and talented musicians, artists, actors and dancers at the top of their game.  These people provide inspirational role models and there is no limit to what children can learn from them. In contrast, music inexpertly taught – and at the fag-end of the day or week, to boot – not only turns children off but also confirms the common perception that music in schools is undemanding, joyless and irrelevant. Yet that, alas, is what too many children experience. For notwithstanding the previous government’s investment in ‘music hubs’, Ofsted remains pessimistic as to both the quality of music teaching and – no less serious – the ability of some school leaders to judge it and take appropriate remedial action, finding them too ready to entertain low expectations of children’s musical capacities.

But then this is another historic nettle that successive governments have failed to grasp. In its final report  the Cambridge Primary Review recommended (page 506) a DfE-led enquiry into the primary sector’s capacity and resources to teach all subjects, not just ‘the basics’, to the highest standard, on the grounds that our children are entitled to nothing less and because of what inspection evidence consistently shows about the unevenness of schools’ curriculum expertise. DfE accepted CPR’s recommendation and during 2010-12 undertook its curriculum capacity enquiry, in the process confirming CPR’s evidence, arguments and possible solutions. However, for reasons only DfE can explain, the resulting report was never made public (though as the enquiry’s adviser I have seen it).

In every sense it’s time to face the music.

As well as being Chair of the Cambridge Primary Review Trust, Robin Alexander is a Trustee of the Prince’s Foundation for Children and the Arts.

 www.robinalexander.org.uk

Filed under: arts education, assessment, Cambridge Primary Review Trust, creativity, disadvantage, evidence, music education, national curriculum, Robin Alexander, tests

September 4, 2015 by Warwick Mansell

Test of truth

Are ministers continuing to misuse data in promoting their favoured policies?

Nick Gibb, the schools minister, seems not to have taken on board the implications of a recent letter from the statistics watchdog, stemming from one of my previous CPRT blogs, about how primary schools’ test results should be interpreted and presented.

When the provisional 2015 Key Stage 2 results for England were published last week, Mr Gibb was quoted in the DfE’s press release celebrating big gains overall in average results since Labour left office in 2010. The minister also highlighted, again, the performance of academies, and in particular that of sponsored academies – typically struggling schools whose management is transferred to an outside body which signs a contract with the Secretary of State – as improving faster than the national average.

However, in doing so he ignored a warning from the UK Statistics Authority (UKSA) about over-interpretation of data. The DfE release also seemed to be heavily skewed in favour of a particular narrative, when, as I suggest below, other interpretations are available. And the national data themselves seem to beg questions about what, in reality, has driven the big recent jumps in pupil performance.

That UKSA intervention was prompted after I wrote my CPRT blog in February and followed this up with one for NAHT which argued that seemingly big improvements in sponsored academy KS2 results last year may have been nothing to do with academy status. Rather, I argued, they seemed to follow a national trend, whereby schools of all types with low statistical starting points had improved faster than the national average.

I wrote to the UKSA citing the two blogs and arguing that a DfE statistical release published in December 2014, on which ministers had relied to support their academies policy, should have investigated whether improvements in sponsored academy results came not as a result of the schools’ governance structures, but simply reflected a broader statistical trend for all types of schools.

Ed Humpherson, UKSA director general for regulation, wrote to DfE in July to suggest that while ministers were entitled to use the contents of DfE’s December 2014 statistical release when they commented on the academies policy, the paper itself should have made clear that ‘the differences in the rates of improvement [of academies versus other schools] were not necessarily caused by school type.’ He also recommended that future statistical publications should see DfE ‘commenting on limitations’ when interpreting these statistics, in order to ‘make it clearer to Ministers and to other users that the statistics could not be used to infer a causal link between school type and either attainment or rates of improvement.’

Last week came the first test of how DfE and ministers would react to this advice, with the first statistical publication revealing this year’s KS2 results, and the accompanying press release. Mr Humpherson’s warning seems to have been taken on board to some extent in the DfE statistical release, but – perhaps unsurprisingly – not at all by the minister.

The new DfE statistical release  has a section on academy performance, as was the case last year. Again, it notes how sponsored academies improved faster than the average for all schools. This time, though, it says  that when interpreting differential rates of improvement between types of school ‘it should be noted that the extent to which a school improves is related to a range of factors. Schools with the lowest previous outcomes tend to see the largest improvements…’

For me, this does not go far enough in stating clearly, in line with UKSA, that differences in improvement rates between schools of different types may be nothing to do with whether the institution is an academy or not.

Yes, this extra line of interpretation is an improvement on last year, and in that sense should be welcomed. However, it appears not to have been clear enough for Mr Gibb, whose press release claims: ‘The results…show that sponsored primary academies…are improving more quickly than those run by local authorities.’ Most controversially, Mr Gibb is also quoted as saying: ‘These results vindicate our decision to expand the valuable academies programme into primary schools.’

So, Mr Gibb is inferring a causal link between school type and results, seemingly against the advice of the UKSA.

As mentioned in previous blogs, this is not a purely political or statistical debate with only abstract implications. No, this possibly erroneous and misleading interpretation is likely to have profound implications on the ground, as struggling primary schools are pushed, often controversially, towards sponsored academy status on evidential grounds which still seem dubious.

Of course it may be that this year’s sponsored academy results do not fit the statistical pattern of previous years. It may be that they have improved substantially, while other previously low-performing local authority schools have not. We will not know for sure if that is the case until all school-by-school results are published towards Christmas. But such a phenomenon seems unlikely, based on what has happened in the recent past.

We do also already have further data for 2015 which cast Mr Gibb’s pronouncements in the press release in a somewhat different light from that intended. In the DfE release, Mr Gibb talks not only of major improvements since 2010, with 90,000 more pupils achieving the expected levels in maths and literacy, but of the results in different local authority areas. The narrative with regard to the latter is almost entirely negative. In fact, throughout this release, the only messages to come through are that ministers and their policies are proving successful; that the types of schools favoured by ministers in their reforms are proving successful; and that particular local authorities – yes, that’s government, but not the national government presided over by ministers  – are underperforming and so are facing a ‘crackdown’.

Remarkably, there is no mention at all that other actors in this annual statistical drama – children, their schools and teachers, and their parents – may have played a part in improving results.

In relation to local authorities, the release features a table of ‘best performing local authority areas’ and ‘worst performing local authority areas’, but the text focuses only on the latter, with Mr Gibb promising to write to directors of LAs at the bottom of the rankings to get them to ‘explain how they intend to improve the teaching of reading and arithmetic in the primary schools under their control’.

There are several ways to unpick that last phrase, by the way. For example, do local authority directors really have much influence over teaching content? Is ‘arithmetic’ all that mathematics amounts to now? Have local authorities really ‘controlled’ schools since the 1988 Education Reform Act, introduced by the Conservatives supposedly to stop LA control happening? But we must move on.

The interesting thing is that, within these latest statistics, DfE did publish LA-by-LA figures which point to some large improvements in recent years. So, two authorities have improved their headline percentage of pupils achieving level four in reading, writing and mathematics by 12 points since 2012. In Hull, the figure rose from well below the national average, at 67 per cent, in 2012, to 79 per cent, just below the national figure. In Portsmouth, the gain was also by 12 points, from 65 to 77 per cent. Another five authorities – Redcar, Herefordshire, Suffolk, East Sussex and Hounslow – improved by at least nine percentage points across the three years. Overall, five of the top seven fastest rising authorities, on this measure, had below-average results in 2012 so have either closed the gap with the national average or have surpassed it.

Some of them, including Hull, it is true, do have a higher than average numbers of academies. Yet outside one very small authority – Rutland, where performance tends to jump around every year – the fastest-rising LA from 2014 to 2015 on this headline measure was South Tyneside, where results surged by seven percentage points. DfE data reveals that South Tyneside has only one sponsored primary academy. Meanwhile, the academy chain widely seen as the most successful in England – Ark Schools – posted average headline results which, at 72 per cent, were a point lower than the lowest-performing local authorities nationally. Will Mr Gibb now be writing to Ark?

It is possible, then, to see from the above statistics how an alternative narrative could have been crafted, perhaps based on ministerial praise for local authority areas which have risen on the Government’s chosen measures. As ever, interpretation of statistics can depend on what the interpreter chooses not to highlight.

One final set of questions present themselves from the press release’s statistics. What do the last few years of generally improving national data actually mean?

Of course, the implications of the press release, as voiced by Mr Gibb, are clear. Results have improved strongly since 2010. This shows, said Mr Gibb, that ‘the government is delivering on its one nation vision for education’ and that ministerial policies are paying off. The national data behind this claim show that the proportion of pupils achieving the expected level 4 in all of reading, writing and maths rose from 62 per cent in 2009 to 80 per cent this year.

But to repeat: why has this happened? I’m not convinced that any of the three policies listed in the DfE press release – introducing higher floor targets, banning calculators from maths tests and introducing a spelling, punctuation and grammar test – have been entirely behind it.

And perhaps the most obvious change that a government can make to teaching and learning – the introduction of a new national curriculum – cannot have contributed here as none of the pupils taking the 2015 tests have experienced the national curriculum introduced by the previous government.

So it is a bit of a mystery. Perhaps readers of this blog can explain why the figures have jumped. I am certainly curious about them, and would like to investigate further. For if anything is to be underlined from recent ministerial interpretations of figures, it is the need continually to ask questions.

Warwick Mansell, one of CPRT’s regular bloggers, is a freelance journalist and author of ‘Education by Numbers: the tyranny of testing’ (Methuen, 2007).

This is not the first time that our bloggers have had cause to challenge the government’s use of evidence. Click here for further comment.

For other blogs by Warwick Mansell click here and/or download CPRT’s book Primary Colours.

Filed under: academies, assessment, Cambridge Primary Review Trust, Department for Education, evidence, KS2 tests, Nick Gibb, standards, UK Statistics Authority, Warwick Mansell

June 19, 2015 by Warwick Mansell

Can data really define ‘coasting’?

For me, this is the question of the moment, with the Education and Adoption Bill, whose first section is on the charmingly-worded but as-yet-undefined term ‘coasting schools’, having started its passage through Parliament.

The bill promises to sweep a new category of these schools into the reach of the ‘intervention’ powers of the Secretary of State, which include issuing academy orders forcing schools into the arms of new sponsors.

In a blunt exchange at Education Questions in the House of Commons this week, Nicky Morgan, the Education Secretary, reminded her Labour shadow, Tristram Hunt, that the definition of coasting schools would not come until part-way through the passage of the bill, at its committee stage. But I’ve already had a steer on its likely content. The Department for Education’s press office has told me that the ‘coasting’ definition will focus on data, and specifically the school’s performance over time. The idea, I was told – these are not my words – was to home in on schools which have failed to fulfil their pupils’ potential.

If ‘coasting schools’ are to be defined entirely in terms of results data, I think this will be the first time that formal intervention powers by central government will have been triggered completely by assessment statistics. This already appears to be a departure both from a promise reportedly made by David Cameron before the election and from the contents of the Conservatives’ election manifesto.

In March, the Daily Mail warned that ‘coasting schools’ would be targeted under a new Conservative government, with the Prime Minister quoted – depressingly, though predictably given our experience of the past 20 years of education policy-making – as ‘waging all-out war on mediocrity’.

However, the definition of ‘coasting’ suggested in that piece was an Ofsted judgement. Schools falling in the inspectorate’s ‘requires improvement’ category would ‘automatically be considered’ for turning into academies. Only if they could demonstrate clear plans for improvement, as judged by the Regional Schools Commissioners – England’s new cadre of officials appointed by the Secretary of State, taking decisions in private – would they avoid a change of leadership. The manifesto backed this up, saying: ‘Any school judged by Ofsted to be requiring improvement will be taken over…unless it can demonstrate that it has a plan to improve rapidly.’

But there were indications post-election that the definition was changing. Now it appears that results statistics are going to be the key driver. And that, of course, has big implications.

First, I think it has repercussions for the very controversial language used. A little diversion might be in order here, into the perhaps simpler realm of football.

Imagine, say, a football team, without any great history of success, which gets promoted to the Premier League one season. In its first year in the top tier, it finishes, say, 12th. This is seen as a big achievement, as the club beats many longer-established, richer outfits and comfortably avoids the relegation that comes with finishing 18th or lower. The following season, results are even better, with a 10th place finish the reward. The next two seasons, consistency seems to have been achieved, with 11th and 13th places secured.

However, any outsider looking only at the club’s end of season position over the years might conclude that it has been drifting. Someone could almost call it a ‘coasting’ club in its last two seasons, based on data alone. But while the possible reaction to the club’s statistical direction of travel – sack the manager – may or may not be right, any implication that it was ‘coasting’ and therefore not trying, would be to over-interpret the results. For faced with that ‘coasting’ slur, the club’s manager and any of his coaching staff or players would be incensed. The manager arrives at his desk at 6.30 every morning, hardly has a holiday in the summer and the attention to detail on the training ground is phenomenal.

But the manager does not have total control over the performances of his players and is up against other teams who may be trying similarly hard. He argues that, in a league where he will never have the budgets of the big clubs, survival in the Premier League is success. While results, then, might suggest non-progress, this is based on anything but a sense that the manager is just taking things easy: it is a real triumph.

In contrast, there was the case of a real Premier League football club recently, again having established itself comfortably in mid-table following promotion, where the manager was said by the club’s board to be too laid-back about training. He was replaced by a former player, who has marginally improved its overall position. ‘Coasting’ might have been a more appropriate word in that case, if the characterisation of the former manager’s attitude was right.

My point is that data alone will never tell you whether a football club, or indeed a school, is ‘coasting’ or not. ‘Coasting’ suggests a lack of effort but all we have, with results data, is a statistical end product: the output numbers. Teachers could be working phenomenally hard, and yet failing to improve results as much as outsiders might wish, because schools, in reality, do not have full control over results. These are, inevitably, subject to unpredictability, from the motivation and ability of pupils to ‘perform’ on the big day to the vagaries of marking. And there may be a sense of a zero-sum game: ‘below-average’ schools will always be penalised, even if all schools are working very hard, if the indicators used are based on comparing one school’s results to others’.

Are we really happy, I wonder, to bandy around a word, with all its dismissive implications for the professionals whom our system has spent years training and paying, and with whom we entrust our children, when we are unsure of its accuracy in individual cases?

So to use the results as indicators of underlying effort is as lazy – is this ‘coasting’ policy merely following the assumptions of its accustomed comfort-zone? – as it is potentially misleading. And, in implicitly being brazenly unconcerned about who gets labelled in this way, policy-makers seem to compound the insult that the word ‘coasting’ undoubtedly provokes in many in the profession.

I have to say, surveying the school accountability regime as I do, that I find it very hard to believe that many, if any, schools can truly be said to be ‘coasting’. There are, surely, already too many penalties for those schools which fail to improve their pupils’ results, starting with the head losing his or her job following a failing Ofsted, for any of them to take it easy, I reason. And surveys of teacher workload surely make unarguable the case that most professionals are putting in very long hours in term-time – and often adding to them in the holidays – often under considerable pressure.

And yet here have we have the phrase ‘coasting schools’ backed not only by the Prime Minister and his Education Secretary, but written on the face of an education bill, in its first clause.

Individuals whom I respect, working more closely with schools than I am, have countered that there are some institutions which are not working as hard as they could to provide the best possible education for their pupils. Fair enough. But my point remains: data alone will not tell us which ones they are, because there is no straight read-across from outcome data to teacher commitment and motivation. This seems to me be to be another example of policy-makers making heroic assumptions of what can be read into results statistics alone.

We will have to wait until we have a definition in full – if, indeed during this bill’s passage, we get all the details which will be used in reality by those taking decisions on schools – in order to judge the technical reliability of the datasets being used. But with the futures of more schools poised to hinge on results statistics, this is likely to place even greater weight on, for example, marking reliability. Can it withstand the pressures being placed upon it? Again, the assumption is always that it can. But national curriculum tests and GCSEs, for example, have not been designed with the intention that institutions’ existence could rest on them.

A final implication should be obvious to anyone who is interested in the unintended consequences of assessment-driven accountability. Allowing schools to be placed as, in the language of the bill, ‘eligible for intervention’ – in other words, available for a management takeover – on the basis of results data alone will, surely, accentuate teaching to the test. With so much riding on performance on a particular set of indicators, the incentive for schools to concentrate even more narrowly on doing whatever it takes to maximise performance on those particular indicators will be underlined. If, on the other hand, the statistical definition of ‘coasting’ is not precise, teaching to particular indicators may be more tricky but then Regional Schools Commissioners stand to be accused of arbitrariness in selecting which schools count as ‘coasting’.

To ministers and those defending these plans, this is all to the good. The ‘war on mediocrity’ really will force institutions and those working in them to raise their game, with the implication that countless previous reforms in the same vein have not fully succeeded in doing so. Labelling schools, then, as ‘coasting’ – even if the label is in some cases inaccurate – is not a problem and will just reinvigorate professionals who need a bit of a push. And focusing on particular indicators is fine, as these centrally-defined metrics will just spur teachers to prioritise aspects of education which are important.

To this observer, who sees teachers for the hard-working, often stressed individuals they are, and wonders about the message being sent to this and the next generation of professionals about their efforts and about the alienation of policy-making from its implications on the ground, there is a sense of despair.

As ever, and as evidenced and articulated by the Cambridge Primary Review, the hope is that professionals can still educate pupils well in spite of policy-making, rather than because of it.

Warwick Mansell, one of CPRT’s regular bloggers, is a freelance journalist and author of ‘Education by Numbers: the tyranny of testing’ (Methuen, 2007).

For other blogs by Warwick Mansell click here and/or download CPRT’s book Primary Colours.

Filed under: 'coasting schools', accountability, assessment, Cambridge Primary Review Trust, data, Education and Adoption Bill, evidence, metrics, Nicky Morgan, Warwick Mansell

March 13, 2015 by David Reedy

Are we nearly there yet?

February saw a flurry of government announcements about assessment in English schools.

On 4 February information about reception baseline assessment was published. In summary this states that from September 2015 schools may use a baseline assessment on children’s attainment at the beginning of the reception year. DFE has commissioned six providers which are listed in the document. Schools can choose the provider they prefer. This is not compulsory but the guidance states:

Government-funded schools that wish to use the reception baseline assessment from September 2015 should sign up by the end of April. In 2022 we’ll then use whichever measure shows the most progress: your reception baseline to key stage 2 results or your key stage 1 results to key stage 2 results.

From September 2016 you’ll only be able to use your reception baseline to key stage 2 results to measure progress. If you choose not to use the reception baseline, from 2023 we’ll only hold you to account by your pupils’ attainment at the end of key stage 2.

The Early Years Foundation Stage Profile (EYFSP) stops being compulsory in September 2016 too.

DfE is therefore essentially ensuring that emphasis is placed on a narrowing measure of attainment in language, literacy and mathematics (with a few small extra bits in most of the six cases), rather than an assessment which presents a much more holistic view of a child’s learning and development. There is a veiled threat implied in the information quoted above. If a school doesn’t use one of these baselines, progress will not be taken into account when a primary school is judged as good or not. Not doing the baseline might be an advantage to a school where children would do very well on it, and then only make expected progress, but still achieve high scores at the end of KS2, and the reverse if children would score very low on the baseline. Are schools now going to gamble whether to do these or not?

There are other serious issues leading to further uncertainty for schools. Almost all the recommended schemes are restricted mainly to language, literacy and mathematics and therefore progress and the school’s effectiveness would be based on a narrow view of what the aims of primary education is for. Five of the six chosen systems do not explicitly draw on parents’ and carers’ knowledge of their children and thus will be based on incomplete evidence. As TACTYC has pointed out, there are fundamental concerns about reliability and validity.

Comparisons between schools and overall judgements would be compromised when there are six different ways to measure the starting points of children in reception. It is inconsistent to allow schools to choose between six providers at baseline but only allow one choice at age 7 and 11.

Finally, as the first time progress will be measured from the baseline at age 11 will be in summer 2023 there will be at least two general elections before then. Will education policy in assessment remain static until then? On current experience that is highly unlikely.

Alongside this inconsistency and uncertainty about the reception baseline the government published its response to the consultations about the draft performance descriptors for the end of KS1 and KS2.

The responses were significantly more negative than positive with the vast majority of  respondents indicating that these descriptors were not good enough and would not be able to do the job they were designed to do. Indeed nearly half thought the descriptors were not fit for purpose.

At the same time, and no doubt as a result of the consultation, DFE announced an Assessment without Levels Commission with the remit of ‘supporting  primary and secondary schools with the transition to assessment without levels, identifying and sharing good practice in assessment’.

This is clearly to address the significant  uncertainty about ongoing and summative assessment at the end of key stages where schools continue to struggle to understand what DfE’s thinking actually is now that levels have been abolished.

Schools in England are in a cleft stick. Do they choose to do one of the baseline tests, which will take considerable time to administer one to one without knowing if it will be used in seven years time or be of use next week to help plan provision? Can they afford to wait for the assessment commission to recommend an approach to assessment without levels or do they get on with it and possibly end up with a system that doesn’t fit with what is recommended?

Thus to answer the question in this blog’s title, the answer on the basis of the evidence above is ‘Who knows?’

What a contrast to the situation in Wales where, also in February, Successful Futures, the review of the curriculum and assessment framework for Wales led by Professor Graham Donaldson, was published.

This states:

Phases and key stages should be removed in order to improve progression, and should be based on a well-grounded, nationally described continuum of learning that flows from when a child enters education through to the end of statutory schooling at 16 and beyond.

Learning will be less fragmented… and progression should be signaled through Progression Steps, rather than levels. Progression Steps will be described at five points in the learning continuum, relating broadly to expectations at ages 5, 8, 11, 14 and 16…. Each Progression Step should be viewed as a staging post for the educational development of every child, not a judgement.

What a sensible and coherent recommendation for assessment policy. Thus Wales may very well end up with a coherent, agreed, national framework for both mapping progress and judging attainment at specific ages within a broad understanding of the overall aims of education.

Perhaps it is not surprising that Professor Donaldson’s review drew significantly on the Cambridge Primary Review’s Final Report in coming to its conclusions.

Maybe England’s policy makers should too.

Assessment reform is a CPRT priority. For a round-up of CPR and CPRT evidence on assessment see our Priorities in Action page. This contains links to Wynne Harlen’s recent CPRT research review, relevant blogs, CPRT regional activities, CPR and CPRT evidence to government consultations on assessment, and the many CPR publications on this topic.

Filed under: assessment, Cambridge Primary Review Trust, David Reedy, England, Wales

February 13, 2015 by Iain Erskine

Planning, teaching, assessing: journey to coherence

In 2003, Fulbridge Primary School came out of Special Measures and in 2012 it was judged ‘outstanding’ in every Ofsted inspection area. Along the way, we were assessed by Creative Partnerships and in 2008 we gained the status of a National School of Creativity. In 2013, we converted into an Academy. In December 2014, we were invited to be a Whole Education Pathfinder school. Most significantly however, we became a member of the Cambridge Primary Review Trust’s Schools Alliance in 2014 and adopted the principles, priorities, vision, aims and curriculum domains of the Cambridge Primary Review.

Once we left the Special Measures Club we decided that more of the same would not work, so we embarked on a curriculum and school development journey that can fairly be called never-ending. On this journey we have been lucky enough to learn from the likes of Roger Cole, Mick Waters, Mathilda Joubert, Alan Peat, Lindy Barclay and Andy Hind. But it’s our decision to accept the invitation to work with the Cambridge Primary Review Trust that will have the biggest impact.

Before the Cambridge Primary Review we had been working to develop a curriculum based on creativity, first hand experiences and the local environment. This suited our school, its pupils, teachers and community.  But when the CPR final report appeared we discovered that it encapsulated both what we had been aspiring towards and what we had not yet addressed. So it not only aligned with what we were already doing but also offered us a way forward that would lead to further improvements. In this we heeded the parting comment of our lead Ofsted inspector: ‘Remember: “outstanding” is not perfect’.

So what have we done since becoming a member of CPRT’s Schools Alliance?

From September 2014 we started teaching, assessing and planning by reference to CPR’s eight curriculum domains: arts and creativity; citizenship and ethics; faith and belief; language, oracy and literacy; mathematics; physical and emotional health; place and time; science and technology.  These are not unlike DfE’s seven early years areas of learning and development – and indeed the CPR report made it clear that its domains were intended to encourage curriculum continuity from early years to primary and from primary to secondary – so we decided to adopt them throughout the school, from nursery to year 6. This meant that there would be significant changes to our assessment processes too, because assessment without levels was introduced nationally at the same time.

To demonstrate genuine commitment to a broad and balanced curriculum we wanted to assess children’s learning in every domain, so a great deal of thought, research and work went into creating an approach which provides effective assessment without losing the exciting and innovative curriculum that we created, which we believe, in CPR’s words, ‘engages children’s attention, excites and empowers their thinking and advances their knowledge, understanding and skill.’

The time to make changes is when you are doing really well; don’t leave it until things start going wrong. The master of this principle was of course Alex Ferguson at Manchester United, hence the unparalleled success that the Red Devils have enjoyed over many years. So we too have adopted that principle in the hope of creating a Theatre of Dreams at Fulbridge as he did at Old Trafford.

September 2014 brought major changes and initiatives such as the new national curriculum, the SEND code of practice and of course the new assessment requirements and we too changed many of our structures. Meanwhile we have had a new 240-place building constructed which allows us to move from a 3 to 4 form entry school.

We are an enthusiastic Google Apps school, so all the new structures were created in Google Drive on Excel sheets, a format that allows everyone to contribute and add to the master document that will cover all our short, medium and long term planning. This process proved to be a great way to ensure participation and ownership by all staff. Alongside this we are working with Pupil Asset, who have created a bespoke tracking system that will tell you – if you really want to know – whether a child with size ten feet, blue eyes and ginger hair is over or underperforming compared to national averages.

Planning, teaching and assessing are the keys to everything that happens in our classrooms. We took the government’s proposed freedoms as a genuine invitation and made sure that each part of the cycle linked to the others. Thus, we use the same criteria to plan, teach and assess. To start the process we look at what we want to assess, having merged the CPR’s eight curriculum domains with the new national curriculum. We have created areas of assessment within each domain, aligning them with the attainment targets from the primary curriculum. In addition, we looked at how this linked to the topics and themes we teach, taking away parts of the new curriculum we didn’t want to use and adding any parts that were missing – the most serious omission being oracy.

We followed the same process of aligning curriculum domains and assessment strands in our EYFS Developmental Matters statements. Planning, teaching and assessing are now coherently and consistently applied and practised from nursery to year 6.  During the current school year we are establishing what works and what fits, modifying elements as necessary so that by the end of the year we will have refined and embedded a system that we can take forward.

In basing all we are doing on the Cambridge Primary Review, we know that what we are doing is based on sound evidence, which makes a refreshing change when we think back to some of the initiatives that successive governments have introduced.

To support all these changes, our website was updated. Links to the CPRT  website were easily made, but ensuring that the site’s curriculum area reflected all we are doing as a member of CPRT’s Schools Alliance took more time. After consulting staff and Governors, our new Ethos and Aims statement was uploaded onto the site. This adapts the CPR educational aims to reflect our overall approach and the character of our school community.

Iain Erskine is Head Teacher of Fulbridge Academy, Peterborough and a member of the Cambridge Primary Review Trust Schools Alliance. This is the second in a series of occasional blogs in which Alliance members write about their schools and we provide links to enable you to discover how their vision works in practice.

For further information about Fulbridge Academy, click here.

For other blogs about featured CPRT Schools Alliance schools, click here.

Filed under: assessment, Cambridge Primary Review Trust, curriculum, Fulbridge Academy, Iain Erskine, Schools Alliance

November 21, 2014 by CPRT

CPRT publishes new report on assessment

CPRT has commissioned a number of reports on research that bears on its eight priorities. The first of these, by Professor Wynne Harlen, has now been published. Entitled Assessment, Standards and Quality of Learning in Primary Education, it may be viewed/downloaded here. A three page briefing/executive summary may also be viewed and/or downloaded.

21 November 2014

Filed Under: assessment, Cambridge Primary Review Trust, research, Wynne Harlen

October 10, 2014 by David Reedy

Teaching or testing: which matters more?

On 26th September the Schools Minister, Nick Gibb, was extremely pleased to announce that the results of the phonics check for 6 year olds in England had improved considerably: 18 per cent more children had reached the ‘expected standard’ in 2014 than in 2012 when the test was introduced. A government spokesman stated that ‘100,000 more children than in 2012 are on track to become excellent readers’.

As primary teachers are aware, the phonics check has become a high stakes test. School results are collated and analysed in depth through RAISEOnline and made available to Ofsted inspectors, who are explicitly told to consider these results as evidence of the effective teaching of early reading in the current framework for Ofsted inspections.

The CPR final report in 2010 pointed out that primary children in England were tested more frequently than in many other countries, including some that rank higher in the international performance league tables. Since then the difference has become even more marked. Further tests have been introduced – the phonics check and the introduction of a grammar strand in the tests for 11 year olds – with the intention of introducing a similar grammar strand for 7 year olds in 2016.

Politicians like Nick Gibb like to claim that tests like these raise standards, yet CPR found that the evidence of a causal relationship between tests and raised standards was at best oblique. It continues to be unconvincing. Scores in the tests rise, certainly. But what high stakes tests do is ‘force teachers, pupils and parents to concentrate their attention on those areas of learning to be tested, too often to the exclusion of much activity of considerable educational importance'(CPR final report, page 325).

This is particularly true of the phonics check with its 20 phonically-regular real words and 20 non- words to be decoded, with 80 per cent accuracy required if it is to be passed. Indeed, as Alice Bradbury points out, there is considerable disquiet that the check was introduced by politicians as a means of forcing teachers to change the way they teach early reading.

In his rather approving analysis of the test results David Waugh said, ‘I know many teachers who now concentrate a lot of time on teaching children how to read invented words to help them pass the test.’ This has been my experience too.

Thus the test promotes a distortion of reading development. Teachers in primary classrooms spend extra time on teaching children how to read made-up words, diminishing the time for reading real words and teaching the other strategies needed for accurate word reading (whole word recognition of irregular words, the use of context for words such as read, for example), let alone comprehension and the wider experience of different kinds of text.

Increased test scores do not infallibly demonstrate improved standards. Wynne Harlen confirms this in the forthcoming review of research on assessment and testing which CPRT has commissioned as one of its 2014 research updates of evidence cited by CPR (to be published shortly: watch this space). It is therefore hardly surprising that results of the phonics check have improved as teachers become familiar with the demands of the test and adapt their teaching in line with them. Yet here we have a test that undermines the curriculum and is unlikely to give any useful information about children’s reading development; a government which is committed to increasing the number of tests young children are subject to despite evidence of their negative effects; and an opposition that has given no indication that it will change this situation if elected in 2015.

In 2010 the Cambridge Primary Review cited assessment reform as one of its eleven post-election policy priorities for the incoming government. As we approach the 2015 election assessment reform remains, in my view, as urgent a priority as it was in 2010.

David Reedy, formerly Principal Primary Adviser in Barking and Dagenham LA, is a CPRT co-director and General Secretary of UKLA.

  • To find out how to contribute to the debate about primary education policy priorities for the 2015 general election, see Robin Alexander’s blog of 25 September.
« Previous Page

Contact

Cambridge Primary Review Trust - Email: administrator@cprtrust.org.uk

Copyright © 2025