These value added scores were intended to allow schools to demonstrate the progress made by children during Key Stage 2 (7-11).
SIZE OF SCHOOL
-The small size of many primary schools results in wild fluctuations in the score from year to year, even with consistent teaching. Data are published for schools with as few as 11 pupils in a year group and this means that, even when the quality of a school's educational provision remains the same, its indicators will give the impression of change. A more realistic minimum Key Stage 2 year group size to give reliable measures of schools would be 50.
QUALITY OF KEY STAGE 1 DATA
-There are issues with the quality of the end of Key Stage 1 (age 7) used to calculate value-added scores. Internal marking and the possibility of differences between infant schools and all through primary schools make the validity of comparisons between the different schools questionable.
It might be helpful if the DfES were to investigate whether infant school KS1 results tend to be better than those of comparable all through primary schools, and to publish their conclusions. If no statistically significant difference were to be found, this might also allay many teachers' concerns.
-Value-added scores are assigned to individual schools on the basis of pupils who were attending that school when the Key Stage 2 tests were taken. But this does not take account of pupil turnover and the fact that, for many schools, a high proportion of their pupils were not on their school roll for all of the preceding four years. As it stands, value-added measures are published if data are ava ilable for half the pupils in the last year before they leave primary school, including those who have arrived from other schools. In other words, the published value-added measures can be based on less than half the pupils that the school taught in Years 3 to 6. For many schools, a great deal of effort, perhaps disproportionate effort, is put into working with transient pupils.
In future, the DfES should use a stability indicator that can be applied to junior schools (7-11), and only publish value-added results for schools whose stability indicator is 90% or more, ie high stability.
ADVERSE IMPACT ON THE MOST ABLE PUPILS
-One of the most serious concerns with the DfES's methodology is that it exhibits a prominent ceiling effect, which adversely affects the most highly achieving children. Schools with high achieving pupils at the end of Key Stage 1 are prevented from achieving high value-added scores at the end of Key Stage 2. This results partly from the fact that a 7-year old who achieves Level 3 cannot show value-added if they achieve Level 5 at the age of 11 and partly because Level 5 is the highest that can currently be achieved at the age of 11.
-As published, the results do not compare like with like. It is unfair to hold schools to account based on comparisons between schools in affluent areas and those in deprived areas. Contextual factors may have a significant effect on a school's success. These include differences in parental support, neighbourhood influences, peer-group effects and the proportion of children whose first language is not English.
The report concludes that although value-added information is an essential tool for schools, the publishing of value-added indicators in their current form is misleading and should be discontinued. Value-added measures should not be published at the end of KS2 in 2004. Even if the issues listed were addressed, it will be several years before value-added measures could be published, and then only for some schools. The Report recommends that the whole issue of performance tables for primary schools should be re-examined.
David Hart, general secretary NAHT comments as follows: 'Performance tables for 11-year olds are fatally flawed. From the outset, the NAHT has remained opposed to the publication of performance tables on the grounds that they are unfair, are misleading and have a strongly negative effect on assessment and on the breadth and balance of the curriculum.
In particular the NAHT cannot understand how parental choice is being properly informed when absent pupils, pupils with statements, units containing pupils with significant learning difficulties are counted in the calculation of percentages for the tables when they have no positive result to contribute to the threshold scores.
Whilst we accept that a value-added system is likely to be less unfair, we remain very concerned at the way it is being calculated by the Department for inclusion in the performance tables.
The NAHT continues to press for abandoning of all forms of performance tables, as is now the case in Scotland, Wales and Northern Ireland. But if they are to remain in England, government and heads should work together on an acceptable set of value-added scores that tell the truth about how schools are performing.'
The NAHT represents well over 30,000 school leaders, including virtually every Special School head, 85% of all primary school heads, and over 40% of all secondary school heads.