Valuing what we measure instead of measuring what we value?

Analysing the issues discussed in my first two posts around an inequitable educational provision in England has led to a close examination of my own assumptions, and increased questioning of the educational paradigm we currently operate in. My next series of posts on face value may seem unconnected, but I hope to ultimately bring them together into a coherent set of observations and recommendations.

Firstly some questions arising from a simple comparison of data from two real schools. My question is which school is providing a better education for the students it serves?

Which school is providing a better education?


From this initial data, the answer to that question must be the comprehensive school with an intake below national average prior attainment and high disadvantage. Despite this intake, a very similar percentage of its students are reaching top 3rd and Russell group universities as the selective school. The selective school has a prior attainment massively higher than average, lower than average FSM6 and yet is getting no-one into Oxbridge.

However, if we add more information the evaluation changes.

dest 2

If we look at KS4 and KS5 data in isolation from destination data there is seemingly no comparison – the comprehensive school is failing. For example, 99% of students from the selective school achieved 5A*-C including English and Maths, compared to 47% from the comprehensive and Progress 8 scores from 2016 also paint very different pictures. The comprehensive school did not see any students gaining an AAB in facilitating subjects.

Which is outstanding and which requires improvement?

Despite the above KS4 and KS5 data, the comprehensive school have excellent numbers going to good universities when compared to national figures (this is a big school, with a large sixth form). This looks set to continue this year, and with the 2016 cohort with such low P8 scores.

You could suggest that either the students leading to these destinations are the ones who did well at KS4, or perhaps they came to the sixth form from other schools. Maybe the school serves a very divided catchment and it is the ‘nice middle class kids’ who are the ones successfully securing places at good universities whilst others are being left to fail. There are endless questions and variables at play here. But what excuse could the selective school possible have for their KS5 destinations figures? Surely it is a travesty that 99% of their students achieved 5ACEM yet no-one went to Oxbridge and only 18% went to a top 3rd university despite positive KS5 VA?

The point I am making is that answering the question of which school is better depends on what you measure and what you value.

This is relevant to our current context as in the real world the comprehensive school has recently been judged by OFSTED as requiring improvement, whilst the selective school continues to be judged as outstanding.

Waypoints or destinations?

I would argue that what we as a system currently measure and value are waypoints along the way to a desired destination, they are not end goals in themselves yet we treat and judge them through OFSTED as such. When we link this with the real issue around the ability of schools with a low prior attainment to offer an equitable education to their students it leads to a strong disincentive for school leaders to take on our most challenging schools.

Disillusionment and Russian roulette

Consider what it must be like to work in each of our secondary schools above; where are stress levels higher? Where is morale higher? If I tell you the Headteacher of the comprehensive school recently retired early, disillusioned with our education system, would you be surprised?

To steal a phrase from @leadinglearner, would a Headteacher considering moving to the comprehensive school be playing Russian roulette with their career?

Questions about our schools arising from Progress 8. Post 1.

If we link the average prior KS2 attainment of a secondary school to its Progress 8 score we get the following graph.


If a school’s KS2 prior attainment is below national average, they are far more likely to have a negative progress 8 score.

Compare the number of schools in the bottom left quadrant to the bottom right. This is not what we would expect to see given Progress 8 methodology, and I want to understand this better as I think this is a significant factor that should be of interest to school leaders and policy makers trying to develop equitable and socially just schools and school systems.   This will be the first in a series of posts.

Education Datalab, (and many others) have already highlighted this trend, and I would highly recommend the post and links here:

These posts will use the Education Datalab posts, and others, as a springboard to further analysis.

A prior attainment premium for secondaries?

If we plot P8 against FSM6 outcomes the trend is much less conclusive


Given the current policy and inspection incentives and frameworks focus on  closing the Pupil Premium gap, is this suggesting a ‘prior attainment premium’ for secondaries is appropriate?  We know the pupil premium gap exists right from early years, giving a correlation between FSM6 and KS2 attainment – and this is telling us that a bigger indicator of the progress secondary students in a school make is the average prior attainment. This may be because the Pupil Premium is leading to a narrowing of the progress gap, but I am unsure of that. This is a complex question to which I do not have an answer, but something to have in the back of your mind when looking at further analysis.

It’s not the low prior attaining students causing this trend

If we look at the P8 scores of low, middle and high prior attaining students against the average prior attainment for the school we get these graphs. Please forgive the use of the word ability on these graph headers.

Low Prior Attainment


Middle Prior Attainment


High Prior Attainment


Note the steepness of the trend lines and the number in the bottom right and left quadrants in each of these graphs.

So it is actually the middle and high prior attaining students whose progress is most affected by attending schools with lower average prior attainment.

It’s not just curriculum issues

Have a look at the graphs below:

English Bucket


Maths Bucket


Ebacc bucket



Open Bucket


The trend may be steepest in the Ebacc bucket suggesting curriculum choice is one factor but it certainly does not explain what is happening alone.

A real equity and social justice issue that needs real action

There is a real issue around the ability of schools with a low prior attainment to offer an equitable education to their students and this is something policy makers must look at quickly and seriously

Again, @edudatalab have highlighted some possible factors in the links above, and @leadinglearner talks brilliantly about the consequences for school leaders, e.g.

Low prior attaining schools still being unfairly penalised by OFSTED

A real and present danger is that because Progress 8 is perceived to ameliorate prior attainment issues, the incentives for head teachers to play “Russian Roulette” with their careers and take on the greater challenge of low prior attaining schools is further reduced.  This issue MUST be recognised by OFSTED.

My recommendation is that contextualised scores are used by OFSTED when judging effectiveness. This does not mean we will have less aspiration for any students, only that we are judged fairly as a school.

Policies to improve equity and social justice in our school system

If we judge the success of a school system around issues of equity and social justice, this provides further evidence that we are currently failing.

What is absolutely clear is that the Pupil Premium alone is not enough to enable schools with low prior attaining intakes to combat the multiple complex factors leading to these worrying progress trends. Government must look seriously at the evidence and develop policy drivers that will actually make a difference.  Enough with the grammar school nonsense.

Sponsored academies bucking the trend?

It is interesting to note the trend line for sponsored academies (in red), who seem to buck this trend somewhat – though note the variation in the red sponsored academy dots. Is this evidence that sponsored academies are doing something meaningful to buck the trend or a result of a massive variation in data?

In my next post I will look more closely at what the evidence tells us about the factors behind this unacceptable situation, and tentatively suggest some possible solutions.