Finding that water flows downhill is not all that surprising. But it can be vey much worth knowing how fast that water moves, and how badly wrong we can be if we assume the waters are still.
Everyone knows New Zealand’s measures of school performance are a bit of a nonsense. But it has been hard to pin down just how bad those measures are.
NCEA pass rates give one measure of performance – one that hopelessly intermingles a school’s contribution with the community that the school serves. Reaching the upper end of the NCEA league tables is a lot easier for schools serving the children of highly educated parents. Meanwhile, stellar contributions from less privileged schools can go unrecognised.
Too often, a school’s decile ranking is taken as a measure of the school’s quality.
While Education Review Office reports provide some insight about what is going on in a school, they are not easy for parents to parse. While the reports can provide a reasonable qualitative assessment, they do not provide a strong basis for comparing outcomes at different schools.
This week, the Initiative released the results of more than a year’s work in Statistics New Zealand’s data lab building a better measure of school performance. When we accounted properly for the effects of factors outside a school’s control, we found that most schools performed very comparably to each other. We also found that a huge proportion of decile 1 and 2 secondary schools – over 40% – were in the top 25% of schools overall. These schools would otherwise have been unfairly overlooked if their raw NCEA scores did not put them near the top of the charts.
Building the measure
Building a better measure of school performance requires sound information about the circumstances students bring with them into the classroom. That work is now possible thanks to the work Statistics New Zealand has put into linking the back-end administrative databases held by the various Ministries.
The Ministry of Education has data on every grade awarded to every student going through NCEA. Census provide information on the education of every person in the country, regardless of whether they went through NCEA. Inland Revenue knows everyone’s income. The Ministry of Justice knows who has spent time in prison. The Ministry for Social Development has details on receipt of benefits. Oranga Tamariki knows about notifications of child abuse.
Statistics New Zealand’s Integrated Data Infrastructure allows qualified and approved researchers to link together anonymised records across those different databases to build a rather comprehensive picture of the circumstances facing every student who has gone through NCEA.
The Initiative’s Joel Hernandez spent the past year linking those background records for the approximately 400,000 students who completed NCEA over the past decade. Each student was linked not only to the grades they achieved in each NCEA standard, but also to their parents’ incomes and education, criminal records, benefits history, and more.
Linking the data let us figure out how much of the difference in NCEA outcomes between students was likely due to differences in their parents’ incomes, differences in their parents’ education, differences in the other family background measures we included – and differences in the secondary school they attended.
We then had a far more accurate basis for assessing a school’s performance – one that adjusts for differences in student backgrounds rather than unfairly rewarding or penalising schools for the characteristics of the communities they serve.
The results confirm that water does flow downhill: the differences between schools in NCEA outcomes is a lot smaller when you account appropriately for differences in the communities that different schools serve. Measured properly, the middle 80% of schools perform very comparably to one another.
In Figure 7 below, the blue curve tracks unadjusted performance at NCEA level 1, showing substantial differences in relative performance between the top schools and others. The curve is fairly sharp. The red curve pulls out all the factors outside a school’s control – and shows that most schools are rather comparable in performance, with little difference between the school ranked 100th from the bottom and the school ranked 400th.
There are still differences across schools. The difference between a high performing school and a school at the middle of the distribution is comparable to the difference between growing up in a household where both parents have graduate degrees, and one where neither parent has completed secondary school.
And while more traditional NCEA league tables show relatively few lower decile schools at the top of the ranks, our better measure helped us find the stars that really improve outcomes for the kids and communities they serve. On the unadjusted measure, the vast majority of decile 1 and 2 schools are in the bottom tier, and the vast majority of high decile schools appear to be top performers.
But when we stopped punishing lower decile schools for factors outside their control, and stopped rewarding higher decile schools for serving more privileged communities, we saw a substantial change. More than 40% of decile 1 and 2 schools were in the top performance tier; less than 40% of decile 9 and 10 schools were in that top tier.
Why does this matter?
The point of the exercise was not to build a far more accurate league table. Rather it was to improve school performance across the distribution.
Building on the work we have completed, it would be rather simple for the Ministry of Education to regularly provide every school and every school board in the country with detailed information on their own school’s performance. Some schools might not know what a superb job they are doing for communities that other schools fail; other schools might not recognise just how much their own results are padded by the privilege of the communities they serve.
The Ministry and the Education Review Office could use the results to check what differences in practice in schools result in the differences in outcomes we found in the data lab. Good practice could then be shared, lifting performance across the board. While most schools perform comparably to each other, that hardly means there is not room for improvement: the difference between the best performing schools and the broad set of middling performers remains large.
At least as importantly, this kind of measure could be used in evaluating the results of different policies that the Ministry of Education tries out from time to time, or for the results of interventions by the Education Review Office. ERO intervenes when things start going badly at a school, but does not have great ways of following up to see what works and what does not.
We have a lot more research scheduled, building on the work Joel has undertaken. We want to know how schools affect students’ later life outcomes, from tertiary enrolment through to employment. We want to know whether schools that do well on average do well for the different communities they serve, or whether there are schools that do a particularly good job with some groups of students rather than others. And we want to provide examples of the kinds of reports the Ministry should be providing to every school.
Most importantly, we want others to help us in this work. All the code that Joel developed is available in the Statistics New Zealand IDI wiki for other researchers to build on. Matching and cleaning the data is a long and onerous task that is too often repeated by many different groups of researchers. Building on the work we have already done both helps in checking that our work is accurate, and makes it much easier for others to get started. We very much hope that the country’s economics departments will consider using what we have started as the basis for Master’s thesis projects.
Everyone knows that water flows downhill. But putting a lot of time into thinking about just how water flows downhill, and the consequences of it, can lead to hydroelectric generation. We hope that our work can have similarly electrifying consequences.
A copy of report In Fairness to Schools can be found here