Will parents be able to choose the right school using National Standards data, which even the Prime Minister admits is "ropey"? Or is the Government rushing out the information for another reason? Geoff Cumming reports

When a national snapshot of how well children are doing at primary schools was released last week, authorities had a ready-made reason: parents wanted more information on "attainment".

On the same day, the first batch of National Standards data suggested too many youngsters learned to write by cellphone, the Education Ministry released a Colmar Brunton survey about the factors parents use to choose a primary school.

One answer from the online questioning of 68 parents and a half-dozen focus group meetings was, "Yes, we'd like more information about attainment please."

It's a fair bet the answer would have been different had parents been asked: what if that data was, in the Prime Minister's words, ropey - likely to be misleading, calculated in ways that vary from school to school, based sometimes on gut-feeling and sometimes on a few tests unrelated to the National Standards?


And what if the data measured only three areas - reading, writing and numeracy - rather than the many other, more creative things that schools do to inspire a love of learning?

A little knowledge can be a dangerous thing and schools fear handing this ropey data to parents, politicians and the media could see them caned again and again for failings that are often beyond their control.

But whether yesterday's release of information showing how individual schools compare themselves to National Standards will send schools to the headmaster's office may depend on whether parents, politicians and the media engage their brains.

The problem starts with the National Standards themselves - descriptions of what students should be able to do in reading, writing and mathematics as they progress through levels 1 to 8, the primary and intermediate years.

The system, based on the national curriculum, is still in its infant years, launched in 2009 with undue haste and limited consultation, says Auckland Primary Principals' Association president Jill Corkin.

"Many of us believe the standards themselves are a bit airy-fairy and not specific enough. When you have a standard that says 'work towards level 3', what does 'work towards' actually mean?"

It's over to schools to assess where their pupils' are at in relation to these standards, and different schools have different ways of assessing. Some use test results, others "Overall Teacher Judgments" - jargon for a teacher's verdict on a particular pupil's capabilities, based on broader observation than a one-off test.

This variation is no bad thing - schools are encouraged to choose teaching methods to suit their environment and, in introducing National Standards, the ministry deliberately avoided setting a uniform test because of experience overseas, where national tests led to counter-productive behaviour by schools and parents.


We're only seeing the data now because of a political decision to make it publicly available - the ministry acknowledges it will take years to raise the quality of the information to a standard where valid comparisons can be made by parents.

The ministry crunches the highly-variable information sent by schools into four categories: those well below the standard, below standard, at the standard and well above the standard. The national picture painted by this data is this: 76 per cent of children are at or above the national standard for reading, 72 per cent for mathematics and 68 per cent for writing.

More girls than boys are shining at reading and writing; more Maori and even more Pacific kids are heading for Struggle Street. What the data doesn't show is that many of these "under-achieving" kids live on Struggle Street already and that's not the school's fault.

So what's the point of the number crunching and release of unreliable data that parents and the media will use to compare schools - so-called league tables - and that politicians may use to pursue agendas?

At the moment, around 20 per cent of students leave school without attaining even NCEA Level 1 - not enough literacy and numeracy to hold down a job, say Government MPs. They claim identifying them earlier will help towards the goal of having 85 per cent of students leave school with at least NCEA Level 2 by 2016 (which, when you do the maths, doesn't add up - children who will sit NCEA Level 2 in 2016 are already in year 8, so league tables may be too late to help them).

Secretary for education Lesley Longstone says even though the data is unreliable it is still a useful start. It gives the ministry a basis to target resources; under-achieving schools will be identified and supported earlier.


In time, analysis of the data will help schools to identify what works and where they sit in relation to national norms, Longstone says. "Most schools are quite happy to be held accountable for their results and are very happy for their results to be transparent."

She notes it took 10 years to develop full confidence in the assessment process for NCEA in secondary schools.

Corkin says few schools would argue with having a national baseline to compare themselves to, if it was accurate and the data pinpointed where improvements could be made. "But I really doubt that we've identified the problem correctly and I'm concerned that this very superficial data will just exacerbate that poor identification."

She cites research showing that the long-term tail of under-achievement - the 20 per cent - has more to do with socio-economic factors and the home environment than what goes on in the classroom. "The national data doesn't tell us anything we don't know already and it doesn't tell us how we're going to fix it."

What the data does confirm is the correlation between poor socio-economic areas and under-achievement in schools, though drill down and there is far more variation in attainment between low decile schools than in higher decile areas.

Corkin predicts the league tables will have more impact in higher than in lower decile areas. "Choice is fine where people have the ability to make that choice but they need mobility and in some communities that's not possible. I think it will do more damage to high-decile schools if by chance it shows parents the school is not sitting where they think it should be."


An expert on educational assessment and measurement, Associate Professor Alison Gilmore of Otago University, doubts the information will be particularly useful to parents or the Government. "Parents actually need something much more nuanced that information on a website that captures only part of the school experience. There are other, less tangible indicators - it comes down to teachers and the school's recognition of individual learning needs.

"Assessment is a very powerful instrument if used in a formative way to make improvements to children's learning. That's a far cry from using the information to make judgments about quality of teaching or quality of the school. They should be putting more resources into teachers and principals ..."

To parents who took part in the ministry research, learning assessment was one of several factors they rated as "most important" when it comes to choosing a school. Of equal importance were teaching quality, the school's "culture" and atmosphere, multicultural policies and student wellbeing and support.

Then came things like school leadership, peer influence (having friends from earlier schooling alongside), the school's ethnic mix, class sizes and socio-economic decile ratings. Of less importance were cost, location, facilities and resources.

Parents formed their impressions through various means: word-of-mouth, media reports, and by visiting schools and talking to teachers, the principal and other parents.

The survey did find support for league tables, with some parents liking the idea of comparing their school against others in the same decile and getting a more meaningful assessment of how their child is progressing. But some wanted a broader measure of overall student wellbeing, while a few would "monitor the hell out of the system".


The Government decision to release the ropey school-by-school data now is linked to its desire to increase "parental choice", a cornerstone policy of National and coalition partner Act. It may also be sending a message to reluctant schools (nearly 10 per cent failed to co-operate this year) to fall into line.

It's what happens next that worries opponents, who call the release of data a "naming and shaming" exercise.

"The rhetoric is around choice and charter schools but I'm unsure what the Government has in mind," says Corkin. "The big question is where is it going - where is the ministry heading with it?"

Claims that under-achieving schools will experience an exodus as parents exercise their choice are more than union scaremongering - many schools experienced this in the 1990s when the news media seized on Education Review Office reports which catalogued school failings.

"Some will be afflicted more than others, in fact for some it is likely to be fatal," says Professor Martin Thrupp of Waikato University's education department.

"There is concern that there's a privatisation agenda - that one solution may be to open up [more] charter schools. I don't know whether that's a well-formed strategy but I think the Government would achieve much more with professional development. This is a low-trust approach."


Thrupp, who spent several years in England researching accountability trends, is equally concerned about how the numbers game could change behaviour within schools.

"Schools aiming to improve their positions on the tables may turn away kids who aren't going to enhance their position, such as special needs kids or slow starters, which is what happened in England."

Another trend found overseas was a narrowing of the curriculum, he says. "Schools opt for a more direct focus on literacy and numeracy, but it's the wider curriculum areas that hook kids into learning.

"In England, they call this the 'drill and kill' curriculum - you just hammer some things again and again."

Thrupp doubts that the data will become much more meaningful and assessments more standardised over time. More likely is a rerun of the early years of NCEA, when internal assessments saw huge variations between secondary schools and some encouraged less able pupils into unit standards and "soft" credits to boost results.

Longstone rejects the notion that schools will avoid or neglect children with disabilities and she believes moderation and other initiatives will iron out any manipulation of the system.


The ministry has a five-year plan to improve the quality and reliability of data, involving more moderation between and within schools and computer programs designed to improve consistency.

Longstone says that in time, schools in similar socio-economic deciles will be able to better measure the progress children make over the course of a year, develop a more rounded picture of their progress and identify what works in improving achievement.

"But parents shouldn't look at this in isolation, because it is not the whole story. They should look at ERO reports, talk to teachers and principals and to other parents. This is just an additional piece of information."