And the ACT’s relative rates of progress have fallen further behind the national average in recent years. The 2010-12 cohort made around two months less progress than the national average in numeracy, and close to the national average in reading. But the 2014-16 cohort made five months less progress than the national average in numeracy, and four months less in reading. There is a similar worrying trend for year 3 reading results.
At secondary schools, on a like-for-like basis, ACT students make three months less numeracy progress than the national average between year 7 and year 9. And the low rates of progress are evident across several levels of school advantage (our report does not cover high-advantage schools, and in the ACT this represents around one third of students, so these results do not apply to them).
What might be causing the ACT’s poor performance on student progress? Our analysis can’t prove the root cause, and more research is needed. But one issue raised in a 2017 ACT Auditor-General’s report was that ACT teachers were not using student performance data to determine what students are ready to learn next. The Auditor-General also recommended a better balance between autonomy for individual schools on one hand, and consistency across schools on the other.
The ACT should learn from other jurisdictions, such as Queensland primary schools which consistently make above-average progress in reading and numeracy. NSW is good at stretching the top secondary students; and Victoria’s disadvantaged schools consistently make above-average student progress.
The most worrying pattern revealed in our report is consistent across all states and territories. Students in Australia’s low-achieving schools make only half the progress in numeracy from year 7 to year 9 as students in high-achieving schools, and 30 per cent less progress in reading. Most of these low-achievement, low-progress schools are also disadvantaged. This challenges the argument that high-achieving schools are cruising and make the slowest progress.
While some disadvantaged schools beat the odds, many deliver a lot less than a year’s worth of growth each year. Governments must find a way to boost learning in these schools if Australia is to reach the Gonski 2.0 goal of “at least a year of growth for every student every year”.
To become an adaptive, constantly-improving education system, we must learn from what works best. Often the focus is on outcomes, particularly using NAPLAN. NAPLAN does not capture everything that matters in school education, but it is the only test in Australia that enables us to compare student progress across every school; vital information needed by policy makers and researchers to improve the system. But we also need better data on teaching, so policy makers can make the links between government policy, teacher practice, and student progress.
States and territories must do more to learn from one another, while facing up to their own weaknesses and building on their own strengths. Governments across the country need to investigate why students make more progress in some states and territories than others, with the goal of identifying the teacher practices and school policies that produce the best results for our children.
Julie Sonnemann is School Education Fellow and Peter Goss is School Education Program Director at the Grattan Institute. The new Grattan report, Measuring student progress, is available at www.grattan.edu.au