Education minister Julia Gillard repeatedly claimed that the publication of NAPLAN results on the MySchool website would enable parents to find out how schools are performing. To make this claim, there is an assumption that schools with similar social-educational background can be compared, and that the variation in NAPLAN performance across schools within each group of “like-schools” is entirely due to “school performance”.
The MySchool website technical document shows that the school social-educational background index, ICSEA, used to classify schools into “like-schools” groups explains, at best, about 68% of variation in students’ achievement measures. Is the remaining 32% of variation in students’ achievement entirely due to school or teacher effectiveness? Research carried out by Professor Brian Byrne from the University of New England showed that teacher effect accounted for only 8% of the variation in students’ achievement. Research findings from the United States also showed that teacher effect typically accounted for between 3% and 16% of variation in students’ achievement. Given these figures, it is clear that the index, ICSEA, does not remove all student level factors that contribute to student’s achievement. In fact, variation in NAPLAN performance across schools within each group of “like-schools” is most likely due to factors unrelated to school/teacher performance.
The report “Reporting and Comparing School Performances” prepared by the Australian Council for Educational Research, often quoted by Ms Gillard as the basis for the validity of the use of NAPLAN results for school comparisons, in fact cautioned that “fluctuations in student cohort from one year to the next are large enough to swamp the effect of any improved teaching that may be occurring”. The report recommended the use of student growth as a better measure about student learning, and not simply comparing a school’s results from one year to another, as the current MySchool website does. However, the report failed to point out that, while many researchers have concluded that measures of student growth provide more reliable measures of a school’s effectiveness, many years of data are needed to provide the accuracies required to identify individual schools/teachers who are more effective than others. Professor Andrew Leigh from the Australian National University used student test data on two occasions of testing, and found the margin of error associated with individual teachers’ effectiveness measures to be very large.
The key message here is not that teachers do not make a difference to student achievement. In fact, most research findings demonstrate that the difference in achievement gains between having a good teacher and a poor teacher is about one year of students’ growth. We do need to identify good teachers from poor teachers. However, the large margins of error from students’ test scores such as NAPLAN results prevent us from obtaining accurate school/teacher effectiveness measures.
It would be irresponsible for the government and education researchers to tell the public that school performance can be judged from information provided on MySchool website. It appears that the so-called transparency agenda is actually a ploy for putting pressure on all teachers, and not just those who are not performing, since there is no way that good or poor teaching can be identified using NAPLAN results, with or without “like-schools” grouping. Even if such a ploy raises accountability, I don’t believe that such a ploy is ethical. In addition, the technical challenges in analysing NAPLAN data, particularly for measuring trends, are acknowledged by education measurement experts. The government should come clean about the limitations of NAPLAN tests. That will be real transparency. And we owe that to the parents.
Dr Margaret Wu is an Associate Professor at the Assessment Research Centre, University of Melbourne.
Margaret’s main area of research is in the development of item response modelling and its applications, particularly in the context of large-scale assessments such as international and national assessments. Margaret also has a keen interest in computer delivered assessments and the use of technology in educational assessment.