Online naplan results under fire
REPORTS are out today alleging issues with the validity of NAPLAN's online testing.
These reports and ACARA's response thus far leave questions as to whether online and paper-based NAPLAN tests can be meaningfully compared, education industry experts say.
"It is impossible to know the degree to which the test is valid and equivalent to older versions,"Associate Professor Ladwig said.
James Ladwig, Associate Professor at the University of Newcastle's School of Education, is internationally recognised as a leading researcher in school reform, the sociology of education and educational policy.
"It is well known that student performances in online test situations are not equivalent to pen and paper tests," Prof. Ladwig said.
"ACARA has not yet released any of the technical information needed to assess the performance of the online version. As a consequence it is impossible to know the degree to which the test is valid and equivalent to older versions.
"One of the main rationales for the online test regime was to produce results sooner.
"Defending the current release as being on the same timeline as previous years seems disingenuous given this rationale."
Fellow Associate Professor Jihyun Lee from the UNSW School of Education has specialised in large-scale standardised testing, including NAPLAN and PISA.
She is currently a TJA Research Fellow in the PISA division at the OECD. She will be working at the OECD-headquarters in Paris the remainder of this year.
"An independent body of measurement experts should review the most recent NAPLAN data," Professor Lee said.
"A concern has been raised about the comparability between online and pen-and-paper test results of NAPLAN.
"I was cautiously optimistic that the online testing would be successful given that the basic statistical methodology has been around since the 1980s. Large-scale online testing has proven to be valid in several overseas programs since 1990s.
"However, if ACARA were to release the NAPLAN results in two separate forms, this would suggest a failure to ensure comparability. An independent body of measurement experts should review the most recent NAPLAN data," she said.
Eduction policy expert Dr Steven Lewis, the Alfred Deakin Postdoctoral Research Fellow at Deakin University, and an expert on education policy said transparency is needed to build trust.
Dr Lewis acknowledges that "NAPLAN can help us understand schooling at the system level." However, he highlights that "any lack of statistical comparability, be it perceived or actual, between the online and pen-and-paper tests jeopardises its utility as a trusted means of comparison."
"Such a lack of comparability could mean that comparisons cannot be made between schools using different modes of testing in 2018, or between a single school's year-to-year performance if the school has piloted the online delivery format," he said.
"My research has shown the profound impact of NAPLAN data, and comparisons of these data, on how schooling is understood and practised by teachers, schools and systems. Unless there is transparency around the statistical procedures and experts used by ACARA to make the data 'valid and comparable', there cannot be a complete trust in the evidence and comparisons these data provide.
"To this end, I would call on ACARA to release this information to help forestall what is arguably a growing mistrust, amongst the public and education professionals alike, in relying on NAPLAN and standardised test data to inform teacher practice and student learning."