During the election period, ACT released a long list of education policies, including a platform similar to Australia’s ‘My School’.
On their policy document, ACT said this platform would “help parents understand how their school is performing compared with other schools that serve students of similar backgrounds.”
This policy sits hand-in-hand with ACT’s policy to require schools to use e-asTTle tests twice a year and report the results.
Now that National, ACT and New Zealand First are confirmed to be forming the next parliament, we can expect to see some movement on these policies – so what exactly is ‘My School’, and how might it work in a New Zealand context?
An online information service
According to the ‘My School’ factsheet, presented by the Australian Curriculum, Assessment and Reporting Authority (ACARA), the website is “an information service” developed so parents and communities can access information about their school and others across Australia. It was implemented in 2008 as part of a movement to create “greater transparency and accountability for the performance of schools… to help ensure that every Australian child receives the highest-quality education.”
In Australia, ACARA is responsible for administering the website. They also collect and distribute data from schools across Australia.
Contrary to ACT policy, however, ‘My School’ emphasises that it is not a “league table”. ACARA states that “simple ‘league tables’ that rank and compare schools with very different student populations can be misleading and are not published on the ‘My School’ website.”
What information is available?
Each school has a profile on the website which outlines the school’s educational outcomes, student population and capacity or capability.
Educational outcomes are measured using the National Assessment Programme – Literacy and Numeracy (NAPLAN). NAPLAN was implemented in 2008 to improve the comparability of student results across states and territories. It is also developed and administered by ACARA. Other educational outcome measures available on ‘My School’ include measures of student improvement, vocational education and training, school-based apprenticeships and traineeships, senior secondary outcomes and post-school destinations (where available).
Student population data includes level of socio-educational advantaFge (SEA) of a given school’s student body, proportion of indigenous students, students with a non-English-speaking background and attendance data.
School capacity and capability includes information on the type of school, year-ranges, student and staff numbers, school financial information and location. Financial information includes recurrent income and capital expenditure for each funding source. Both government and non-government schools are required to provide this information. Funding information is provided to enable fairer comparisons between schools.
Schools also provide a short description to be displayed in their profile. This statement outlines the school’s “context and character”.
NAPLAN data could be compared across schools with statistically similar student bodies until 2020. After 2020, ‘My School’ changed its focus to reporting on “student progress”, averaging NAPLAN performance compared with students from similar socioeconomic backgrounds two years earlier.
As of writing, ‘My School’ no longer offers comparisons between schools.
Issues, concerns and criticisms of ‘My School’.
In 2014, a review of ‘My School’ was presented to the Australian Government Department of Education. It outlined some main issues including:
- The inherent complexities of presenting NAPLAN data
- Comparisons between schools
- Interpreting data and information
The ‘My School’ website also ran into useability issues. Evidence suggests that function and use of the website is not well understood by parents or educators. Issues with distilling a complicated dataset to useable information for the public means that the website is underutilised. These concerns led to the overhaul of the website in 2020.
Another concern was the use of the Index of Community Socio-Educational Advantage (ICSEA) to determine “statistically similar” schools. The report acknowledged that although work was put in to make the measure more robust, it was not designed to capture key, qualitative data. Similar concerns may arise when using New Zealand-developed measures of socio-educational advantage.
Although ‘My School’ discourages the creation of ‘league tables’, many media outlets used ‘My School’ in this way, leading to negative impacts on school reputations, which in turn created difficulties for attracting and retaining staff and students. Additionally, these media-generated “league tables” could be misleading and lacking important context.
Due to these practises, some schools reportedly requested some students stayed home during NAPLAN testing, for fear the school would be disadvantaged by poor results.
Australians have also raised concerns about equating NAPLAN scores and ‘My School’ data with school quality. Stewart Riddle, Associate Professor at the School of Education for the University of Southern Queensland noted that:
“My School does not give direct information about school culture, community connections and values, which are all important considerations when thinking about what makes for a “good” school.
“In short, parents should not read too much into NAPLAN results and My School information.”
The problem with NAPLAN
One of the biggest concerns was the use of NAPLAN as the primary measure of school performance. These concerns may translate into the use of e-asTTle reports in the New Zealand context. In Australia, NAPLAN had several unintended consequences including:
- Teaching and learning becoming focused on NAPLAN rather than other areas.
- Students’ wellbeing becoming negatively impacted, and stress reportedly caused lower performance on NAPLAN.
- Lowered educator morale
Time lag between NAPLAN administration and reporting limited the usefulness of the test as a diagnostic tool.
As reported in the 2020 review of NAPLAN, standardised literacy tests proved problematic in several respects. Student writing became “formulaic”, as did teaching and learning in schools.
Similar to ACT’s rhetoric, standardised testing in Australia was meant to raise national achievement level. But there is evidence that NAPLAN has been counterproductive, as outlined in the 2020 review.
Teachers also reported that NAPLAN and ‘My School’ created performance pressures which further increased workload, ill-health and workforce shortages.
Australia’s young people are also facing some of the lowest levels of wellbeing and mental health in the OECD, some of which has been attributed to the standardised testing system.
In 2020, during the height of the pandemic, NAPLAN was cancelled by the Australian Federal Government in an acknowledgement of its detrimental effects on wellbeing.
The New Zealand context
Both National and ACT have pledged they will introduce standardised assessments at least twice a year, with results being reported to parents.
Although National has not specified what their standardised test will look like, ACT’s policy specifically mentions e-asTTle tests, which would be reported on their proposed platform. This would position e-asTTle in New Zealand similarly to NAPLAN in Australia. Currently, e-asTTle is not compulsory, but many schools use it. On Te Kete Ipurangi, e-asTTle is described as primarily for assessment of students years 5 – 10. It states “e-asTTle provides teachers and school leaders with information that can be used to inform learning programmes and to apply teaching practice that maximises individual student learning.
“Schools using e-asTTle have found it to be a great tool for planning purposes, for helping students to understand their progress, and for involving parents in discussions about how well their children are doing.”
Currently, teachers using e-asTTle are already able to compare their classes results to national datasets.
However, e-asTTle and asTTle tests and results are not without their own issues. Similar to NAPLAN, asTTle writing tests are associated with a limited scope and narrowing of how reading and writing are taught and learned.
However, other research had found that e-asTTle assessment practices enabled teachers to shift toward a more effective pedagogy. A masters’ thesis presented by Susan Carnegie-Harding found that utilising the e-asTTle tool accelerated student achievement and progress in reading and mathematics. But Carnegie-Harding also found the use of the e-asTTle tests themselves were not enough to close educational disparities based on ethnicity or socio-economic status.
Finally, teachers report e-asTTle – similarly to NAPLAN in Australia – creates stress and pressure on students that may cause them to underperform. Teachers note that standardised testing only offers “snapshots” of student achievement and may not accurately reflect a student’s skill or ability.