In Australia, tools commonly used to examine a student’s level of achievement in mathematics include the National Australian Program — Literacy and Numeracy (NAPLAN)43; the Progressive Achievement Test in Mathematics (PAT-M)44; and the On-Demand Testing Program45. All three employ a specific type of Item Response Theory called the Rasch Model46. In this approach, a specially constructed test is administered to a student, and the number of correct responses is counted. This provides a measurement of the student’s ‘latent trait’, which can be thought of as their overall position on a continuum, along with a margin of statistical error. The statistical model allows this to be done using a relatively short test (say, 40 questions)47.
The measurement approach used by Maths Pathway is fundamentally different, because it is constructed for a very different purpose: to target teaching to point of need. Knowing a student’s overall level of achievement (with or without statistical error) is not enough to achieve this aim, because two students sitting at the same overall level can have very different learning needs. Instead, Maths Pathway gathers data directly on each distinct learning objective across the entire curriculum, spanning levels 1 through to 10A, without relying on statistical inference.
This data makes it possible to determine the set of learning objectives that sit within the individual student’s ZPD, but necessitates a much longer test (typically hundreds of questions long). In practice, students complete this diagnostic assessment in multiple sessions spread out across a semester; and adaptivity that leverages non-statistical inference keeps the test length to a manageable size (using the set of known clear relationships between connected learning objectives; for example, single-digit addition and multi-digit addition).
The result is a detailed learning profile for each student. That profile consists of a vast number of learning objectives, arranged into distinct levels following the structure of the Australian Curriculum: Mathematics. For each learning objective for each student, there exists evidence to show either that they have, or have not yet achieved the objective. This learning profile is updated each learning cycle — typically 16 times per year — to include the updated direct evidence of what the student has achieved.
It is worth noting that this approach amounts to ‘criterion reference’ measurement, which has been suggested as a more suitable approach than the Rasch Model for educational measurement because it does not rely on an assumption of ‘unidimensionality of ability’48. This assumption is that there is only one underlying trait which determines a student’s response to assessment items; rather than one trait governing algebraic item responses, and another trait governing geometric item responses for example.
Despite having no unidimensional latent trait, Maths Pathway learning profile data can be abstracted down to a single ‘overall level number’, as a function of ACARA’s arrangement of learning objectives into a level structure in the Australian Curriculum: Mathematics.
This overall level of achievement is derived by summing the proportion of learning objectives which have been achieved at each level. This accounts for both gaps from lower levels, and competencies from higher levels. This means that the overall level of achievement increases whenever the student learns something new.
An explanation of the data analysis within this report
In 2019, there were 67,210 students using Maths Pathway. Not all of those students, however, have been included in every mean or distribution calculation. In general, only students with data that is sufficiently ‘complete’ from the whole year are used. This equates to 57,478 students (or 85% of the total population of Maths Pathway students). The following section breaks down the analysis performed on Maths Pathway data, in order to calculate the key findings within this report.
Our data comes from 300 schools representing 3,774 teaches and 67,210 students. The learnings from this data are fed back into our model so we can better serve every teacher and student.
The diagnostic assessment that every student undertakes when they begin using Maths Pathway establishes their achievement level in line with the Australian Curriculum: Mathematics. Year 7 students in 2019 who had sufficiently complete data were used in the measurement of this metric, equating to a total of 23,439 students. When considering students from low ICSEA backgrounds the number of students measured was 63,378, as a small number of schools do not have ICSEA data and are therefore excluded.
Measuring growth rates
Student growth rate refers to academic progress made over a defined period — the amount of new mathematics that a student learns in that period, expressed as a proportion of the average amount of new mathematics a student would need to learn every year between Year 1 and Year 10 in order to have mastered all of the mathematics content and skills up to and including the end of level 10, by the end of Year 10. Prior growth rate is measured by taking a student’s diagnosed level when they began using Maths Pathway and averaging it over the number of years they have attended school. Note that in the interests of estimating the impact conservatively this is a best case estimate of prior growth, because it assumes that students learn all of their mathematics at school, and none prior to starting school. In reality, prior growth rates are probably lower.
43. Australian Curriculum, Assessment and Reporting Authority 2018, Mathematics, <https://www.australiancurriculum.edu.au/f10-curriculum/mathematics/>
44. Australian Curriculum, Assessment and Reporting Authority 2018, Progressive Achievement Tests in Mathematics, <https:// www.acer.org/pat/tests/mathematics>
45. Victorian Curriculum and Assessment Authority, 2018, On demand testing, <http://www.vcaa.vic.edu.au/Pages/prep10/ondemand/ index.aspx>
46. Rasch, G 1980, Probabilistic models for some intelligence and attainment tests, University of Chicago Press, Chicargo
47. Goldstein, H 1979, ‘Consequences of Using the Rasch Model for Educational Assessment’, British Educational Research Journal, vol. 5, no. 2, pp.211-220
48. Goldstein, H 1979, ‘Consequences of Using the Rasch Model for Educational Assessment’, British Educational Research Journal, vol. 5, no. 2, pp.211-220