• Congratulations to the Class of 2024 on your results!
    Let us know how you went here
    Got a question about your uni preferences? Ask us here

Comparision of aligning and scaling of subjects (1 Viewer)

Sarah182

Herpes Member
Joined
Mar 6, 2008
Messages
851
Location
Somewhere over the rainbow
Gender
Female
HSC
2009
Hey,

Recently I found myself on the threads where BOS revealed the raw marks in 2003 (here) and 2004 (here).

Firstly in 2003 Software design and development had a scaled mean of 25.6 and Physics had a scaled mean of 30.3 yet Ragerunner's results indicate that they align similarly. The same occurs with Mathematics and English Advanced where he recieved a LOWER raw mark for Mathematics yet it aligns to the same mark as his English Advanced mark however the 2003 scaling report suggest that English Advanced has much better scaling than mathematics.

Another thing I observed in 2004 was that History Extension indicated poor aligning, a mark of 42/50 translated to a mark of 43/50, yet it is an extension subject and the scaling of this subject is fairly good, ranging from about 34-35 as a scaled mean. I just found it odd.

So could someone explain to me what this means? That subjects with lower scaling can actually align better than subjects with higher scaling? That all subjects actually align in a similar way? Because we have insufficient data couldn't it be true that a subject like Business Studies might actually align better than, say, Maths Extension 1 (it is an extreme but I'm sure you understand the point I'm trying to make)?

These are just a few of the questions running through my head.
It would be rather interesting if the Board ever releases the raw band cut offs one year so we could see how all subjects actually align.

Thanks guys :)
 

dp624

Active Member
Joined
Oct 16, 2008
Messages
2,326
Gender
Male
HSC
2008
Aligning is not done relative to the strength of a candidature. Rather, it's done a set standard which key examiners agree on. This obviously can differ from the strength of a cohort.
Aligning also changes with the difficulty of an exam. It is independant of the cohort sitting it.
Scaling is independant of the difficulty of an exam, but depends on the subject's candidature instead.
 

Lazarus

Retired
Joined
Jul 6, 2002
Messages
5,965
Location
CBD
Gender
Male
HSC
2001
Mmm... some interesting questions.

In order to be able to draw some meaningful conclusions, you need to keep in mind the different purposes of aligning and scaling.

Aligning marks can be done for any course and is primarily done to report achievement in terms of the course standards and achieves the following:

(a) It doesn't matter whether the exam is too hard or too easy - if everyone scores very high raw marks or very low raw marks, the effect is undone;

(b) The mark reported indicates where a student's performance fits on the performance bands - a mark of 88 means someone has demonstrated most of the competencies in the description for band 5 for that course but has not demonstrated the minimum level of competence required for the band 6 standard;

(c) Aligned marks for a course in one year can be compared to aligned marks in that same course in previous years, because the standards are the same - but not to aligned marks in other courses, because the standards are different.

Scaling marks must be done for all courses and is primarily done to rank students for tertiary admission and achieves the following:

(a) It allows for a ranking for every student to be determined in a way that is both objective and rational - research shows that this rank is the best single indicator of a student's success in the first year of university.

(b) Because the purpose is to rank students, the scaling process only looks at where students have ranked in the state and the gaps between students in each course - so again it doesn't matter whether the exams are too hard or too easy, because the raw marks just act as 'placeholders' which show where students are positioned in a course relative to the state.

(c) Remembering that the focus is on ranking students objectively, you'll notice that it's easier to get a high rank in a course with 'less able' students and harder to get a high rank in a course with 'more able' students - because you're competing against students with different levels of ability. So courses where you have to be really competitive to get a high rank are scaled upwards, and vice versa.

(d) Once calculated, scaled marks in one course can be compared and added to scaled marks in every other course - they are still 'placeholders' for the positions of students, just like the raw marks were, but they've been corrected to take into account the competition faced by those students. So all marks for all courses are on the same scale. The aggregation of these scaled marks can tell you a student's overall rank, but doesn't tell you anything else.

Sarah182 said:
So could someone explain to me what this means? That subjects with lower scaling can actually align better than subjects with higher scaling? That all subjects actually align in a similar way? Because we have insufficient data couldn't it be true that a subject like Business Studies might actually align better than, say, Maths Extension 1 (it is an extreme but I'm sure you understand the point I'm trying to make)?
You can't really ask whether Business Studies aligns "better" than Mathematics Extension 1 - what is "better"? It is the same as asking: does a raw mark in Business Studies put you higher on the Business Studies standards than a raw mark in Mathematics Extension 1 puts you on the Mathematics Extension 1 standards?

The aligned marks only mean what is written in the performance band descriptors for that course.

The raw band cut-offs for Business Studies could very well be lower than those for Extension 1 - for example, if the exam was very hard, everyone would receive lower raw marks, and the cut-offs would have to be lower than normal in order to make sure the marks are aligned to the right standards.

But that doesn't mean anything either, except that the exam was very hard in the context of the Business Studies standards - you can't even say whether it was harder than the Extension 1 exam.

Cut-offs will be low when the exams are very hard or the standards are very high, and vice versa.

Scaling will be negative for a course when most of the students taking the course tend to be ranked at the lower end in all of their courses (making it easier for an average student to obtain a high rank).

Sarah182 said:
Firstly in 2003 Software design and development had a scaled mean of 25.6 and Physics had a scaled mean of 30.3 yet Ragerunner's results indicate that they align similarly.
To take this example - the scaled mean of SDD, which is close to the average of 25, shows that most of the students taking that course tended to be ranked in the middle of the state for all their courses, so the competition was about average and very little adjustment needed to be made.

The scaled mean of physics was higher and shows that the competition was higher than in SDD - most of the students in physics tended to be ranked relatively higher in all of their courses than the students in SDD.

The fact that the two courses aligned similarly simply means that the difference between the scale on which achievement was described in the SDD standards and the scale on which achievement was measured by the SDD exam was similar to the difference between the scale on which achievement was described in the physics standards and the scale on which achievement was measured by the physics exam.

You can't draw any conclusions from the fact that the aligning and the scaling for the two courses were not the same - it's only pure chance if it ever is the same (for example, in 2008 with Mathematics Extension 1, see here).

Sarah182 said:
These are just a few of the questions running through my head.
It would be rather interesting if the Board ever releases the raw band cut offs one year so we could see how all subjects actually align.
It would be interesting. It would also make the process transparent, allow for greater public confidence and enhance student understanding.
 

Sarah182

Herpes Member
Joined
Mar 6, 2008
Messages
851
Location
Somewhere over the rainbow
Gender
Female
HSC
2009
Hey guys thanks so much for the replies, it gave me a much better understanding of what all of it means.

So aligning is a measure of how well the student fulfills the course outcomes whereas scaling is a ranking of their place relative to other students in the state?

All this reading and discussion of raw marks has made me much more aware of how important it would be if we werent left in the dark about raw marks.
I hope for the year where BOS reveal the raw band cut offs. Imagine what a breakthrough it would be for teachers and students alike.
Even most teachers dont understand the aligning and scaling process I have noticed.
 

cem

Premium Member
Joined
Nov 12, 2005
Messages
2,438
Location
Sydney
Gender
Female
HSC
N/A
Sarah182 said:
Hey guys thanks so much for the replies, it gave me a much better understanding of what all of it means.

So aligning is a measure of how well the student fulfills the course outcomes whereas scaling is a ranking of their place relative to other students in the state?

All this reading and discussion of raw marks has made me much more aware of how important it would be if we werent left in the dark about raw marks.
I hope for the year where BOS reveal the raw band cut offs. Imagine what a breakthrough it would be for teachers and students alike.
Even most teachers dont understand the aligning and scaling process I have noticed.

That won't happen as the BOS don't want the public to realise that people with marks in the teens are getting reported as over 50 or passing the courses.

They also won't do it in case it reveals that they are actually lowering standards to say that standards are rising i.e. the first time they did this about 1% of students in Modern History got Band 6 whereas this year it was closer to 10%. Does that mean that this year's group are brighter than students 7 or 8 years ago or is the BOS trying to pull the wool over our eyes by actually lowering the cut-offs to allow more people into the Bands. The judges make recommendations to the BOS but.... they are never told if their recommendation is actually the one used. So the judges might recommend a cut-off of 86.3 but the BOS might lower it to 77.3 to have more students in Band 6 but not tell anyone.
 

Users Who Are Viewing This Thread (Users: 0, Guests: 1)

Top