- Research
Sharing and securing learners' performance standards across schools
Variability of standards across schools - an intractable problem?
Here in England, we are on the eve of the publication of A'Level results (and some other Level 3 qualifications). Government policy has reverted to using traditional external assessments. A large reason for this has been a lack of confidence created during the pandemic when Centre Assessed Grading (2020) and then Teacher Assessed Grading (2021) were used instead. In both cases consistency in marking and grading was hindered by the lack of a clear and simple way for teachers to complete internal assessments with the confidence that they were aligned to national standards. This was especially true for 'fuzzy' subjects.
The inevitability of the return to formal examinations has disappointed many who recognise some of their limitations. However, until the standardisation issue is properly resolved it is hard to see a way out.
We have known for some time that Adaptive Comparative Judgement can be an alternative approach to assessment that can, in some situations, overcome the limitations of traditional approaches. For reasons outlined in an earlier post however it has not been in a position to properly assist with the issue of standardising at scale.
How much simpler it would all be if teachers had – as a matter of normal practice – access to, and familiarity with, work from a national sample of schools, not just their own classroom?
This question has been a key focus for our Team and some of our partners for a while now. A leading light in this research is Richard Kimbell (Emeritus Professor, Goldsmiths, University of London).
Professor Kimble has been at the heart of ACJ research for many years and is presenting his latest findings to the ACER Research Conference next week. In it he will explain how unlocking the schools standardisation challenge with ACJ could be approached, and will also share some of the transformational implications for teaching and learning. In addition, the presentation will share the latest findings from an ongoing research project in Ireland which we are supporting.
You can read the presentation abstract here.
Problem solved?
Er........not quite! We still have a lot to learn here and we continue to take an iterative approach. As part of our work we often talk about focusing on user problems and 'Jobs to be Done' (JTBD). We do this by relentlessly validating assumptions with our users. This already informed some early thoughts regarding RM Compare on Demand.
As always, please get in touch if you would like to join us. While we do not have all the answers right now, perhaps this problem might not be quite as intractable as we think.