The potential for 'When-ready' assessment with RM Compare On-Demand

The benefits and challenges of taking a 'stage not age' approach to learners has been well explored for many years. When it comes to assessment, we know for example that not all 16 years olds in an exam hall are at the same developmental stage on account of some candidates being up to 12 months older than others. The dis-advantages for summer born students in this situation are hardly surprising.

Many of the challenges here, at least until this point, have been logistical. Education systems have been built to deliver at efficiently at scale, accepting the trade-off in personalisation. Synchronous summative exams are a key part of this system.

We know that the way we choose to assess directly influences our approach to both curriculum and pedagogy. We also know that technological advances mean that this 'one size fits all' approach to assessment is no longer inevitable. The arrival of 'when-ready' systems will have a profound effect.

'When ready' assessment

The examination we take to qualify for a driving license is sometimes held up as a good example of a 'when ready' process. In this case the instructor, through regular interaction with the learner, decides when the student is ready to be examined. This decision reflects the individual stage the learner is at, not the number of lessons they have had. In this case the student is booked into a testing slot 'when ready'.

A similar approach can be found in a number of areas online. I have personally sat a number of technical online assessments. In each case I book myself into an available slot when I think I am ready. In most cases these examinations are proctored in an attempt to counter malpractice.

While the two cases above are 'when ready', they are not completely 'on-demand'. I am still reliant on a testing slot being available when I am ready to take the test.

The On-demand challenge

A truly On-Demand assessment is not reliant on booking testing slots. Instead it can be taken anytime, anywhere. In other words, it is genuinely a 'when ready' assessment.

We have of course all taken On-demand online tests at some stage. The underlying 'auto-marking' has been improved dramatically in recent years through the addition of AI capability. Even so, we are still a long way from replicating human assessment capability, especially in 'fuzzy' subject areas.

This is the challenge RM Compare On Demand has set out to meet. How can we have a truly 'human' on-demand assessment system that is not limited to 'non-fuzzy' artifacts?

A truly 'human', on-demand assessment system that is not limited to 'non-fuzzy' artifacts

The RM Compare On-Demand approach will deliver fast, reliable assessment at scale in an entirely new way. In doing so it will unlock transformative approaches to curriculum and pedagogy.

We are working hard and making substantial progress. Get in touch if you would like to know more.