Curriculum and the Future of Assessment - The Architecture of Curriculum Alignment (Part 5)

Over this series we’ve built an argument step by step:

  • Different curriculum models (content, product, process) need different assessment logics.
  • Curriculum for Wales (CfW), as illuminated by Camau i’r Dyfodol, is best understood as a process curriculum.
  • Adaptive Comparative Judgement (ACJ), via RM Compare, is an agnostic engine: it can serve any of these models depending on how it’s configured.
  • When curriculum and assessment sit in different “rows” of the matrix, misalignment and assessment drag undermine even the best‑intentioned reforms – as Scotland’s Curriculum for Excellence (CfE) vividly shows.

In this final post, we bring those threads together in the Welsh context.

What would it take for Wales to truly align CfW and its assessment architecture – and to escape the gravitational pull of English‑style assessment?

Wales is not just “England with tweaks”

It’s important to name something that often sits between the lines. For Wales (and Scotland), curriculum reform is not just a technocratic exercise; it’s part of a broader national project. CfW is:

  • purpose‑led and progression‑based;
  • designed to reflect Welsh language, culture and civic identity;
  • explicitly intended to move away from a narrow, performance‑driven, exam‑centred model of schooling.

That already sets Wales apart from England’s more tightly specified, product‑driven approach. But public debate doesn’t always reflect this. Welsh performance is routinely compared to England’s using the same metrics – particularly PISA scores and high‑stakes exam outcomes – as if the two systems were pursuing identical goals through identical means.

There are two problems with that:

  • PISA is narrow and lagged: it samples 15‑year‑olds in three domains and tells us little about the wider purposes of CfW, or about progression over the full 3–16 span.
  • CfW is still bedding in: recent PISA cohorts were educated largely under pre‑CfW arrangements or in the very early stages of the new curriculum.

In other words, when critics point and say “Wales is failing because its PISA scores are lower”, they’re effectively judging a process‑led, nation‑building curriculum by product‑style, cross‑sectional measures that don’t reflect its full intent. It’s another case of apples and oranges.

The more constructive question is: Is Wales building an assessment architecture that matches the curriculum it has chosen?

What Camau tells us about the Welsh starting point

Camau i’r Dyfodol gives us a clear answer to the prior question: CfW aligns most fully with a process model. When teachers worked with that model explicitly – through design teams, complementary expertise and Stenhouse’s three‑part structure of aims, processes and assessment – they reported:

  • more responsive, inclusive and engaging teaching;
  • deeper learning at a slower, more thoughtful pace;
  • greater professional satisfaction and a stronger sense of purpose.

But Camau also surfaced the risks:

  • CfW is a low‑definition curriculum by design; without explicit theory, it can feel like “just the bones”.
  • Many teachers feel ill‑equipped for local curriculum making and naturally revert to familiar product‑style planning and assessment when under pressure.
  • System messages (accountability, reporting, external scrutiny) don’t always fully reflect the process model, sending ambiguous signals about what really matters.

In that context, assessment becomes the decisive lever. If Wales now rebuilds its assessment architecture on a product logic, CfW will be bent slowly back into the shape of the old system. If it builds assessment on a process logic, aligned with CfW’s principles of progression, the curriculum has a chance to become what it was intended to be.

The role of holistic assessment: flywheel or new wall?

This is where RM Compare enters the picture. As we saw in Part 3, ACJ can behave very differently depending on the job it is asked to do:

  • It can act as a high‑precision marking engine in a product system, turning professional judgements into stable ranks and grades for high‑stakes decisions.
  • It can act as a process‑aligned sense‑making tool in a curriculum like CfW, helping teachers see progression in rich artefacts of learning and co‑construct shared exemplars and developmental descriptions.

Technically, it’s the same platform. Strategically, these are different worlds.

For Wales, the key design choices around RM Compare include:

Task design

Are tasks open and rich enough to allow diverse, authentic responses aligned with CfW’s purposes, or tightly constrained to mimic traditional exam items?

Judging criteria

Are judges guided by process‑based descriptors (e.g. CfW’s principles of progression, locally agreed indicators of deeper thinking, creativity, collaboration), or by narrow, product‑style rubrics that reduce everything to a handful of easily scored features?

Uses and stakes

Are ACJ outputs used primarily for formative, professional and system‑learning purposes (exemplars, moderation, developmental profiles), with any high‑stakes use clearly bounded?

Or do they feed directly into high‑stakes grading and accountability, effectively turning RM Compare into a smarter exam machine?

Messaging and support

Are teachers supported and trained to use ACJ as part of curriculum making – a way of seeing and discussing CfW‑aligned quality in real work?

Or is it presented mainly as a way to “mark faster” and “get reliable scores”, reinforcing a product mindset?

If Wales chooses the second option in each pair, RM Compare risks becoming just another wall: a technically impressive but ultimately product‑style assessment infrastructure that reinforces the very logics CfW was designed to move beyond.

If it chooses the first option, RM Compare can become a flywheel: a piece of infrastructure that helps Wales scale the professional judgement and nuanced understanding that a process curriculum requires.

What winning could look like

To “win” this latest battle is not to beat England on its own terms – by chasing higher positions in league tables designed for a different system. It is to:

  • Hold the course on CfW’s process‑based, purpose‑led vision, resisting the temptation to retreat to simpler, more test‑friendly models at the first sign of pressure.
  • Build an assessment architecture that makes progression visible in rich, authentic work and supports teachers as curriculum makers, rather than treating them as deliverers of pre‑packaged test prep.
  • Use tools like RM Compare consciously, aligning their configuration and governance with CfW’s underlying model, rather than letting them default to product‑style uses that are convenient for accountability but corrosive to the curriculum’s integrity.

In practical terms, that might mean:

  • National and regional ACJ projects that focus on building CfW‑aligned exemplars, progression maps and shared professional understanding, rather than just generating scores.
  • System‑level use of ACJ data to learn about patterns of progression and curriculum realisation, not simply to rank schools.
  • Clear communication that CfW will not be judged solely – or primarily – by imported product metrics, but by a broader, more coherent set of indicators that reflect its purposes.

Closing the Integrity Gap

The Integrity Gap opens whenever curriculum, pedagogy and assessment are built on different foundations. In England, there is high integrity – but around a product model. In Scotland, CfE showed what happens when a process‑leaning curriculum hits an unchanged product wall. Wales, through CfW and Camau, has the chance to do something different.

The central message of this series is that tools like RM Compare matter, but only as part of a wider alignment story. ACJ will not make CfW process‑aligned by itself; it will simply amplify whatever conception of quality and progression the system asks it to embody.

If Wales can:

  • make its process model explicit,
  • align assessment purposes and practices with that model, and
  • deploy RM Compare as a genuinely process‑supporting infrastructure,

then it can do more than defend itself against cross‑border criticism. It can offer a working example of what a coherent, purpose‑led, progression‑based national curriculum looks like when the architecture of alignment is built from the ground up.

That would be a win worth having.