- Curriculum
When Curriculum and Assessment Clash: Lessons from Scotland - The Architecture of Curriculum Alignment (Part 4)
So far in this series we’ve:
- mapped three curriculum models (content, product, process) and their aligned assessments;
- seen how Camau i’r Dyfodol helped Wales understand Curriculum for Wales (CfW) as a process curriculum;
- and explored how Adaptive Comparative Judgement (ACJ), via RM Compare, is an agnostic engine that can serve any of these models depending on how it’s configured.
In this post, we look at what happens when those pieces don’t line up. The central claim is stark:
No assessment method – ACJ included – can “save” a reform if it is working to a different model than the curriculum.
To see this in action, we turn north, to Scotland.
Scotland’s Curriculum for Excellence: a vision stalled at the wall
Curriculum for Excellence (CfE) set out a broad, ambitious vision: a curriculum focused on capacities, skills and values, not just content coverage and exam performance. On paper, it leaned clearly towards a process model:
- holistic aims for “successful learners”, “confident individuals”, “responsible citizens” and “effective contributors”;
- an emphasis on rich experiences, interdisciplinary learning and teacher agency;
- a strong narrative about teachers as curriculum makers.
However, in the Senior Phase, the qualifications and assessment system remained firmly in the product column:
- traditional high‑stakes exams continued to dominate;
- progression and accountability were framed largely in terms of exam passes and grades;
- public, media and political attention focused on performance in those qualifications.
In other words, CfE’s curriculum and its assessment spine were written in different languages. Teachers were asked to design and deliver a process‑leaning curriculum, but were judged – and their learners’ futures decided – using product‑style exams.
The predictable happened:
- In younger years, some schools experimented with more process‑aligned practice.
- As learners approached exam years, teaching narrowed and reverted to familiar exam‑preparation behaviours.
- The broader capacities of CfE were squeezed into the margins, where they didn’t threaten the timetable or the grade profile.
Reviews of CfE have since identified this curriculum–assessment misalignment as a central reason the reform did not fully realise its promise. The vision was not inherently flawed; it was undermined by an assessment infrastructure that never stopped behaving like the old system.
The general pattern: what misalignment does
Scotland’s story is specific, but the pattern is general. Whenever you have:
- a process or broader competency‑based curriculum in principle, and
- a product‑style assessment and accountability regime in practice,
three things tend to happen:
- Assessment drag
High‑stakes assessments act like gravity. Under pressure, schools orient teaching towards what “counts” in those measures, even if it contradicts the curriculum narrative. - Mixed signals and professional cynicism
Teachers hear one message in curriculum documents (“be creative, local, holistic”) and another in the data conversations (“how many met the standard?”, “what are your target grades?”). Over time, they learn which message is really non‑negotiable. - Reversion to habit
When theory is weakly shared and system support is misaligned, most people revert to what they know – familiar product‑style planning and assessment – especially in high‑stakes phases.
The technology used for scoring doesn’t change this dynamic. If the underlying uses and consequences of assessment are product‑style, those logics will dominate, no matter how sophisticated the tools.
Where ACJ fits in this picture
ACJ, as implemented in RM Compare, is sometimes treated as if it were inherently “progressive” or inherently “more valid”. But as we saw in Part 3, it is really a flexible mechanism for aggregating professional judgement. That flexibility means it can be mis‑used just as easily as it can be used well.
Consider three scenarios:
- Process curriculum, product use of ACJ
- The curriculum (like CfW) emphasises rich experiences, progression and teacher agency.
- ACJ is used mainly to generate high‑stakes rank orders and grades against narrow criteria.
- Results feed directly into accountability, league‑table‑like comparisons or high‑consequence decisions.
- In this case, ACJ behaves like a very efficient exam machine. It sharpens the product signal inside a process curriculum. Teachers quickly infer that optimising these scores is what really matters, and assessment drag intensifies.
2. Product curriculum, process‑like use of ACJ
3. Process curriculum, process‑aligned use of ACJ
- The curriculum defines progression in rich, developmental terms.
- ACJ is used to examine and cluster diverse authentic work against process‑based descriptors, to build shared exemplars and narrative descriptions.
- Outcomes are used primarily for professional learning, curriculum refinement and low‑stakes system insight, with any high‑stakes uses carefully delimited and transparent.
- Here, curriculum and assessment are pulling together. ACJ becomes part of the infrastructure that makes a process curriculum visible and actionable, rather than a subtle force pulling it back towards product logics.
A simple misalignment matrix
| Curriculum model (in policy) | ACJ use in practice | Alignment? | Likely result |
|---|---|---|---|
| Process | Product‑style: mainly high‑stakes ranking/grades | No | Assessment drag; reversion to test‑driven teaching; process aims eroded. |
| Product | Loosely process‑like: exploratory, low‑definition | No | Weak signals; inconsistent decisions; low trust in results. |
| Process | Process‑aligned: progression descriptors, exemplars, rich evidence | Yes | Assessment reinforces curriculum; shared understanding of quality; more coherent practice. |
What this means for system design
The big lesson is that assessment design decisions cannot be made in isolation from curriculum theory. When considering tools like RM Compare, system leaders need to ask:
- Which curriculum model are we actually operating? (Not just what the branding says.)
- What purposes will ACJ serve – grading, moderation, exemplar building, professional learning, system monitoring?
- How will tasks, judging criteria and reporting structures embody our chosen model?
- What safeguards will prevent gradual drift back towards product‑style uses that undermine a process curriculum?
If those questions aren’t asked and answered explicitly, the path of least resistance is almost always towards product‑style use, because that is what most accountability frameworks and inherited cultures expect.
In the final post of this series, we’ll bring the strands together in the Welsh context: OECD’s view of CfW, the politics of PISA and cross‑border comparison, and how a consciously aligned use of ACJ could help Wales “win across Offa’s Dyke” by building an assessment architecture that truly matches its curriculum ambitions.