The road to localisation
We have started on the journey to 'localise' RM Compare. What is it and why is it important?
We have started on the journey to 'localise' RM Compare. What is it and why is it important?
It sounds obvious, but getting Items into RM Compare needs to be a simple, quick and reliable process. We have completed a significant upgrade to the service.
This interesting and novel piece of research brought together two new, innovative products to explore the relationship between content originality and value.
After much promise and some lead time, we are pleased to announce the release of Licence Centre and Connector functionality into the RM Compare system.
On Monday I travelled to London to attend the regional roadshow for the ongoing DfE Curriculum and Assessment review. I was joined by over 150 others from across the sector to learn more about the review and to offer contributions to the ongoing work.
November is always one of the busiest months in the calendar - this year is no exception. We are attending and participating in a number of events this month. At the same time we are continuing to work toward the new connector functionality - more news soon!!
As part of its 120th anniversary celebrations the ISEB worked with the RM Compare Team to launch a groundbreaking creative writing competition. The Board were keen to use Adaptive Comparative Judgement based on a previous project that showed a number of advantages over traditional assessment approaches.
In the rapidly evolving world of educational technology, a groundbreaking study published in Assessment & Evaluation in Higher Education is challenging our approach to digital assessments. The research, titled "From academic integrity to assessment validity: a paradigm shift in digital assessment" by Phillip Dawson and colleagues, argues for a fundamental change in how we view and implement online assessments.
We have a number of 'firsts' in this edition including the announcement of our inaugural symposium, our first venture into podcasts and our shortlisting at the prestigious Learning and Technologies Awards.
Join Education Assessment expert Victoria Merrick on Tuesday November 26th 2024 at the amazing Amazon UK Head Office for a transformative symposium, exploring the challenges and opportunities in principled curriculum & assessment design for all subject areas.
A recent trial conducted by the Australian Securities and Investments Commission (ASIC) found that AI performed significantly worse than humans in summarizing complex documents. At RM Compare, we see these findings not as a setback, but as an opportunity to reinforce our commitment to human-centric assessment solutions.
In an era where artificial intelligence is reshaping various industries, the value of human-generated content has become a topic of intense discussion. Reddit's recent valuation and data licensing deals provide fascinating insights into this evolving landscape, offering valuable lessons for educational assessment platforms like RM Compare.
We support a broad range of research projects. Our engagement in each case is context specific however there are some general principles.