Let me recall a tale of two Parents’ Evenings. My colleague attended the usual fare at his son’s primary school and he learnt that his son was working at a level 1a in reception literacy. When he probed about what that really meant and what he could do as a parent the teacher was somewhat flummoxed. The teacher struggled to articulate what really mattered as she was so tied up in a mass – or mess – of data and levels.
My experience for my young daughter: I found out what she could do, what she could understand and what she needed to do more of – in school and at home. Alongside her competence with letter sounds and number, I found out about how confident she was with others and whether she was happy. No levels, nor no sub-levels or data targets. It was a wholly different emphasis – free from obsessing about data…and it worked. It is a quite simple but profound difference.
We should ask: when did we lose our way and get lost down the rabbit of sub-levels and worse? When did we lose sight of assessment being for learning.
Schools, like Wroxham Primary School, were already leading the way with their methods of assessment – see here – so we were emboldened and seized the opportunity to get rid of National Curriculum levels over a year ago. We wanted to ditch ‘ego-involving’ levels and unleash the real power of formative feedback (see this Dylan Wiliam video here).
Happily, I can say that at my school we have dumped NC levels and we are not looking back. For the last year at Huntington School we have been implementing our new Key Stage 3 ‘assessment beyond levels‘ model. We devoted a great deal of time and support to build a new curriculum and assessment model concurrently last year. The CPD input was significant, but essential – totaling around twenty hours.
We thought that it was imperative that we overhaul both the curriculum and the assessment model in one go, given the changes in our cohort and their needs, alongside the the prospective changes to GCSEs and more. We wanted more challenge and to end the annual rigmarole of last-ditch GCSE interventions.
We began by establishing our overarching principles of KS3 assessment:
- The primary objective of our assessment system is to improve students’ learning;
- A move towards a system of formative assessment
- A move away from simply charting attainment, and towards charting progress relative to a student’s starting point;
- A focus on the Growth Mindset ideals;
- An end of year examination in Years 7, 8 and 9;
- Flexibility to allow for subject level distinctions.
Defining the Butterfly
Initially, we stripped away the superficial labels of levels and decided what our students needed to know, do and understand by the end of their time at Huntington, being it at 16 or 18. We then planned backwards to ensure such knowledge and skills were addressed at Key Stage 3 – with fewer assessments and deeper learning. We aimed towards conquering the address the ‘big ideas’ in each subject (or the threshold concepts).
We then went about creating outcomes and apt assessment at a subject level to ensure that we honed the knowledge and skills that mattered. Crucially, we needed to set the standard of excellence and let that guide our curriculum and assessment design. We were inspired by the work of Ron Berger and ‘Austin’s Butterfly’. Take a look at this video and see how formative, meaningful feedback can supplant the poverty of being assigned merely level a little higher than what came before:
We knew that every subject had to ‘define their butterfly’ – to set the standard for every piece of work – from an essay in English, to an opus in Music. We then started to think about what steps would each student need to take to aspire to the best standard possible, just like Austin made his way to excellence.
Crucially, this process needs to be manageable for teachers, so it was best to group students to compare the progress of students relative to similar peers. We therefore decided to use the following broad groupings (using existing RAISE data): High, Middle and Low starters (of course, these labels were a short-hand to help guide teachers and were not, and will not, be shared with students). This was no cap on progress or achievement – it just gave us a benchmark and workable groupings. They were ‘starters‘, so they could freely move groupings relative to their progress.
We again used the image of ‘Austin’s butterfly’ to rationalise how different students would have different stating points and how progress was always relative to their current starting point:
‘Student A’ may start at stage one of these symbolic butterflies, whereas ‘Student B’ may well begin at stage 4. With this rationale, we looked at how we could chart on-going progress. Now, we could easily fall back down the rabbit hole in the vain pursuit of an exact science. Of course, these judgements are relative and they are ‘best fit judgements‘ – made by the teacher who has the best knowledge of their student – not relying on some abstract national criteria. If we are to beat teachers over the head with data demands and expect ceaseless progress, like some Stalinist state task-master, then we can kiss goodbye to useful assessment.
We created a simple taxonomy for our progress descriptors, influenced by the very useful NAHT Assessment Commission Report. Each of the three groups in the cohort will make progress that is relative to their starting point. There are four descriptors of progress that relates to where we expect them to be at that stage of Key Stage 3 (which can be reviewed annually):
- Exceeding expected progress
- Meeting expected progress
- Working towards expected progress
- Underperforming against expected progress
We have used this information to support the tracking of students, with teachers having the responsibility to focus on strategies for improvement at a subject level. Rather than a heavy-handed grip on endless ‘interventions’ from school leaders, our ‘interventions’ are focused on good, responsive teaching.
Of course, there has been months of work to help teachers understand the new assessment framework and to gain consistency of reporting. We needed to do a lot of tweaking with the progress descriptors. What the difference is between ‘meeting‘ and ‘working towards‘ in any given subject takes time to define and we will continue to work on that. It is not a simple business, but it means that teachers get to understand what they are assessing (Tim Oates, from Cambrige Assessment has said that teachers need to become assessment experts – right down to the questions we ask and students – and he is right. We need to commit the time and effort to deepen our expertise).
It is of course a little harder to equate progress across a huge range of subjects. Students may take longer to ‘meet’ expected progress in Music over English, for example, but we accept those subject differences and we have sought to better understand them. Indeed, that is the stuff of real learning – complex differences and nuanced understanding – not the endless, inexorable rise from sub-level to sub-level that has characterised much of our assessment (learning and progress is gloriously messy – like a butterfly – it takes a great deal of ‘catching’).
The focus of our work this year has been a robust programme to ensure consistency ‘within‘ departments and to understand the strengths and flaws of the new assessment system and our new KS3 curriculum. To achieve something like consistency we ensured a programme of subject specific training, work scrutiny and departmental moderation was undertaken. As Tom Sherrington put it:
“At the core of this issue – of students being set work that is too easy or having substandard work accepted – is that teachers lose sight of expected standards. They don’t define what the finished Butterfly should look like for each learner, taking account of their age and prior attainment. The key to real assessment of this kind and the real standard setting that it allows, is to have routine moderation processes.”
Good moderation is – and will continue to be – the thing.
Daisy Christodoulou, Research and Development Manager at Ark, has also commented about the limitations of vague and abstract criteria:
I think Sherrington and Christodoulou combine here to hammer the nail of effective assessment at KS3 squarely on the head. Teachers, including myself, have been mired in the maze of APP and level criteria and we were disempowered to exercise our own best judgement. I know, with an accuracy that supercedes a sub-level, whether one of my students is progressing as they should, or whether their work meets what I expect of them. I can make that judgement and compare similar students in my school. That is the liberating power of going beyond levels – we can exercise our best judgment and then focus on giving students formative feedback on how to improve.
Key Stage 3 Examinations
We report on progress once per term to parents. Our only reporting of attainment to parents is that of an end of year exam. Now, I know the mere mention of exams can spark dystopian visions of a factory-model, but, done well, they can support our students to learn better. Teachers at Huntington were unanimous: in a world of terminal GCSE examinations, students needed to learn how to revise and how to best exploit their memory, and that this was actually the essential stuff of learning. The ‘testing effect‘ would also help in reducing stress and anxiety in the future.
Subject by subject we designed an appropriate end of year assessment. Some adapted previous materials, others grabbed useful pilot material from exam boards, while some started from scratch. As Tim Oates playfully put it – we became “assessment kleptomaniacs”. In all honesty, this has proved the most challenging aspect of our new curriculum and assessment model and more work will no doubt be required.
We took the opportunity to extrapolate grades using the new 1 to 9 system in some subjects, like English and maths, whereas others report with percentages, such as PE and Food Technology (we are reporting to parents with a clear explanation of what we are doing). Though clearly these attainment grades are estimates, they give a flavour of the future attainment of our students. We will develop them iteratively in future and we will continually communicate to parents our developments.
In truth, the actual attainment isn’t the essential element (it is just a guide-post) – it is the act of learning to learn in a different style and to prime their memory to do it well.
This is a messy period for schools and parents. Year 9 students about to embark on their GCSEs next year will gain numbers and letters on their examination certificates. We shouldn’t hamstring our new KS3 curriculum with an obsessive focus on GCSE skills and gradings. It is seductive to think that we can wait for GCSE descriptors, pulling them down into Key Stage 3, thereby making our students better prepared…but it won’t. We are waiting for a false dawn. We are only waiting for some abstract language that doesn’t change how anybody teaches in reality.
We should trust our instincts as subject experts: if students learn what we consider they need to know, do and understand now, then the students flourish in their GCSE examinations, regardless of vague descriptors of skills.
Waiting for ‘answers‘ from new GCSEs will only give us some technical information about shifting some of content and tweaking the type of assessment we undertake e.g. practical work, field work etc.; however, core skills and knowledge our students need at both GCSE and A Level – the subject specific ‘threshold concepts’ – will not change a great deal. We should trust ourselves to design a great KS3 curriculum and the GCSE attainment will take care of itself.
No doubt, we are still in the early days of getting our KS3 curriculum and assessment model right. I believe it will take three years to fully take root and for teachers to fully understand the standards we have set our students. We will create banks of student work in English, portfolios in Food Technology, video footage in Physical Education, audio file collections in Music, and more. We will continue to build that shared language that Daisy Christodoulou describes. Good moderation – teachers working collaboratively to focus deeply on student outcomes – is the key to better assessment for learning.
We have been fooled for too long about the supposed accuracy of NC levels. Between schools there was never really any consistency – moderation was piece-meal. Better to admit that fallacy and to instead aim for consistency on a micro-level – allowing schools to track their own progress with some authenticity and to ignore the game of comparing ourselves against other schools. Attainment at KS4 and beyond will always be there to account for the success of our KS3 curriculum anyway.
Like anything worthwhile, it takes effort. Designing new systems will have failings and it will take further development, but it is about quickly learning from those failures and evolving the model to better fit its purpose: helping our students learn better.
I have been surprised by how quickly levels become forgotten. I was asked by a couple of my year 8 students back in October about their level of attainment. I explained to them, and the class, our approach to formative assessment and concentrating on simply getting better. I wasn’t asked again – they were busy getting on with the work.
We have received a tiny number of complaints from parents (we introduced the new assessment model last year, in information evenings, to a very positive response), but we report termly on effort and progress, and most crucially, give their child a subject specific ‘strategy for improvement’, so parents understand the purpose of the new model. They are getting good qualitative feedback – specific and useful – just like I did at my daughter’s Parents’ Evening.
Sometimes we can get so fearful of deviating from the established norm that we forget how much freedom we can exercise in schools. In losing NC levels, we have in truth lost very little. I think each teacher has, given their investment of time and energy, learnt a great deal about good assessment for learning, and in future we will learn much more about our students than a paltry sub-level could ever tell us.
– I have written about the original rationale for going beyond NC levels here.
– The superb Tim Oates speaks about moving beyond levels and focusing on ‘fewer things in greater depth’ here.