Earlier this week I wrote a brief post explaining how research evidence that appeared from the Education Endowment Fund (EEF) appeared to challenge evidence undertaken by Deborah Myhill and her team at Exeter University on the success of her ‘grammar for writing‘ programme. See here.
Now, as an English teacher, ‘grammar for writing‘ is a relatively familiar approach to teaching grammar. It explicitly focuses upon teaching grammar in the context of texts and literature, using the tricky meta-language of grammar with students, and it is one approach I use as part of my current teaching and planning. For example, students may study and learn about the impact of model verbs by reading political speeches or some dystopian fiction. This approach runs counter to an arguably more traditional approach of decontextualising the study of grammar. This may mean grammar drills and a linear model of discrete lessons focusing upon aspects of grammar. Of course, many teachers use a broad mixed economy – which is more akin to my current stance.
Each to their own if it works.
My post, which sparked some controversy, was not about the minutiae of pedagogy, but how we needed to be critical and circumspect about how we use research evidence as teaching professionals. I was accused of saying research is pointless, as it provides conflicting evidence, therefore everything is relative and flawed. This accusation is way off mark. I think research, and teachers engaging in such reflective practice, is both empowering and can have a positive impact on our practice so that we may become better teachers for our students. Simply the act of reading the aforementioned studies had me thinking about my practice more deeply.
I did raise points of caution. We should scrutinise the evidence for the ‘how’, the ‘who’ and the ‘why’. My previous post was a case in point. I needed to heed my own advice and scrutinise the evidence really rigorously. I hadn’t read every detail of both studies – only headlines and sections of the research.
In my first post I observed that the EEF study was independent, whereas the original research involved Myhill and, importantly, the educational publishers Pearson (whom obviously had a vested interest) and it therefore was less objective. This was seized upon by some. After finding out more about the original research, I found out that Pearson were not involved in the original study that lauds the success of this teaching method. Instead, the original research was conducted and by funded by the Economics and Social Research Council (ESRC) – with Pearson not involved until a later stage. This probing into the ‘who’ of the research is significant of course.
The findings of this early ESRC study covered 32 classes, with comparison and intervention groups in a RCT etc. Students in the comparison group improved writing results by 11%; students in the intervention group improved their results by 20%. The study applied a mixed methods evidence based, with qualitative interviews and lesson observations adding to the evidence.
In stark contrast, the EEF findings found no more than a negligible improvement with larger, whole class teaching. The scale of the research was larger, with a sample of circa sixty schools, with students in year 6. Deborah Myhill and her team were also part of the implementation of this study. It found: “Grammar for Writing is not effective in improving general writing among Year 6 pupils when delivered as a whole class intervention over four weeks“, but it was more effective for small group teaching of lower ability students.
My post clearly drew comparisons about the efficacy of the ‘grammar for writing’ approach, without knowing every finer detail of both research studies. I have had numerous replies to the post since. One explained that it was unsurprising that the studies threw up different findings, as it was like comparing ‘apples and pears’. The original study by Myhill was year long, comprising of year 8 groups, whereas the other, published by the EEF, was undertaken in a primary school by year 6 students over the course of a month. Assuming they are both rigorous studies, the variables are clear and possibly hugely significant: different key stages, different teachers, different lengths of study etc.
An EEF research study on feedback that I undertook in my school found little improvement in the grammar of students’ writing over a short period (the focus wasn’t solely grammar and students did improve their writing for ‘purpose and audience‘). I wasn’t surprised – you don’t make significant progress with such skills – they take years of development – through repetition and endless honing! If I spent a year on my intervention I would may fairly expect different results.
My first post, highlighting the discrepancy between both research studies, actually led to me having to do what I advocated doing – paying close attention to such evidence, with a critical eye, then looking to interpret the information relative to my school circumstances. Perhaps using ‘grammar for writing’ works better at Secondary school because of a fistful of variables, such as teacher specialist subject knowledge, group size etc.? Perhaps the nature of the Myhill pedagogy is more appropriate and suitable for the secondary phase? I need to do this work with my department – design our own pedagogical approaches to teaching grammar and evaluate our evidence with rigour.
In short, I need to find some research answers of my own. I also know working with organisations like the EEF would help immensely. No school is an island unto itself.
The point is that we can usefully engage with research. It is unequivocally adding to our repertoire of professional knowledge. One piece of research poses a thesis, this is often followed by an antithesis (like the two independent Myhill studies). We then systhesise this knowledge – ideally with a good working knowledge of the methodology of rigorous research (seeking support from the EEF and other such bodies are a boon for teachers in this area) and we move forward. A good old dialectical approach to knowledge. As teachers, we need to engage and critically evaluate the research evidence. I feel like I’m repeating myself – but I think the point is worth making.
In engaging with these two research studies I have learnt a lot in a week. I will think a little harder about teaching grammar next half-term. In short, I’m learning stuff about being a better teacher. That has to be a good thing.