A Cautionary Tale of Educational Evidence – Part 2

In Uncategorized by Alex Quigley7 Comments


Earlier this week I wrote a brief post explaining how research evidence that appeared from the Education Endowment Fund (EEF) appeared to challenge evidence undertaken by Deborah Myhill and her team at Exeter University on the success of her ‘grammar for writing‘ programme. See here.

Now, as an English teacher, ‘grammar for writing‘ is a relatively familiar approach to teaching grammar. It explicitly focuses upon teaching grammar in the context of texts and literature, using the tricky meta-language of grammar with students, and it is one approach I use as part of my current teaching and planning. For example, students may study and learn about the impact of model verbs by reading political speeches or some dystopian fiction. This approach runs counter to an arguably more traditional approach of decontextualising the study of grammar. This may mean grammar drills and a linear model of discrete lessons focusing upon aspects of grammar. Of course, many teachers use a broad mixed economy – which is more akin to my current stance.

Each to their own if it works.

My post, which sparked some controversy, was not about the minutiae of pedagogy, but how we needed to be critical and circumspect about how we use research evidence as teaching professionals. I was accused of saying research is pointless, as it provides conflicting evidence, therefore everything is relative and flawed. This accusation is way off mark. I think research, and teachers engaging in such reflective practice, is both empowering and can have a positive impact on our practice so that we may become better teachers for our students. Simply the act of reading the aforementioned studies had me thinking about my practice more deeply.

I did raise points of caution. We should scrutinise the evidence for the ‘how’, the ‘who’ and the ‘why’. My previous post was a case in point. I needed to heed my own advice and scrutinise the evidence really rigorously. I hadn’t read every detail of both studies – only headlines and sections of the research.

In my first post I observed that the EEF study was independent, whereas the original research involved Myhill and, importantly, the educational publishers Pearson (whom obviously had a vested interest) and it therefore was less objective. This was seized upon by some. After finding out more about the original research, I found out that Pearson were not involved in the original study that lauds the success of this teaching method. Instead, the original research was conducted and by funded by the Economics and Social Research Council (ESRC) – with Pearson not involved until a later stage. This probing into the ‘who’ of the research is significant of course.

The findings of this early ESRC study covered 32 classes, with comparison and intervention groups in a RCT etc. Students in the comparison group improved writing results by 11%; students in the intervention group improved their results by 20%. The study applied a mixed methods evidence based, with qualitative interviews and lesson observations adding to the evidence.

In stark contrast, the EEF findings found no more than a negligible improvement with larger, whole class teaching. The scale of the research was larger, with a sample of circa sixty schools, with students in year 6. Deborah Myhill and her team were also part of the implementation of this study. It found: “Grammar for Writing is not effective in improving general writing among Year 6 pupils when delivered as a whole class intervention over four weeks“, but it was more effective for small group teaching of lower ability students.

My post clearly drew comparisons about the efficacy of the ‘grammar for writing’ approach, without knowing every finer detail of both research studies. I have had numerous replies to the post since. One explained that it was unsurprising that the studies threw up different findings, as it was like comparing ‘apples and pears’. The original study by Myhill was year long, comprising of year 8 groups, whereas the other, published by the EEF, was undertaken in a primary school by year 6 students over the course of a month. Assuming they are both rigorous studies, the variables are clear and possibly hugely significant: different key stages, different teachers, different lengths of study etc.

An EEF research study on feedback that I undertook in my school found little improvement in the grammar of students’ writing over a short period (the focus wasn’t solely grammar and students did improve their writing for ‘purpose and audience‘). I wasn’t surprised – you don’t make significant progress with such skills – they take years of development – through repetition and endless honing! If I spent a year on my intervention I would may fairly expect different results.

My first post, highlighting the discrepancy between both research studies, actually led to me having to do what I advocated doing – paying close attention to such evidence, with a critical eye, then looking to interpret the information relative to my school circumstances. Perhaps using ‘grammar for writing’ works better at Secondary school because of a fistful of variables, such as teacher specialist subject knowledge, group size etc.? Perhaps the nature of the Myhill pedagogy is more appropriate and suitable for the secondary phase? I need to do this work with my department – design our own pedagogical approaches to teaching grammar and evaluate our evidence with rigour.

In short, I need to find some research answers of my own. I also know working with organisations like the EEF would help immensely. No school is an island unto itself.

The point is that we can usefully engage with research. It is unequivocally adding to our repertoire of professional knowledge. One piece of research poses a thesis, this is often followed by an antithesis (like the two independent Myhill studies). We then systhesise this knowledge – ideally with a good working knowledge of the methodology of rigorous research (seeking support from the EEF and other such bodies are a boon for teachers in this area) and we move forward. A good old dialectical approach to knowledge. As teachers, we need to engage and critically evaluate the research evidence. I feel like I’m repeating myself – but I think the point is worth making.

In engaging with these two research studies I have learnt a lot in a week. I will think a little harder about teaching grammar next half-term. In short, I’m learning stuff about being a better teacher. That has to be a good thing.


  1. I enjoyed your blog as usual Alex, as a teacher educator from the ‘other’ of the ‘two cultures’, I welcome the rise of evidence based practice in education but not as it is being presented. I agree with your thoughts. Responding to ‘educational evidence’ is what I did as a science teacher (on a minute by minute basis) for many years and what I hope I am continuing to do as a science teacher educator, you are right paying close attention to evidence with a critical eye is absolutely vital. In my endeavours I hope to help trainee teachers ‘see’ the value of evidence informed teaching and learning, where the evidence is from a primary source, the kids themselves. Thank you for inspiring me to respond to your ideas

  2. In that great big picture book in the sky that knoweth all, the most complex systems are chaotic, having no rhyme or reason. Education is one off in terms of complexity, and because all the parameters are variable, it is known as a complex adaptive system (CAS). Go look up on wikipedia for more. Nothing is easy to test and there will be no silver bullets to be found.
    Evidence based practice in education is not pointless nor worthless, but needs certainly to be derived from the long term longitudinal study. 4 weeks testing might show some promise, but only sufficient to merit longer work in depth. Where evidence arrives in such depth and variety that the conclusions are inescapable then perhaps some thing like a Royal College can make recommendation.
    What worries me is that for a very long time, money has flown to organisations that don’t follow these rules, by the sound of it EEF qualifies. And where you have highly reputable research bodies funded for the long term, such as Prof Bill Boyle’s team at Manchester, how can a new government simply choose to switch off the outcomes because they don’t fit. Read Boyle on the EBacc for example here.
    Considered judgements to are of value, ref either Tomlinson or Rose review, but when politics trumps them every time, nonsense is made.
    The new call for evidence based practice comes in some insane hope that it will back politicians into a corner and will triumph. I fear for our new white knight, and ask that we rest him quickly. Instead, Manchester, Durham, Pearson, NFER et all need to identify and share their big data sets, and permit some open research into that data.
    From big data studies of complex systems, we know for example you need over 50,000 employees in an organisation for social partnerships to work. John Lewis is admired for its success and employs 80000. Of course units can be much smaller than the whole, but creating small free schools cut adrift from agreed best practice is evidentially a recipe for disaster, as some will succeed, but many can’t help but fail. We know that profit and performance related play works, providing all benefit in the social group, but never works in CAS when employee is pitted against employee. So why compulsory PRP in the state sector now mandatory?
    Would it ever now be possible to run clinical trials in a state school where teachers are no longer impartial?
    Lots of words perhaps but a serious call to arms. I would love to see parliament set in statute that the implementation of whim by those elected be subject to criminal investigation for malfeasance.

  3. Pingback: A Cautionary Tale of Educational Evidence – Part 2 | HuntingEnglish | The Echo Chamber

  4. Thank you for this follow up blog post. The additional information provided significantly helps to contextualise the original post and also to significantly refine my understanding of what you were meaning to say in the first post.

    All research is long term and intensive. This is why it is important that any emerging conclusions are carefully probed and considered, with follow up research to test understanding, before widespread implementation. I also think that clear understanding is very difficult without a conceptual framework to slot the research findings into. For example, in this case, how do students understand ‘grammar’ and how does their understanding of it effect how they embed it into their own writing? If research is designed to shed light on this type of question it probably has more chance of contributing towards improving learning…..maybe.

  5. Hi Alex. This is excellent stuff. This is a superb example. A bit like the ‘Hattie on homework’ stuff, people want simplicity in a complex world. Grammar for Writing works. Grammar for Writing doesn’t work. Well, obviously enough, it depends…… You’ve illustrated that really well. It makes me think that, given the money and time resources we’ve got, we can’t rely on large scale trials to reveal fundamental truths even if they are very well defined. I’m more inclined to focus on small scale local explorations as with Lesson Study etc. Factor in the murky water of ‘values’ and you’ve got trouble presenting the outcomes of studies like those on Grammar for Writing as hard evidence that should inform practice in any one direction. It’s so interesting – and I agree that this does not dismiss the value of research; it just puts it in a more realistic context.

  6. Pingback: This much I know about…the trouble with educational research | johntomsett

Leave a Reply