The Research Says…

In Uncategorized by Alex Quigley16 Comments

(Image via Vanderbilt Edu)

The research says…” is something I have heard uttered many times in the last couple of years and I do so myself too. With healthy skepticism, I have been wary of the authority this statement wields and how it may be received uncritically by some. It is time to examine the questions that attend the statement of ‘the research says‘ – like ‘who says it and with what authority?‘, ‘why are they saying it?‘ and ‘why does it need saying?

I must declare my vested interest: I work closely with ResearchEd and the Institute for Effective Education at York University – both institutions that seek to promote the using of research in education. I am involved in leading an Education Endowment Foundation funded initiative- the RISE Project (Research-leads Improving Students’ Education) – which seeking out the impact of research-informed school improvement.

When I say ‘the research says…‘ I am making a judgment about what I deem good research, troublesome though that is. I am of the opinion that you can apply scientific methods, with good judgment, to find ‘better bets‘ about what works in schools and the classroom. My view is that such quantitive research should be balanced against qualitative research – they can and should be compatible. There is no perfect method, but we can better avoid spurious judgements. Of course, we should be mindful that we all pick our own experts, even when we appear to be selecting evidence to test our assumptions.

Given our instinctive biases and our selection of evidence that supports our own beliefs we should exercise a series of critical questions when we survey any research:

– Who produced this research and why?

– When was it produced and does this have effect upon the outcome?

– What outcome measures have been selected and why? What research methods have been undertaken? Are the controls sufficient to move beyond spurious correlations?

– What related research challenges or supports these findings?

I understand that there are some fears that such evidence may be used by policy makers to justify ill-considered changes in education. Well, let’s be that clear that policy has been made without any evidence at all, throughout the history of modern education, so we are likely  tilting at windmills to pursue this argument.

For me, a knowledge and understanding of what ‘the research says’ is an empowering notion that allows teachers to make better informed decisions about their pedagogy, and to challenge and help shape any school, or national policies that are being implemented that will affect our practice.

I do not see research evidence as a ready-made easy answer (perhaps we should be happy with asking better questions) – education is far too complex for that – but, instead, it signals a commitment to inquiry and evaluation, checking our biases and testing our instincts and assumptions. Research evidence can only ever tell us about what has happened in the past and in other contexts that ours, but if we are to ignore evidence on those very same grounds we risk rejecting useful information that can help better inform our own decision making.

With the growing interest in research and I see edu-companies begin to integrate the language of research evidence into their slick sales package. They have an unofficial checklist: glossy website; beguiling logo; teacher and student testimonials declaring the outstanding impact of said product; research case studies from satisfied; a range of pseudo-scientific terms to convey the authority of expertise; and, finally, an expensive training package.

Does this seem familiar? Have you read the direct emails and intercepted the glossy leaflets?Of course, some products have robust and reputable research supporting their efficacy, but too many do not.

Like adverts since time immemorial, the use of scientific language abounds to convey a sheen of expertise. Given the interest in research evidence, the language of neuroscience has become flavor of the moment. The seductive allure of neuro-scientific language is well documented here. In the last year I have explored a lot of products and their supposed research and their exaggerated claims. Once you scrape away the mask of language – the obligatory graphs and data – far too many products have been found wanting given any scrutiny of their claims.

Professor Rob Coe, who leads the charge for a culture of schools being evidence-informed, and led, has cited the many dangers that attend what he describes the latest buzz phrase – being “evidence based“. Instead, he appeals for a culture of evidence and evaluation in schools – a professional, critical and wise application of evidence and judgement – see here. When did he write this timely manifesto?

In 1999. This alone should give us pause.

So when I utter ‘the research says…‘ I am offering up a critical challenge to our intuitions and biases about what works, but I recognise some of the abuses that attend this claim. We should listen closely when this phrase is uttered and we should remain critical, but we can be hopeful that testing our ideas, and our intuitions and biases, against a range of research evidence can make for better decision making in schools.


Related reading:

I have written about the crucial difference between correlation and causation in research evidence – see here.

I have written about the role of a Research-lead as a Devil’s Advocate (incorrigible lapsed Catholic that I am) here.



  1. I completely agree about the need to couple quantitative & qualitative research. Whilst I agree that research should be robust, valid & reliable, I feel there is a risk of it becoming sterile. There is a need to provide the narrative, to ultimately tell the story if the young people who have been impacted by the research

  2. It’s kind of amusing how we seek to sterilize the humanness from situations via ‘evidence’ -a noble cause for sure given how susceptible we all are to our own beliefs- but then forget the humanness required in interpretation and application of evidence based practices. In school/uni I used to make a game of taking the same data and twisting the words around it to create different but seemingly logical conclusions. This isn’t so hard and each conclusion had supporters.

    This is from Alfie Kohn’s latest blog: “In a series of four experiments reported in the March 2015 issue of the Journal of Personality and Social Psychology, Justin Friesen, Troy Campbell, and Aaron Kay found that a lot of people, upon encountering facts that contradict a political or religious belief they hold, don’t “revise [their] belief to be more in line with the new information” but instead flick away the facts by reframing the issue as a moral one. And the more that people’s convictions are threatened, the more likely they are to rely on unfalsifiable beliefs.”

    As a student all I ever wanted was for the methods my teachers used to be justified fully in their own minds as worth our time, or at least worth the time to try them out. Research on teaching/learning is fantastic, I follow allot of what is done, but it’s dangerous to force practices onto our teachers or to leave them drowning in a load of “shoulds” about the way they teach each minute. Who knows what we can all learn and discover if all teachers felt free to find, apply and discuss the research that inspires their individual practice? I’m writing more on re-empowering teachers from my blog.

    1. Author

      For me, teachers understanding the research, with all its attendant flaws, is ultimately very empowering. Hen a teacher could eschew and challenge the ‘should’ and better work out the ‘could’ from the evidence.

      1. Exactly Alex! This brings new energy and possibility into the game which, for many, is feeling stagnant and… well… oppressive. Whether or not things really are stagnant & oppressive is another matter, I personally believe we’ve never had more opportunity to flourish!, but the *belief* that things are limiting is, I’m finding, rather real and widespread across schooling systems as a whole.

        1. Author

          I agree with your diagnosis. I think that we can resist much in our own schools, especially if our school lesdership protects from the crap. Money is tight, but we still have much agency over how we teach and lead.

  3. I do remember back in the distant past that education changes were led by major reports, the results of which were seismic changes in the landscapes of schools and colleges, and hugely influential for teaching pedagogies.
    The move towards comprehensive education was very much a moral decision, as was the unification of 16+ exams under GCSEs. The ‘cancellation’ of assisted places back in the late seventies was a political decision more than moral , as have been the emergence of academies and Free schools over the last decade. In many senses over the past parliament, political will has triumphed over moral purpose; setting management free from the legitimate constraints of local public oversight inevitably was going to corrupt more than a few – human nature is sadly thus.
    The cris de coeur for evidence based decision making has been an (as yet) vain attempt to restrain national government from its own excesses. It was in Blair’s second parliament that Labour publicly wished it had been more radical in its first term and thus taught Cameron and Co to go for broke this time round.
    The evidence has now stacked up that the PFI initiatives that have built so many schools and hospitals are the next (financial) crime against which to hold governments to account; but hold the passionate advocates of Literacy and Numeracy initiatives to account first please. With messianic zeal, and with Ofsted a willing inquisitor, the state forced its schools down a cul de sac from which many will not recover. Where now a liberal arts education?
    Nuthall, Hattie, EEF and the Sutton Trust shine a light on what is happening in our classrooms, and across the world other research agencies have also rallied the troops to roll back the neocon agenda by using evidence.
    It’s the unremitting awfulness of current non-evidence based policy implementations that bothers me; when Michael Rosen rails against the latest national curriculum dictat for the teaching of Poetry, I know he is right and as an independent headteacher can trim accordingly. But DfE won’t budge an inch, and I fear for young children whose love of rhyme will be destroyed by those seeking to ensure they ‘make progress’ in artificial and unhelpful ways.
    In short, worrying whether we have enough evidence prior to making choices is a ‘Wicker Man’ – like choosing petrol over diesel engines when all the motorways are being shut around us. Social mobility is enabled for all by great education, but what works in Reigate or Rochdale is hugely influenced by local circumstances and it’s local government working with national government that will make the difference.
    Only one of the political parties has this close in their manifesto, but could Labour be trusted to ddeliver? What’s the evidence from the past show?!

    1. Author

      You raise a whole host of issues there James! I don’t attempt to hide the fact that much evidence is used, abused and ignored. For me, if we can take the next decade to empower teachers with know,edge and understanding of the evidence then we have a chance to cohere a voice of the profession to better Challenfe baseless initiatives.

      I can’t say I know enough about the poetry matter you describe to comment properly.

      1. Here’s Michael Rosen’s blog on the poetry - – as ever beautifully written.

  4. Pingback: The Research Says… – HuntingEnglish | The Echo Chamber

  5. Your blog prompted me to return to the EEF site to have a look at their toolkit again and I wondered if there had been some sleight of hand involved or maybe I’m remembering wrong. I was sure there was something on there about assessment for learning. Now there’s only ‘feedback’ with AfL mentioned within the general text. I say this, because I returned specifically for the purpose of looking again at the number of ‘rigour’ points awarded for assessment for learning, which I recall as being rather high. For ‘feedback’ there are 3 padlocks which indicates a strong research base. The thing which makes it a little disconcerting is that this research base is apparently ‘2 or more rigorous meta-analyses’. But meta-analyses have serious limitations and I can’t really see a justification in any of the literature cited, of 8 months + in terms of benefit. In fact one of the most recent articles, is Randy Bennett’s which emphasises the very lack of much solid, reliable evidence for assessment for learning.

    My point here being not to criticise feedback or AfL but to wonder if even with the filtering and supposed usefulness of something like the EEF toolkit, teachers are really going to be any the wiser or any less likely to jump on the bandwagon just because it has the most stars (or padlocks)?

    1. Author

      The problem always rest wine fact that you need to make some things simply understandable, but it hides many nuances behind it. The padlocks and months progress are just another imperfect example. I think feedback has a lot of sound studies, some with very large impact, so when you club it all together the large months progress ‘feels about right’. It doesn’t tell us ‘how to do feedback’, or even exactly what it is – but it leads us down the road. Alas, bandwagons are laden down the road!

      1. Yes – I think you put that very well. I would prefer a policy of not jumping on any bandwagon – often much more harm is done than good, in terms of the energy requirement for new initiatives and the morale-sucking impact when they somehow aren’t the magic answer.

        1. Author

          I think so much in education takes simple, wise principles, like high expectations, then brands and monetises them. We could judo with all having a sound knowledge of the available evidence, couple with the wisdom of our experience, to derail and resist the bandwagon!

  6. Pingback: Strike A Pose - HuntingEnglishHuntingEnglish

Leave a Reply