(Image via Vanderbilt Edu)
“The research says…” is something I have heard uttered many times in the last couple of years and I do so myself too. With healthy skepticism, I have been wary of the authority this statement wields and how it may be received uncritically by some. It is time to examine the questions that attend the statement of ‘the research says‘ – like ‘who says it and with what authority?‘, ‘why are they saying it?‘ and ‘why does it need saying?‘
I must declare my vested interest: I work closely with ResearchEd and the Institute for Effective Education at York University – both institutions that seek to promote the using of research in education. I am involved in leading an Education Endowment Foundation funded initiative- the RISE Project (Research-leads Improving Students’ Education) – which seeking out the impact of research-informed school improvement.
When I say ‘the research says…‘ I am making a judgment about what I deem good research, troublesome though that is. I am of the opinion that you can apply scientific methods, with good judgment, to find ‘better bets‘ about what works in schools and the classroom. My view is that such quantitive research should be balanced against qualitative research – they can and should be compatible. There is no perfect method, but we can better avoid spurious judgements. Of course, we should be mindful that we all pick our own experts, even when we appear to be selecting evidence to test our assumptions.
Given our instinctive biases and our selection of evidence that supports our own beliefs we should exercise a series of critical questions when we survey any research:
– Who produced this research and why?
– When was it produced and does this have effect upon the outcome?
– What outcome measures have been selected and why? What research methods have been undertaken? Are the controls sufficient to move beyond spurious correlations?
– What related research challenges or supports these findings?
I understand that there are some fears that such evidence may be used by policy makers to justify ill-considered changes in education. Well, let’s be that clear that policy has been made without any evidence at all, throughout the history of modern education, so we are likely tilting at windmills to pursue this argument.
For me, a knowledge and understanding of what ‘the research says’ is an empowering notion that allows teachers to make better informed decisions about their pedagogy, and to challenge and help shape any school, or national policies that are being implemented that will affect our practice.
I do not see research evidence as a ready-made easy answer (perhaps we should be happy with asking better questions) – education is far too complex for that – but, instead, it signals a commitment to inquiry and evaluation, checking our biases and testing our instincts and assumptions. Research evidence can only ever tell us about what has happened in the past and in other contexts that ours, but if we are to ignore evidence on those very same grounds we risk rejecting useful information that can help better inform our own decision making.
With the growing interest in research and I see edu-companies begin to integrate the language of research evidence into their slick sales package. They have an unofficial checklist: glossy website; beguiling logo; teacher and student testimonials declaring the outstanding impact of said product; research case studies from satisfied; a range of pseudo-scientific terms to convey the authority of expertise; and, finally, an expensive training package.
Does this seem familiar? Have you read the direct emails and intercepted the glossy leaflets?Of course, some products have robust and reputable research supporting their efficacy, but too many do not.
Like adverts since time immemorial, the use of scientific language abounds to convey a sheen of expertise. Given the interest in research evidence, the language of neuroscience has become flavor of the moment. The seductive allure of neuro-scientific language is well documented here. In the last year I have explored a lot of products and their supposed research and their exaggerated claims. Once you scrape away the mask of language – the obligatory graphs and data – far too many products have been found wanting given any scrutiny of their claims.
Professor Rob Coe, who leads the charge for a culture of schools being evidence-informed, and led, has cited the many dangers that attend what he describes the latest buzz phrase – being “evidence based“. Instead, he appeals for a culture of evidence and evaluation in schools – a professional, critical and wise application of evidence and judgement – see here. When did he write this timely manifesto?
In 1999. This alone should give us pause.
So when I utter ‘the research says…‘ I am offering up a critical challenge to our intuitions and biases about what works, but I recognise some of the abuses that attend this claim. We should listen closely when this phrase is uttered and we should remain critical, but we can be hopeful that testing our ideas, and our intuitions and biases, against a range of research evidence can make for better decision making in schools.
I have written about the crucial difference between correlation and causation in research evidence – see here.
I have written about the role of a Research-lead as a Devil’s Advocate (incorrigible lapsed Catholic that I am) here.