As we rush headlong toward a general election in early May, the news defies the promise of Spring with the usual bombast about teachers failing our students and the latest silver bullet which will solve everything. Now, every teacher knows that what makes the news is education is some scandal, silliness or worse. Rarely do people have time to dig beneath the headlines, so it is important that we ask some critical questions about the headline news.
The BBC has happily scandalized a nation of parents by sensationalizing the latest OFSTED report on how state schools are failing the least able:
The criticism claims that schools’ assessment, tracking and target setting appears insufficient. School leaders are “complacent”. Sir Michael Wilshaw fires similar shots only a couple of years ago, only their advice for schools then was blunt and not very useful – well, you could describe it as insufficient too. I have criticized some of the lazy policy that attends provision for Gifted and Talented, or the Most Able, students here, citing the OFSTED report that precedes the latest broadside.
The report goes on to loosely criticize lots of aspects of schooling, from mixed ability teaching, to failing students from poorer backgrounds. Each criticism would require a lengthy blogpost explaining the many flaws and nuances that make this render these criticisms rather dubious. We could easily draw into question the accuracy of OFSTED inspector judgements that destabilize reports like this.
One way to dig beneath the headlines is to look for some green shoots of evidence-led reporting that challenges some of the tabloid tub-thumping. One such report that didn’t make the BBC headlines was the excellent new report by the Edudatalab. One striking finding challenging the edifice of linear progress of our students that props up OFSTED’s criticisms of Secondary Schools failing to ensure the progress of our most able students.
Their excellent report, ‘Seven Things You Might Not Know About Our Schools‘, calls into question how our accountability system – primarily OFSTED – uses the orthodoxy of linear progress to make judgements of Secondary School success or failure. They use of illuminating graph to expose the idea that linear progress between key stages and across phases is a brilliantly complicated process – something that you will struggle to capture in short inspection processes:
They ask the pertinent question:
“But do children normally take such smooth learning journeys as they acquire knowledge and understanding in a subject as our accountability system assumes? And is it reasonable to deem children as ‘on target’ or ‘in need of intervention’ using this approach?”
This subtle and complex question won’t make headline fodder, but we should share it and challenge easy headlines. Some schools surely can do better, other schools are ok, some great – but such complexity doesn’t make front page copy. I look forward to Edudatalab presenting their complex case for the progress of Most Able students from poor backgrounds. I hope it arrives before Tristram Hunt’s ‘Most Able Fund’ is initiated so that policy is founded on better evidence than just OFSTED inspections.
Whilst scanning the BBC website I also found this story from a week or so back that was doing the rounds on Twitter and inspiring much support:
When you dig beneath the headlines you can see that the report that claims the the classroom environment makes for massive gains in Primary School children learning ‘s commissioned – shock horror – by IBI – an architecture and technology firm. Yes – a reputable university has conducted the study and they refer to lots of technical statistical analyses – such as ‘multilevel statistical modeling’. The report – here – looks incredibly attractive as you might imagine. It also offers us attractive notions, like hiring in a design firm will cure all of our educational ills.
It is, of course, a little too good to be true. I cannot find much evidence of controls. The randomization process of the trial appears tucked away too. The report, quite inexplicably eliminates all measures of teacher performance. The report explains away teacher impact, but then reveals all the statistical improvement is based upon teacher self-reprted grades (in the form of sub-levels). I’m sure people involved in the trial may have some awareness of the problems that attend consistency of teacher reporting with levels.
Their claims are huge for the well designed classroom space. They assert it accounts for 16% of the variation of overall progress for the students involved in the study. The study involved a mere 27 Primary Schools. I will readily admit I am not a statistician expert, but when they talk about air quality impacting on learning so significantly I smell the faint whiff of… dubious, over-cooked headlines. I would like to see Edudatalab turn their gaze on this report. We should ring up the architect and ignore teacher quality just yet!
As we all know, making the news doesn’t make it true.
5 thoughts on “Making the News: Damn Lies and Some Data”
Thank you for challenging the headlines. I sometimes despair that despite over 100 years of universal education in this country, our electorate, even those who bother to vote apoear to be so willing to accept these bland sensational headlines as the true facts. Hope more people will share this post and THINK.
Pingback: Making the News: Damn Lies and Some Data | Uxbridge College Teaching and Learning
Another great post, Alex.
I couldn’t get the link to the Edudatalab’s ‘7 things…’ report to work, but this takes you to the pdf (I hope!):
I read this with interest but didn’t realise that you have a new book due out imminently. I’ve pre-ordered it and I’m really looking forward to reading it and using your ideas to inform my teaching practice.
I read Closing The Vocabulary Gap when that came out in 2018 and it helped me as a student teacher. In fact I reviewed it as part of my professional development portfolio. I’m sure that Closing The Reading Gap will have a similar effect.