Of the 200 or so soundbites I tweeted from IATEFL 2017, the one that produced the longest discussion thread was Scott Thornbury’s declaration that “it’s a well-known fact that teachers don’t read research.” And Thornbury wasn’t the only one to mention it.
The idea that research is research, practice is practice and never the twain shall meet (to paraphrase Kipling) was cited, lamented, and baked into sales pitches by a number of speakers and commentators throughout the week.
Why don’t teachers read research? Lack of time, lack of money for that time, lack of perceived relevance and the simple “inconclusiveness” of research findings themselves. Which came as a shock to many.
But I teach in private language school. I could play a Liszt rhapsody with the fingers I’d have left over after counting up the number of teachers I personally work with who read primary research.
And I’ll be honest: I don’t read research either.
Opening the door
At least, I didn’t, until I started doing Delta Module 1. Like every great learning experience I’d ever had, the best thing I can say about my otherwise excellent tutor was that she opened a door for me and helped me step through.
I started reading the research. At first it was what Thornbury referred to as mediators, that is, those who help us interpret the science: methodology books like The A-Z of ELT, Jack Richards, Pearson’s How to… series.
Then, in Delta Module 2, I took a different approach. I wanted to know where these “mediators” were getting their stuff from. So I really started digging into actual research.
You know, like the stuff you find on the internet.
And that, of course, is the problem.
I’ve got a university degree, and years of experience as a teacher, but when it comes to actually interpreting and making use of what Thornbury termed the “inconclusiveness” of research findings, I was sailing without a rudder.
My choices were skewed by a number of factors:
- I only accessed papers freely available from Google search
- I had limited time
- Lack of knowledge meant I had little way of evaluating the truth of any claim or measuring it against other research
And I drew my own conclusions. In the end, consciously or unconsciously, I’m sure I was guilty of cherry picking. Or of using research inconclusiveness in the name of my more opinionated or decisive conclusions.
Research in practice
When I’m not teaching, my other job (ah, that cringe-worthy phrase!) is in the world of marketing, where everybody shouts about the need for data, data-driven insights, etc. But the fact is that most marketing writers and bloggers (myself included) are guilty of the same exact things – cherry-picking statistics, circulating unquestioned facts and leaping to conclusions.
In drawing this comparison I’m not saying that the world of education and marketing are the same. In both fields there are serious people working to try to understand and explain how their respective domains work and using scientific methods and experiments to back things up. But there are many others, sometimes with the best of intentions, sometimes without, who draw less than scientific conclusions about the data that they find. In many cases because they have no training in reading, interpreting and making sense of research.
Think of some of the big ideas in (ELT) education, all of which, explicitly or implicitly, came up at IATEFL 2017: grit, growth mindset, resilience, learner autonomy, plurilingualism, English-mediated instruction, materials-light teaching, etc.
And those ideas have superseded (which is not to say disproved, but rather pushed, at least for the moment, off the attention-shelf) those of previous years: grammar-translation, audiolingualism, the lexical approach, the direct method, Universal Grammar, etc.
These are the ideas that inform the methodology books, past and present, that we read. What should I, as a teacher, make of them? And the fact that many of them thrive for one decade, only to be overturned in the next?
One of my Delta tutor’s favorite expressions was “throw out the baby with the bathwater” because (and I’m certainly not the first to say this) it so perfectly describes what many publishing and practicing in ELT seem so good at doing.
All of this is just to say it’s understandable that teachers, if they give it thought at all, are intimidated by, or skeptical of, what they can get from reading research.
A basis for skepticism
It’s a skepticism that was interestingly echoed by the very methodology writers that Thornbury interviewed in his informal survey. “JS” commented:
I’ve never found much formal “research” very helpful to my own classroom work. I am not “anti-research” but I do carry a suspicion of many statistical studies in teaching.
And he wasn’t alone:
In other words, despite all their years of writing about teaching, these ELT mediators can sound very much like the practitioners on the other side of the research-practice divide they’re meant to bridge!
A problem in search of a metaphor
But there’s a danger in suggesting, as I did in the title, that there’s a great divide between research and practice, because it suggests that they’re opposites. Research lacking in application vs. practice without any grounding in research findings. Evidence-based abstractions vs. unscientific gut feelings.
Maybe a more honest take is that we’re all on a continuum between the two extremes, and that we need to all try to pull each other toward the middle.
How can we do that? A few proposals:
- Take off paywalls for access to research papers produced by public/state universities. If you’re supported by government funds, you’re supported by taxpayer funds, so let us see and think about what you’re doing.
- Make mediated research available (in the form of methodology books) in every school and language center. As Scott Thornbury said elsewhere, many schools might do better to just dump the year’s coursebook budget into the Cambridge Teaching Library guides (and yes, as series editor, he admitted his choice of texts was biased). It would be a better investment for all concerned and a good start. Because even if the attitude some methodology writers offered toward research is, as Geoff Jordan wrote, “disquieting”, they still represent a jumping-off point for learning more about ― and beginning to question ― theory.
- More support ― and funding for? ― research mediation efforts like ELT Research Bites, a voluntary effort at research mediation, with follow-up from teachers about what research they made use of in their own classroom, in other words, how the research fit into their practice.
- Give teacher-learners a reason to research. Mine was the expensive and time-consuming Delta, but independent, smaller-scale projects would work as well, including individual attempts at action research.
- More time devoted in training and on teacher certification courses (Delta, etc.) to how we can assess and make use of research. There should be a strong push away from the Harmer and Scrivener basic teaching books that got you through the Celta ― or even time devoted to a critical reappraisal of them.
- Accept and indulge some inconclusiveness. On the one hand, universities and researchers should push back against commerical attempts to make a fast buck on the latest edu-fad before it gets disseminated into methodology and teacher training. On the other, as Thornbury (somewhat controversially) suggested, I think we should be sensitive when seeking to debunk other people’s cherished or time-strengthened beliefs, drawing a line between what is in fact harmful or detrimental to learning and what’s simply been disproven but may have some sort of non-negative placebo effect (and I’m saying this someone whose knee-jerk response to any positive discussion of learning styles is, “You heard of the Foer Effect?”)
Some of these ideas might draw research and practice closer together.
None of them, however, would solve the problem of time. And pay or return on investment for that time.
But that’s a topic for another post.