My 2009 copy of Why Don’t Students Like School by Dan Willingham is among the most dog-eared and annotated books I own. Along with E.D. Hirsch’s The Knowledge Deficit (2006) and Doug Lemov’s Teach Like a Champion (2010), I’m hard-pressed to think of another book in the last twenty years that had a greater impact on my teaching, thinking, or writing about education. I’m clearly not alone: Dan’s book has been translated into thirteen languages.
A second-edition has just been published. There’s plenty of new and refreshed material, but the strength of the book—its proof point, actually—is how much has not changed from its first printing. Willingham set out to put between two covers a set of enduring principles from cognitive science (“People are naturally curious, but they are not naturally good thinkers”; “factual knowledge precedes skill”; “proficiency requires practice,” et al.) that can reliably inform and shape classroom practice—a rich vein of ore that Willingham began to mine in his “Ask the Cognitive Scientist” columns for The American Educator starting nearly twenty years ago. His many admirers will appreciate the opportunity to refresh their familiarity with the book. But the primary beneficiaries may be younger teachers who might be encountering it for the first time.
I recently talked to Dan about his masterful and accessible book, its origin and impact, the importance of explaining the findings of cognitive science to teachers, and the decision to bring out a second edition. Here’s our conversation, edited for concision:
Since Why Students Don’t Like School describes nine enduring principles from cognitive science, why do we need a new version?
That was one thing that went through my mind when I started the second edition. I explicitly said I picked these principles because I think they’re so well established. I said in so many words, “I don’t expect to be writing another edition of this book in five years.” I don’t have to eat crow on any of those points, but there are two major features of the book that are pretty new. One is that the scientific literature on intelligence has been updated. We used to think intelligence was mostly a matter of genes. People used to think intelligence was maybe 70 percent genes and 30 percent environment. By 2009, our understanding had flipped that completely. Since then, new, more powerful techniques for analyzing genetic data have been developed, and this new work indicates that, if anything [the evidence suggests] it’s now like 80 percent environment, 20 percent genes.
I also added a chapter on technology, discussion questions, and a glossary. Everything’s kind of updated. The other reason that people might want the second edition is to have some assurance. So this joker wrote this book in 2009? How do I know that he still thinks this is right?
When you first conceived of the book what was your goal? Are you closer to saying “Mission accomplished” in 2021?
You will appreciate the story of the genesis of this book. It grew out of a conversation with E.D. Hirsch, Jr., who has something of a love affair with cognitive psychology. One time he and I were having lunch and he was sort of rhapsodizing about how important cognitive psychology was. And I said, “Don, I think you’re going too far. The truth is most of what’s important in education we don’t know that much about. I can write for you on half a page of paper a set of principles that I’m confident of.” He said, “Well, I’d like to see that half a piece of paper.” And this is why the book ended up being nine principles that I think are both useful to teachers and have enough supporting data that cognitive psychologists are pretty sure that they’re true for all kinds of different kids, learning all kinds of different subject matter in all kinds of different contexts.
Speaking of Hirsch and “mission accomplished,” it feels as if there’s much greater acceptance of the idea that thinking skills depend on factual knowledge, which was one of your principles, than ten or fiteen years ago when so many people were speaking confidently about how content mattered less than teaching “twenty-first-century skills” like problem solving, critical thinking, and collaboration. Is that just confirmation bias on my part, or has there really been a discernible shift in education thinking and practice?
I’ve never published it, but I actually collected some data on teachers’ views of knowledge. What I found was that, if you just ask teachers, on a scale of one to ten, how important is it for kids to know things, the median is pretty high. Most teachers think it’s important that kids know things. But I do think that was held to be kind of separate from thinking skills. I would love to see data on the point you just raised—the extent to which teachers today understand the relationship between knowledge and the kinds of things that all of us most want students to be able to do. So your perception is mine too, but like you, I’m extremely suspicious that I’m subject to confirmation bias. We really have very little idea of what teachers are actually doing.
I still hear too many people in our field promoting the idea of teaching kids to “think like a scientist,” or “think like a historian” rather than teaching them what the historian or scientist knows so they can develop those skills. Does that drive you nuts?
If things drove me nuts, I’d probably be back in my lab doing what I started my career doing. I’m quite serious. When I first told my dad I’m shutting down my lab and I’m just going to write about education for a while, he said something to the effect of, “Well, we’ll see how much stamina you have for that.” My dad was a psychometrician. He worked in various posts for the College Board and Educational Testing Service. He was very much like the guy in the lab who never talked to the public, and he really liked it that way. He just thought that it was an absolutely dreadful thing to have to try and convince people who didn’t necessarily see the world the way that you do that you’ve got an interesting way of viewing the world and that they should talk to you. So none of it really drives me crazy.
To my mind, one counter-intuitive big idea that undergirds your entire book is that the human mind is not designed for thinking, but to avoid thinking. Honestly, I would have thought that teachers might get a bit prickly about that. Can you briefly explain what you mean?
When you explain it, it’s pretty intuitive. Basically, memory is extremely reliable. You think that your memory isn’t very good, but that’s because you call on it so many hundreds of times a day. And when it doesn’t perform as you think it ought to, it’s frustrating. The way cognitive psychologists define a “problem” is simply that you have a goal and you haven’t currently accomplished the goal. So things like, “I wish I were outside, but I’m currently inside,” or “I wish I had a pizza. I don’t have a pizza” are classified as problems. They don’t seem like problems to you because you just consult memory about what to do. If I want to be outside, I should walk out the door. If I want a pizza, I call the pizza place. So the idea is that the more of those simple solutions you have in memory, the better off you are because those are very useful and effective. Problem-solving is time consuming and it’s exhausting, it’s really effortful to do. So you want to have as many of these simple solutions to routine problems as you can, so that you’ve got the mental energy and the mental space to work on things that are really novel and really important to you.
What about teaching kids to be creative thinkers?
There’s a very clear trade-off between creativity and the sort of problem/solution that I’m talking about. If your solution to, “I wish I had a pizza” is always “Call the pizza place” you may never think, “I could try making pizza at home.” You’re always jumping to your established solution. But most of the advice about creativity is not very good. They say, “think outside the box.” I could try using a lettuce leaf instead of a coffee filter when I make coffee in the morning. That’s creative. I’m thinking outside the box. But it’s probably not a very good idea. There’s this enormous advantage to making coffee the way I always make coffee, that’s always worked pretty well in the past. Investing time and effort and hoping for a better outcome, it’s not obvious that it’s going to be worth it. So the trick with creativity is knowing when it’s worth it to question the way others have always done things.
You’ve got a new chapter on ed tech in the second edition. Can you give us the most important take-away?
When I first started in education research about twetny years ago, the prevailing sentiment seemed to be “this is going to change everything.” First Smartboards would change everything, then ChromeBooks would, then OpenSource Software, and so on. I know there were more thoughtful people out there, but a lot of what I heard boiled down to “the potential advantages of personalized instruction that digital technologies offer are so powerful, there’s pretty much no way to implement them incorrectly.” That’s not what people said explicitly, you understand, but all anyone talked about was the potential advantages, like just-in-time quizzing and instruction, integration of video and audio with text; they talked about not as potential advantages, but as things that would happen. It’s not crazy. It could have been true that these advantages were just overwhelming, and that the particulars of how you implemented them just wouldn’t matter much, sort of like early railroads may have been terribly inefficient, but they were still a huge advantage over roads and wagons. What we’ve learned in the last twenty years is that you can implement tech solutions that are lousy. That doesn’t sound like much of a huge advance in knowledge, but we had to figure that out. So now we have the much, much tougher research question in front of us: When does it work? What are the must-have features that allow us to realize the potential of these tools?