Editor’s note: This essay is an entry in Fordham’s 2023 Wonkathon, which asked contributors to answer this question: “How can we harness the power but mitigate the risks of artificial intelligence in our schools?” Learn more.
Many of my educator colleagues have recently expressed concern about the rise of AI in our classrooms. I haven’t agreed. In fact, AI programs like ChatGPT do not worry me. I’m increasingly convinced that some futuristic version of education will not replace our classrooms, nor our teachers. And the precise reason I am not worried is because I feel as though AI has already been tried in our classrooms.
Hasn’t it?
I mean what do we call a decades-long educational system that eliminates recess, sports, art, history, and science for many marginalized community schools in the name of higher reading and math scores? Or the National Reading Panel‘s finding in 2001 that reading for pleasure does not impact “achievement”? What would you call the literal scripting of standards-based instruction around key standards and test preparation?
More recently, how would we describe a virtual classroom where one non-credentialed teacher monitors nearly a hundred kids plugged into an online curriculum?
If these are not intelligences of an artificial nature, I do not know what is.
And here’s the kicker: They’ve all failed—and failed spectacularly.
But why did these systems of AI fail? Why are they failing now when put into practice? That answer is simple and needs no coding nor algorithm. It’s because kids are not computers, and what we must teach them to be healthy, happy, and well-educated adults has never been less artificial.
AI cannot teach compassion
During the pandemic, a wave of articles and books found publication about the need for social and emotional learning. But what the general public may not realize was that, just prior to Covid, legislators were not interested in social and emotional learning. I know, as I was a policy fellow trying unsuccessfully to find champions for a bill in California to fund SEL. But then there was Covid, and kids in front of computers expanded the SEL conversation. Suddenly, my colleagues and I found ourselves in state senators’ offices instead of in the hallway talking to staff.
Why? That’s because parents and educators alike now see what limits technology like AI actually has. Parents want their children to grow in their social and emotional intelligence, not just academics. Computers simply cannot teach compassion, and despite well-advertised efforts, they never will.
AI cannot teach work ethic
During the pandemic, realities on the computer made work completion optional in district after district. Soon, educators and educational writers were noting a brand-new term, too—learning loss. That’s because it wasn’t just a loss of access to technology during Covid. If that had been the case, only students not on their computers during class would have experienced learning loss. The simple fact is that AI cannot, nor can any technology by itself, replace a human teacher instilling the values of a work ethic in their students.
AI cannot teach stewardship for the environment
Climate change and STEAM science is at the top of many priorities, nationwide. It’s not just for employment sakes either.
AI can relegate tasks to students about environmental issues. AI can teach and even assess content about environmental issues. But AI cannot teach the type of stewardship for the environment that real humans do in various organizations nationwide. That takes real humans, in the very real and natural world, giving hands-on experiences. Computer screens do not have hands.
AI cannot teach the appreciation of music, theater, dance, or any other art
As a recent colleague told me, “Art is what brings them to school.” I couldn’t agree more. My best attended classes during Covid were not mine, they were with our PE coach and our partners in STEAM who gave kids a chance to do hands-on science at home. But why can’t AI just teach these things somehow, someway? That’s because AI operates in the virtual world, and art is something that we experience in the real, tactile one. We can produce art virtually, but the experience of it requires a physical presence. And not just by ourselves. Art requires a shared experience from another real human being. Whether they are the watcher, the dance partner, the viewer, or the eater, art requires not a person and an AI; it requires people in the plural sharing their feelings about the art. AI has no feelings.
AI will not be equitable, nor will it affirm anyone’s identity or culture or empower anyone
In order for AI to do a modern teacher’s job, it would have to complete all of the impossible tasks that a real human does just by being human. But there is more required because a teacher must also consider equity, culture, and identity in their classroom.
In Race After Technology, by Ruha Benjamin, the author gives compelling evidence for why technology, including AI, can’t do our job humanely. She expertly illustrates that technology’s track record is not something anyone should be happy about, whether you are in a minority or a majority. Additionally, AI does not, nor ever will, possess the human ability to understand our differences or the ability to understand why those differences are beneficial and, often, lead to bias that benefits one group over another.
I feel very sure that artificial intelligence isn’t going to teach our kids anytime soon. Unless, that is, we haven’t learned from our earlier experiments with turning children into testing robots. That part does scare me, though. Because if we no longer prioritize very real human qualities like appreciation, equity, stewardship, perseverance, and compassion, somebody, somewhere, wanting to earn a buck is going to give it a try. And it won’t take long until we terminate what is best in us as a society.
But the good news is that to do that would be a very human decision—one which we, not AIs, are in control of.