The War on the Imagination

There were numerous news items last week about surveillance and algorithmic decision-making in the workplace. These kinds of stories aren’t new, of course. But as Elon Musk has infiltrated the White House, instructing his minions to eliminate as much of the federal workforce as possible; and as the Trump Administration seeks to roll back the last sixty years of civil rights legislation; and as Silicon Valley’s pro-AI/anti-DEI mindset has infiltrated management everywhere else, the plans for artificial intelligence are increasingly clear: crush labor.
Crush labor, and then evade all accountability for this destructive decision-making.
I recently saw someone say (citation needed) that we might view this as the ongoing fallout from the pandemic, when, for a brief period, some workers were able to win some concessions from their employers: the ability to work from home, most notably. There has been an uptick in unionization and in public support for unions in the US too, reversing the past few decades' trends for both. So we shouldn't be surprised that the response from management has been to fire people, to threaten to fire people, and to threaten to replace people with robots (or in today's parlance, with chatbots or AI agents).
Dan Meyer astutely points out in his latest newsletter that while many ed-tech entrepreneurs and investors might want to replace teachers with robots, and while many schools might give it the ol' college try – "When There’s No School Counselor, There’s a Bot," The Wall Street Journal recently reported – this simply won't work. Ed-tech can't replace teachers because the industry has no clue what teachers actually do. Instead, Meyer argues, "it's wasting them."
Again, this is hardly new. Back in 1926, when education psychologist Sidney Pressey invented the first teaching machine, he too claimed that the automation of education was a modern imperative, one that would "free the teacher from much of the present-day drudgery of paper-grading, drill, and information-fixing" and "leave the teacher more free for her most important work, for developing in her pupils fine enthusiasms, clear thinking, and high ideals." The latter is vague, but aspirational; the former, an assertion that reveals he, like so many entrepreneurs and reformers past and present, believe that much of teaching is already mechanical.
This fantasy of ed-tech — this century-old fantasy — has always struck me as a huge failure of imagination: why double-down on mechanizing the classroom? If the work is drudgery for teachers, it's likely drudgery for students too. (Indeed, students at The Ohio State University, where Pressey taught, recognized this immediately, and proposed building a device that would automate the tasks on their side of the teaching-machine equation.) Why is this “the work” of school?
Arguably, much of the push for even more technology, even more automation, even more control, even more surveillance in the classroom is a consequence of the pandemic too. Recall the astonished recognition – it was ever-so-brief – that teaching was the most important and most difficult job and that teachers needed to not just be praised but compensated much much much better. And now, that’s been erased — purposefully; and conservatives and technologists and politicians alike feel it necessary to put teachers (read: women) back “in their place.”
During the pandemic, schools had the opportunity to radically rethink what education might look like – as a workplace for teachers, to be sure, but also as a place of growth and exploration and learning for students. Instead many opted to double-down on the worst aspects of school, to embrace some of the worst sorts of surveillance technologies – test-proctoring software, most egregiously.
There was — again, briefly — widespread revulsion about education technology during the pandemic. So dull, so dehumanizing. And now? Now we're handing everything over to AI – a complete and total surrender to "the digital" at the expense of "the human," letting the demands of the technology industry dictate pedagogy and research and assessment rather than respond to the needs of teachers or students or parents or communities.
“The only war that matters is the war against imagination” — Diane di Prima
Two incredibly moving, incredibly important stories from this past week on what teachers do: "The Teacher in Room 1214" – a profile of Ivy Schamis, a teacher at Marjorie Stoneman Douglas High School in Parkland, Florida, and the lengths she has gone to support the students in her class who survived the mass shooting at their school in 2018. And "As Immigrant Students Flee in Fear of ICE Raids, Teachers Offer Heartfelt Gifts" – gifts to prepare them to continue their education elsewhere; gifts to remind the students they are loved, they are welcome, as the Trump Administration violates their constitutional rights to an education.
We can't automate this work, obviously. It’s not a task or a routine to be engineered. It’s relational, emotional. But we shouldn’t just shrug and automate around this work either, as Sidney Pressey suggested we might: mechanizing teaching and testing so that teachers have more time to care. Why are these the working conditions for teachers and students? Why are these the conditions of life? Why are we building systems to streamline schooling in the midst of an onslaught of violence and hatred and suffering?
As the world has become increasingly precarious — environmentally, economically, politically — it’s no wonder that students have embraced AI, something that instantly produces an essay, an answer; something that promises the fulfillment of schools' tasks without the burden of any more risk. As teachers continue to be grossly overworked, tasked with more and more digital bureaucracy, and are told again and again that their teaching practices are deficient, their lesson plans even dangerous, it's no wonder that AI slinks its insidious way into schools with the promise of productivity, objectivity, and infallibility.