Nothing But Flowers

Nothing But Flowers
Downy woodpecker (Image credits)

The plan for December was to focus on a big writing project (okay, a book proposal), mostly pausing these weekly emails in order to write other things; but here we are, mid-month and I've accomplished very little towards that end. (You're still getting the odd newsletter, so I guess that's something.) In some ways, I think my lack of progress reflects a problem that everyone has faced for the past couple of years now: "artificial intelligence" has dominated almost every conversation, hijacking everyone's plans for today and tomorrow. And I very much do not want to write a book about "AI" and education. It's. All. So. Dumb. More broadly – indeed, more importantly – I truly wish it was something that none of us had to address. Not in writing. Not in speaking. Not in thinking. Not in working.

Oh sure sure, there are still those utterly enamored with the gadgetry, making all sorts of promises: harder, better, faster, stronger. There's nothing to show for "AI" other than venture brouhaha, but still the apostles of "AI" will not shut up. I will say, at this stage, they're starting to sound a little desperate, pleading and posturing and gesticulating enthusiastically, because while they keep talking, few of us are actually buying any of it. (I know it might seem like everyone is. But I promise you: most people know that "AI" is bad bad news and do not want it in their classroom / bedroom / workplace / kitchen / car / church / etc / etc / etc.

Dan Meyer, recounting a recent debate on the topic or AI's transformative potential, writes that "this idea that we made a massive technological breakthrough three years ago and now we can improve schools like never before. I wish it were true." My wish, I think, would be that we'd reject techno-solutionism altogether and that we would invest – literally, figuratively – in one another.

"AI" is a divestment in one another. It is the surrender to extraction and exploitation.

And not just in some sort of imagined future. Now. Today.

"The fear is everywhere," reads the title of a new report from researchers at UCLA chronicles how ICE activity has dramatically (and negatively) affected schools – no surprise as approximately 5 million children in the US live in a household with at least one undocumented individual. More bullying. Lower attendance. Reduced health and well-being of everyone in the community. "Close to 40,000 foreign-born child care workers have been driven out of the profession in the wake of the Trump administration’s aggressive deportation and detainment efforts," Hechinger Report recently noted, pointing to a recent New America study on the impact of ICE on both the child care workforce and on mothers' employment. These anti-immigrant initiatives are all "AI" initiatives, and the technologies of prediction and punishment that undergird ICE also undergird ed-tech – systems of ranking, rating, assessing, control. Never forget: "Ed-tech is cop shit."

And maybe some people have forgotten. Maybe some people simply do not care. Maybe some people don't feel like they're in danger – they don't have to worry about children or child care. ("Women's work.") They don't have to worry that someone will ask for their papers. Maybe some people are comfortable standing on the necks of others in the hopes it will raise them above that (arbitrary) line that marks certain lives, certain jobs, certain bodies as worthy, marketable, safe, good. I don't get it. I really don't. I don't understand how you can rationalize the terror and cruelty in your nifty "magic school." But clearly a few (loud) people still do.

"Maybe the 'A' in AI Sometimes Stands For Asshole," Leah Reich suggests. Maybe!


This is ed-tech.


A couple of months ago, Derek Thompson argued that "everything is television." (One reason, no doubt, why Neil Postman's analysis of schools and society can remain so vital.) But I'd also argue as well that "everything is gambling."

There have been a number of stories recently about the growth of prediction markets and the rise of betting on everything – not just on sports, but on awards, on elections, on war. On everything, every possible outcome of every possible decision. It's all being datafied. It's all being financialized. "Place your bets."

"Everyone is gambling and no one is happy," Kyla Scanlon writes in her latest newsletter. Americans, she argues (based on some Harvard polling data), are worried about the economy, worried about democracy, worried about one another – distrustful of everything and everyone, and "AI" is only making things worse.


"AI is Destroying the University and Learning Itself" by Ronald Purser.

In classrooms today, the technopoly is thriving. Universities are being retrofitted as fulfillment centers of cognitive convenience. Students aren’t being taught to think more deeply but to prompt more effectively. We are exporting the very labor of teaching and learning—the slow work of wrestling with ideas, the enduring of discomfort, doubt and confusion, the struggle of finding one’s own voice. Critical pedagogy is out; productivity hacks are in. What’s sold as innovation is really surrender. As the university trades its teaching mission for “AI-tech integration,” it doesn’t just risk irrelevance—it risks becoming mechanically soulless. Genuine intellectual struggle has become too expensive of a value proposition.

(I had a bunch of other links on "AI" and education to share, but when I typed them all out, I noticed that they were all written by men. FFS. This whole technofascist regime, this whole resurgence of "AI," is very much wrapped up in a backlash to the past sixty years of civil rights, isn't it. So how convenient that LLMs are going to generate verbiage and imagery that reflects Reddit.)


I've been listening to the podcast series from MIT's Teachlab, Homework Machine. (Recommended.) In one episode, Justin Reich describes an assignment he has his students do, identifying the pedagogical framework that underpins various pieces of ed-tech: is it instructionist? or is it constructionist? But I wonder if that's (sadly and often) granting far too much intentionality, too much "theory" to an industry that has for the past decade or so been decidedly against theory, privileging instead the "unreasonable effectiveness of data."

And I wonder too if, particularly when it comes to "AI," the model isn't about teaching and learning as we have conceived of these practices for the past century, but a return to a pre-scientific credence in revelation: "AI" as oracle. There is no rational explanation needed – no theory or proof necessary. Only faith. (And, um, a lot of money.)

(One of the best books I've read this year, incidentally: God Human Animal Machine by Meghan O'Gieblyn.)

It might feel like the end of the world. Indeed, the apostles of "AI" might lean into the millenarian rhetoric, hoping to compel us to fear and obey. But endings are also beginnings...


"This was a shopping mall. Now it's covered with flowers."


Today's bird is the downy woodpecker – Poppy and I saw one on our walk along the Hudson River the other day. The famously annoying Woody Woodpecker – that laugh, my god – was (according to Wikipedia) inspired by a noisy acorn woodpecker, although Woody's appearance is more pileated woodpecker. So many woodpeckers. And phew, as Woody and AI evangelists both serve to remind us, so many noisy assholes.