Matter and Dystopia

Matter and Dystopia
Mountain bluebird (Image credits)

I'm not sure if we can call it an official launch – there's no actual product yet – but a new education company Matter and Space unveiled its website last week. The company, co-founded by former SNHU president Paul LeBlanc, MOOC pioneer George Siemens, and clinical psychologist Tanya Gamby, promises "a revolutionary approach to human development” – “human-centered learning for the age of AI.”

Matter and Space has released two very expensive-looking videos – one in which the co-founders explain the company’s mission; and the other, a six-minute science fiction story written and directed by Lee Loechler, meant to showcase "what is possible" with AI (and maybe, if you use your imagination, with the forthcoming Matter and Space technology).

These videos, even without any actual technology in hand, are worth talking about, as they are extraordinarily rich texts to explore what I’ve described elsewhere as the “ed-tech imaginary,” the powerful narratives that shape how we understand the history, present, and future of education alongside the role that technology has played, currently plays, and maybe might someday play if we can actually get it to work right and – let’s be honest – even if we don’t. Indeed, this imaginary helps underscore that Matter and Space’s unreleased product – whatever it will someday do – will always in some way be intertwined with and even secondary to this initial storytelling, to the promises of the future that the product says it will unlock.

Much of the discussion of any new technology relies heavily on this imaginary – it's so clear with all discussions of AI right now – as we’re supposed to overlook what cannot quite be done right now and focus instead on what the tool will surely someday be able to accomplish. No surprise then, much of the commentary by Matter and Space’s co-founders is in the future tense: “what we’re building,” not “what we’ve built.” I do appreciate, I suppose, at least saying you're gesturing towards a new and different future – or, at least this company demonstrates bolder ad copy than the press releases from universities bragging they've signed a contract with a big generative AI player.

And yet even here, so much is bound up in a resolute faith that AI will take care of things; it hasn’t yet, but it will. “This is the worst AI you’ll ever use,” AI evangelist Ethan Mollick likes to say – although Cory Doctorow’s insights about software's slide towards “enshittification” should serve to remind us that it’s more likely this is, in fact, the best it’ll ever be.

“Matter and Space isn’t just about what you learn – it’s about who you become,” the website reads. The startup isn’t unique here (although they position this as a differentiator). Promises about the future are as foundational to education as they are about technology, and learning is always, in some way, a liminal time and space. All manner of educational enterprises – from schools to startups to “Draw Tippy” ads in the back of comic books – urge people to imagine a future self. They sell people a future self, a future world.

Southern New Hampshire University, where Paul LeBlanc served as president from 2003-2024 during the school’s most transformative years, was particularly adept at this salesmanship. SNHU rapidly expanded in those decades from a residential university to a largely online one, and one of the largest in the US at that. Although SNHU remains a not-for-profit, it adopted the aggressive advertising tactics of the for-profit college industry, spending far more per student on recruitment than on actual instruction. (You've surely seen the ads.)

Perhaps Matter and Space is a continuation of this – the slick videos suggest as much, although it’s not readily apparent who the target audience for this marketing might be – other than the investors and entrepreneurs in attendance at the ASU+GSV Summit, where the videos were first circulated.

“We started with the question, ‘what could learning look like in the age of AI if we were unconstrained by the way in which it happens today,” LeBlanc tells the camera. “Then everything becomes possible.” And yet, the constraints of the past annd present are not so simply waved away – neither in their materiality nor in the mind. The echoes of education’s past run throughout Matter and Space’s futurism: tropes about education designed for “the masses” and not the individual; the promise (a la Sidney Pressey in the 1920s) that technology will handle the “knowledge transfer,” leaving the teacher to “really take care” of students “as human beings”; the idea that if we can just gather and track enough data on a student, we can know the “entirety of the person” – an idea expressed by Columbia University’s Ben Wood (also in the 1920s), when he argued that “to get accurate and significant information about students, and to record it in a way that will be available and meaningful and directive at each step in the educational ladder, is a duty fully coordinate with, and certainly prerequisite to the proper discharge of the duty to teach students.”

As I wrote in Teaching Machines, “to know students, for Wood, meant to test students – via content examinations, psychological analyses, personality assessments, and intelligence and aptitude tests.” He recognized early on the impulse to extract as much data about students' thinking, knowledge, and behavior. In the promotional video, George Siemens becomes the latest in a long line of scientists to restate this objective: “our goal at Matter and Space is to know our learners better than they’ve ever been known by an educational or any other institution.” Echoes of Ben Wood, to be sure, and more recently, echoes of Knewton CEO Jose Ferreira, who a decade ago famously boasted that his learning analytics company collected millions of data points on students every day and that his company could tell a student if, for example, “you learn math best in the morning between 8:40 and 9:13 am” or “you learn science best in 42 minute bite sizes.”

Ironically, perhaps, it was Ferreira’s assertion to Wired that his “robot tutor can essentially read your mind” that prompted Siemens to declare in a now-deleted blog post “Adios Ed Tech.” In it, he decried startups that

require the human, the learner, to become a technology, to become a component within their well-architected software system. Sit and click. Sit and click. So much of learning involves decision making, developing meta-cognitive skills, exploring, finding passion, taking peripheral paths. Automation treats the person as an object to which things are done. There is no reason to think, no reason to go through the valuable confusion process of learning, no need to be a human. Simply consume. Simply consume. Click and be knowledgeable.

One might ask what’s changed in the last ten years – and one wouldn't be shocked if the resounding answer, without much thought given, is “the technology, duh." But I’m not sure that’s an entirely satisfactory answer. The technology – fill in the blank with which one – has promised for decades and decades and decades and decades now that we are on the cusp of robot tutors for everyone.

Perhaps what’s changed are the aesthetics? Perhaps what’s shifted are the politics? And these options seem far more foreboding.


“I got my copilot today,” the young Lakota boy announces. “Everyone came to watch.” He is surrounded by half a dozen other people, although they remain out of focus. “I’m the first in my family to have one,” he says, pushing his new smart glasses up his nose.

The subtitle informs us this is the Pine Ridge Indian Reservation. It does not tell us that around 60% of the homes there still today have no electricity or water or sewage systems. It does not tell us this reservation, located in southern South Dakota, is by many accounts the poorest place in America, where the life expectancy of men like the boy in this story is a mere 48 years. It does not explain how the Oglala Oyanke – “the scattered ones” – ended up there, how they danced the last Ghost Dance, how hundreds of them were massacred at Wounded Knee. It does not mention the long history of anthropologists and filmmakers using their image, taking their stories, commodifying their knowledge systems. There is no history, no structural analysis in this film. The fuzzy figures in the opening image aside, there are no elders, no teachers.

But there is AI.