Just So AI Stories

Just So AI Stories
Image credits

In 1902, Rudyard Kipling published Just So Stories for Children, a collection of short, origin stories of how various animals acquired their best known features: "How the Leopard Got His Spots" and so on. Kipling had first told these tales to put his daughter to sleep, and “you were not allowed to alter those by one single little word,” he later explained. “They had to be told just so; or Effie would wake up and put back the missing sentence. So at last they came to be like charms.” Their comforting familiarity lulled the child to sleep – with their form and their content — for these are stories, Daniel Karlin argues on the OUP blog, “that answer the kinds of question children ask, in ways that satisfy their taste for primitive and poetic justice.”

It's easy to see Just So Stories as an expression of Kipling's Lamarckism – his belief that offspring inherit the traits their parents have acquired through use: if a curious young elephant comes home with his trunk stretched long by a crocodile, for example, this characteristic would subsequently be passed on to all elephants born after.

The publication of Charles Darwin's Origin of the Species in 1859 had certainly challenged Lamarck's earlier theories about evolutionary adaptation; but as Kipling's Victorian-era works suggests – as we can see in Just So Stories and in The Jungle Book (and elsewhere) – the popularity of Lamarckism persisted long after the introduction and acceptance of Darwinism. Indeed, Lamarck's ideas about improvement and inheritance had particular resonance among Victorians, like Kipling, who sought to justify racist social hierarchies with "science."

"Ontogeny recapitulates phylogeny" – not Lamarck's words but a core tenet of neo-Lamarckism nonetheless – became "the search for signs of apish morphology in groups deemed undesirable," as Stephen Jay Gould observes in The Mismeasure of Man. The primitive nature of the child — as Karlin alludes to above in explaining Kipling’s appeal; the childish nature of the “primitive.”

Gould continues a few page later,

Even Rudyard Kipling, the poet laureate of imperialism, used the recapitulationist argument in the first stanza of his most famous apology for white supremacy:
Take up the White Man's Burden
Send forth the best ye breed
Go, bind your sons to exile
to serve the captive's need:
To wait, in heavy harness,
On fluttered folk and wild –
Your new-caught sullen peoples.
Half devil and half child.

Gould, for his part, is sometimes credited with turning the title of Kipling's famous collection of stories into a dismissal: a "just-so" story as an untested, even untestable tale that neatly explains the world — explains natural and/or cultural phenomena quaintly but not scientifically. And it was, as used by Gould, a dismissal, to be clear, of sociobiology in particular, which he argued sought to reduce all human culture and behavior to evolutionary adaptation — to a speculation that was just too impossible to prove to ever be a science.

(Wondering if the evolutionary psychologists are still mad at Gould about this? Check out the Wikipedia entry on the phrase.)

I’d been thinking about “just-so” stories this week — without really thinking about Kipling until I sat down to write this newsletter; without knowing, I do confess, the whole disciplinary drama surrounding the lower-case version of the phrase.*

More accurately, I guess, I’d been thinking “just-so” stories and “folk science” — disputed science and stories, along with various academic fields that have scientific aspirations. About narratives that sound “just so,” and even if we suspect they’re not quite accurate, that serve as adequate-enough explanations — if nothing else, they fit with other stories we’ve been told. “Feels true.” Evolutionary psychology, for sure. Economics. Computer science. And so on.

We’re bombarded with “just-so” stories about technology — stories that rarely get questioned but, to my ear at least, sound just a little too simple, a little too pat. That robots are coming for our jobs. That things are changing faster than ever. That the Internet helps us connect. That suddenly suddenly we’re awash in information; suddenly suddenly we’re grappling with uncertainty. That computers help us think. That the brain is a computer, just not a very good one.

I’m less interested in any explanation as to why ”just-so” stories are so appealing — scientific or science-sounding or otherwise. These sorts of stories are easy to tell and easy to receive, whether or not you want to think we’re hard-wired to tell them because they offer some sort of evolutionary benefit. (I’m rolling my eyes here. I really do need to take a break from reading evo-psych stuff.**) Rather I like to scrutinize the histories and power of these stories — how they work, how they’re wielded and why.

But I do wonder: is the development of artificial intelligence neo-Lamarckian? It’s certainly all bound up in eugenics — no surprise with that invocation of “intelligence” — as Ruha Benjamin has recently argued. Surely, we can look to history to understand the ramifications of the politics of “intelligence” and hierarchy — to how, for example, neo-Lamarckism and eugenics have been used to decree certain people “children” in order to strip them of their rights.


How the Camel Got His Hump

Heard of the "80-20" rule of AI-created teaching materials – I guess this is a thing? – where 80% of stuff is useful and 20% needs tweaking. Yeah, this is a perfect example of a "just-so" story. (And it's more like a "20-80" rule, Dan Meyer graciously suggests.)

According to MIT Technology Review, "This AI system makes human tutors better at teaching math." No comment.

"Kids are learning how to make their own little language models," MIT Technology Review reports. How twee.

Via The Atlantic: "Schools Without ChatGPT Plagiarism." Marx would like us to ask, is it the structure? Or the superstructure?

Image credits

How the Rhinoceros Got His Skin

"I Fired My Therapist and Hired ChatGPT" – a typical post on LinkedIn.

Google's CEO says that more than a quarter of the code at the company is now written by AI. No word on what percentage of everyone's time at Google is now devoted to then zhuzh-ing it up.

Via 404 Media: "Inside an AI Training for Doctors." Meanwhile, "Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said," according to the AP.

How the Whale Got His Throat

"Classic Christmas song gets authorized Spanish reworking thanks to ‘responsible’ AI," TechCrunch reports. "Responsible AI" and Brenda Lee.

"Claude AI gets bored during coding demonstration, starts perusing photos of National Parks instead." "Gets bored" not "errors out."

"Facebook is Auto-Generating Militia Group Pages as Extremists Continue to Organize in Plain Sight" – via Wired.

Pretty sure I said this a few weeks ago, but Ars Technica reminds us that "40 years later, The Terminator still shapes our view of AI."

How the Alphabet Was Made

"Do we really need chatbots?" asks Max Read.

"Apple Intelligence is here, but it still has a lot to learn," says The Verge. As opposed to all the other AI shit that's out here, working so flawlessly.

Via Fast Company: "Generative AI promised creativity; instead, it's about summarization." I keep saying this, I know, but I definitely need to write more about this fascination with summarization.

An infinite AI-generated podcast. Because god forbid y'all pick up a book and read.

The Cat that Walked By Himself

Robert Downey Jr. vs AI. See also: Robert Downey Jr. in a play about AI.

"AI Slop is Flooding Medium" – I'm shocked shocked! that a platform full of tech bros (who couldn't be bothered to set up their own domains to blog) is now full of AI slop.

Via Medium: "It’s Time to Re-Design How We Think." – the case for replacing the scientific method with design thinking. My god, that’s dumb.

"There is no Artificial Irony" by Benjamin Riley.

The Butterfly that Stamped

"Margaret Atwood says she's 'too old' to worry about AI." Same, dear.

Thanks for subscribing to Second Breakfast. Paid subscribers will hear from me on Monday – although it'll be a short one as I'll be recovering from handing out water and Gatorade to 55,000 runners at the NYC Marathon. But I do want to say a few words about hope and agency on the eve of the Presidential Election.

*I’m plenty familiar with Kipling’s racism, what with being British. And dammit, I do catch myself uttering the final line of “Gunga Din” far too frequently. ”Cultural literacy” can be terrible, can’t it.

**I started re-reading The Diamond Age yesterday, and good grief, I’d forgotten that it’s set in a neo-Victorian future. Man, if you say this book inspired you to build your ed-tech company, I have some big big concerns.