Fitness Technology and the Templated Body

Some very rough thoughts about the "smart watches" and the "templated self"

Cyborg anthropologist Amber Case argues that digital technologies, particularly social media, push us into "templated" identities by constraining the ways in which we can behave and present ourselves online. This "templated self" is "produced through various participation architectures," she writes, "the act of producing a virtual or digital representation of self by filling out a user interface with personal information. …The shape of a space affects how one can move, what one does and how one interacts with someone else,"  But as these technologies extend their influence from the digital back to the material world, one might ask if that has material consequences — that is, if or how these technologies constrain our physical bodies in templates as well. How do they try to *shape* our bodies to conform to the expectations of profiles, algorithms, and (LOL) AI?

Confession: I wear a "fitness tracker" every day, and I have for quite some time although I haven't always used this phrase to describe it. "Wearable." "Smart watch." "Activity tracker." Wrist-size "Skinner box." Giddy-up gadgetry. Whatever.

Much of this device's job is to monitor my movement and and engineer my health and cajole me when I deviate from the patterns and signals it thinks I should be making.

I do try to be critical and skeptical; but there it is, strapped to my arm. So know that I am implicated in all of this that I write about here; and while I can say that my interest in these devices at first coincided with a curiosity about how behaviorism permeates all technologies — not just the educational technologies that I was researching at the time — my intellectual orientation did not position me outside of their influence.

Image credits: Age Barros...

Although I can define "good health" for myself — or, better yet, in concert with my doctor — my fitness tracker takes my user profile — namely, my age, my weight, my gender — along with the all data that it captures in order to "nudge" me towards certain kinds of behaviors. These are the behaviors that the corporation behind the device has decided are the markers and metrics and signals of "good health," such as that omnipresent (but at the end of the day, pretty arbritary) demand for 10,000 steps a day.

While fitness trackers promise "personalization" in their recommendations — ah, here with have the echoes, always the echoes, of my work in ed-tech — what they actually provide is *standardization* against which one's individual movements and data are measured and judged. These templates have been built on aggregate data — this isn't simply a flaw in how the technology industry tries to shape our body; it's a problem in larger health and wellness circles. Think BMI, for example, a notoriously terrible biomarker first proposed in the nineteenth century by Adolphe Quetelet — a measurement derived from height and weight data of European men to quantify the "l'homme moyen." This remains the "ideal" against which all of our bodies — no matter our sex or race, for example — are evaluated.


Another confession: I first bought a fitness tracker after reading David Sedaris's essay on his Fitbit, in which he chronicles the way in which the device challenged him to take more steps every day. What might sound at first to be a signal of "good health," Sedaris is clear that what it actually demonstrates is something else altogether (and surely says something about my own weird beliefs about discipline, behavior modification, and exercise):

I look back on the days I averaged only thirty thousand steps, and think, Honestly, how lazy can you get? When I hit thirty-five thousand steps a day, Fitbit sent me an e-badge, and then one for forty thousand, and forty-five thousand. Now I’m up to sixty thousand, which is twenty-five and a half miles. Walking that distance at the age of fifty-seven, with completely flat feet while lugging a heavy bag of garbage, takes close to nine hours—a big block of time, but hardly wasted. I listen to audiobooks, and podcasts. I talk to people. I learn things: the fact, for example, that, in the days of yore, peppercorns were sold individually and, because they were so valuable, to guard against theft the people who packed them had to have their pockets sewed shut.

At the end of my first sixty-thousand-step day, I staggered home with my flashlight knowing that I’d advance to sixty-five thousand, and that there will be no end to it until my feet snap off at the ankles. Then it’ll just be my jagged bones stabbing into the soft ground. Why is it some people can manage a thing like a Fitbit, while others go off the rails and allow it to rule, and perhaps even ruin, their lives?

Why is it, indeed.


These devices are, in so many ways, the heirs to B. F. Skinner's "psycho-technologies," offering feedback and rewards for good behavior. The Apple Watch, the device I moved to after losing my little Fitbit one too many times, encourages the wearer to "close your rings" — that is, complete three movement challenges every day: stand up and move once and hour (the stand ring); move at or above the pace of a brisk walk or complete an activity using the watch as a gauge (the exercise ring); and burn a certain number of calories per day (the move ring). As with Sedaris's Fitbit, users get badges for completing this goal consistently and repeatedly, day in and day out, and are encouraged to extend their "move" weekly, always increasing the number of calories they're burning. There are no "rest days" in the Apple Watch framework, a complaint that many users have registered (prompting many suggestions on how to take a break without actually losing your activity "streak" — operant conditioning indeed).

If, as The Verge and others argue, Apple fails at "recovery"  it's perhaps because the company does not see "rest" as the problem it needs to solve for wearers. The problem — and this is certainly a narrative espoused not just by the tech sector but by plenty in public health — is too much rest. We are too sedentary, too stationary — and as such, this is the template into which the Apple Watch wearer is compelled. The "templated self" designed by the Watch is a self that needs to be prompted to get up and move around. It's a body that cannot be trusted to move on its own accord.


Case argues that digital templates shape and even circumscribe how we express our identities online. Fitness technologies are poised do something similar, altering not just our bodies through "fitness" — ostensibly that's the goal, right? — but shaping and even circumscribing how we view our bodies and our health (and our identities tied to such).

While much of this is dependent on the kinds of data-fication and optimization that the tech sector extols — and okay, that capitalism have fostered since at least the advent of scientific management — it's also intertwined with companies' brands. And brands certainly shape and reflect our identities. This is why, for example, many athletes prefer Garmin over Apple, as its watches shout "athlete" not "lifestyle." ("Lifestyle" is a slur, I guess.) Indeed, my physical therapist/running coach asked me once when I was going to get a Garmin, even though, as I went through the motions of the lunge exercises she prescribed to strengthen my glutes, I was clearly wearing an Apple Watch on my wrist. "Real runners," or some such bullshit, wear Garmin.

Apple did release the "Ultra" in the fall of 2022 — a bigger device with a longer battery life — signaling its intention to cater to a user willing to pay for "the ultimate sports watch." I didn't buy an Ultra, despite being pretty invested in that whole damn Apple ecosystem. New to the sport of running and eager to immerse myself in it — both my body and my identity — I bought a Garmin. But it's not just the exterior look of the device and all that it symbolizes that leads to this templated identity. Different devices track different data; they present it in different ways; their algorithms highlight different trends and make different predictions; their nudges push you towards different goals, with different values.

There are plenty of questions we should ask about this: what are the consequences of compliance and non-compliance with their algorithms — mentally and physiologically? Is the data that's collected and presented to us accurate? Is it meaningful? Is it actionable? What is the imagined body of the user, the imagined identity? What exactly do these devices think we should be "optimizing" for? Are these really our goals? What happens if our bodies — or when our bodies — do not match the template?

Because they won't.

All of us deviate from "the norm" — such is the nature of statistics. Even as the algorithms on these fitness trackers purport to give us insights into our own performance, in the quest to improve our own health, they draw on aggregate data and models based on other bodies — bodies that, for a variety of reasons society and/or science deems "healthy." (As a post-menopausal woman, I can assure you that these algorithms don't know jack shit about me, Audrey; but they really don't know much about post-menopausal women in general because there just isn't a ton of research on us, particularly as athletes.)

If we turn over our decision-making to these devices, to these algorithms — how much should we eat, for example, based on the number of calories our fitness trackers says we’re burning — we are not only letting these devices shape our physicality, we are letting them dictate what that physicality should be. Did your watch give you a bad sleep score, even when you're certain you slept pretty well? (This happens to me all the time as my wrist tattoos prevent the heart rate monitor from functioning consistently. My watch will often say I didn't sleep at all — and from there, all my "readiness" scores are trash.) How did that make you feel? Did you go for a run because your watch said “run” even if you felt a little niggle in your knee? Does all the data make you less-informed, make second guess your own perception of your body?

Image credits: Josh Marshall...

Fitness technologies shape how we think about fitness; they shape how we think about our movement — why we move, how we move, and so on. We covet the gadgets that promise to give us more and more data and deeper and better insights about ourselves, supposedly to learn more about ourselves. And yet, we are simultaneously un-learning to trust ourselves (or trust professionals — our teachers and coaches), waiting for the "nudge" and the badge to compel us move.