Of course this isn’t just a book about hedging, that would be silly. There’s also haymaking, shepherding, walling, beekeeping, weaving, tanning, basketry, thatching, plowing, and the making of everything from ponds to quicklime, because Alex Langlands is obsessed with preserving (and if necessary recovering) the skills of the rural past. He wants you to understand what’s been lost to industrialization, and how our contact area with the world has shrunk, and why doing things with your body is part of being human, and … oh wait I’m sorry I nodded off, because I’ve written this all like twelve times already.
So why am I telling you about a book on how to do things by hand that you can do far more quickly and efficiently with a machine?
Well.
Langlands frames his book around the concept of cræft, which (as you can probably guess from that æsc) is the Old English origin of our modern “craft”. The ancestral word is richer and more complicated than the modern one, though, pointing to far more than handmade tchotchkes and beer with too much hops. The Dictionary of Old English explains:
“Skill” may be the single most useful translation for cræft, but the senses of the word reach out to “strength”, “resources”, “virtue”, and other meanings in such a way that it is often not possible to assign an occurrence to one sense in [modern English] without arbitrariness and loss of semantic richness.
Like the modern “craft”, it does convey a sense of ability, especially when it comes to one’s livelihood: the students in Ælfric’s Colloquy use cræft as well as weorc when discussing what they do all day. But it can also mean might or power: when the Old English Orosius tells us that the strength of the Medes failed them in battle, for example, it’s Meða cræft that gefeoll,1 and when Judah Maccabee’s foes join the fray, they begin to fight mid cræfte. Of course, there are semantic connections among these varied meanings: the ideas of physical strength and physical skill blend into one another at the edges, and a word for a thing you’re good at doing with your hands can also be used for a thing you’re good at doing with your mind. (After all, we still refer to writing as a “craft”.) And ideally you’re fairly talented at whatever set of things provide your livelihood! So we can say that Old English cræft broadly means something like “a person’s ability to bring his will to bear on the world, and his skill in doing so”.
There’s one more meaning, though, and it appears more or less exclusively in the writings of Alfred the Great: cræft as spiritual or mental excellence.2 Anglo-Saxon scholars had mostly used cræft as a way of rendering Latin ars, but when King Alfred translated Boethius into Old English he used cræft for Latin virtus, virtue as in moral excellence.3 A contemporary reader might be tempted to see this as merely an extension of the “mental skill” sense of the word (a virtuous person is one who is good at being good), but that would be misleading; the general meaning of cræft leaves the word freighted with powerful and inescapably physical implications. (Remember, too, that before the Reformation the Christian image of spiritual excellence universally emphasized asceticism, which necessarily involves the body a great deal.) Cræft as virtue is not an internal moral condition, it’s an internal activity, a kind of doing or making of the soul.
Or, as Langlands glosses his title, cræft is “a hand-eye-head-heart-body coordination that furnishes us with a meaningful understanding of the materiality of our world”.
Langlands is now a professor of archaeology at Swansea University, but he got his professional start as a circuit digger, the kind of “hired trowel” real estate developers pay to quickly catalog all the ancient remains they’re about to turn into the foundation of a new Tesco. It was not a fulfilling job — “crude and expedient” is the line he uses for his commercial excavations — and he was beginning to grow disillusioned with archaeology as a field. So naturally he did what any sensible person would do if he didn’t like his job: he applied to be on a TV show. This was in 2003, and BBC Two was advertising for people to spend a year in 1620, living on and running a historical farm using reconstructed period techniques and equipment. Langlands got the gig (along with Ruth Goodman and another archaeologist whose book I haven’t read), and had a wild year in the Stuart era and then a few more in the Victorian and Edwardian periods.
The shift from examining the archaeological record to experiencing how it was made was an eye-opener, and the success of that first program took him by surprise: “I’d often wondered to myself who on earth would want to watch a bunch of cranky, oddball re-enactors and archaeologists bimbling around in costume, pretending to be in the past”, he writes. “But I didn’t care too much because I was spending nearly every single hour of every day immersed in historical farming. I was tending, ploughing, scything, chopping, sweeping, hedging, sowing, walling, slicing, chiselling, digging, sharpening, thatching, shoveling; the list was almost endless.” And the longer he spent doing all these things (he was on three more shows), the more he realized that the skills, and the knowledge they required, were slipping away.
True cræft, in Langlands’s version, is a combination of know-how and make-do. It’s when you live on the Outer Hebrides and don’t have any trees, so you use whatever driftwood washes up as the ridgebeam for your roof. No timber for the rafters? No problem — a sufficiently strong rope, drawn tightly enough over the ridgeline and secured on both sides, makes something like a giant net on which you can lay your thatch. The straw left in the fields after harvest will do nicely for making both rope and thatch, but if (say) it’s the early twentieth century and you’ve abandoned cereal crops because cheap North American grain knocked the bottom out of the market, then you can make rope from heather and thatch with bunches of bracken. (On the Danish coast, they use seaweed.)
Or it’s when you’re an early Anglo-Saxon who wants to boil some water. A few generations ago, some Romano-Briton on the same spot would simply have bought a beautifully thrown pot from any one of a dozen proto-industrial centers across the Empire, but these days that production has slowed or stopped and the trade networks that would’ve brought them to you are kaput anyway. All you’ve got is some lousy local clay, too weathered to be easily worked. You don’t even have the fuel to fire it hot. So you add organic tempers like grass or chaff (or even dung) to make the clay more plastic, you shape it by hand without a wheel, and when you fire your pot the chaff burns away and leaves tiny voids in the ceramic. Your pot is soft, it’s brittle, it’s kind of lumpy, and fifteen hundred years from now Bryan Ward-Perkins is going to point to it as evidence that civilization collapsed when Rome fell — but it’s still a pot, and it still holds water. You’ve made ingenious use of the world around you to solve your problem. You are, in a word, cræfty.
So when Langlands says cræft, he means the way people behave under conditions of scarcity and resource constraint. And when you’re in that kind of situation, of course you have to be intimately familiar with all your materials — you have to squeeze every last drop of performance out of them! And while Langlands is interested in preindustrial techniques, this isn’t just a matter for drystone wallers and skep-making beekeepers; you can also be cræfty with machines or computers. Cræft is the Havana mechanics who keep 1950s cars running on an income of $40/month, or the engineers who fit all the computer code for the Apollo Guidance Computer into 80 kilobytes. It’s the defining feature of the Real Programmer who “tuck[ed] a pattern matching program into a few hundred bytes of unused memory in a Voyager spacecraft that searched for, located, and photographed a new moon of Jupiter”. We rightly admire these cræfty solutions for their elegance and their makers’ skills, but aside from a few weird hobbyists we don’t imitate them. You don’t spend days hauling rocks and building a wall to keep your sheep in when you have wire fencing. You don’t learn the skies so you can time your haymaking for clement weather when you can just wrap your machine-mown grass in plastic and make silage instead. And you don’t work in unreal mode when you have 64-bit processor. Technological advances have freed up our time precisely because they’ve freed us from the need for clever, thoughtful, material-aware solutions to our problems. No one is cræfty in the midst of abundance, because they don’t have to be.
Your reaction to that last paragraph reveals where you fall in the Wizard/Prophet divide: are you pumping your fist for humanity, or are you a little sad that a kind of mastery has been lost? Is our ability to simply throw more resources at the problem and go on with our day a blessed liberation from the bonds of brute necessity, or is it a tragic separation of our thinking, making, doing selves from our world? Are our practical limitations something to be defeated or innovated around, or are they something to embrace because they are, in some sense, good for us?
Langlands is, unsurprisingly, well over on the Prophet side. He warns that “while some machines are clever, the net result of our using them is that we become lazy, stupid, desensitized, and disengaged” — it’s not that a thing made by hand is better as an object than its mass-produced counterpart (although in some cases it is, and a stone wall does last longer than a wire fence), it’s that the making changes the maker. And while he likes to warn that climate change or Peak Oil or the fragility of international supply chains make our uncræftiness a serious survival risk (think of those poor imported-pot-dependent Britons when Rome withdrew!), that’s not really the point. Even if our technological society never falters — even if we soar to greater and greater heights of prosperity and can afford to automate and mechanize more and more of our interface with the world — Langlands argues that would just mean more missing out.
Jane Psmith, “REVIEW: Cræft, by Alexander Langlands”, Mr. and Mrs. Psmith’s Bookshelf, 2025-03-24.
- As in modern German and Dutch, Old English used the ge– prefix for past participles.
- For more on Old English cræft, especially in the Alfredian corpus, see here. Langlands quotes from the late Peter Clemoes, who wrote extensively on the topic, but no obliging Kazakh has put that online for me.
- This is a fascinating word choice, because virtus is also a complicated and interesting word; it’s derived from the Latin word for man, vir, and means things like “force” but also “manliness” or “bravery” (like Greek ἀνδρεία). In the classical world, it came to mean something like moral worth or excellence in a particularly masculine way, and though it was adopted as a western Christian term for something like spiritual ἀρετή, it retained some of those echoes.
May 15, 2026
QotD: Rediscovering Cræft
May 6, 2026
QotD: Deskilling society through AI
It’s always a little dangerous to write about any rapidly-developing technology, because chances are pretty good that whatever you say will be incredibly and obviously dated within a few months. But I’m going to plant my flag anyway, because even if nothing else changes — even if there’s no meaningful advancement in LLM performance beyond the state-of-the-art right now, in March 2025 — the potential disruption is already so enormous that you can think of it as a kind of Industrial Revolution for text.
Just like in the first one, we’ve figured out how to use machines to do a broad swathe of things people used to do, swapping energy and capital in for human labor. And just like in the first one, the output isn’t necessarily better (in fact, it’s often worse), but it’s so much cheaper in terms of human time and thought and effort that the quality almost doesn’t matter. Sometimes that’s wonderful: if you desperately need to put a roof for your barn right this moment, it’s a blessing to be able to slap on some corrugated tin instead of going to the effort of thatching. When you have to write your seventeenth letter to the insurance company explaining that no, they really ought to be covering this, it’s a relief to hand the composition off to Claude instead. But do that too much and you forget how to do it yourself — or more plausibly, you never learn.
The greatest risk of AI is probably “we all get turned into paperclips”, or maybe “someone uses it to design a novel and incredibly fatal pathogen”, but the most certain risk — the one that’s already here, at least on the edges — is a great deskilling. Just as the mechanization of physical labor lost us all those traditional skills that Langlands describes, the ability to automate cognitive tasks undermines their acquisition in the first place. Why pay any attention at all to word choice and metaphor and prosody when ChatGPT can churn out that essay in a few seconds? Why worry about drafting a convincing email when you’re pretty sure your recipient is just going to ask Grok for a summary?1 Why learn to code when a machine can do it faster?
I was recently informed that someone — “not anyone you know, Mom, someone at another school” — used ChatGPT to write his essay about the causes of the Civil War. This was obviously deeply upsetting to the congenital rule-follower who reported it to me, on account of THAT’S CHEATING (you must imagine this in the whiniest she-touched-my-stuff voice possible), but it was a good teachable moment — for me, if not for the history teacher at another school. What’s the point of an essay about the causes of the Civil War, anyway? It can’t be that the teacher wants to know the answer: she can find a dozen books on the topic if she cares to look, each more cogent and thorough than anything a middle-schooler is likely to produce.2 Heck, even the Wikipedia article will probably give her a better understanding. And if it’s not for the teacher’s benefit, it’s certainly not for the benefit of any other audience, since as soon as the essay is marked and graded it’ll probably be crumpled up and tossed into the recycling bin. No, it’s for the kid.
The point of writing an essay about the causes of the Civil War is not to have an essay about the causes of the Civil War, it’s to undergo the internal changes effected by the process of thinking through, planning, drafting, and editing the darn thing. Writing forces you to put your thoughts in order, to shape whatever mass of inchoate ideas is bouncing around in your head into something clear and reasoned you can pin to the page. The thinking is the hard part; putting words to it is simple by comparison. (This book review began life as about seven hundred words of stream-of-consciousness riffing, with only the vaguest kind of structure. When I experimentally pasted it into an LLM and asked for an essay, the result was terrible.) But even the putting of words is a valuable skill: what’s the right tone here? What’s the right word? Do I want to say “writing forces you to” or “when you write you have to”? How do they feel different? Asking a machine to do this for you is like bringing a forklift to the gym.
Of course, that kid who had ChatGPT write his essay was almost certainly thinking of the assignment not as one small step in the alchemical process of self-transformation that is education but as basically equivalent to an appeal letter to the insurance company: just another dumb hoop you have to jump through in your interactions with a vast impersonal machine that doesn’t particularly want to grind you to dust but wouldn’t mind it either. And since this was at another school, he might not even be wrong. Maybe the teacher was just pasting the rubric and the essays back into ChatGPT and asking it to assign a grade.3
But there’s an even bigger problem than lying about who (or what) has done the work, which is lying about whether the work has been done at all. LLMs make lying very easy indeed. Yes, yes, sometimes they hallucinate and tell you things that are patently untrue, and that’s a bigger danger for students and other people who don’t have the background to notice when something seems off — this is all true, but it’s not what I mean.
LLMs, when working exactly as intended, enable human falsehood — because our society relies on written records as proof of work. Until recently that was fine, because writing down lies actually used to be pretty hard: putting together a convincing false report from scratch — maintenance records for the airplane you’re about to board, say, or a radiologist’s report on your brain scan — was almost as time-consuming as actually checking the things that were supposed to checked and then documenting them, and the liar had to spend the whole time aware of their own dishonesty. (Not that this stops everyone, of course.) But now that it takes about two clicks to generate an inspector’s report for the house you’re considering buying, or the pathologist’s findings in your biopsy, how much are you going to trust that they actually looked?
LLMs can be useful tools,4 but all tools change what we make and how we make it. It’s often a good tradeoff! Sure, each individual example of simplification and automation in the name of efficiency is a tiny bit of alienation, removing the maker from the making, but it’s also a gift of time we can spend on other things: I couldn’t write this if I also had to sew my family’s clothes and wash our laundry by hand. And yet those bits pile up, and once it becomes possible to exist in the world without really needing to come into contact with it, once you can get by without ever really needing to make anything, some people just won’t. And that’s terrible! Being entirely without cræft — never bringing mind-body-soul into harmony with one another and then using them to master the world — means missing out on something deeply human.
Jane Psmith, “REVIEW: Cræft, by Alexander Langlands”, Mr. and Mrs. Psmith’s Bookshelf, 2025-03-24.
- All the “AI written/AI read” communication begins to resemble Slavoj Zizek’s perfect date:
“So my idea of a perfect date is the following one. We met. Then I put, she puts her plastic penis dildo into my … “stimulating training unit” is the name of this product. Into my plastic vagina. We plug them in and the machines are doing it for us. They’re buzzing in the background and I’m free to do whatever I want and she. We have a nice talk; we have tea; we talk about movies. What can be — we paid our superego full tribute. Machines are doing — now where would have been here a true romance. Let’s say I talk with a lady, with the lady because we really like each other. And, you know, when I’m pouring her tea or she to me quite by chance our hands touch. We go on touching. Maybe we even end up in bed. But it’s not the usual oppressive sex where you worry about performance. No, all that is taken care of by the stupid machines. That would be ideal sex for me today.”
- Well, okay, most of them.
- See footnote one again.
- Personally I’ve found them useful in three cases: (1) when I’m blanking on how to begin an email I will occasionally ask for a draft, which inevitably makes me so mad about how bad it is that I immediately rewrite it in a way that doesn’t suck; (2) when it’s Sunday night and I need a picture of a Japanese man in a business suit and a samurai helmet for a book review going up in the morning; and (3) when I can’t figure out the right search term for my question. (Turns out it was “sigmatic aorist”. Thanks, Claude.)
April 28, 2026
QotD: The cultural history of the Tidewater and Deep South regions of the United States
The first nation [as described in American Nations, by Colin Woodard] that struck my interest was Tidewater, earliest of the English nations. (El Norte and New France, as Woodard names them, are the remnants of colonial empires that predate English settlement in North America.) Founded on the shores of the Chesapeake Bay by gentlemen from southern England, and with a sizeable population influx a generation later from Royalists who had found themselves on the losing side of the English Civil War, Tidewater began with an aristocratic ethos. Its gentlemen wanted to recreate the rural manor life of the English landowners: ruling benevolently over their estates and the tenants who inhabited the associated villages, presiding over the courts and local churches, hunting and visiting their neighbors and paying for the weddings and funerals of the poor. To play the role of the peasantry in this semi-feudal system, they imported indentured servants from among the English poor. But unlike English villagers, who were engaged in a variety of subsistence farming endeavors or local forms of production in much the same way that their ancestors had been, the indentured servants of Tidewater were mostly put to work farming tobacco for export.
This may not seem like a huge difference — does it really matter if you’re growing wheat or tobacco, if you’re farming someone else’s land? — but it had profound implications for what happened after the indenture. In theory, the formerly-indentured should have taken on the role of either the English tenant farmer (think Emma‘s Robert Martin) or yeoman/freeholder (a small-time landowner but not of the scale or social class to be a “gentleman”). In practice, though the colony was a plantation economy exporting a cash crop: there was very little local manufacturing, since it was so easy for a ship from London or Bristol to sail right up to some great landowner’s dock on the river and unload whatever he might have ordered. Independent small-scale farmers simply couldn’t compete for tobacco export with their larger neighbors, and especially not if they also had to pay rent. But luckily for them, they had something no Englishman had had for centuries: empty land nearby. Or, you know, sort of empty. (Several of the rebellions in early Virginia were fought over the colonial government’s refusal to drive the Indians off the land former servants wanted to settle.) They could just leave.
The obvious solution for the Tidewater elites — the clear way for gentlemen to maintain an aristocratic lifestyle without a peasantry tied to the land — was African slaves. And here’s the important difference between Tidewater and it neighboring nation, the Deep South: Tidewater turned to slavery in the hopes of perpetuating their social structures, while the Deep South was envisioned from the first as a slave society.
The Deep South had been founded in the 1670s by Barbados sugar planters who ran out of room on their tiny island and were now exporting their particularly brutal combination of slave gangs and sugarcane to the coastal lowlands around Charleston Harbor. (Like the Tidewater gentry, the Barbadians had originally experimented with indentured servants from Britain, but they were worked to death so rapidly that the authorities objected.) The planter class quickly became phenomenally wealthy — by the American Revolution, per capita wealth in the Deep South was four times that of Tidewater and six times either New York or Philadelphia, and the money was much more concentrated than anywhere else in the colonies — but unlike the manorial idyll of Tidewater, with its genteel pursuits and colonial capitals all but abandoned when the legislature was out of session, the Deep South planters spent as much time as possible in the city.
Charles Town (later Charleston), South Carolina, modeled on the capital of Barbados, was filled with theaters, taverns, brothels, cockfighting rings, private clubs, and shops stocked with goods imported from London. Life in the city was a constant churn of social engagements, signalling, and status competition: in 1773, a pseudonymous correspondent wrote in the South Carolina Gazette that “if we observe the Behavior of the polite Part of this Country, we shall see, that their whole Lives are one continued Race; in which everyone is endeavouring to distance all behind him, and to overtake or pass by, all before him; everyone is flying from his Inferiors in Pursuit of his Superiors, who fly from him with equal Alacrity …” The planters of the Deep South had no interest in being lords of their estates, which were managed by overseers, or indeed in their land or the people who worked it. Certainly there existed poor whites in the colonies of the Deep South, but they never entered into the conversation: where Tidewater imagined agricultural labor performed by the English “salt of the earth” but had to fall back on slaves, the Deep South always planned on slaves.
This may not seem like an important difference, especially if you’re a slave,1 but it matters a great deal for national character. Culture, after all, lives as much in a people’s values and ideals as in their daily routines: a culture that praises loyalty to clan and family will behave very differently from one that lauds fair dealing with strangers. And the Deep Southern ideal, the nation’s vision of how life ought to be, was more or less Periclean Athens: a tremendous efflorescence of wealth, art, and personal distinction for the great and the good, with no consideration whatsoever for the slaves and metics who made up the bulk of the population. A good life meant leisure and luxury, wealth and freedom, the full exploration of personal capacity for the few and who cares about the many. The Tidewater ideal, on the other hand, was basically the Shire: bucolic, rural, politically dominated by a cousinage of great families who shared a profound sense of noblesse oblige and populated by a virtuous, hardworking yeomanry who knew their place but were worthy of their betters’ respect.
Did that world actually exist? Of course not, neither here or in its English model,2 any more than the Puritans’ commonwealth in Massachusetts Bay was a new Zion inhabited by saints. But a culture’s picture of how life ought to be determines its reaction to changing circumstance, and Tidewater pictured an enlightened rural gentry ruling benevolently over lower orders who nevertheless mattered. In contrast to the aggressively middle class northern nations, the fiercely independent Appalachians, and the elite-centric Deep South, Tidewater imagined itself as an aristocracy. And it was the only one among the American nations.
Tidewater had a disproportionate influence on the early United States, contributing far more than its fair share of early statesmen and generals as well as a healthy dose of the philosophical underpinnings for many of our founding documents. Unfortunately for the lowland Virginia gentlemen, however, they were hemmed in to the west by the hill people of Greater Appalachia: when the other nations began to expand deeper into the continent after 1789, Tidewater was stuck in its starting position. Soon the nation that had been “the South” on the national stage was dwarfed by Greater Appalachia (more than doubled between 1789 and 1840) and especially by the Deep South (ten times larger). When the young United States began to polarize over the issues of slavery, Tidewater — by then a minority in Maryland, Delaware, North Carolina, and even Virginia3 — had to retreat to the political protection of the Deep South and began to lose its cultural distinctiveness. It never really emerged again as its own ideological force.
Jane Psmith, “REVIEW: American Nations, by Colin Woodard”, Mr. and Mrs. Psmith’s Bookshelf, 2024-02-19.
- Though it actually mattered a great deal to slaves, who were imported to the Deep South in great waves only to be worked to death; the enslaved population of Tidewater, by contrast, increased steadily over the entire antebellum period.
- Though I will point out that Akenfield suggests the total immiseration of the tenant farmers in the early 20th century has something to do with the land being owned by rich farmers and implies that the local gentry are more generous employers.
- West Virginia’s eventual secession back to the Union would put Tidewater back in the majority there.
April 22, 2026
QotD: Traditional Chinese approaches to science
Those of you who have studied physics know that the laws of motion are usually introduced through the mechanics and dynamics of point particles, or of simple objects acting under the influence of discrete and coherent forces. The reason for this is straightforward: even a tiny bit more complexity, and the system’s behaviour quickly dissolves into a morass that’s analytically intractable and computationally infeasible. The fact that the mutual gravitational influences of just three celestial objects results in chaotic dynamics has entered into popular culture as the “three-body problem”. But even a simple double-pendulum is impossible to predict, even with all kinds of simplifying assumptions (massless rods, no friction, no air resistance, etc., etc.).
It’s not just physics. The central technique of modern science is that of boiling something down to its absolute simplest form, understanding the simplest non-trivial case as thoroughly as possible, and only then building back up to more familiar situations. In physics we start with contrived gedankenexperimenten: “what if two particles collided in a vacuum”, and build experimental apparatuses designed to mimic these ultra-simple cases. In economics we imagine markets with a single buyer and a single seller, both perfectly rational. In political philosophy we imagine human beings in a state of nature, or societies established by a primitive contract. In biology we try to understand the functions of organisms, organs, or other systems by recursively taking them apart and trying to figure out each part in isolation. In every case, what we’re engaging in is “analysis”, ἀνά-λυσις, literally a “thorough unravelling”, understanding the whole by first understanding its parts.
This approach is totally alien to the traditional Chinese understanding of reality, which held instead that no part of the world could be understood except in its relation to the rest of the universe. You can see this in the domains of science where they did maintain a lead. Is it really a coincidence that the Medieval Chinese got frighteningly far with the mathematics of wave mechanics? Or quickly deduced the causes of the tides? Or made great strides with magnetism? In each of these cases, the physical phenomenon in question was compatible with an “organicist conception in which every phenomenon was connected with every other according to a hierarchical order”. Indeed, in all of these cases real understanding was aided by the assumption that a universal harmony underlay all things and connected all things. The tides really are in harmony with the moon, and the lodestone with the earth.
This science, founded on holism rather than on analysis, made great strides in some fields but fell behind in others. It readily imbibed action at a distance, but it could not and would not tolerate the theory of atoms. In this way it serves as a strange mirror of Medieval European science, which also loved the theory of correspondences, also loved alchemy and disdained analysis. The difference is that the glorious intellectual synthesis of Neo-Confucianism was never seriously challenged, it survived the Mongol conquest, it survived the desolation of the civil wars that preceded the Ming founding, it survived everything until communism. In contrast, the eerily-similar Thomistic metaphysics of the High Middle Ages was broken apart by the Reformation, and sufficiently discredited that analytical methods could take their first tentative steps.
This is, to be clear, my own crazy theory, because Needham never really gave a solution to his own puzzle. I came up with it only as a sort of thought-experiment, because I wanted to see if I could find a solution to Needham’s puzzle that disdained material explanations in favour of intellectual tendencies, because I find such theories curiously underrated in our culture. I only half-believe this theory,1 but I find it interesting because twentieth-century Western science has in some ways come back around to the holistic view of things: from Lagrangian methods in theoretical physics, to category theory in mathematics, to systems biology and ecology. It wouldn’t be the first time that a way of viewing the world useful to one age became an impediment to reaching the next one. The question is: what are we missing today?
John Psmith, “REVIEW: Science in Traditional China, by Joseph Needham”, Mr. and Mrs. Psmith’s Bookshelf, 2023-08-14.
- The thing about material conditions is they usually are dispositive!
April 15, 2026
QotD: Archaeological evidence of human achievement misses a lot
We associate human achievement, striving, and greatness with the archaeological remains that testify to them — things like written works and monumental architecture — because often that’s our only evidence that it ever happened. But sometimes, a little clever digging (literal or figurative) can uncover glories of a barbarian past. The most obvious example, of course, is that of the Iliad and the Odyssey, products of a non-state people’s oral culture in the Greek Dark Ages and only recorded with the reintroduction of writing centuries later. How many other texts would be considered classics of world literature if only they had ever become, you know, actual texts? But let’s go beyond art: if you want to talk world-bestriding greatness more broadly, look no further than the ferociously expansive Proto-Indo-Europeans, whose obsession with “imperishable fame” left their DNA all over Eurasia and their culture and even mythology so deeply embedded in their daughter cultures that it can be convincingly reconstructed today.1 Or the Polynesians, whose expansion is arguably even more impressive given how much harder it is to travel across ocean than steppe. Sure, it’s not the Lion Gate or the Mona Lisa — or even the cuckoo clock — but the remains we do have should remind us of the other cultural achievements that have doubtless been lost like tears in the rain.
“What cultural achievements?” you may ask, eyeing the world’s few remaining hunter-gatherers, and it’s true: we judge barbarians of the past by analogy to barbarians of today.2 But that’s not entirely reasonable; there’s no reason to assume that a lack of cultural elaboration among, say, the highlanders of Papua New Guinea reflects anything about the Lapita culture, let alone about the Middle Stone Age or Neolithic Europe.3 It reminds me of the friend who once explained to me, quite seriously, that he would never work for a startup because they’re all culturally dysfunctional and have stupid products. And, you know, statistically he’s probably right: most startups suck, because if they’re any good at what they do they don’t stay startups for long.4 But we all know that different cultures are different: some groups of people see a horizon and burn with the desire to know what’s beyond it, and others don’t. Well, guess who those horizons are going to end up belonging to?
Of course there’s something nice about things that last: the written works and monumental architecture give succeeding generations something to point to and discuss, a jumping-off point for their own striving. Reading Latin is great, partly because you can read what the Romans had to say but more because you can read the same things that every educated person since the Romans has read. But that’s talking about their utility for us, not anything intrinsic to them; if the Huns or the Mongols or the Turks had come a little farther west and despoiled a little more thoroughly, it wouldn’t have retroactively detracted from the grandeur that was Rome. It would simply have turned it into a dark age because it would have left us blind.
Jane Psmith, “REVIEW: Against the Grain, by James C. Scott”, Mr. and Mrs. Psmith’s Bookshelf, 2023-08-21.
- Calvert Watkins argues for a Proto-Indo-European Ur-myth in the charmingly-titled How to Kill a Dragon: Aspects of Indo-European Poetics, which really ought to contain more stat blocks than it does.
- Or, more often, a hundred years ago, since there are vanishingly few non-state peoples left.
- I can’t get over how annoying it is that there’s an entirely different set of terms for periods of human history depending on what continent you’re discussing.
- I’m sorry if it’s uncool, but by the time you employ someone with a certification from the Society of Human Resource Managers you’re not really a startup anymore even if your office fridge is full of energy drinks.
April 6, 2026
QotD: Taylorism
In the world of management, the ideology of generic, domain-agnostic expertise first made its appearance in the late 19th century under the name of “scientific management”, or “Taylorism” after its godfather Frederick Winslow Taylor. Taylor’s insight was that the same engineering principles used to design a more economical or efficient product could just as well be applied to the shop floor itself. In his view, the workers, overseers, and production processes of a factory all combined to form a great living machine, and that machine could be optimized and made more efficient by an application of scientific attitudes.
Taylor was unpopular in his own day and is even less popular today, because his particular brand of optimization of the great living machine was all about stripping autonomy (or as Marx would say, “control and conscious direction“) from workers. But the particular kind of optimization he advocated is less important than the conceptual breakthrough that while a nail factory and a car factory might look very different on the surface, they are both governed by the same set of abstract laws: laws of time and motion, concurrency, bottlenecks, worker motivation and so on. A master of those laws could optimize a nail factory, and then go on to optimize a car factory, and could do both without knowing very much at all about nails or cars.
Who could have a problem with that? Even I don’t think it’s entirely wrong — I may have misgivings about the sheer volume of people going into fields like management consulting, but I’ll admit that there remains alpha in asking a smart and incisive outsider to take a look at your operation and tell you what seems crazy. The trouble comes with confusing that sporadic, occasional sanity-check with the actual business of leading a team of people who are working together to achieve an objective. Because, get this, it’s impossible to lead such a team without a deep understanding of the details of every person’s tasks.
It’s surreal to me that this point has to be made, yet somehow it does. If the team you lead makes nails, you need to know everything there is to know about making nails. If the team you lead operates a restaurant, you need to be an expert, not in “management”, but in restaurants. If the team you lead sells mortgage-backed derivatives, you better know a heck of a lot about finance in general, mortgages in particular, the art of sales, and the specific world of selling financial instruments. There are a thousand reasons why this is true, but consider just one: a subordinate is failing at a task, and tells you that it isn’t because he’s lazy or unqualified but because the task is unexpectedly difficult. How on earth can a manager evaluate this claim without being able to do the job himself?
There’s another, very different reason managers need to be experts in whatever it is their team is doing, and it has to do with morale. A subordinate in any sort of hierarchical organization needs to see that his superior can do his own job as well or better than he can. Almost everybody gets this. In a high-pressure commercial kitchen, if a chef or sous-chef doesn’t like the performance of one of their line cooks, they will often leap in, take over that cook’s station, and begin “expediting.” This has a dual purpose: it both relieves a genuine production bottleneck, and also acts as a showy demonstration of prowess, reminding everybody that they got to be the boss through excellence. At the better tech companies, those managing software engineers are always former engineers themselves, and often the very best of the lot. Just like a chef would do, an engineering manager needs to be able to seize a computer and begin expediting under pressure, both to solve a real problem and as a dominance display. But it’s not just about keeping the troops in line, it’s about inspiring them. Nothing motivates a soldier like seeing his commander leading the charge, weapon in hand.1
John Psmith, “REVIEW: Scaling People by Claire Hughes Johnson”, Mr. and Mrs. Psmith’s Bookshelf, 2023-08-28.
- This shows up in places you wouldn’t expect to. I was once cast in a show, and quickly came to understand that our director could (and often did) leap onto the stage, snatch a script out of somebody’s hand, and play their part better than they could. For any part. Before he did this to me, I found him annoying and bossy. Afterwards, I would follow him into the Somme.
Update, 7 April: Welcome, Instapundit readers! Have a look around at some of my other posts you may find of interest. I send out a daily summary of posts here through my Substack – https://substack.com/@nicholasrusson that you can subscribe to if you’d like to be informed of new posts in the future.
March 28, 2026
QotD: The moment the American empire began to decline
There are two stories from the run-up to the American invasion of Iraq that I can’t get out of my head. The first is that in the final stages of war planning, the US Air Force was drawing up targeting lists for the sorties they expected to make. They already had detailed plans1 for striking Iraq’s air defense systems, but they worried that they would also be asked to disable Iraqi WMD sites. So the Air Force pulled together a special team of intelligence officers to figure out the right coordinates for all the secret factories and labs that were churning out biological weapons and nuclear materials. Try as they might, they couldn’t find them. So … they just kept on looking.
The second story comes from an anonymous source who described to Michael Mazarr, the author of this book, the basic occupation strategy that the National Security Council was settling on. The concept was that once you “cut off the head” of the Iraqi government, you would witness a “rapid and inevitable march toward Jeffersonian democracy”. What I find amazing about this is that nobody even stopped to think about the metaphor — how many things march rapidly and decisively after being decapitated?
I am of the exact right age for the Iraq War to be the formative event of my political identity.2 But even if that hadn’t been true, it still feels like the most consequential geopolitical event of my life. The United States spent trillions of dollars and caused the deaths of somewhere between half a million and a million people in Iraq alone. The goal of this was “regional transformation”, and we transformed the region all right. The war destabilized several neighboring regimes, which led them to collapse into anarchy and civil war. Consequences of that included millions more deaths and the near extinction of Christianity in the place it came from.
As an American, I didn’t feel any of this directly,3 but with the benefit of hindsight the war looks even more epochal for us. It marks, in so many ways, the turning point from our decades of unchallenged global supremacy to the current headlong charge into “multipolarity”. I know this may sound melodramatic, but I truly believe future historians will point to it as the moment that we squandered our empire. Remember, hegemonic empires work best when nobody thinks they’re an empire. True strength is not the ability to enforce your commands, it’s everybody being so desperate to please you that they spend all their time figuring out what you want, such that you don’t even have to issue edicts.
Between the fall of the Soviet Union and the Iraq War, American global dominance was so unquestioned we didn’t even have to swat down any challengers. This is a very good position for an empire to be in, because it means you don’t run the risk of blunders or surprise upset victories that make you look weak and encourage others to take a chance. Conversely, there’s a negative spiral where the hegemon has to start making demands of its clients, which makes the clients resentful and uncooperative, which in turn means that they have to be told what to do. All of this makes the hegemon-client relationship start to look less like a good “deal” and more purely extractive, which can rapidly lead the whole system to fall apart.
Iraq was the moment the American empire went into this negative cycle.
Even if you don’t agree with me about that, presumably you will agree that it was very bad for American soft power and prestige, bad for a number of friendly regimes in the area, and bad for our finances and our military readiness. So to anybody curious about the world, it seems very important to ask why we did this, why we thought it was a good idea, and how nobody predicted the ensuing debacle that seems so obvious in hindsight.
The conventional answers to these questions tend to be either “George W. Bush was dumb” or “Dick Cheney was evil”. I totally reject these as answers. Or I think at best they’re seriously incomplete: if the first Trump administration taught us anything, it’s that the US President can’t actually do very much on his own if the bureaucracy is set against him. The United States is an oligarchy, a kind of surface democracy; big decisions don’t happen without a lot of buy-in from a lot of people. More to the point, the decision to invade Iraq actually was endorsed and supported by pretty much every important politician and every institution, including the whole mainstream media and most of the Democratic Party. Blaming it on a single bad administration is too easy. It’s an excuse designed to avoid asking hard questions about how organizations filled with well-meaning people can go totally off the rails
Fortunately, Michael Mazarr has written the definitive4 book on this very question. It’s not a history of the Iraq War and occupation: it’s a history of the decision to invade Iraq, ending shortly after the tanks went steaming across the border. It’s an exhaustively-researched doorstopper composed out of hundreds and hundreds of interviews with officials working in the innards of the White House and of various federal bureaucracies and spy agencies, all aimed at answering a single question: “What were they thinking?”
John Psmith, “REVIEW: Leap of Faith, by Michael J. Mazarr”, Mr. and Mrs. Psmith’s Bookshelf, 2025-06-30.
- Those plans were provided by the Russians, who prior to multiple rounds of NATO expansion were our allies.
- Given that almost everybody in the US mainstream, both Democrats and Republicans, were for it, this probably explains a lot about how I turned out.
- Sure, maybe someday we’ll have a fiscal crisis, but the incredible thing about America is that all the money wasted in Iraq still won’t be in the top 5 reasons for it. >
- “Definitive” is publisher-speak for “very, very long.”
March 20, 2026
QotD: The lameness and sameness of modern science fiction novels
I’ll confess, though: I almost didn’t read this book. Actually, for several years I didn’t. I was vaguely aware of its existence, but I’d pretty much stopped reading new speculative fiction because I finally admitted to myself that it was pure masochism that kept me beating my head against the wall of newly-published extruded genre product when I had sixty-plus years of Hugo and Nebula nominees to choose from. Sure, every novel will reflect something of its age’s concerns (there’s a lot of nuclear war in those old Hugo winners!), but it’s gotten much worse in the last ten or fifteen years: every book that gets any buzz is so deeply inflected with questions of personal liberation from oppressive structures, so little nuanced and so obsessed with identity and representation, that I find it borderline unreadable. A few books like that, done well — fine, that’s part of life, that’s certainly a kind of story you can tell. But when it’s everything, when it becomes a precondition for publication, you’re left with a tragically denuded sample of the human experience. It’s not that I don’t want to read a book where I disagree with the underlying politics, it’s that an unsubtle obsession with the “correct” politics makes a book boring and cringe. One-dimensionally “right-wing” fiction written in reaction to the contemporary mainstream is just as bad — worse, perhaps, because if done well it’s the sort of thing I would really enjoy.1
Jane Psmith, “REVIEW: The Powers of the Earth, by Travis J.I. Corcoran”, Mr. and Mrs. Psmith’s Bookshelf, 2024-04-29.
- There’s nothing worse than poor execution of an incredible idea, because it means no one else will come along and do the incredible idea right. Austin Grossman’s Crooked, for instance, is Richard Nixon vs. Cosmic Horrors, which is a brilliant premise (yes, the Interstate Highway System is definitely an eldritch sigil designed to protect America, I will not accept any argument) but falls apart on the totally ahistorical version of our 37th President designed to justify making him the “good guy”. The real Nixon is such a fascinating and compelling figure — why not keep him as weird and twitchy and striving as he actually was and have him be the good guy anyway?
Or, say, the Napoleon movie.
March 13, 2026
QotD: “I was one-shotted in my teens by Guns, Germs, and Steel“
I was one-shotted in my teens by the way Guns, Germs, and Steel ✨explained everything✨ and I’ve been chasing that dragon ever since. At this point honestly half the books I’ve reviewed could probably be described as arguments against Jared Diamond. But that’s okay. I can stop any time. Just one more sweeping transdisciplinary exploration of global history. Just let me see a map of British coalfields next to a chart of GDP per capita and I promise I’ll go back to that book about esoteric writing. C’mon, bro, I won’t ever talk about the Hajnal Line again, I swear. Just let me have one more study of an under-appreciated causal factor for the differing trajectories of human societies and I’m done. I have this under control.
Jane Psmith, “BRIEFLY NOTED: Further Arguments Against Jared Diamond”, Mr. and Mrs. Psmith’s Bookshelf, 2025-11-03.
March 5, 2026
QotD: Chinese cooking
Between the foreignness and the sheer, overwhelming size of the topic, it might seem impossible to conduct an adequate survey of the history, vocabulary, and vibe of eating, Chinese-style, for Western readers. But that’s why we have Fuchsia Dunlop. She’s an Englishwoman, but she trained as a chef at the Sichuan Higher Institute of Cuisine (the first Westerner ever to do so). She’s written some of the best English-language cookbooks for Chinese food, and now she’s written this book: her attempt to communicate the totality of the subject she loves and which she’s spent her life studying. But the topic is just too damn big to take an encyclopedic or even a systematic approach, and so she wisely doesn’t try. Instead she writes about the weirdest and tastiest and most emblematic meals she’s had, and ties each one back to the main topic. So the book lives up to its name. Like a banquet, it doesn’t try to give you a thorough academic knowledge of anything, but rather a feast for the senses and a feel for what a cuisine is like.
What is it like? Well, Dunlop barely manages to cover this in a 400-page book, so I hesitate even to try, but let me hit a few of the high points. First, diversity. China is a continent masquerading as a country, both in population and in geographic extent, so its cuisine is comparably diverse. Most cooking traditions have one or two basic starches, China has four or five.1 China extends through every imaginable biome, from rainforest to tundra, desert to marshlands, and much of the genius of Chinese food lies in combining the delicious bounties offered up by this kaleidoscope in interesting or unexpected ways.
One way to think of Chinese eating is that much of it is a sort of “internal” fusion cuisine. Because China was ruled from very early on by a centralized bureaucracy with a fanaticism for river transport, the process of culinary remixing has been going on for much longer than it has in most places. The Roman Empire could have been like this, but the shores of the Mediterranean all have pretty similar climates, so there were fewer ingredients to start the process with. Already very early in Chinese history, before the 7th century, we hear of the imperial city being supplied with:
oranges and pomelos from the warm South, […] the summer garlic of southern Shanxi, the deer tongues of northern Gansu, the Venus clams of the Shandong coast, the “sugar crabs” of the Yangtze River, the sea horses of Chaozhou in Guangdong, the white carp marinated in wine lees from northern Anhui, the dried flesh of a “white flower snake” [a kind of pit viper] from southern Hubei, melon pickled in rice mash from southern Shanxi and eastern Hubei, dried ginger from Zhejiang, loquats and cherries from southern Shanxi, persimmons from central Henan, and “thorny limes” from the Yangtze Valley.
If we think of chefs as artists, the Chinese ones have since ancient times had the advantage of an outrageously diverse set of paints. But these ingredients aren’t combined willy-nilly, without respect for their time or place of origin. The Chinese practically invented the concept of terroir, and their organicist conception of the universe in which everything is connected to everything else implied strict rules about which foods were to be eaten when, both for maximum deliciousness and to ensure cosmic harmony.
In the first month of spring, [the emperor] was to eat wheat and mutton; in summer, pulses and fowl; in autumn, hemp seeds and dog meat; in winter, millet and suckling pig. An emperor’s failure to observe the laws of the seasons would not only cause disease, but provoke crop failure and other disasters.
The obsessions with freshness and seasonality come to their culmination in the one area where Chinese cuisine stands head and shoulders above all others: green vegetables. In the West, “eating your greens” is a punishment, or at best a chore, and it’s easy to see why. In much of the world vegetables are bred for yield and transportability, kept in refrigerators for weeks, and then boiled until no trace of flavor remains. Dunlop and I have one thing in common: when we’re not in China, of all the delights of Chinese cooking it’s the green vegetables that we miss the most.
When I bring American friends to a real Chinese restaurant, sometimes they’re shocked that the vegetable dishes cost the same amount as the main courses. Why does a side dish cost so much? But no Chinese person would ever think of a vegetable course as a “side” dish, they’re part of the main attraction, and more often than not they’re the stars of the show. In the West, you can now get decent baak choy, but this is just one of the dozens and dozens of leafy greens that the Chinese regularly consume, many of them practically impossible to find outside Asia.
My own favorite is the sublime choy sum. I remember once getting off a transoceanic flight, starving and exhausted, and being offered a bowl of it over plain white rice. The greens had been scalded for a few seconds with boiling water, then tossed around a pan for no more than a minute — just long enough that the leaves were so tender they seemed to dissolve in your mouth, but the stems still held snap and crunch. The seasoning was subtle — maybe a few cloves of garlic, some salt, a splash of wine or vinegar. Just the right amount to bring out the deep, earthy flavors of the vegetable, to somehow make them brighter and more forward, but not to overpower them.2 It was one of the most delicious things I’ve ever eaten. I think I’ll still remember it when I am old.
Did you notice that in the previous paragraph I spent almost as much time describing the texture of the food as its flavor? That’s no coincidence. Of course the Chinese care about flavor, everybody does (except the British, ha ha), but relative to many other culinary traditions the Chinese put a disproportionate emphasis on the texture of their food as well. I’ll once again draw on a bastardized version of the Whorf hypothesis: English is a big language with a lot of borrowings, so we have a correspondingly large number of words for food textures. Imagine explaining to a foreigner the difference between “crunchy” and “crisp”, or between “soft” and “mushy”. That is already more semiotic resolution than most languages have when it comes to the mouthfeel of their food, but Chinese takes it to a whole ‘nother level.
John Psmith, “REVIEW: Invitation to a Banquet by Fuchsia Dunlop”, Mr. and Mrs. Psmith’s Bookshelf, 2024-02-05.
- One of them, potatoes, has a particularly fraught history. Potatoes started seriously spreading in China right around the time of the mass famines that accompanied the collapse of the Ming dynasty. Accordingly, they got a reputation of being food for poor people. They’ve never really managed to overcome this association, and are generally shunned by the Chinese, especially in high-end cuisine, despite several government campaigns to encourage people to eat them since they’re nutritious and easy to grow in arid conditions.
- There’s a pattern in Chinese gastronomy where extremely intense, over-the-top flavors are a bit low-status, and flavors so pure and subtle they verge on bland are what the snooty people go for. This is true across regions (the in-your-face food of Sichuan is less valued than the cuisine of the Cantonese South, or the cooking traditions of Zhejiang in the East), but it’s also true within regions (in Sichuan, the food of Chongqing is much spicier than the food of Chengdu, and correspondingly lower status).
February 27, 2026
QotD: American cultural regions
There’s a long tradition of describing the outlines of these regional cultures as they currently exist — Kevin Phillips’ 1969 The Emerging Republican Majority and Joel Garreau’s 1981 The Nine Nations of North America are classics of the genre — but it’s more interesting (and more illuminating) to look at their history. Where did these cultures come from, how did they get where they are, and why are they like that? That’s the approach David Hackett Fisher took in his 1989 classic, Albion’s Seed: Four British Folkways in America, which traces the history of (you guessed it) four of them, but his attention is mostly on cultural continuity between the British homelands and new American settlements of each group,1 so he limits himself to the Eastern seaboard and ends with the American Revolution.2
Colin Woodard, by contrast, assigned himself the far more ambitious task of tracing the history of all America’s regional cultures, from their various foundings right up to the present, and he does about as good a job as anyone could with a mere 300 pages of text at his disposal: it’s necessarily condensed, but the notes are good and he does provide an excellent “Suggested Reading” essay at the end to point you towards thousands of pages worth of places to look when you inevitably want more of something. Intrigued by the brief discussion of the patchwork of regional cultures across Texas? There’s a book for that! Several, in fact.
Woodard divides the US into eleven distinctive regional cultures, which he calls “nations” because they share a common culture, language, experience, symbols, and values. For the period of earliest settlement this seems fairly uncontroversial — you don’t need to read a lot of American history to pick up on the profound cultural differences between, say, the Massachusetts townships that produced John Adams and the Virginia estates of aristocrats like Washington, Jefferson, and Madison, let alone the backwoods shanties where Andrew Jackson grew up. As the number of immigrants increased, though (and this began quite early: several enormous waves of German immigrants meant that by 1755 Pennsylvania no longer had an English majority), it doesn’t seem immediately obvious that the original culture would continue to dominate.
Woodard’s response to this concern is twofold. First, he cites Wilbur Zelinsky’s Doctrine of First Effective Settlement to the effect that “[w]henever an empty territory undergoes settlement, or an earlier population is dislodged by invaders, the specific characteristics of the first group able to effect a viable, self-perpetuating society are of crucial significance for the later social and cultural geography of the area, no matter how tiny the initial band of settlers may have been”. (The nation of New Netherland, founded by the Dutch in the area that is now greater New York City, is the paradigmatic example: both Zelinksy and Woodard argue that it has maintained its distinctively tolerant, mercantile, none-too-democratic character despite the fact that only about 0.2% of the population is now of Dutch descent.)3 But his second, and more convincing, approach is just to show you that the people who moved here in 1650 were like that and then in the 1830s their descendants moved there and kept being like that and, hey look, let’s check in on them today — yep, looks like they’re still like that. Even though between 1650 and now plenty of Germans (or Swedes or Italians or whoever) have joined the descendants of those earliest English settlers.
Most of the book is given over to the six nations — Yankeedom, New Netherland, the Midlands, Tidewater, the Deep South, and Greater Appalachia — that populated the original Thirteen Colonies and still occupy most of the country’s area. Told as the story of the distrust or open bloody conflicts between various peoples, American history takes on a ghastly new cast: have you ever heard of the Yankee-Pennamite Wars, fought between Connecticut settlers and bands of Scots-Irish guerillas over control of northern Pennsylvania? Or the brutal Revolution-era backcountry massacres committed not by the Continental Army or the redcoats but by warring groups of Appalachian militias? What about the fact that Pennsylvania’s commitment to the American cause was made possible only by a Congressionally-backed coup d’état that suspended habeas corpus, arrested anyone opposed to the war, made it illegal to speak or write in opposition to its decisions, and confiscated the property of anyone who suspected of disloyalty (if they weren’t executed outright)? Gosh, this is beginning to sound like, well, literally any other multiethnic empire in history. (It also offers some fascinating points of divergence for alternate history.)4
Jane Psmith, “REVIEW: American Nations, by Colin Woodard”, Mr. and Mrs. Psmith’s Bookshelf, 2024-02-19.
- Including plenty of Jane Psmith-bait like discussion of who was into boiling (the East Anglians who adopted coal early and moved to New England) vs. roasting (the rich of southern England who could afford wood and moved to the Chesapeake Bay), discussions of regional vernacular architecture, the distinctive sexual crimes each group obsessed about (bestiality in New England, illegitimacy in Chesapeake) and so on — I love this book.
- Incidentally, if you’ve only read Scott Alexander’s review of Albion’s Seed, do yourself a favor and read the actual book. Yes, Scott gives a perfectly cromulent summary of the main points, but it’s a such gloriously rich book, full of so many stories and details and painting such a picture of each of the peoples and places it treats, that settling for the summary is like reading the Wikipedia article about The Godfather instead of just watching the darn movie.
- Yes, there is a book for this, and it’s apparently Russell Shorto’s The Island at the Center of the World, which I have not read and don’t particularly plan to.
- Off the top of my head:
- The Deep South tried to get the United States to conquer and colonize Cuba and much of the Caribbean coast of Central America as future slave states;
- There were a wide variety of other secession movements in the run-up to the Civil War, including a suggestion that New York City should become an independent city-state that was taken seriously enough for the Herald to publish details of the governing structure of the Hanseatic League;
- In 1784 the residents of what is now eastern Tennessee formed the sovereign State of Franklin, which banned lawyers, doctors, and clergymen from running for office and accepted apple brandy, animal skins, and tobacco as legal tender. They were two votes away from being accepted as a state by the Continental Congress.
February 18, 2026
QotD: Defending the borders of the Roman Empire
As Luttwak notes, modern historians and military theorists have a tendency to sneer at linear defense lines.1 In fact, some historians of ancient Rome actually blame the decline and eventual collapse of the empire on all the “wasted” energy spent building frontier fortifications. The argument against such “cordon” defenses is that for a given quantity of military potential, spreading it out equally along a perimeter and trying to guard every spot equally dilutes your strength. This makes it easy for an attacker (who picks the time and location of the battle) to concentrate his forces, create a local advantage, and break through.
The thing is, approximately none of this logic applied in the Roman situation. First of all, as we’ve already noted, a huge fraction of the threats the Romans faced were “low-intensity”: border skirmishes, slave raids, pirates and brigands, that sort of thing. Static fortifications, walls and towers, are often more than sufficient for dealing with these problems. Paradoxically, that actually increases the mobility and responsiveness of the main forces. If they aren’t constantly running back and forth along the border dealing with bandits, that means they can respond with short notice to “high-intensity” threats (like major invasions and rebellions) that pop up, and are probably better rested and better provisioned when the emergency arrives. So, far from diluting their strength, a lightly-manned series of linear fortifications actually enabled the Romans to concentrate it.
Secondly, those linear fortifications can also be very useful when that major invasion shows up, even if they are overrun. A defense system doesn’t have to be impenetrable in order to still be very, very useful. One thing it can do is buy time, either for the main army to arrive or for some other strategic purpose. The defenses can also act to channel opposing forces into particular well-scouted avenues of attack, or change the calculus of which invasion routes are more and less appealing. Finally, in the process of setting up those defenses, you probably got to know the terrain extremely well, such that when the battle comes you have a tactical advantage.
[…]
The third, and perhaps most important, reason why the Roman frontier fortifications were actually very smart is that they were carefully designed to double as a springboard for invasions into enemy territory. Luttwak coins the term “preclusive defense” to describe this approach. The basic idea is that an army can take bigger risks — pursue a retreating foe, seize a strategic opportunity that might be an ambush, etc. — if it knows that there are strong, prepared defensive lines that it can retreat to nearby. Roman armies were constantly taking advantage of this, and moreover taking advantage of the fact that the system of border fortifications was also a system of roads, supply lines, food and equipment storage depots, and so on. The limes were not a wall that the Romans huddled behind, they were a weapon pointed outwards, magnifying the power that the legions could project, helping them to do more with less.
John Psmith, “REVIEW: The Grand Strategy of the Roman Empire by Edward Luttwak”, Mr. and Mrs. Psmith’s Bookshelf, 2023-11-13.
- I, an ignoramus, assumed this was all downstream of the Maginot line’s bad reputation, but Luttwak says it’s actually the fault of Clausewitz.
February 9, 2026
QotD: The pre-modern versus the modern concept of “self”
The Canadian philosopher Charles Taylor once wrote a very long book about how the essential quality of secularization is the transition from what he calls “the porous self” to “the buffered self”. In pretty much every premodern society, people believe that their psyches are subject to benign or malign or simply alien influence from external forces and entities — gods, demons, faeries, curses, the evil eye, or Iwa. Contra many popularizers of Taylor, the crucial distinction isn’t that these forces are supernatural in nature, it’s that the boundary between inmost self and the outside world is vague and semi-permeable, and therefore that any one of our thoughts or desires might have arisen through outside influence.
In contrast, most modern societies believe in a self that is “buffered”. In this view there are a few limited, low-bandwidth ways that the external world can act on one’s innate nature, for instance via drugs or other body chemistry, and even these are often seen as revealing or disclosing previously hidden innate characteristics of one’s personality rather than as imposing something alien. Taylor argues quite convincingly that these two ways of viewing the self — porous vs. buffered — inexorably produce two different ways of viewing society and the world: premodern and modern. For example: if selves are porous, then we need to be extremely vigilant against the invasion or violation of our minds by hostile spirits, and we must be suspicious of what we want, because it might not really be what we want, but rather what something else wants through us. Conversely, if selves are buffered then our desires are just part of who we are, and in order to be true to ourselves, we need to explore them and act upon them.
It may have been reasonable to believe in a buffered self back in the days before the internet, but recent developments have made it clear that (as in so many things) the primitive superstitions were actually correct, and the enlightened modern view was just a lamer and dumber kind of superstition.1 Science fiction has long been fascinated with stories of infohazards — images or jokes or snippets of cognition that act like a Gödel sentence for the human mind and leave people braindead or mind-controlled. But such things long since slipped the shackles of fiction — we now have internet creepypasta that induces girls to become murderers and a genre of pornography that turns boys into girls.2 The noösphere is a vast ocean, and its abyssal depths teem with lifeforms and thoughtforms that seek to possess you and live out their blasphemous unlife through your mortal husk.
John Psmith, “REVIEW: Demons, by Fyodor Dostoevsky”, Mr. and Mrs. Psmith’s Bookshelf, 2023-07-17.
- Or maybe society is already correcting itself on this point. Many like to make fun of the “fragility” and “snowflake” nature of Gen Z, and I’ve argued before that these critics miss the point that they’re actually being “flexed on” (in the parlance of our times) because loudly asserting an exaggerated harm is a power move (think: upper class women in an honor culture claiming to feel threatened, and how that’s actually itself a threat).
But here’s a different take on it: maybe “trauma” as it’s popularly conceptualized is actually modernity groping its way back to a porous understanding of the self! We no longer believe in spirits or curses, but our psyches are self-evidently susceptible to immaterial external influence, so we create a new concept that aligns empirical psychic porosity with the dominant metaphysical and ideological currents.
- I had a long debate with myself on whether to include either of those links. Do I really want to expose more people to an infohazard? Ultimately I decided to do it because this stuff is already so widespread. In both cases I’ve linked to a page that links to the subject matter in question rather than linking directly, so you have one more chance to bail out.
January 25, 2026
QotD: Dostoevsky’s views on revolutionaries in Demons
In a novel about political radicalism you might expect the ideas to take center stage, but here they’re treated as pure comic relief (if you’ve read The Man Who Was Thursday, the vibe is very similar). The guy who wants to kill all of humanity and the guy who wants to enslave all of humanity have some seriously conflicting objectives (and don’t forget the guy who just wants to kill himself and the guy who refuses to say what his goal is), yet they all belong to the same revolutionary society. The leader of their society takes it to an extreme, he has no specific ideas at all. His political objectives and philosophical premises are literally never mentioned, by him or by others. What he has is boundless energy, an annoying wheedling voice,1 and an infinite capacity for psychological cruelty. But all these impressive capacities are directed at nothing in particular, just at crushing others for the sheer joy of it,2 at destruction without purpose and without meaning.
Does that seem unrealistic? That ringleader was actually based on a real life student revolutionary named Sergey Nechayev, whose trial Dostoevsky eagerly followed. Nechayev wrote a manifesto called The Catechism of a Revolutionary, here’s an excerpt from that charming document:
The revolutionary is a doomed man. He has no personal interests, no business affairs, no emotions, no attachments, no property, and no name. Everything in him is wholly absorbed in the single thought and the single passion for revolution … The revolutionary despises all doctrines and refuses to accept the mundane sciences, leaving them for future generations. He knows only one science: the science of destruction … The object is perpetually the same: the surest and quickest way of destroying the whole filthy order … For him, there exists only one pleasure, one consolation, one reward, one satisfaction – the success of the revolution. Night and day he must have but one thought, one aim – merciless destruction.
The ideas don’t matter, because at the end of the day they’re pretexts for desires — the desire to dominate, the desire to obliterate the world, the desire to obliterate the self, the desire to negate.3 Just as in their parents’ generation the desire for status came first and wrapped itself in liberal politics in order to reproduce and advance itself, so in their children the desire for blood and death reigns supreme, and the radical politics serve only as a mechanism of self-justification and a lever to pull. This is not a novel about people, and it’s also not a novel about ideas. It’s a novel about desires, motives, urges, and the ways in which we construct stories to make sense of them.
John Psmith, “REVIEW: Demons, by Fyodor Dostoevsky”, Mr. and Mrs. Psmith’s Bookshelf, 2023-07-17.
- To Dostoevsky’s own surprise, when he wrote the main bad guy of the story, he turned out a very funny, almost buffoonish figure. He may be the most evil person in literature who’s also almost totally comic.
- Dostoevsky is notorious for dropping hints via the names of his characters — applied nominative determinism — and this one’s name means something like “supremacy”.
- Or as another famous book about demons once put it:
I am the spirit that negates
And rightly so, for all that comes to be
Deserves to perish wretchedly;
‘Twere better nothing would begin.
Thus everything that your terms, sin,
Destruction, evil represent —
That is my proper element.
January 15, 2026
QotD: Process knowledge
Dan Wang, in his wonderful essay on how technology grows, describes process knowledge as the sine qua non of industrial capitalism, more fundamental than the machines and factories that everybody sees:
The tools and IP held by these firms are easy to observe. I think that the process knowledge they possess is even more important. The process knowledge can also be referred to as technical and industrial expertise; in the case of semiconductors, that includes knowledge of how to store wafers, how to enter a clean room, how much electric current should be used at different stages of the fab process, and countless other things. This kind of knowledge is won by experience. Anyone with detailed instructions but no experience actually fabricating chips is likely to make a mess.
I believe that technology ultimately progresses because of people and the deepening of the process knowledge they possess. I see the creation of new tools and IP as certifications that we’ve accumulated process knowledge. Instead of seeing tools and IP as the ultimate ends of technological progress, I’d like to view them as milestones in the training of better scientists, engineers, and technicians.
The accumulated process knowledge plus capital allows the semiconductor companies to continue to produce ever-more sophisticated chips. […] It’s not just about the tools, which any sufficiently-capitalized firm can buy; or the blueprints, which are hard to follow without experience of what went into codifying them.
Process knowledge lives in people, grows when people interact with other people, and spreads around when skilled individuals relocate between cities or companies. But this also means it can wither and die, can be lost forever, either when old workers shuffle off to the Big Open Plan Office in the Sky, or when an ecosystem no longer has the energy or complexity to sustain a critical mass of skilled workers in a particular vocation. Some East Asian societies have gone to extreme lengths to retain process knowledge, for instance by deliberately demolishing and rebuilding a temple every 20 years.
In fact this is far from the most extreme thing East Asian societies have done to retain the process knowledge that lives within their workers! There are some components of an ecosystem, whether natural or technological, that are especially important keystone species. In the technological case, these species can be unprofitable at the current scale of an ecosystem, or inefficient, or they might not make economic sense until one or more of their customers exist, but those customers might not be able to exist until the keystone species does. Venture capital is very practiced at solving this kind of Catch-22, but in the East Asian economic boom it was national governments that actively sheltered keystone industries until they could get their footing, thus making entire ecosystems possible. A wonderful book about this is Joe Studwell’s How Asia Works, but if you can’t read it, read Byrne Hobart’s thorough review instead.
Process knowledge is so powerful, the ecosystem it enables so vital, it can break the assumptions of Ricardo’s theory of trade. Steve Keen has a perceptive essay about how the naive Ricardian analysis treats all capital stock as fungible and neglects the existence of specialized machinery and infrastructure. But naive defenders1 of trade liberalization often make an exactly analogous error with respect to the other factor of production — labor. Workers are not an undifferentiated lump, they are people with skills, connections, and expertise locked up in their heads. When a high-skill industry moves offshore, the community of experts around it begins to break up, which can cripple adjacent industries, stymie insights and breakthroughs, and make it almost impossible to bring that industry back.
John Psmith, “REVIEW: Flying Blind by Peter Robison”, Mr. and Mrs. Psmith’s Bookshelf, 2023-02-06.
- Like all coastal-Americans, I am generally in favor of trade liberalization, but I’m consummate and sophisticated about it, unlike Noah Smith.




