Albert Einstein was a charmingly blunt man. For instance, in 1952 he wrote a letter to his friend and fellow physicist Max Born where he admits that even if the astronomical data had gone against general relativity, he would still believe in the theory:
Even if there were absolutely no light deflection, no perihelion motion and no redshift, the gravitational equations would still be convincing because they avoid the inertial system … It is really quite strange that humans are usually deaf towards the strongest arguments, while they are constantly inclined to overestimate the accuracy of measurement.
In a few short sentences Einstein completely repudiates the empiricist spirit which has ostensibly guided scientific inquiry since Francis Bacon. He doesn’t care what the data says. If the experiment hadn’t been run, he would still believe the theory. Moreover, should the data have disconfirmed his theory, who cares? Data are often wrong.
This is not, to put it mildly, the official story of how science gets made. In the version most of us were taught, the process starts with somebody noticing patterns or regularities in experimental data. Scientists then hypothesize laws, principles, and causal mechanisms that abstract and explain the observed patterns. Finally, these hypotheses are put to the test by new experiments and discarded if they contradict their results. Simple, straightforward, and respectful of official pieties. The Schoolhouse Rock of science. Or as Einstein once described it:
The simplest idea about the development of science is that it follows the inductive method. Individual facts are chosen and grouped in such a way that the law, which connects them, becomes evident. By grouping these laws more general ones can be derived until a more or less homogeneous system would have been created for this set of individual facts.
See? It’s as easy as that. But then, Einstein finishes that thought with: “[t]he truly great advances in our understanding of nature originated in a manner almost diametrically opposed to induction”.
Was Einstein a science-denier? I’m obviously kidding, but this is still pretty jarring stuff to read. How did he get this way? Einstein’s Unification is the story of the evolution of Einstein’s philosophical views, disguised as a story about his discovery of general relativity and his quixotic attempts at a unified field theory. It’s a gripping tale about how Einstein tried to do science “correctly”, experienced years of frustration, almost had priority for his discoveries snatched away from him, then committed some Bad Rationalist Sins at which point things immediately began to work. This experience changed him profoundly, and maybe it should change us too.
John Psmith, “REVIEW: Einstein’s Unification, by Jeroen van Dongen”, Mr. and Mrs. Psmith’s Bookshelf, 2024-05-27.
January 28, 2025
QotD: Was Einstein a science-denier?
January 27, 2025
QotD: The bureaucratization of university administration
On the consolidation of power within the administrative bureaucracy:
The character of the college as a micro-community of academics is being doubly subverted: from within, by the rapid growth of bureaucratic roles taken up by professional administrators, and from without, by a university seeking to centralise control and elide differences among the colleges. The more uniform the overall environment becomes, the more rapidly it will suffer from the bad decisions inevitably yet to be made.
On the metastasis of overpaid, officious administrators:
“The content of this letter is extremely important, so please read it carefully.” It isn’t often that the university speaks to its employees in this way. This was a follow-up email from the former pro-vice-chancellor for strategy and planning, David Cardwell. He wanted academics to complete his Time Allocation Survey by tabulating how many hours were spent across a vast suite of possible activities. It is characteristic of contemporary Cambridge that the strongest rhetoric it can muster is directed toward this self-serving bureaucratic exercise. Cardwell rubbed shoulders with four other pro-vice-chancellors, all enjoying a salary that is several multiples of the typical university academic, and surpasses the Prime Minister’s.
This administrative overgrowth is, by the way, a historical novelty:
All of this is new: until 1992, the role of vice-chancellor was covered in short stints by the Heads of House, who paused their college governance while the rest of Cambridge got on with what they were here to do. Now we have not only career administrators at the helm, but their five deputies, for an annual cost of around £1.5 million. All the while, the university fails to find the money to keep important subjects alive, such as the centuries-old study of millennia-old Sanskrit.
I’ve been pointing this out for years. Until very recently, administrative functions in universities were largely filled by senior academics: you got bullied into shouldering the unwelcome burden because somebody had to do it, and you drew the short straw. There is nothing that a serious person despises more than paperwork, especially a scholar, who would much rather be happily buried in whatever esoterica he has made his field of study. Forcing academics into administrative roles ensured that the people filling those offices were incentivized to keep the paperwork to an absolute minimum; the last thing they wanted was to create more of the hateful stuff.
Enter, some decades ago, the professional administrators. Initially, these usually had some sort of academic qualification, and still largely do – albeit typically in fake non-disciplines, “public administration” or what have you – but they were not in any sense scholars. They were managers. Give us your burden, they said; we’ll do all the annoying paperwork for you, and you can concentrate on your very important research, you very important scholars, you. Thus the professoriate, like gullible fools, handed over the keys to the kingdom.
Unlike professors, managers are incentivized to create as much administrative complexity as they can: the more administration there is to perform, the more administrators the institution needs, and the larger the fiefdoms senior administrators can command. Since admin typically has control of the budget, they were easily able to appropriate the necessary funds. The result has been the explosive proliferation of useless eaters with lavish salaries and ridiculous titles like Senior Vice Assistant Dean for Excellence or Junior Associate Student Life Provost. At many universities, administrators now exist in a 1:1 ratio with the student body.
Admin have sucked shrinking university budgets dry, with real intellectual consequences: they aren’t going to fire themselves, and they sure aren’t taking a salary cut, so to make up budget shortfalls academic programs with low enrolment get the axe. Butterfield’s reference to the closure of the Sanskrit program is an example of this; there are many such examples, and they are increasingly common. To brains built out of buzzwords and spreadsheets, everything is either a marketing technique or a revenue stream, and if a program isn’t popular enough to subsidize their summer vacations in Provence or social-justicey enough for them to brag to their beach friends about how progressive they are, it serves no purpose.
This ability of university administrations to close down programs illustrates something else, which is that they are the real power on campus. The academics are mere employees: they will teach whichever students the admin decides to admit, will teach those students whatever the admin says to teach them, will not teach what the admin tell them not to teach, will teach in whatever manner admin decides is best, and will evaluate the results of that instruction in whatever fashion admin mandates they be evaluated.
As members of the managerial class, university administrators are drones of the global managerial hive mind, and instinctively exert a homogenizing influence. Old, parochial practices must be jettisoned in favour of standardizing the institutions they manage.
As for our age-old titles – of lecturer, senior lecturer, reader and professor – these were replaced with American titles so as to be “more intelligible” to a global audience. … To conjure up a world of “assistant professors” and “associate professors”, who in fact have no supporting relationship to “the professors”, makes a mockery of that venerable system.
Administrators dislike horizontal social relationships amongst faculty. Peer-to-peer network architectures are hard to control; they prefer a server-terminal model, with management as the server through which all communications pass, and professors as the terminals, who can be regulated through systems of permissions. Thus, they set about dissolving those institutions that facilitate conviviality amongst the faculty:
It was telling that a few years ago the authorities silently closed down the University Combination Room, the 14th-century hall in which academics could freely convene outside their individual colleges.
Administration is also sneaky, adopting governance practices that minimize whatever legacy powers the professoriate still possesses:
Although in theory Cambridge academics are self-governing, the move to online voting, with minimal announcement, allows for many university policies to be driven through by those who want them enacted.
Butterfield understands full well that the problems are hardly unique to Cambridge:
All this I say of Cambridge. But these issues go right across the university sector … The public need to trust and respect the elite academic institutions they fund; but that respect is waning, as stories continue to reveal politicised teaching, grade inflation, authoritarian campus policies and lurid, even laughable, research grants. The ambitions of our whole education system are ultimately pegged to the achievements at the very pinnacle of academia. If Cambridge can’t resist decline, who can?
The obvious answer is: no one can resist this. Not Cambridge, not Oxford, not University College London, not Harvard, not Princeton, not MIT, and not Whittier College. The problem is too systemic; the rot too deep. Decades of administrative consolidation of power has subsumed the ivory tower into an appendage of the global asset management system. Generations of ideological infection by the mind virus of cultural Marxism, wokeism, critical social justice, gay race communism, whatever you want to call it, has poisoned the minds of too many of the faculty. Generations of steadily declining standards, an inevitable consequence of massively increased enrolment which of unavoidable necessity heavily sampled the fat middle of the IQ distribution, has thinned out the influence of the bell curve’s rarefied tail to statistical irrelevance. After all of this, the only way to save the university is to purge it, of the great mass of low-performing affirmative action students, of the diversity hire academics who substitute clumsy sermonizing for the scholarship they can neither understand nor perform, and most especially of the great tumorous mass of useless administrators.
Such a purge, to be effective, would need to be thorough. To be thorough, it would remove almost everyone in the system. This would be the same as destroying the system. To save the patient, one would have to kill the patient.
Therefore no such purge will take place.
Instead, the system will crumble, buckle under its own weight, and eventually collapse.
As, in fact, it is doing.
John Carter, “Crumbling DIEvory Towers”, Postcards From Barsoom, 2024-10-25.
January 26, 2025
QotD: The map is not the territory, state bureaucrat style
… most bureaucrats aren’t evil, just ignorant … and as Scott shows, this ignorance isn’t really their fault. They don’t know what they don’t know, because they can’t know. Very few bureaucratic cock-ups are as blatant as Chandigarh, where all anyone has to do is look at pictures for five minutes to conclude “you couldn’t pay me enough to move there”. For instance, here’s the cover of Scott’s book:
That’s part of the state highway system in North Dakota or someplace, and though again my recall is fuzzy, the reason for this is something like: The planners back in Bismarck (or wherever) decreed that the roads should follow county lines … which, on a map, are perfectly flat. In reality, of course, the earth is a globe, which means that in order to comply with the law, the engineers had to put in those huge zigzags every couple of miles.
No evil schemes, just bureaucrats not mentally converting 2D to 3D, and if it happens to cost a shitload more and cause a whole bunch of other inconvenience to the taxpayers, well, these things happen … and besides, by the time the bureaucrat who wrote the regulation finds out about it — which, of course, he never will, but let’s suppose — he has long since moved on to a different part of the bureaucracy. He couldn’t fix it if he wanted to … which he doesn’t, because who wants to admit to that obvious (and costly!) a fuckup?
Add to this the fact that most bureaucrats have been bureaucrats all their lives — indeed, the whole “educational” system we have in place is designed explicitly to produce spreadsheet boys and powerpoint girls, kids who do nothing else, because they know nothing else. Oh, I’m sure the spreadsheet boys and powerpoint girls know, as a factual matter, that the earth is round — we haven’t yet declared it rayciss to know it. But they only “know” it as choice B on the standardized test. It means nothing to them in practical terms, so it would never occur to them that the map they’re looking at is an oversimplification — a necessary one, no doubt, but not real. As the Zen masters used to say, the finger pointing at the moon is not, itself, the moon.
Severian, “The Finger is Not the Moon”, Rotten Chestnuts, 2021-09-14.
January 25, 2025
QotD: “Big budget cuts!”
So why is there such a big disconnect in the media? Why are there headlines about cutting and slashing when government is growing by every possible measure?
For the simple reason that the budget process in Washington is pervasively dishonest, as I’ve explained in interviews with John Stossel and Judge Napolitano. Here are the three things you need to know.
- The politicians created a system that automatically assumes big increases in annual spending, called a baseline.
- When there’s a proposal to have spending grow slower than the baseline, the gap between the proposal and the baseline is called a cut.
- It’s like being on a diet and claiming progress because you’re gaining two pounds each month rather than five pounds.
Defenders of this system argue that programs should get built-in increases because of things such as inflation, or because of more old people, which leads to more spending for programs such as Social Security and Medicare.
It’s certainly reasonable for them to argue that budgets should increase for these reasons.
But they should be honest. Be forthright and assert that “Spending should climb X percent because …”
Needless to say, that won’t happen. The pro-spending politicians and interest groups like the current approach because it allows them to scare voters by warning about “savage” and “draconian” spending cuts.
Daniel J. Mitchell, “The Media’s Pervasively Dishonest Coverage of Trump’s New Budget”, International Liberty, 2020-02-10.
January 24, 2025
QotD: Star Trek‘s transporter
Some great men are born great, some achieve greatness, and some have great catchphrases said to them. James Doohan is an honorary member of that last category. He was the guy who spent four decades on the receiving end of the request to “Beam me up, Scotty” – if not on TV, where no character on Star Trek ever actually uttered the words, at least in real life, where fans would cheerfully bark the injunction across crowded airport concourses in distant lands, and rush-hour freeway drivers would lurch across four lanes of traffic to yell it out the window at him. Elvis is said to have greeted him with the phrase, and Groucho, too. There are novels with the title, and cocktails. On Highway 375 to Roswell, New Mexico, you can stop at the Little A-Le-Inn and wash down your Alien Burger with a Beam Me Up, Scotty (Jim Beam, 7 Up and Scotch).
It wasn’t supposed to be the catchphrase from the show: that honor was reserved for Gene Roddenberry’s portentous sonorous orotund grandiosity – the space-the-final-frontier-boldly-going-where-no-man’s-gone-before stuff. The beaming was neither here nor there: it was a colloquialism for matter-energy transit, or teleportation – or, more to the point, a way of getting from the inside of the space ship to the set of the planet without having to do a lot of expensive exterior shots in which you’ve got to show the USS Enterprise landing and Kirk, Spock et al disembarking. Instead, the crew positioned themselves in what looked vaguely like a top-of-the-line shower, ordered Scotty to make with the beaming, and next thing you know they were standing next to some polystyrene rocks in front of a backcloth whose colors were the only way of telling this week’s planet from last week’s. “Beaming” was the special effect – the one that saved Star Trek from having to have any others.
Mark Steyn, “Beam Movie Actor”, Steyn Online, 2020-02-15.
January 23, 2025
QotD: The origins of strategic airpower
In my warfare survey, I have a visual gag where for a week and a half after our WWI lecture, every lecture begins with the same slide showing an aerial photograph (Wiki) of the parallel trenches of the First World War because so much of the apparatus of modern warfare exists as a response, a desperate need to never, ever do the trench stalemate again. And that’s where our story starts.
Fighting aircraft, as a technology in WWI, were only in their very infancy. On the one hand the difference between the flimsy, unarmed artillery scout planes of the war’s early days and the purpose-built bombers and fighters of the war’s end was dramatic. On the other hand the platforms available at the end of the war remained very limited. Once again we can use a late-war bomber like the Farman F.50 – introduced too late to actually do much fighting in WWI – as an example of the best that could be done. It has a range of 260 miles – too short to reach deep into enemy country – and a bomb load of just 704lbs. Worse yet it was slow and couldn’t fly very high, making it quite vulnerable. It is no surprise that bombers like this didn’t break the trench stalemate in WWI or win the war.
However, anyone paying attention could already see that these key characteristics – range, speed, ceiling and the all-important bomb-load – were increasing rapidly. And while the politicians of the 1920s often embraced the assumption that the War to End All Wars had in fact banished the scourge of war from the Earth – or at the very least, from the corner of it they inhabited such that war would now merely be a thing they inflicted on other, poorer, less technologically advanced peoples – the military establishment did not. European peace had always been temporary; the Peace of Westphalia (1648) and the Congress of Vienna (1815) had not ended war in Europe, so why would the Treaty of Versailles (1919)? There had always been another war and they were going to plan for it! And they were going to plan in the sure knowledge that the bombers the next war would be fought with would be much larger, faster, longer ranged and more powerful than the bombers they knew.
One of those interwar theorists was Giulio Douhet (1869-1930), an Italian who had served during the First World War. Douhet wasn’t the only bomber advocate or even the most influential at the time – in part because Italy was singularly unprepared to actually capitalize on the bomber as a machine, given that it was woefully under-industrialized and bomber-warfare was perhaps the most industrial sort of warfare on offer at the time (short of naval warfare) – but his writings exemplify a lot of the thinking at the time, particularly The Command of the Air (1921). But figures like Hugh Trenchard in Britain or Billy Mitchell in the United States were driving similar arguments, with similar technological and institutional implications. But first, we need to get the ideas.
Like many theorists at the time, Douhet was thinking about how to avoid a repeat of the trench stalemate, which as you may recall was particularly bad for Italy. For Douhet, there was a geometry to this problem; land warfare was two dimensional and thus it was possible to simply block armies. But aircraft – specifically bombers – could move in three dimensions; the sky was not merely larger than the land but massively so as a product of the square-cube law. To stop a bomber, the enemy must find the bomber and in such an enormous space finding the bomber would be next to impossible, especially as flight ceilings increased. In Britain, Stanley Baldwin summed up this vision by famously quipping, “no power on earth can protect the man in the street from being bombed. Whatever people may tell him, the bomber will always get through.” And technology seemed to be moving this way as the possibility for long-range aircraft carrying heavy loads and high altitudes became more and more a reality in the 1920s and early 1930s.
Consequently, Douhet assumed there could be no effective defense against fleets of bombers (and thus little point in investing in air defenses or fighters to stop them). Rather than wasting time on the heavily entrenched front lines, stuck in the stalemate, they could fly over the stalemate to attack the enemy directly. In this case, Douhet imagined these bombers would target – with a mix of explosive, incendiary and poison gas munitions) the “peacetime industrial and commercial establishment; important buildings, private and public; transportation arteries and centers; and certain designated areas of civilian population”. This onslaught would in turn be so severe that the populace would force its government to make peace to make the bombing stop. Douhet went so far as to predict (in 1928) that just 300 tons of bombs dropped on civilian centers could end a war in a month; in The War of 19– he offered a scenario where in a renewed war between Germany and France where the latter surrendered under bombing pressure before it could even mobilize. Douhet imagined this, somewhat counterintuitively, as a more humane form of war: while the entire effort would be aimed at butchering as many civilians as possible, he thought doing so would end wars quickly and thus result in less death.
Clever ideas to save lives by killing more people are surprisingly common and unsurprisingly rarely turn out to work.
Bret Devereaux, “Collections: Strategic Airpower 101”, A Collection of Unmitigated Pedantry, 2022-10-21.
January 22, 2025
QotD: The Who
The Who’s case for being the greatest rock band in history, and it has one, depends on the band having been a four-piece act in which all four pieces had the absolute maximum of performing ability and musical personality. To find any equivalent — maybe Zeppelin comes close — you would probably have to quit rock and go rummaging through the jazz section.
But I’ll tell you right now, there ain’t no Moon over there. I mean, good Lord: OF COURSE Keith Moon and John Entwistle were a difficult rhythm section for a guitarist to play in front of. Have you listened to those records? Professionals have talked about how watching Moon play up close was an exercise in constant suspense — you would see him take off at the start of the bar and go roaming around the drum kit and wonder how he could possibly make it back in time. He usually did make it — when he wasn’t so zonked he was falling off his stool, which is also a thing that happened sometimes.
This intricate, frantic quality is what made Moon the most inimitable of the great rock drummers — someone whose style you could recognize in a matter of seconds if he were playing on biscuit tins — but the difficulty of playing in front of a notional “timekeeper” so adventurous, and particularly doing it in concert, ought to be self-evident.
The standard advice for a rock guitarist in this predicament would be to make sure he had a very steady, unadventurous bass player to anchor the group. And the bassists for many excellent groups do, in fact, secretly stick to four or five notes they’re real comfortable with. But Entwistle offered Moon-like challenges as part of a rhythm section, albeit without inducing the same terror. At any moment his left hand might start leaping like a salmon on the fretboard, and if he played half notes in one bar, this was no guarantee he wouldn’t be doing startling, blinding sixteenths in the next.
That’s what makes Who records Who records; that’s what lifts the best ones above even the empyrean level of Townshend’s songwriting. But it meant, as Pete explained in his apology, that he could never step out and “shred” as a guitarist. The entire structure of the traditional rock group was topsy-turvy with the Who, and Townshend, whose ego is at least as big as the next fellow’s (spoiler: it’s bigger), was forced in some regard to be the responsible one, the custodian of the rhythm.
Colby Cosh, “Leave Pete Townshend alone!”, National Post, 2019-11-29.
January 21, 2025
QotD: Raw democracy
In a democracy, the majority rules and individual rights are irrelevant. If the majority votes that half of your income be confiscated before you can even buy groceries, oh well. If the majority votes that you must educate your children in a certain location because you live on a certain side of an arbitrary line, oh well. If the majority votes that you must be disarmed and defenseless against violent criminals, oh well. If the majority votes that your religion be designated an “outlaw religion” and that you and all other practitioners be committed to mental institutions, oh flipping well.
(And this is what our political, economic and media elites want to export across the globe?)
Doug Newman, “An Understatement: The Founding Fathers Hated Democracy”, The Libertarian Enterprise, 2005-08-14.
January 20, 2025
QotD: Brainwashing
I’ve always had a fascination with “brainwashing”. It turns out that the human mind is, indeed, pretty plastic out on the far edges, and so long as you don’t care about the health and wellbeing of the object of your literal skullfuckery, you can do some interesting things. For instance, a book on every dissident’s shelf should be The Rape of the Mind: The Psychology of Thought Control, Menticide, and Brainwashing, by Joost Meerloo. You’ll need to get it used, or on Kindle (the usual caveats apply). Meerloo was a Dutch (or Flemish or Walloon, I forget) MD who was briefly detained by the Gestapo during the war. They had nothing more than a cordial chat (by Gestapo standards), but they obviously knew what they were doing, and the only reason Meerloo didn’t get Der Prozess for real was that they didn’t feel the need at that time. He escaped, and the experience charted the course of his professional life.
Like Robert Jay Lifton’s Thought Reform and the Psychology of Totalism (another must-read), I read Meerloo years ago, so my recall of the details is fuzzy, but the upshot is obvious: The techniques of “brainwashing” have been known since at least the Middle Ages, and they’re still the same. Suspected witches in the Early Modern period, for instance, got Der Prozess, and though the witch hunters also had recourse to the rack and thumbscrews and all the rest, none of it was really necessary — isolation, starvation, and sleep deprivation work even better, provided you hit that sweet spot when they’re just starting to go insane …
I’m being deliberately flip about a horrible thing, comrades, because as no doubt distasteful as that is to read, the fact is, we’re doing it to ourselves, everywhere, all the time. Not the starvation part, obviously, but we eat such horribly unnatural diets that our minds are indeed grossly affected. Want proof? Go hardcore keto for a week and watch what happens. Or if that’s too much, you can simulate the experience by going cold turkey off caffeine. I promise you, by the end of day two you’d give the NKVD the worst dirt on your own mother if they sat a steaming hot cup of java in front of you.
Severian, “Kickin’ It Old Skool”, Founding Questions, 2021-10-04.
January 19, 2025
January 18, 2025
QotD: On Auguste Rodin’s Fallen Caryatid
“For three thousand years architects designed buildings with columns shaped as female figures. At last Rodin pointed out that this was work too heavy for a girl. He didn’t say, ‘Look, you jerks, if you must do this, make it a brawny male figure’. No, he showed it. This poor little caryatid has fallen under the load. She’s a good girl — look at her face. Serious, unhappy at her failure, not blaming anyone, not even the gods … and still trying to shoulder her load, after she’s crumpled under it.
“But she’s more than good art denouncing bad art; she’s a symbol for every woman who ever shouldered a load too heavy. But not alone women — this symbol means every man and woman who ever sweated out life in uncomplaining fortitude, until they crumpled under their loads. It’s courage, […] and victory.”
“‘Victory’?”
“Victory in defeat; there is none higher. She didn’t give up […] she’s still trying to lift that stone after it has crushed her. She’s a father working while cancer eats away his insides, to bring home one more pay check. She’s a twelve-year old trying to mother her brothers and sisters because Mama had to go to Heaven. She’s a switchboard operator sticking to her post while smoke chokes her and fire cuts off her escape. She’s all the unsung heroes who couldn’t make it but never quit.
Robert A. Heinlein, Stranger in a Strange Land, 1961.
January 17, 2025
QotD: Foraging for supplies in pre-modern armies
We should start with the sort of supplies our army is going to need. The Romans neatly divided these into four categories: food, fodder, firewood and water each with its own gathering activities (called by the Romans frumentatio, pabulatio, lignatio and aquatio respectively; on this note Roth op. cit. 118-140), though gathering food and fodder would be combined whenever possible. That’s a handy division and also a good reflection of the supply needs of armies well into the gunpowder era. We can start with the three relatively more simple supplies, all of which were daily concerns but also tended to be generally abundant in areas that armies were.
For most armies in most conditions, water was available in sufficient quantities along the direction of march via naturally occurring bodies of water (springs, rivers, creeks, etc.). Water could still be an important consideration even where there was enough to march through, particularly in determining the best spot for a camp or in denying an enemy access to local water supplies (such as, famously at the Battle of Hattin (1187)). And detailing parties of soldiers to replenish water supplies was a standard background activity of warfare; the Romans called this process aquatio and soldiers so detailed were aquatores (not a permanent job, to be clear, just regular soldiers for the moment sent to get water), though generally an army could simply refill its canteens as it passed naturally occurring watercourses. Well organized armies could also dig wells or use cisterns to pre-position water supplies, but this was rarely done because it was tremendously labor intensive; an army demanded so much water that many wells would be necessary to allow the army to water itself rapidly enough (the issue is throughput, not well capacity – you can only lift so many buckets of so much water in an hour in a single well). For the most part armies confined their movements to areas where water was naturally available, managing, at most, short hops through areas where it was scarce. If there was no readily available water in an area, agrarian armies simply couldn’t go there most of the time.
Like water, firewood was typically a daily concern. In the Roman army this meant parties of firewood forages (lignatores) were sent out regularly to whatever local timber was available. Fortunately, local firewood tended to be available in most areas because of the way the agrarian economy shaped the countryside, with stretches of forest separating settlements or tended trees for firewood near towns. Since an army isn’t trying to engage in sustainable arboriculture, it doesn’t usually need to worry about depleting local wood stocks. Moreover, for our pre-industrial army, they needn’t be picky about the timber for firewood (as opposed to timber for construction). Like water gathering, collecting firewood tends to crop up in our sources when conditions make it unusually difficult – such as if an army is forced to remain in one place (often for a siege) and consequently depletes the local supply (e.g. Liv. 36.22.10) or when the presence of enemies made getting firewood difficult without using escorts or larger parties (e.g. Ps.-Caes. BAfr. 10). Sieges could be especially tricky in this regard because they add a lot of additional timber demand for building siege engines and works; smart defenders might intentionally try to remove local timber or wood structures to deny an approaching army as part of a scorched earth strategy (e.g. Antioch in 1097). That said apart from sieges firewood availability, like water availability is mostly a question of where an army can go; generals simply long stay in areas where gathering firewood would be impossible.
Then comes fodder for the animals. An army’s animals needed a mix of both green fodder (grass, hay) and dry fodder (barley, oats). Animals could meet their green fodder requirements by grazing at the cost of losing marching time, or the army could collect green fodder as it foraged for food and dry fodder. As you may recall, cut grain stalks can be used as green fodder and so even an army that cannot process grains in the fields can still quite easily use them to feed the animals, alongside barley and oats pillaged from farm storehouses. The Romans seem to have preferred gathering their fodder from the fields rather than requisitioning it from farmers directly (Caes. BG 7.14.4) but would do either in a pinch. What is clear is that much like gathering water or firewood this was a regular task a commander had to allot and also that it often had to be done under guard to secure against attacks from enemies (thus you need one group of soldiers foraging and another group in fighting trim ready to drive off an attack). Fodder could also be stockpiled when needed, which was normally for siege operations where an army’s vast stock of animals might deplete local grass stocks while the army remained encamped there. Crucially, unlike water and firewood, both forms of fodder were seasonal: green fodder came in with the grasses in early spring and dry fodder consists of agricultural products typically harvested in mid-summer (barley) or late spring (oats).
All of which at last brings us to the food, by which we mostly mean grains. Sources discussing army foraging tend to be heavily focused on food and we’ll quickly see why: it was the most difficult and complex part of foraging operations in most of the conditions an agrarian army would operate. The first factor that is going to shape foraging operations is grain processing. [S]taple grains (especially wheat, barley and later rye) make up the vast bulk of the calories an army (and it attendant non-combatants) are eating on the march. But, as we’ve discussed in more detail already, grains don’t grow “ready to eat” and require various stages of processing to render them edible. An army’s foraging strategy is going to be heavily impacted by just how much of that processing they are prepared to do internally.
This is one area where the Roman army does appear to have been quite unusual: Roman armies could and regularly did conduct the entire grain processing chain internally. This was relatively rare and required both a lot of coordination and a lot of materiel in the form of tools for each stage of processing. As a brief refresher, grains once ripe first have to be reaped (cut down from the stalks), then threshed (the stalks are beaten to shake out the seeds) and winnowed (the removal of non-edible portions), then potentially hulled (removing the inedible hull of the seed), then milled (ground into a powder, called flour, usually by the grinding actions of large stones), then at last baked into bread or a biscuit or what have you.
It is possible to roast unmilled grain seeds or to boil either those seeds or flour in water to make porridge in order to make them edible, but turning grain into bread (or biscuits or crackers) has significant nutritional advantages (it breaks down some of the plant compounds that human stomachs struggle to digest) and also renders the food a lot tastier, which is good for morale. Consequently, while armies will roast grains or just make lots of porridge in extremis, they want to be securing a consistent supply of bread. The result is that ideally an army wants to be foraging for grain products at a stage where it can manage most or all of the remaining steps to turn those grains into food, ideally into bread.
As mentioned, the Romans could manage the entire processing chain themselves. Roman soldiers had sickles (falces) as part of their standard equipment (Liv. 42.64.2; Josephus BJ 3.95) and so could be deployed directly into the fields (Caes. BG 4.32; Liv. 31.2.8, 34.26.8) to reap the grain themselves. It would then be transported into the fortified camp the Romans built every time the army stopped for the night and threshed by Roman soldiers in the safety of the camp (App. Mac. 27; Liv. 42.64.2) with tools that, again, were a standard part of Roman equipment. Roman soldiers were then issued threshed grains as part of their rations, which they milled themselves (or made into a porridge called puls) using “handmills”. These were not small devices, but roughly 27kg (59.5lbs) hand-turned mills (Marcus Junkelmann reconstructed them quite ably); we generally assume that they were probably carried on the mules on the march, one for each contubernium (tent-group of 6-8; cf. Plut. Ant. 45.4). Getting soldiers to do their own milling was a feat of discipline – this is tough work to do by hand and milling a daily ration would take one of the soldiers of the group around two hours. Roman soldiers then baked their bread either in their own campfires (Hdn 4.7.4-6; Dio Cass. 62.5.5) though generals also sometimes prepared food supplies in advance of operations via what seem to be central bakeries. This level of centralization was part and parcel of the unusual sophistication of Roman logistics; it enabled a greater degree of flexibility for Roman armies.
Greek hoplite armies do not seem generally to have been able to reap, thresh or mill grain on the march (on this see J.W. Lee, op. cit.; there’s also a fantastic chapter on the organization of Greek military food supply by Matthew Sears forthcoming in a Brill Companion volume one of these years – don’t worry, when it appears, you will know!). Xenophon’s Ten Thousand are thus frequently forced to resort to making porridge or roast grains when they cannot forage supplies of already-milled-flour; they try hard to negotiate for markets on their route of march so they can just buy food. Famously the Spartan army, despoiling ripe Athenian fields runs out of supplies (Thuc. 2.23); it’s not clear what sort of supplies were lacking but food and fodder seems the obvious choice, suggesting that the Spartans could at best only incompletely utilize the Athenian grain. All of which contributed to the limited operational endurance of hoplite armies in the absence of friendly communities providing supplies.
Macedonian armies were in rather better shape. Alexander’s soldiers seem to have had handmills (note on this Engels, op. cit.) which already provides a huge advantage over earlier Greek armies. Grain is generally (as noted in our series on it) stored and transported after threshing and winnowing but before milling because this is the form in which has the best balance of longevity and compactness. That means that granaries and storehouses are mostly going to contain threshed and winnowed grains, not flour (nor freshly reaped stalks). An army which can mill can thus plunder central points of food storage and then transport all of that food as grain which is more portable and keeps better than flour or bread.
Early modern armies varied quite a lot in their logistical capabilities. There is a fair bit of evidence for cooking in the camp being done by the women of the campaign community in some armies, but also centralized kitchen messes for each company (Lynn op. cit. 124-126); the role of camp women in food production declines as a product of time but there is also evidence for soldiers being assigned to cooking duties in the 1600s. On the other hand, in the Army of Flanders seems to have relied primarily on external merchants (so sutlers, but also larger scale contractors) to supply the pan de munición ration-bread that the army needed, essentially contracting out the core of the food system. Parker (op. cit. 137) notes the Army of Flanders receiving some 39,000 loaves of bread per day from its contractors on average between April 1678 and February of 1679.
That created all sorts of problems. For one, the quality of the pan de munición was highly variable. Unlike soldiers cooking for themselves or their mess-mates, contractors had every incentive to cut corners and did so. Moreover, much of this contracting was done on credit and when Spanish royal credit failed (as it did in 1557, 1560, 1575, 1596, 1607, 1627, 1647 and 1653, Parker op. cit. 125-7) that could disrupt the entire supply system as contractors suddenly found the debts the crown had run up with them “restructured” (via a “Decree of Bankruptcy”) to the benefit of Spain. And of course that might well lead to thousands of angry, hungry, unpaid men with weapons and military training which in turn led to disasters like the Sack of Antwerp (1576), because without those contractors the army could not handle its logistical needs on its own. It’s also hard not to conclude that this structure increased the overall cost of the Army of Flanders (which was astronomical) because it could never “make the war feed itself” in the words of Cato the Elder (Liv 34.9.12; note that it was rare even for the Romans for a war to “feed itself” entirely through forage, but one could at least defray some costs to the enemy during offensive operations). That said this contractor supplied bread also did not free the Army of Flanders from the need to forage (or even pillage) because – as noted last time – their rations were quite low, leading soldiers to “offset” their low rations with purchase (often using money gained through pillage) or foraging.
Of course added to this are all sorts of food-stuffs that aren’t grain: meat, fruits, vegetables, cheeses, etc. Fortunately an army needs a lot less of these because grains make up the bulk of the calories eaten and even more fortunately these require less processing to be edible. But we should still note their importance because even an army with a secure stockpile of grain may want to forage the surrounding area to get supplies of more perishable foodstuffs to increase food variety and fill in the nutritional gaps of a pure-grain diet. The good news for our army is that the places they are likely to find food (small towns and rural villages) are also likely to be sources of these supplementary foods. By and large that is going to mean that armies on the march measure their supplies and their foraging in grain and then supplement that grain with whatever else they happen to have obtained in the process of getting that grain. Armies in peacetime or permanent bases may have a standard diet, but a wartime army on the march must make do with whatever is available locally.
So that’s what we need: water, fodder, firewood and food; the latter mostly grains with some supplements, but the grain itself probably needs to be in at least a partially processed form (threshed and sometimes also milled), in order to be useful to our army. And we need a lot of all of these things: tons daily. But – and this is important – notice how all of the goods we need (water, firewood, fodder, food) are things that agrarian small farmers also need. This is the crucial advantage of pre-industrial logistics; unlike a modern army which needs lots of things not normally produced or stockpiled by a civilian economy in quantity (artillery shells, high explosives, aviation fuel, etc.), everything our army needs is a staple product or resource of the agricultural economy.
Finally we need to note in addition to this that while we generally speak of “forage” for supplies and “pillage” or “plunder” for armies making off with other valuables, these were almost always connected activities. Soldiers that were foraging would also look for valuables to pillage: someone stealing the bread a family needs to live is not going to think twice about also nicking their dinnerware. Sadly we must also note that very frequently the valuables that soldiers looted were people, either to be sold into slavery, held for ransom, pressed into work for the army, or – and as I said we’re going to be frank about this – abducted for the purpose of sexual assault (or some combination of the above).
And so a rural countryside, populated by farms and farmers is in essence a vast field of resources for an army. How they get them is going to depend on both the army’s organization and capabilities and the status of the local communities.
Bret Devereaux, “Collections: Logistics, How Did They Do It, Part II: Foraging”, A Collection of Unmitigated Pedantry, 2022-07-29.
January 16, 2025
QotD: “At promise” youth
A new law in California bans the use, in official documents, of the term “at risk” to describe youth identified by social workers, teachers, or the courts as likely to drop out of school, join a gang, or go to jail. Los Angeles assemblyman Reginald B. Jones-Sawyer, who sponsored the legislation, explained that “words matter”. By designating children as “at risk”, he says, “we automatically put them in the school-to-prison pipeline. Many of them, when labeled that, are not able to exceed above that.”
The idea that the term “at risk” assigns outcomes, rather than describes unfortunate possibilities, grants social workers deterministic authority most would be surprised to learn they possess. Contrary to Jones-Sawyer’s characterization of “at risk” as consigning kids to roles as outcasts or losers, the term originated in the 1980s as a less harsh and stigmatizing substitute for “juvenile delinquent”, to describe vulnerable children who seemed to be on the wrong path. The idea of young people at “risk” of social failure buttressed the idea that government services and support could ameliorate or hedge these risks.
Instead of calling vulnerable kids “at risk”, says Jones-Sawyer, “we’re going to call them ‘at-promise’ because they’re the promise of the future”. The replacement term — the only expression now legally permitted in California education and penal codes — has no independent meaning in English. Usually we call people about whom we’re hopeful “promising”. The language of the statute is contradictory and garbled, too. “For purposes of this article, ‘at-promise pupil’ means a pupil enrolled in high school who is at risk of dropping out of school, as indicated by at least three of the following criteria: Past record of irregular attendance … Past record of underachievement … Past record of low motivation or a disinterest in the regular school program.” In other words, “at-promise” kids are underachievers with little interest in school, who are “at risk of dropping out”. Without casting these kids as lost causes, in what sense are they “at promise”, and to what extent does designating them as “at risk” make them so?
This abuse of language is Orwellian in the truest sense, in that it seeks to alter words in order to bring about change that lies beyond the scope of nomenclature. Jones-Sawyer says that the term “at risk” is what places youth in the “school-to-prison pipeline”, as if deviance from norms and failure to thrive in school are contingent on social-service terminology. The logic is backward and obviously naive: if all it took to reform society were new names for things, then we would all be living in utopia.
Seth Barron, “Orwellian Word Games”, City Journal, 2020-02-19.
January 15, 2025
QotD: Innovations hiding in plain sight
This sentence, from the Wall Street Journal, strikes me as being profoundly wrong:
Today, another half-century later, a coast-to-coast flight still takes you as long as it took your father in the 1970s. And with the major exception of computers, nothing in your luggage is likely to be much more useful or valuable than dad’s equivalent.
It may well be the case that aeroplanes fly at the same speeds that they did in the 1970s. I don’t know for sure, but my understanding is that supersonic speeds were banned due to noise factors. No doubt someone in the thread will clear that up. Let’s also concede the point about “major exception of computers” – like wow, let’s ignore the single greatest area of human innovation in the past 20 years.
Okay – the material your luggage is made out of is very strong and very light weight compared to luggage in the 1970s. Your luggage will have wheels on it now. Luggage with wheels would have been a luxury item in the 1970s. The entertainment on the flight will be much better than what it was in the 1970s. Remember the single movie in the cabin? That was a feature of flying until the late 1990s. I reckon the food the would be better too, today. Hard to believe, but yes.
Then what about the computers? Paper tickets? Movies on demand on your own device? Books loaded on your own device?
So while it may be true that the experience of flying is very similar – hurry up and wait, fly through the air, and arrive at a destination faster than all alternatives. But many, many aspects of the experience are very different and much improved. Cheaper too.
When thinking of innovation, it’s not just gadgets and new-fangled things that we should think about – it’s improved business models and improvements in pre-existing gadgets that we should think about too.
Sinclair Davidson, “Has innovation stalled?”, Catallaxy Files, 2019-12-14.
January 14, 2025
QotD: Ritual in medieval daily life
I am not in fact claiming that medieval Catholicism was mere ritual, but let’s stipulate for the sake of argument that it was — that so long as you bought your indulgences and gave your mite to the Sacred Confraternity of the Holy Whatever and showed up and stayed awake every Sunday as the priest blathered incomprehensible Latin at you, your salvation was assured, no matter how big a reprobate you might be in your “private” life. Despite it all, there are two enormous advantages to this system:
First, n.b. that “private” is in quotation marks up there. Medieval men didn’t have private lives as we’d understand them. Indeed, I’ve heard it argued by cultural anthropology types that medieval men didn’t think of themselves as individuals at all, and while I’m in no position to judge all the evidence, it seems to accord with some of the most baffling-to-us aspects of medieval behavior. Consider a world in which a tag like “red-haired John” was sufficient to name a man for all practical purposes, and in which even literate men didn’t spell their own names the same way twice in the same letter. Perhaps this indicates a rock-solid sense of individuality, but I’d argue the opposite — it doesn’t matter what the marks on the paper are, or that there’s another guy named John in the same village with red hair. That person is so enmeshed in the life of his community — family, clan, parish, the Sacred Confraternity of the Holy Whatever — that “he” doesn’t really exist without them.
Should he find himself away from his village — maybe he’s the lone survivor of the Black Death — then he’d happily become someone completely different. The new community in which he found himself might start out as “a valley full of solitary cowboys”, as the old Stephen Leacock joke went — they were all lone survivors of the Plague — but pretty soon they’d enmesh themselves into a real community, and red-haired John would cease to be red-haired John. He’d probably literally forget it, because it doesn’t matter — now he’s “John the Blacksmith” or whatever. Since so many of our problems stem from aggressive, indeed overweening, assertions of individuality, a return to public ritual life would go a long way to fixing them.
The second huge advantage, tied to the first, is that community ritual life is objective. Maybe there was such a thing as “private life” in a medieval village, and maybe “red-haired John” really was a reprobate in his, but nonetheless, red-haired John performed all his communal functions — the ones that kept the community vital, and often quite literally alive — perfectly. You know exactly who is failing to hold up his end in a medieval village, and can censure him for it, objectively — either you’re at mass or you’re not; either you paid your tithe or you didn’t; and since the sacrament of “confession” took place in the open air — Cardinal Borromeo’s confessional box being an integral part of the Counter-Reformation — everyone knew how well you performed, or didn’t, in whatever “private” life you had.
Take all that away, and you’ve got process guys who feel it’s their sacred duty — as in, necessary for their own souls’ sake — to infer what’s going on in yours. Strip away the public ritual, and now you have to find some other way to make everyone’s private business public … I don’t think it’s unfair to say that Calvinism is really Karen-ism, and if it IS unfair, I don’t care, because fuck Calvin, the world would’ve been a much better place had he been strangled in his crib.
A man is only as good as the public performance of his public duties. And, crucially, he’s no worse than that, either. Since process guys will always pervert the process in the name of more efficiently reaching the outcome, process guys must always be kept on the shortest leash. Send them down to the countryside periodically to reconnect with the laboring masses …
Severian, “Faith vs. Works”, Rotten Chestnuts, 2021-09-07.





