Mansa Musa’s good intentions may be the first case in history of failed foreign aid. Known as the “Lord of the Wangara Mines”, Mansa Musa I ruled the Empire of Mali between 1312 and 1337. Trade in gold, salt, copper, and ivory made Mansa Musa the richest man in world history.
As a practicing Muslim, Mansa Musa decided to visit Mecca in 1324. It is estimated that his caravan was composed of 8,000 soldiers and courtiers — others estimate a total of 60,000 — 12,000 slaves with 48,000 pounds of gold and 100 camels with 300 pounds of gold each. For greater spectacle, another 500 servants preceded the caravan, and each carried a gold staff weighing between 6 and 10.5 pounds. When totaling the estimates, he carried from side to side of the African continent approximately 38 tons of the golden metal, the equivalent today of the gold reserves in Malaysia’s central bank — more than countries like Peru, Hungary or Qatar have in their vaults.
On his way, the Mansa of Mali stayed for three months in Cairo. Every day he gave gold bars to the poor, scholars, and local officials. Mansa’s emissaries toured the bazaars paying at a premium with gold. The Arab historian Al-Makrizi (1364-1442) relates that Mansa Musa’s gifts “astonished the eye by their beauty and splendor”. But the joy was short-lived. So much was the flow of golden metal that flooded the streets of Cairo that the value of the local gold dinar fell by 20 percent and it took the city about 12 years to recover from the inflationary pressure that such a devaluation caused.
Orestes R Betancourt Ponce de León, “5 Historic Examples of Foreign Aid Efforts Gone Wrong”, FEE Stories, 2021-06-06.
March 6, 2024
QotD: Mansa Musa’s disastrous foreign aid to Cairo
March 5, 2024
QotD: Begging the question
… I hate, hate — with a burning passion — the modern use of the phrase “begs the question”. That’s NOT what it means, damn it!! “Begs the question” is a translation of the Latin petitio principii, which is a time-hallowed description of one of the most common of mankind’s logical fallacies — an “argument” that assumes the conclusion in the premises. Please don’t ever use “begs the question” in the modern sense — the fact that we don’t know what it actually means is one of the reason it’s so depressingly common today.
Severian, “Mental Middlemen II: Sex and the City and Self-Confidence”, Rotten Chestnuts, 2021-05-06.
March 4, 2024
QotD: The “ABC” movement in wine
Back in the 1990s when I first got into wine, there was a movement against the growing homogenisation of the world’s wine called ABC: “anything but chardonnay” which handily also stood for “anything but cabernet”. This was at a time when growers from Piedmont to Penedès were planting chardonnay (or cabernet sauvignon for reds) instead of local grape varieties.
There was a worry that in the future all wine would taste the same while the more obscure varieties would disappear. As an ABC enthusiast, I thought it far better and more interesting to drink riesling, or fiano or albariño or esgana cão (a Madeiran grape that means “dog strangler” in Portuguese because of its ferociously high acidity).
Jancis Robinson was also not a chardonnay fan. Recently I’ve been watching her Wine Course made in the 1990s on YouTube. It still holds up well, and the budget by modern standards is mind-blowing; one moment she’s in Burgundy, the next she’s in Australia.
Can we have another series like this again soon please? Each episode is devoted to a grape and in the chardonnay one Jancis (she’s one of the few famous people who it is acceptable to refer to by just her first name, like Britney or Boris) can barely contain her contempt for many wines made from the variety describing them as “sugar water”. She’s also not keen on the world’s second favourite variety, sauvignon blanc, either.
I was with Jancis. In fact, I was with Jancis on most things which points to a possible explanation for my chardonnay conversion. When I started out, I hadn’t developed my own tastes and so I was buying wine that I thought sounded sophisticated — such as Mosel riesling.
But as I’ve got older, I’m now buying bottles purely because I like them. Furthermore, I cook and entertain a lot more than I did when I was in my twenties and chardonnay, especially white Burgundy, goes with pretty much anything. If you don’t know what to order when eating out then a bottle of Mâcon-Villages will cover all your bases (the red equivalent if you’re interested is a bottle of Beaujolais).
But also your average chardonnay has got a lot better since the ’90s, or perhaps I should say that it leans more towards my tastes. I’ve been watching a lot of old episodes of Frasier recently and the chardonnay they drink is nearly orange. This style which is still very popular in the US is based on very ripe, some might say overripe, grapes which are then treated to a pre-fermentation maceration to get colour and body out of the skins.
Following fermentation with a yeast which accentuates tropical fruit flavours, the wine would be perked up with some tartaric acid and then either aged in new oak casks or more likely for cheaper wines have oak chips added.
The finished product would be thick and syrupy with a deep golden colour. Not very chic but a revelation in 1980s Britain when everyday white wine meant Blue Nun or Black Tower. They’re what Oz Clarke called “bottled sunshine” in his colourful slots with Jilly Goolden on BBC2’s Food and Drink programme. Like those loud waistcoats everyone thought were so witty worn with a morning suit or dinner jacket, they were great fun then but a bit embarrassing now.
Henry Jeffreys, “Chard: an apology”, The Critic, 2023-11-14.
March 3, 2024
QotD: The pushback against EVs
Parts of the automotive press seem to have sensed conspiracy in this. One senior figure recently asked who exactly has been “driving the anti-electric-car agenda”, while a respected publication claimed an “increasingly vehement anti-electric-car rhetoric” had hampered consumer confidence. The truth, however, is far simpler: people aren’t buying electric cars because they’re not very good.
Don’t think me a luddite – EVs are lovely in their own right. Smooth, brisk and easy to drive, there is a certain serenity in piloting a battery-powered vehicle. But EVs don’t exist in isolation. Instead, they are competing with a century of petrol and diesel power that has established cars as providers of comfort, freedom and convenience. And while the quiet nature of an EV arguably brings more comfort than an engine, batteries offer so much less freedom and convenience than fuel tanks as to barely be worth comparing.
My old diesel Mercedes, for instance, cost £4,000 and could go from London to Aberdeen, and most of the way back, on a single tank of fuel. A typical EV would need to recharge at least twice – just on the way up. This would add perhaps 90 minutes to the journey, assuming the public plugs were working and conveniently located. That, in my book, makes an EV demonstrably inconvenient. And cries of “how often do you drive to Aberdeen?” don’t hold water, because the freedom cars bring is absolutely intrinsic to their appeal. Perhaps tomorrow I get the urge to cross the Bridge of Dee; perhaps it’s none of your business. That’s freedom for you, and EVs curtail it.
Hugo Griffiths, “Why the public isn’t buying electric cars”, Spiked, 2023-11-20.
March 2, 2024
QotD: Early siege warfare
Now the besieger’s side of the equation may seem like an odd place to start a primer on fortifications, but it actually makes a fair bit of sense, because the capabilities of a potential attacker is where most thinking about fortification begins. Siegecraft, both offensive and defensive, is a case of “antagonistic co-evolution“, a form of evolution through opposition where each side of the relationship evolves new features in response to the other: neither offensive siege techniques nor fortifications evolve in isolation but rather in response to each other.
In many ways the choice of where to begin following that process of evolution is arbitrary. We could in theory start anywhere from the very distant past or only very recently, but in this case I think it makes sense to begin with the early Near Eastern iron age because of the nature of our evidence. While it is clear that siege warfare must have been an important part of not only bronze age warfare but even pre-bronze age warfare, sources for the details of its practice in that era are sparse (in part because, as we’ll see, siege warfare was a sort of job done by lower status soldiers who often didn’t figure much into artwork focused on royal self-representation and legitimacy-building).
But as we move into the iron age, the dominant power that emerges in the Near East is the (Neo-)Assyrian Empire, the rulers of which make a point of foregrounding their siegecraft as part of a broader program of discouraging revolt by stressing the fearsome abilities of the Assyrian army (which in turn had much of its strength in its professional infantry). Consequently, we have some very useful artistic depictions of the Assyrian army doing siege work and at the same time some incomplete but still very useful information about the structure of the army itself. Moreover, it is just as the Assyrian Empire’s day is coming to a close (collapse in 609) that the surviving source base begins to grow markedly more robust (particularly, but not exclusively, in Greece), giving us dense descriptions of siege work (and even some manuals concerning it) in the following centuries, which we can in turn bring to the Assyrian evidence to better understand it. So this is a good place to start because it is the earliest point where we are really on firm ground in terms of understanding siegecraft in some detail. This does mean we are starting in medias res, with sophisticated states already using complex armies to assault fairly complex, sophisticated fortifications, which is worth keeping in mind.
That said, it should be noted that this is hardly beginning at the beginning. The earliest fortifications in most regions of the world were wooden and probably very simple (often just a palisade with perhaps an elevated watch-post), but by the late 8th century, well-defended sites (like walled cities) already sported sophisticated systems of stone walls and towers for defense. That caveat is in turn necessary because siegecraft didn’t evolve the same way everywhere: precisely because this is a system of antagonistic co-evolution it means that in places where either offensive or defensive methods (or technologies) took a different turn, one can end up with very different results down the line (something we’ll see especially with gunpowder).
Bret Devereaux, “Collections: Fortification, Part I: The Besieger’s Playbook”, A Collection of Unmitigated Pedantry, 2021-10-29.
March 1, 2024
QotD: Canadian neuroticism
Canada remains unmatched in its ability to turn somebody else’s tragedy into a debate about our own neuroses.
Paul Wells, quoted by Mark Steyn, Western Standard 2005-01-31.
February 29, 2024
QotD: The fallacy of “engineering” economic growth
[The] ability to achieve the commercial exploitation of new scientific knowledge is heavily dependent also upon – as Deirdre McCloskey explains – people’s attitudes toward market-tested innovation, creative destruction, and progress.
Economic growth – while it is made possible by, and itself makes possible, countless impressive mechanical and technological feats – is not itself a mechanical, technological feat. Sustained economic growth cannot be engineered as can successful missions to the moon. The economic, legal, and social institutions required for there to be sustained growth are many, indescribably intricate and complex, and largely unseen (and, hence, unappreciated). To observe with one’s senses, statistics, and measuring instruments a successful economy, or even just a successful firm within a successful economy, is to observe only the surface of economic and social reality. A vast, deep ocean of complex attitudes and margins of adjustments swirls beneath.
Among the many, typically unappreciated implications of this reality is this: even if people in country B manage to acquire, by whatever means, all of the intellectual property belonging to the people of economically successful country A, the people of country B do not thereby gain any sure means of successfully “growing” their economy.
Don Boudreaux, “Quotation of the Day…”, Café Hayek, 2019-08-19.
February 28, 2024
QotD: When the rules in the dating market all changed
So far we’ve only been talking about guys, but the gals went through their own version of the same process way back in the days. Indeed, it’s because the girls changed that the guys got into PUA in the first place.
Under the old dispensation, back before the Clinton Era (1988-2001), everyone acknowledged that there were a lot of users and abusers, douchebags and parasites and losers, out there in the world. That being the case, simply being an all-around ok guy with a steady job — what the PUAs came to term “beta providers” — was, in itself, a pretty solid resume in the dating market. “Just be yourself” was every guy’s dad’s advice when it came to dating, and back then it was pretty solid, since it was assumed that the decent job etc. flowed from being a decent human being. And since every girl’s mom was telling her complementary things, the system worked … until it didn’t, and you can date the change precisely: June 6, 1998, the premiere of the HBO series Sex and the City.
[…] Everyone has met one of those “one of the guys”-type girls. They’re great fun, and while you know what I mean when I say they’re not necessarily marriage material as-is, you therefore also know what I mean when I say they really are what feminists all claim to be: Strong, confident women. They are what they are, and they know it, take it or leave it.
The problem is, most women — and, it goes without saying, all feminists — aren’t “strong, confident women”, in the same way the vast majority of guys aren’t naturally “alpha males”. That’s the dialectic I’ve been trying to get at in this series of posts. Sex and the City, as much as every episode needs to be burned and the ashes shot into deep space, was just the manifestation of a long-developing process. Thanks to all that “self-esteem” shit that started in the Seventies, sometime in the Clinton Era a critical mass of young women decided that what they needed was to be “strong” and “self-confident”. But they didn’t know how to do that, because the people telling them this were fat lesbian college professors. Then HBO, sensing a valuable market niche, got into the act …
Sex and the City […] is the gayest show in the history of television. Carrie and the Gals don’t act like women; they act the way women think men act — which is to say, they act like gay men. Recall that the late 1990s also saw an explosion of female “comedians”, whose one “joke” was some version of “I got my period today, but damn, I still crave dick.” (Sex and the City, you’ll recall, was pitched as a comedy). And that’s a serious problem, because as every straight guy has said at least once in his life, being gay would be fabulous if not for the “sex with guys” part. I mean, how awesome would it be (every young man thinks), if you could reorient your whole life around your crotch?
Severian, “Mental Middlemen III: SATC”, Rotten Chestnuts, 2021-05-06.
February 27, 2024
QotD: The role of the scholar
The first Great Commandment of scholarship is be honest; everything else is commentary. All the standards and methods that scholars have developed over the ages can be reduced to “be honest”.
Of course, fraudulent scholars have always existed, but it seems to me — not that I’ve conducted a study of the matter — that clear dishonesty by leading scholars no longer elicits widespread condemnation and no longer discredits the guilty parties to the extent that it used to. The Nancy MacLean affair [her book Democracy in Chains (2017) was an extended character assassination of Nobel Prize winning economist James M. Buchanan] is clear-cut. Thomas Piketty’s work is either blatantly dishonest or spectacularly incompetent. And many other examples might be adduced. Ideology, it seems, has overwhelmed scholars in the humanities and social sciences to an unprecedented degree.
Scholars should be seeking the truth, the whole truth, and nothing but the truth, however much they appreciate that this objective can never be attained fully. They are obliged to strive. If they clearly are not trying, indeed, are twisting and turning in the ideological wind above all, real scholars should drum them out of their professions as unworthy of recognition by genuine scholars.
Bob Higgs, Facebook, 2019-08-28.
February 26, 2024
QotD: Lockdown rebuttal
First, lockdowns were neither prudent nor essential. It’s not as if government officials considered the collateral damage to be inflicted on the economy, society, and health – not all health problems are caused by covid – by the lockdowns and then rationally concluded that the benefits of locking down outweighed these costs. No. The collateral damages were ignored. As the New York Times‘s Joe Nocera and Vanity Fair‘s Bethany McLean – authors of the just-released The Big Fail – write, “But there was never any science behind lockdowns – not a single study had ever been undertaken to measure their efficacy in stopping a pandemic. When you got right down to it, lockdowns were little more than a giant experiment.”1 In no universe is such a policy prudent.
Nor were lockdowns “essential”. As Nocera and McLean note,
… the weight of the evidence seems to be with those who say that lockdowns did not save many lives. By our count, there are at least 50 studies that come to the same conclusion. After The Big Fail went to press, The Lancet published a study comparing the COVID infection rate and death rate in the 50 states. It concluded that “SARS-CoV-2 infections and COVID-19 deaths disproportionately clustered in U.S. states with lower mean years of education, higher poverty rates, limited access to quality health care, and less interpersonal trust – the trust that people report having in one another.” These sociological factors appear to have made a bigger difference than lockdowns (which were “associated with a statistically significant and meaningfully large reduction in the cumulative infection rate, but not the cumulative death rate”.)
Second, the lockdowns were, contra Mr. Orrell’s claim, utterly unprecedented. Isolating individuals known to be infected, such as Typhoid Mary, is a categorically different measure than locking down whole societies. Such lockdowns were never used until China locked Wuhan down in early 2020. Here again are Nocera and McLean: “On April 8, 2020, the Chinese government lifted its lockdown of Wuhan. It had lasted 76 days – two and a half months during which no one was allowed to leave this industrial city of 11 million people, or even leave their homes. Until the Chinese government deployed this tactic, a strict batten-down-the-hatches approach had never been used before to combat a pandemic. Yes, for centuries infected people had been quarantined in their homes, where they would either recover or die. But that was very different from locking down an entire city; the World Health Organization called it ‘unprecedented in public health history’.”
It’s jarring to encounter in an essay that features many excellent arguments – as Mr. Orrell’s does – such irrational and utterly uninformed claims as Mr. Orrell offers about lockdowns.
Donald J. Boudreaux, responding to an article by Brent Orrell in Law & Liberty, 2023-10-31.
1. “COVID Lockdowns Were a Giant Experiment. It Was a Failure.” Intelligencer, October 30, 2023.
February 25, 2024
QotD: From the M1903 Springfield to the M1 Garand
The M1 Garand is correctly considered the best battle rifle of World War II. It was the only semi-automatic rifle — meaning that it fired each time the operator pulled the trigger — to be the standard issue infantry rifle of any army during the war. Other forces were equipped with bolt-action rifles — the British Lee Enfield, Soviet Mosin Nagant, Japanese Type 99, German K98k, etc. — that required the operator to manually pull back a bolt to eject the [expended cartridge], and then push it forward again to insert a fresh cartridge into the chamber. The most obvious advantage was an increased rate of fire: a semiautomatic rifleman with an M1 had an official aimed rate of fire of 24 shots per minute.1 Compare this to the 15 aimed shots that British soldiers were expected to pop off with a bolt-action Lee Enfield in a “mad minute” drill. And the Lee Enfield was one of the fastest bolt-action rifles ever produced! In a pinch, a GI could blast out a clip in a few seconds, approximating a burst from an automatic weapon.2 Furthermore, with semi-automatic fire, the shooter could stay focused on his target, whereas working the bolt generally forced the shooter off target, requiring time to reacquire a proper sight picture.
Lt. General George Patton famously called the M1 Garand “the greatest battle implement ever devised“, a quote often repeated reverently in the context of World War II nostalgia. The US Government after the war gave away millions of M1 Garands, making it a popular civilian rifle for hunting and competitive shooting.
But nostalgia aside, it is also possible to view, from the high perch of hindsight, the M1 Garand as a missed opportunity. The most advanced battle rifle of World War II ultimately looked back too much to the past rather than pointing the way to the future. During its development, senior military officials applied the perceived lessons of the Spanish American War to a rifle designed to solve the problems of the Great War. This intervention prevented the M1 Garand from becoming something closer to a modern assault rifle, with an intermediate power cartridge and higher magazine capacity. The army was in no hurry to ditch the rifle that had won World War II, meaning that the United States did not field its first true assault rifle until two decades after the concept had been invented by the Germans in 1943 and soon successfully adapted by the Soviets in 1947. The first American assault rifle, the M16, would not debut until 1965.
First a necessary caveat: rifles were not the decisive weapon in World War II. For the most part the small arms deployed by the United States had been designed to fight World War I: the Browning Automatic Rifle (1918), the M1919 Browning Medium Machine Gun (as the date implies, first fielded in 1919) and the M2 Browning heavy machine gun, designed in 1918 and so good it is still used over a century later. In contrast, German machine guns were somewhat more recent in design: the MG 34 (as the name implies, first fielded 1934) and the MG 42. During the war, the Germans invented the first true assault rifle, the StG 44.
The secret sauce of the US Army by 1944-45 had little to do with firearms at all: it was a combination of ready mobility through motorization combined with deadly artillery and close air support, enabled by an unmatched communication system that allowed forward observers to direct and adjust fires to lethal effect. The American way of war was rooted in fleets of trucks and jeeps, networks of radios and heaps of shells. Having a nice semi-automatic rifle was ancillary to a conflict like World War II. But the M1 Garand is a useful window into the vagaries of military procurement and technological innovation, which require developers to at once predict the operational environment of the future and analyze the lessons of the past.
Throughout the 1920s, officers at the Infantry School in Fort Benning experimented with new tactics that they hoped would again allow for mobile infantry combat and avoid the trench stalemate of World War I. The basic solution was some form of “fire and maneuver”, in which one section of a unit (say a squad or platoon) would lay down a sufficient base of small arms fire to suppress the enemy so as to facilitate the other section’s advance. By alternating suppression and assault the element might leapfrog its way forward, even against entrenched enemy machine guns.
For such a tactic to work, infantry platoons needed a lot of firepower. Some might come from light machine guns, like the Browning Automatic Rifle, which was issued to individual infantry squads. But it was generally realized that individual infantrymen needed to be capable of a far higher rate of fire than could be provided by the standard issue rifle, the bolt-action M1903 Springfield, fed from a five round magazine. To this end, the US government set about developing a semi-automatic rifle.
The charge was taken up by John C. Garand, a Canadian-born, self taught firearms designer who worked for the Springfield Armory in Massachusetts. Garand’s solution to make the rifle self-loading was to insert a piston beneath the barrel. When the gunpowder exploded in the cartridge, the gas produced in the explosion propelled the bullet out the barrel. But some gas was bled off into a cylinder below the barrel (which gives the M1 its peculiar appearance of seeming to have two barrels); the pressure of the gas in the cylinder drove a piston.3 This piston attached to an operating rod which pushed the bolt of the rifle back, ejecting the spent casing, and allowing a spring in the internal magazine to insert another cartridge into the chamber before a separate spring pushed the operating rod forward and closed the bolt for the next shot, all in a fraction of a second.
Garand had initially worked on a rifle chambering the standard .30-06 cartridge used by the M1903 Springfield rifle (the .30 indicates that the bullet had a diameter of .30 inches, while the 06 indicates that the cartridge had been adopted in 1906; the round is often pronounced “thirty-ought-six”. But when a rival designer named John Pedersen, also affiliated with Springfield Armory, developed a semi-automatic design that chambered a lighter .276 (7mm) round, Garand retooled his project for the lighter round as well, producing a prototype known as the T3, with subsequent refinements labeled the T3E1 and T3E2. The smaller round meant that the internal magazine of the T3E1/2 could accept clips of 10 bullets, doubling the magazine capacity of the M1903 Springfield.
Army officials were very interested in the new round, but wanted proof that a lighter bullet would be sufficiently lethal in combat. A series of grisly ballistic tests were therefore ordered, pitting the .276 round against the traditional .30-06. In 1928, anesthetized pigs were shot through with both rounds. To the surprise and consternation of traditionalists, the .276 did far more damage in the so-called “Pig Board”. This is not paradoxical: the lighter round was more likely to “tumble”, precisely because it was lighter and so more likely to have its trajectory disrupted by bone and tissue; the tumble meant that more of the kinetic energy was expended inside the target, causing far more damage. The .30-06, meanwhile, as a more powerful round, was more likely to punch clean through, retaining its kinetic energy to keep moving forward after passing through the target. Eventually, tests of this sort would be used to sell the army on the lethality of the 5.56mm round used by the M16/M4, which has an even greater tumble, and causes even more grievous injuries. Out of concern that the fat bodies of pigs did not accurately replicate the human teenagers that the new round was designed to kill, a new test was inflicted upon goats, seen as more appropriately lean and therefore better analogs. The result was the same in favor of the .276 (a lighter .256 performed even better). With two rounds of tests vindicating the .276, the Army demanded that its new rifle chamber the .30-06. The final decision was made by Douglas MacArthur himself.
The .30-06 round itself had been the product of a painful lesson learned during the Spanish American War. Here, American troops, armed with Krag M1892 rifles, had found themselves badly out-ranged by Spanish troops armed with Mausers; the famous charge up San Juan Hill occurred after US troops had advanced for some distance under a hail of unanswered rifle fire. Given the importance of sharp shooting to the American military mythos, getting handily outranged and outshot by Spanish forces was a painful embarrassment. The first order of business had been to adopt the Mauser design: the M1903 Springfield was essentially a modified Mauser, as the US government had licensed a number of Mauser’s patents. By upgrading the M1903 to take the heavy .30-06 round, the Army ensured that soldiers could engage targets over a kilometer away. Beyond the deeply ingrained “lessons learned” from the Spanish American War, the mythos of the deadly American sharpshooter was strongly entrenched. Even as disruptors at Benning developed new infantry tactics that stressed volume of fire over accuracy, the phantoms of buck skinned frontiersmen sniping at British redcoats from a thousand paces still occupied the headspace of military leaders; they wanted a rifle with long distance accuracy. The sights on the M1 Garand adjust out to 1200 yards.
But MacArthur’s reasoning seems to have been primarily motivated by administrative and logistical concerns, as he cited the generic difficulties of fielding a lighter round. Some of these challenges may have been related to production and distribution of a sufficient stockpile of new caliber ammunition. There may have also been a concern with the new round complicating the logistics of line companies. The army also used .30-06 for the BAR and Browning medium machine gun, and having all of these shoot the same round in theory simplified the supply of line companies, and allowed for cross-leveling between weapons systems. Similar concerns have the US Army maintaining a policy of only having 5.56mm weapons at the squad level (thus the M4 and M249 Squad Automatic Weapon both shoot 5.56, and the SAW can shoot from M4 magazines).4 Still, MacArthur’s concerns seem unfounded in hindsight. The United States was about to produce billions of bullets during World War II. American troops were about to be so lavishly supplied that distributing two types of bullets would have been readily feasible given the soon to be proven quality of American logistics.5
With MacArthur’s edict, Garand retooled his rifle back to the .30-06 caliber, and his design was finally accepted in 1936. But the larger and more powerful round required a design change: the internal magazine now took clips of eight bullets instead of ten. Two rounds may not sound like much, but every bullet can be precious in a firefight, and this represented a 20% reduction in magazine capacity. Spread over a company sized element, the reduced clip capacity represented over two-thousand fewer rounds that a company commander could expect to fire and maneuver with. Indeed, the M1’s volume of fire proved generally insufficient to suppress the enemy on its own during the war, evidenced by the habit of equipping rifle squads with two Browning Automatic Rifles, instead of one. Marine divisions by the end of the war often deployed three BARs per squad.6
Michael Taylor, “Michael Taylor on The Development of the M1 Garand and its Implications”, A Collection of Unmitigated Pedantry, 2023-09-08.
1. TM 9-1005-222-12.
2. Thus George Wilson, a platoon leader and later company commander in the Fourth Infantry Division, describing a platoon scout stumbling upon the enemy: “The second scout emptied the eight round clip in his M-1 so fast it seemed like a machine gun … the rest of us moved very cautiously and found three dead enemy soldiers in the road.” (G. Wilson. If You Survive. Ivy, 1987).
3. John Garand’s initial design, and early production M1s, had a gas trap inserted over the muzzle to catch the gas after the bullet had exited; when this proved prone to fouling, a design modification drilled a small hole in the barrel just before the muzzle to allow the gas to bleed into the cylinder, most M1 Garands had this “gas port” system.
4. In 2023, US Army infantry will begin transition to the M7 carbine and M250 Squad Automatic weapon, which will both use a 6.8mm bullet, out of concerns that the 5.56 NATO is insufficient to penetrate body armor.
5. Editor’s Note: I agree with Michael here in principle: US logistics could have managed this. But I would also note that part of the reason American logistics were so good is that they applied MacArthur’s reasoning to everything, reusing vehicle chassis, limiting the number of different ammunition calibers and demanding interchangeable parts across the whole range of military equipment. Take one of those decisions away and the whole still functions. Take all of them away and one ends up with the mess that was German production and logistics.
6. Editor’s Note: BAR goes BARRRRRRRRRRR.
February 24, 2024
QotD: Big government
I’m Canadian and have a romantic fondness for the famous motto of the Royal Canadian Mounted Police, the one about the Mounties always getting their man. But the bigger you make the government, the more you entrust to it, the more powers you give it to nose around the country’s bank accounts, and phone calls, and e-mails, and favourite Internet porn sites, the more you’ll enfeeble it with the siren song of the soft target. The Mounties will no longer get their man, they’ll get you instead. Frankly, it’s a lot easier.
[…]
What should have died on September 11th is the liberal myth that you can regulate the world to your will. The reduction of a free-born citizenry to neutered sheep upon arrival at the airport was the most advanced expression of this delusion. So how’s the FAA reacting to September 11th? With more of the same kind of obtrusive, bullying, useless regulations that give you the comforting illusion that if they’re regulating you they must be regulating all the bad guys as well. We don’t need big government, we need lean government — government that’s stripped of its distractions and forced to concentrate on the essentials. If Hillary and Co want to argue for big government, conservatives could at least make the case for what’s really needed — grown-up government.
Mark Steyn, “Big Shift”, National Review, 2001-11-19.
February 23, 2024
QotD: Eating roti
I used chicken because I was tired of looking unsuccessfully for goat. You can get goat if you go where people from the islands live, but that would be a lot like work.
You roll the roti up and eat the curry like a bear eating a Cub Scout in a sleeping bag.
Steve H., “Roti Stuffed With Curry: Green and Mean”, Hog on Ice, 2005-01-01.
February 22, 2024
February 21, 2024
QotD: Mid-century London
Two of the novels were by, respectively, Nigel Balchin and R. C. Hutchinson, writers well regarded in their time but now mostly forgotten, while the third was by Ivy Compton-Burnett, who still has her admirers. They were quite different authors, but each had an unmistakable quality of unreconstructed English national identity, such as no writer about London — where two of the novels are set — or anywhere else in the country could now convey.
It is not that foreigners could not be found in 1949 London, which was then still a port city of some importance. In Hutchinson’s book, Elephant and Castle, set largely in the East End, one of the main characters is half-Italian, and foreigners of various nationalities have walk-on parts. But they in no way affect the strongly English character of the city. Today’s London, by contrast, seems more like a dormitory for an ever-fluctuating population than a home; even much of its physical fabric has been completely denationalized by modernist architecture of a sub-Dubai quality. It is not a melting pot, for little is left to melt into; a better culinary metaphor might be a stir-fry, the ingredients remaining unblended — though, with luck, compatible.
Theodore Dalrymple, “What Seventy Years Have Wrought”, New English Review, 2019-10-26.



