Quotulatiousness

June 21, 2024

From “invention” to “tradition”

Filed under: Architecture, Britain, Europe, History — Tags: , , , , , — Nicholas @ 05:00

At Astral Codex Ten, Scott Alexander considers some “traditions” which were clearly invented much more recently than participants might believe:

Two NYC synagogues, one in Moorish Revival style and the other is some form of modernism (you can tell it’s not Brutalism because it’s not all decaying concrete). Like Scott, I vastly prefer the one on the left even if it isn’t totally faithful to the Moroccan original design.

    A: I like Indian food.

    B: Oh, so you like a few bites of flavorless rice daily? Because India is a very poor country, and that’s a more realistic depiction of what the average Indian person eats. And India has poor food safety laws – do you like eating in unsanitary restaurants full of rats? And are you condoning Narendra Modi’s fascist policies?

    A: I just like paneer tikka.

This is how most arguments about being “trad” sound to me. Someone points out that they like some feature of the past. Then other people object that this feature is idealized, the past wasn’t universally like that, and the past had many other bad things.

But “of the past” is just meant to be a pointer! “Indian food” is a good pointer to paneer tikka even if it’s an idealized view of how Indians actually eat, even if India has lots of other problems!

In the same way, when people say they like Moorish Revival architecture or the 1950s family structure or whatever, I think of these as pointers. It’s fine if the Moors also had some bad buildings, or not all 1950s families were really like that. Everyone knows what they mean!


But there’s another anti-tradition argument which goes deeper than this. It’s something like “ah, but you’re a hypocrite, because the people of the past weren’t trying to return to some idealized history. They just did what made sense in their present environment.”

There were hints of this in Sam Kriss’ otherwise-excellent article about a fertility festival in Hastings, England. A celebrant dressed up as a green agricultural deity figure, paraded through the street, and then got ritually murdered. Then everyone drank and partied and had a good time.

Most of the people involved assumed it derived from the Druids or something. It was popular not just as a good party, but because it felt like a connection to primeval days of magic and mystery. But actually, the Hastings festival dates from 1983. If you really stretch things, it’s loosely based on similar rituals from the 1790s. There’s no connection to anything older than that.

Kriss wrote:

    I don’t think the Jack in the Green is worse because it’s not really an ancient fertility rite, but I do think it’s a little worse because it pretends to be … tradition pretends to be a respect for the past, but it refuses to let the past inhabit its own particular time: it turns the past into eternity. The opposite of tradition is invention.

    Tradition is fake, and invention is real. Most of the human activity of the past consists of people just doing stuff … they didn’t need a reason. It didn’t need to be part of anything ancient. They were having fun.

    I’ve been thinking a lot about [a seagull float in the Hastings parade] … in the procession, the shape of the seagull became totemic. It had the intensity of a symbol, without needing to symbolise anything in particular. Another word for a symbol that burns through any referent is a god. I wasn’t kidding when I said I felt the faint urge to worship it. I don’t think it would be any more meaningful if someone had dug up some thousand-year-old seagull fetishes from a nearby field. It’s powerful simply because of what it is. Invention, just doing stuff, is the nebula that nurses newborn gods.

I’m nervous to ever disagree with Sam Kriss about ancient history, but this strikes me as totally false.

Modern traditionalists look back fondly on Victorian times. But the Victorians didn’t get their culture by just doing stuff without ever thinking of the past. They were writing pseudo-Arthurian poetry, building neo-Gothic palaces, and painting pre-Raphaelite art hearkening back to the early Renaissance. And the Renaissance itself was based on the idea of a re-naissance of Greco-Roman culture. And the Roman Empire at its peak spent half of its cultural energy obsessing over restoring the virtue of the ancient days of the Roman Republic:

    Then none was for a party;
    Then all were for the state;
    Then the great man helped the poor,
    And the poor man loved the great:
    Then lands were fairly portioned;
    Then spoils were fairly sold:
    The Romans were like brothers
    In the brave days of old.

    Now Roman is to Roman
    More hateful than a foe,
    And the Tribunes beard the high,
    And the Fathers grind the low.
    As we wax hot in faction,
    In battle we wax cold:
    Wherefore men fight not as they fought
    In the brave days of old.

(of course, this isn’t from a real Imperial Roman poem — it’s by a Victorian Brit pretending to be a later Roman yearning for the grand old days of Republican Rome. And it’s still better than any poem of the last fifty years, fight me.)

As for the ancient Roman Republic, they spoke fondly of a Golden Age when they were ruled by the god Saturn. As far as anyone knows, Saturn is a wholly mythical figure. But if he did exist, there are good odds he inspired his people (supposedly the fauns and nymphs) through stories of some even Goldener Age that came before.

June 18, 2024

Freddie deBoer contra J.J. McCullough on Conspiracy Theories

Filed under: Cancon, History, Media, Politics, USA — Tags: , , , , — Nicholas @ 04:00

I saw JJ’s latest video pop up on my YouTube subscriptions page, read the headline and winced slightly. I generally like JJ’s videos even when I disagree with his presentation or interpretation and from the title, I thought “No, ‘many conspiracy theories’ did NOT ‘turn out to be true'” would require a fair bit of, uh, curation of the theories that get discussed. Freddie deBoer — who I disagree with much more than I do with McCullough — had a similar reaction:

The latest video from conservative Canadian YouTuber JJ McCullough displays many of the attributes that make his perspective unique — he’s genuinely a right-wing figure but an arch institutionalist, a gay Millennial with the kind of vague social libertinism common to a lot of libertarian-leaning conservatives but something of a scold, a Canadian patriot who relentlessly defends the United States from the kinds of criticism of Americanah that you might associate with Europe or, well, Canada — critiques of our provincialism, our consumerism, our boorish tendency to shove the rest of the world around. McCullough likes all of that stuff, more or less, while living a cosmopolitan and vaguely-arty lifestyle in groovy Vancouver. He’s perhaps best known for his war with Montreal, Francophone Canadians, and the entire province of Quebec, which fits his general esteem for a certain kind of capital-R Reasonable Anglophilia.

He reminds me, strangely, of a certain kind of secular anti-atheist, the type who still gets mad about the New Atheists despite the complete collapse of that subculture and whose own lack of belief doesn’t prevent them from waxing poetic about the glories of religion. I have a friend from grad school who grew up in an extremely repressive Christian community when she was young, and who describes leaving as an “escape”. (The kind of community where she and her sisters wore wrist-to-ankle dresses every day of their lives no matter the Oklahoma heat, weren’t allowed TV or radio, absorbed lots of corporal punishment, that sort of thing.) She has very, very little patience for people who are so annoyed by internet atheists that they become in effect advocates for religion; as she says, this kind of vague fondness for religion among the irreligious could only occur to someone who never had to live the way she did. I sort of see the same thing in McCullough — he idealizes certain aspects of America’s ethos because he has never had to live with the consequences of being surrounded by people who believe in it, who consciously or unconsciously demand that everyone else believe in it.

Anyhow, this new video is about conspiracy theories. Conspiracy theories are a good topic for understanding McCullough’s very particular ideological makeup. Conspiracy theories are famously a cross-ideological phenomenon, with both left conspiracy theories and right conspiracy theories but also conspiracy theories that don’t fit neatly into either, like 9/11 trutherism. As I said, McCullough is an institutionalist, a small-but-good government sort of guy (or so I take it) who places a great deal of value in official claims, institutions, and experts, and so he’s naturally distrustful of conspiracy theories. And he demonstrates that antipathy in this video through poking holes in a few clickbaity articles listing conspiracy theories that turned out to be true. This all amounts to feasting on a banquet of low-hanging fruit, but it’s not an illegitimate way to approach the question. I just don’t like his conclusions.

The key to McCullough’s bit here is that he doesn’t dispute that the named conspiracy theories (or “conspiracy theories”) that are asserted to be true are true. Rather, he operates by insisting that every identified conspiracy theory is in fact not a conspiracy theory according to his preferred definition. It’s not sufficient for a conspiracy theory to be broadly thought of as a conspiracy theory; it has to comport to specific rules he has devised for what a conspiracy theory entails. Effectively, that means that a conspiracy theory is only a conspiracy theory if it satisfies criteria endorsed by no one but JJ McCullough. I can’t decide if this is an isolated demand for rigor or a No True Scotsman, but either way, McCullough is here insisting on an unusually stringent definition of a conspiracy theory for the purpose of dismissing the idea that any conspiracy theories are true. And there’s a version of this that isn’t entirely wrong; there’s a tautological sense in which all conspiracy theories are false because being false is part of that definition of a conspiracy theory. But McCullough isn’t using that definition, just a particularly odd one that makes his task easier.

So the fact that cigarette manufacturers knew that cigarettes were very bad for your health but conspired to hide this fact from the public is not a conspiracy theory, according to McCullough, because other people of that era suspected that cigarettes caused lung cancer. (Actually proving that took a very long time, at least according to modern standards of causality.) I find this argument powerfully strange! You had a group of powerful people, they indisputably knew that cigarettes were very bad for your health, they indisputably conspired to suppress that information, they were fairly effective at that task. The fact that some early whistleblowers tried to raise the alarm is simply irrelevant. Check out my own proprietary formula.

Group of Powerful or Influential People + Nefarious Intent + Secrecy + Active Conspiring + Negative Consequences, Real or Potential = Conspiracy Theory

That’s a conspiracy, brother, and the tobacco company bad behavior fits. Long before information about their coverups became public knowledge, people were talking about the possibility that the tobacco companies were up to that exact bad behavior. Theorizing, you might say.

June 15, 2024

QotD: Is there more craziness these days or is it just the volume turned up to 11?

Filed under: Food, Health, Media, Quotations, USA — Tags: , , , , — Nicholas @ 01:00

… Is there, in fact, more lunacy in the Current Year, or is it just louder? He argued that there’s more. I argue that there’s not. Victorians, for instance, were world-class eccentrics. Just to stick with the breakfast cereal theme, consider that Kellogg’s corn flakes were based on some weird theory of digestion that was designed to combat the scourge of masturbation. No, really — the Sylvester Graham referenced in that article is the guy behind graham crackers, which were designed for similar reasons. See also “Fletcherism”, which counted Thomas Edison among its adherents. And that’s just food! Water, electricity, magnetism, you name it, there’s some weird Victorian health fad attached to it. Throw in the peccadilloes, sexual and otherwise, of just the widespread missionary movements, and you’ve got all the crazy you can handle, and then some.

Contrast this to the Current Year, where, much like breakfast food, what seems to be a bewildering variety of lunacy can be boiled down to just a few basic types. “Wokeness” is a madlib with just two variables: ____ is either racist or sexist, pick one. (I suppose you can combine them, but you’ll notice that doesn’t happen nearly as often as you’d predict, because the blacks hate the gays and the feminists hate everyone, so going full retard ends up getting you in a lot of trouble with your coreligionists).

Severian, “Mail Bag / Grab Bag”, Rotten Chestnuts, 2021-06-11.

June 2, 2024

A definite sign of the end-times – “South Park is going into its 27th season”

Filed under: Humour, Media, Politics, USA — Tags: , , , , — Nicholas @ 03:00

I’d pretty much given up on watching anything on television around the time that South Park went on the air, so I never “stopped watching it” because I wasn’t watching anything on TV by that time (although I did see Team America: World Police in the theatre). Andrew Sullivan says I’ve been missing something quite worthwhile for all this time:

South Park is going into its 27th season. And it has rarely been better. (I simply can’t believe so many people I meet say they haven’t watched in years. You’ve been missing out!) The new special on obesity — a deft masterclass of social commentary — has a brutal takedown of suburban white women jonesing for doses of Ozempic like meth-heads; a definitive — and musical! — digression into the insanity of the American healthcare system; pure, character-driven humor in a figure like Randy Marsh — a far subtler parody of the average American male than Homer Simpson; and, of course, Eric Cartman — the “big-boned” fat-ass kid whose capacity for pure evil was first truly captured in the epic “Scott Tenorman Must Die“.

You can read books on Ozempic, scan op-eds, absorb TikToks, and even listen to the Dishcast! — but nothing out there captures every single possible social and medical and psychological wrinkle of this new drug than this hour of crude cartoons. Yes, there are fart jokes. There are always fart jokes. But fart jokes amid a sophisticated and deeply informed parody of insurance companies? Or, in other episodes, toilet humor guiding us through the cowardice of Disney, the dopey vanity of Kanye, the wokification of Hollywood, the exploitation of black college athletes, the evil of cable companies, the hollowness of hate-crime laws, the creepiness of Christian rock, or the money-making behind legal weed? Only South Park pulls this off. Only South Park gets away with all of it.

It’s a 1990s high-low formula at root, sophisticated cultural and political knowingness married to crude cartoons, silly accents, m’kay, and a talking Christmas turd, Mr Hankey. Generationally, it really marked a moment when merging these two worlds seemed the most creative option — not an abandonment of seriousness, but the attachment of a humane levity to it. South Park can be brutal, but it is never cruel. Unless you’re Barbra Streisand or Bono. And virtually every character (even Eric) is redeemable. Except Meghan Markle.

Yes, Matt and Trey have tried other things. To wit: just one of the best and most successful musicals of the 21st century, The Book of Mormon. They’ve pioneered deep-fakes. They also just renovated and relaunched a huge Denver restaurant they loved as kids, Casa Bonita, memorialized in a classic Cartman-is-evil episode. Twenty years ago, they actually created an entirely puppet-acted movie with epic sex and vomit scenes as a commentary on the war on terror, Team America; and are now teaming up with Kendrick Lamar to shoot a live-action comedy about a biracial couple where the black boyfriend interns as a slave re-enactor only to discover that his ancestors were owned by his girlfriend’s. No landmines there.

But they always return to South Park and evince no desire to transcend it — partly because it has become an entire world that can expand and contract at will: a world where Mel Gibson tweaks his nipples and smears his feces, Mickey Mouse acts like a mafia don, Michael Jackson’s nose falls off, Meghan Markle is a literal empty vessel, Christopher Reeve eats fetuses for their stem-cells, and Tom Cruise works in a fudge factory where, yes, he does a lot of the packing.

And in two decades of an acutely polarized and politicized culture, what team is South Park on? Precisely. You can’t tell, can you? — which is a staggering achievement in its own right. And it’s not about risk-aversion: the duo was targeted by Islamist terror and didn’t blink. They also took on the censors at the MPAA — savor this memo — and obliterated one of George Carlin’s “Seven Words You Can Never Say on TV” by saying “shit” 162 times in one episode.

They’ve shown Martha Stewart putting a whole turkey up her back-hole, Paris Hilton putting a whole pineapple up her front-hole, Caitlyn Jenner running over innocent pedestrians, and Jesse Jackson demanding that his big black ass be ceremoniously kissed. They’ve tackled Scientology and Mormonism; they’ve shown intergalactic Catholic priests astonished at the idea they have to stop raping young boys; and they beat Dave Chappelle by two decades with “Mr. Garrison’s Fancy New Vagina” — their take on sex reassignment.

They have done all this, taken no prisoners, and remain uncancellable. Why? Because their mockery is genuinely universal (including themselves), their courage is real, and because they remain humane.

By humane, I mean they show how you can skewer and yet still love. As a young gay man, I often winced at the careful, all-too-sensitive depictions of gay men in most movies and television, the elaborate ways in which the subculture was homogenized and prettified for straight audiences. But in South Park, I could see the gay reality as I had already witnessed it in all its bewildering variety: the right-wing, elementary school teacher Mr Garrison … dating Mr Slave — a leather-daddy with a gerbil called Lemmiwinks living in his upper colon; I could see Big Gay Al get expelled from the Boy Scouts — and defend their right to do so; I could see Butters’ dad on the DL at the White Swallow bathhouse; in time, I could see Satan having a gay love affair with Saddam Hussein, because his other boyfriend was so lame. They even made AIDS funny. The offense worked because it always conveyed an actual truth about gay men, while also obviously mocking us with love. (Mr Slave was portrayed as a moral paragon next to Paris Hilton, for example, and Mr Garrison eventually ends up with Rick, a total normie.) South Park‘s role in helping America grow up on the topic of homosexuality, especially the young male demographic who followed them, is deeply under-rated.

QotD: The Spartans do not deserve the admiration of the modern US military

Filed under: Books, Europe, History, Media, Military, Quotations, USA — Tags: , , , , , , , , — Nicholas @ 01:00

The Athenian historian Thucydides once remarked that Sparta was so lacking in impressive temples or monuments that future generations who found the place deserted would struggle to believe it had ever been a great power. But even without physical monuments, the memory of Sparta is very much alive in the modern United States. In popular culture, Spartans star in film and feature as the protagonists of several of the largest video game franchises. The Spartan brand is used to promote obstacle races, fitness equipment, and firearms. Sparta has also become a political rallying cry, including by members of the extreme right who stormed the U.S. Capitol on Jan. 6, 2021. Sparta is gone, but the glorification of Sparta — Spartaganda, as it were — is alive and well.

Even more concerning is the U.S. military’s love of all things Spartan. The U.S. Army, of course, has a Spartan Brigade (Motto: “Sparta Lives”) as well as a Task Force Spartan and Spartan Warrior exercises, while the Marine Corps conducts Spartan Trident littoral exercises — an odd choice given that the Spartans were famously very poor at littoral operations. Beyond this sort of official nomenclature, unofficial media regularly invites comparisons between U.S. service personnel and the Spartans as well.

Much of this tendency to imagine U.S. soldiers as Spartan warriors comes from Steven Pressfield’s historical fiction novel Gates of Fire, still regularly assigned in military reading lists. The book presents the Spartans as superior warriors from an ultra-militarized society bravely defending freedom (against an ethnically foreign “other”, a feature drawn out more explicitly in the comic and later film 300). Sparta in this vision is a radically egalitarian society predicated on the cultivation of manly martial virtues. Yet this image of Sparta is almost entirely wrong. Spartan society was singularly unworthy of emulation or praise, especially in a democratic society.

To start with, the Spartan reputation for military excellence turns out to be, on closer inspection, mostly a mirage. Despite Sparta’s reputation for superior fighting, Spartan armies were as likely to lose battles as to win them, especially against peer opponents such as other Greek city-states. Sparta defeated Athens in the Peloponnesian War — but only by accepting Persian money to do it, reopening the door to Persian influence in the Aegean, which Greek victories at Plataea and Salamis nearly a century early had closed. Famous Spartan victories at Plataea and Mantinea were matched by consequential defeats at Pylos, Arginusae, and ultimately Leuctra. That last defeat at Leuctra, delivered by Thebes a mere 33 years after Sparta’s triumph over Athens, broke the back of Spartan power permanently, reducing Sparta to the status of a second-class power from which it never recovered.

Bret Devereaux, “Spartans Were Losers”, Foreign Policy, 2023-07/22.

May 27, 2024

QotD: The Cursus Honorum in the Roman republic

Filed under: Europe, Government, History, Quotations — Tags: , , , — Nicholas @ 01:00

One particular feature of Rome’s system of magistrates is that the offices were organized from a relatively early point into a “career path” called the cursus honorum or “path of honors”. Now we have to be careful here on a few points. First, our sources tend to retroject the cursus honorum back to the origins of the republic in 509, but it’s fairly clear in those early years that the Romans are still working out the structure of their government. For instance our sources are happy to call Rome’s first magistrates in the early years “consuls“, but in fact we know1 that the first chief magistrates were in fact praetors. Then there is a break in the mid-400s where the chief executive is vested briefly in a board of ten patricians, the decemviri. This goes poorly and so there is a return to consuls, soon intermixed from 444 with years in which tribuni militares consulari potestate, “military tribunes with consular powers”, were elected instead (the last of these show up in 367 BC, after which the consular sequence becomes regular). Charting those changes is difficult at best because our own sources, writing much later, are at best modestly confused by all of this. I don’t want to get dragged off topic into charting those changes, so I’ll just once again commend the Partial Historians podcast which marches through the sources for this year-by-year. The point here is that this system emerges over time, so we shouldn’t project it too far back, though by 367 or so it seems to be mostly in place.

The second caution is that the cursus honorum was, for most of its history, a customary thing, a part of the mos maiorum, rather than a matter of law. But of course the Romans, especially the Roman aristocracy, take both the formal and informal rules of this “game” very seriously. While unusual or spectacular figures could occasionally bend the rules, for most of the third and second century, political careers followed the rough outlines of the cursus honorum, with occasional efforts to codify parts of the process in law during the second century, beginning with the Lex Villia in 180 BC, but we ought to understand that law and others of the sort as mostly attempting to codify and spell out what were traditional practices, like the generally understood minimum ages for the offices, or the interval between holding the same office twice.

That said, there is a very recognizable pattern that was in some cases written into law and in other cases merely customary (but remember that Roman culture is one where “merely customary” carries a lot of force). Now the cursus formally begins with the first major office, the quaestorship, but there are quite a few things that an aspiring Roman elite needs to do first. The legal requirement is that our fellow – and it must be a fellow, as Roman women cannot hold office (or vote) – needs to have completed ten years of military service (Polyb. 6.19.1-3). But there are better and worse ways to discharge this requirement. The best way is being appointed as junior officers, military tribunes, in the legions. We’ll talk about this office in a bit, but during this period it served both as a good first stepping stone into political prominence as well as something more established Roman politicians did between major office-holding, perhaps as a way of remaining prominent or to curry favor with the more senior politicians they served under or simply because military exigency meant that more experienced hands were wanted to lead the army.

A diagram of the elected offices of the cursus honorum. Note that there were additional appointed military tribunes.

There are a bunch of other minor magistrates that are effectively “pre-cursus” offices too, but we don’t know a lot about them and they don’t seem generally to show up as often in the careers of the sort of Romans making their way up to the consulship, though this may be simply because our sources don’t mention them as much at all and so we simply don’t know who was holding them in basically any year. We’ll talk about them at the end of this set of posts, because they are important (particularly for non-elites).

I should note at the outset: all of these offices are elected annually unless otherwise noted, with a term of service of one year. You never hold the same office twice until you reach the consulship, at which point you can seek re-election, after a respectable delay (which is later codified into law and then ignored), but you may serve as a military tribune several times (this was normal, in fact, as far as we can tell).

The first major office of the cursus was the quaestorship. The number of quaestors elected grows over time. Initially just two, their number is increased to four in 421 (two assigned to Rome, one to each of the consuls) and then to six in the 260s (initially handling the fleet, then later to assist Roman praetors or pro-magistrates in the provinces) and then eight in 227. There may have been two more added to make ten somewhere in the Middle Republic, but recent scholarship has cast doubt on this, so the number may have remained eight until being expanded to twenty under Sulla in 81 BC through the aptly named lex Cornelia de XX quaestoribus (the Cornelian Law on Twenty Quaestors, Sulla being Lucius Cornelius Sulla Felix).2 It’s not clear if there was a legal minimum age for the quaestors and we only know the ages of a few (25, 27, 29 and 30, for the curious) so all we can say is that officeholders tended to hold the office in their twenties, right after finishing their mandatory stint of military service.3 Serving as a quaestor enables entrance into the Senate, though one has to wait for the next census to be added to the Senate rolls by the censors.

After the quaestorship, aspirants for higher office had a few options. One option was the office of aedile; there were after 367 four of these fellows. Two were plebeian aediles and were not open to patricians, while the two more prestigious spots were the “curule” aediles, open to both patricians and plebeians. The other option at this stage for plebeian political hopefuls was to seek election as a tribune of the plebs, of which there were ten annually, we’ll talk about these fellows in a later post because they have wide-ranging, spectacular and quite particular powers.

After this was the praetorship, the first office which came with imperium. Initially there may have just been one praetor; by the 240s there are two (what will become the praetor urbanus and the praetor peregrinus). In 227 the number increases to four, with the two new praetors created to handle administration in Sicily, Sardinia and Corsica. That number then increases to six in 198/7, with the added praetors generally being sent to Spain. Finally Sulla raises the number to eight in 81 BC. The minimum age seems to have been 39 for this office.

Finally comes the consulship, the chief magistrate of the Roman Republic, who also carried imperium but of a superior sort to the praetors. There were always two consuls and their number was never augmented. For our period (pre-Sulla) the consuls led Rome’s primary field armies and were also the movers of major legislation. Achieving the consulship was the goal of every Roman embarking on a political career. This is the only office that gets “repeats”.

Finally there is one office after the cursus honorum and that is the censorship. Two are elected every five years for an 18 month term in which they carry out the census. Election to the censorship generally goes to senior former-consuls and is one way to mark a particularly successful political career. That said, Romans tend to dream about the consulship, not the censorship and if you had a choice between being censor once or holding the consulship two or three times, the latter was more prestigious.

With the offices now laid out, we’ll go through them in rough ascending sequence. Today we’ll look at the military tribunes, the quaestors and the aediles; next week we’ll talk about imperium and the regular offices that carry it (consuls, praetors and pro-magistrates). Then, the week after that, we’ll look at at two offices with odd powers (tribunes of the plebs and censors), along with minor magistrates. Finally, there’s another irregular office, that of dictator, which we have already discussed! So you can go read about it there!

One thing I want to note at the outset is the “elimination contest” structure of the cursus honorum. To take the situation as it stands from 197 to 82, there are dozens and dozens of military tribunes, but just eight quaestors and just six praetors and then just two consuls. At each stage there was thus likely to be increasingly stiff competition to move forward. To achieve an office in the first year of eligibility (in suo anno, “in his own year”) was a major achievement; many aspiring politicians might require multiple attempts to win elections. But of course these are all annual offices, so someone trying again for the second or third time for the consulship is now also competing against multiple years of other failed aspirants plus this new year’s candidates in suo anno. We’ll come back to the implications of this at the end but I wanted to note it at the outset that even given the relatively small(ish) size of Rome’s aristocracy, these offices are fiercely competitive as one gets higher up.

Bret Devereaux, “Collections: How to Roman Republic 101, Part IIIa: Starting Down the Path of Honors”, A Collection of Unmitigated Pedantry, 2023-08-11.


    1. See Lintott, op. cit. 104-5, n. 47.

    2. These dates and numbers, by the by, follows F.P. Polo and A.D. Fernández, The Quaestorship in the Roman Republic (2019).

    3. If you are wondering about how anyone can manage to hold the office before 27, given ten years of military service and 17 being the age when Roman conscription starts, well, we don’t really know either. The best supposition is that some promising young aristocrats seem to have started their military service early, perhaps in the retinues (the cohors amicorum) of their influential relatives. Tiberius Gracchus at 25 is the youngest quaestor we know of, but he’s in the army by at most age 16 with Scipio Aemilianus at Carthage in 146.

May 25, 2024

“Education” versus “learning”

Filed under: Books, Education, History, USA — Tags: , , , , — Nicholas @ 05:00

At Astral Codex Ten, Scott Alexander discusses some of the ideas from Bryan Caplan’s book The Case Against Education:

Source here. Note deranged horizontal axis.

Education isn’t just about facts. But it’s partly about facts. Facts are easy to measure, and they’re a useful signpost for deeper understanding. If someone has never heard of Chaucer, Dickens, Melville, Twain, or Joyce, they probably haven’t learned to appreciate great literature. If someone can’t identify Washington, Lincoln, or either Roosevelt, they probably don’t understand the ebb and flow of American history. So what facts does the average American know?

In a 1999 poll, only 66% of Americans age 18-29 knew that the US won independence from Britain (as opposed to some other country). About 47% of Americans can name all three branches of government (executive, legislative, and judicial). 37% know the closest planet to the sun (Mercury). 58% know which gas causes most global warming (carbon dioxide). 44% know Auschwitz was the site of a concentration camp. Fewer than 50% (ie worse than chance) can correctly answer a true-false question about whether electrons are bigger than atoms.

These results are scattered across many polls, which makes them vulnerable to publication bias; I can’t find a good unified general knowledge survey of the whole population. But there’s a great survey of university students. Keeping in mind that this is a highly selected, extra-smart population, here are some data points:

  • 85% know who wrote Romeo and Juliet (Shakespeare)
  • 56% know the biggest planet (Jupiter)
  • 44% know who rode on horseback in 1775 to warn that the British were coming (Paul Revere)
  • 33% know what organ produces insulin (pancreas)
  • 31% know the capital of Russia (Moscow)
  • 30% know who discovered the Theory of Relativity (Einstein)
  • 19% know what mountain range contains Mt. Everest (Himalayas)
  • 19% know who wrote 1984 (George Orwell)
  • 16% know what word the raven says in Poe’s “The Raven” (“Nevermore!”)
  • 10% know the captain’s name in Moby Dick (Ahab)
  • 7% know who discovered, in 1543, that the Earth orbits the sun (Copernicus)
  • 4% know what Chinese religion was founded by Lao Tse (Taoism)
  • <1% know what city the general Hannibal was from (Carthage)

Remember, these are university students, so the average person’s performance is worse.

Most of these are the kinds of facts that I would expect school to teach people. Some of them (eg the branches of government) are the foundations of whole subjects, facts that I would expect to get reviewed and built upon many times during a student’s career. If most people don’t remember them, there seems to be little hope that they remember basically anything from school. So what’s school even doing?

Maybe school is why at least a majority of people know the very basics – like that the US won independence from Britain, or that Shakespeare wrote Romeo and Juliet? I’m not sure this is true. Here are some other questions that got approximately the same level of correct answers as “Shakespeare wrote Romeo and Juliet“:

  • What is the name of the rubber object hit by hockey players? (Puck, 89%)
  • What is the name of the comic strip character who eats spinach to increase his strength? (Popeye, 82% correct)
  • What is the name of Dorothy’s dog in The Wizard of Oz? (Toto, 80% correct)

I don’t think any of these are taught in school. They’re absorbed by cultural osmosis. It seems equally likely that Romeo and Juliet could be absorbed the same way. Wasn’t there an Academy-Award-winning movie about Shakespeare writing Romeo and Juliet just a decade or so before this study came out? Sure, 19% of people know that Orwell wrote 1984 – but how many people know the 1984 Calendar Meme, or the “1984 was not an instruction manual!” joke, or have heard of the reality show Big Brother? Nobody learned those in school, so maybe they learned Orwell’s name the same place they learned about the other 1984-related stuff.

Okay, so school probably doesn’t do a great job teaching facts. But maybe it could still teach skills, right?

According to tests, fewer than 10% of Americans are “proficient” at PIIAC-defined numeracy skills, even though in theory you need to know algebra to graduate from most public schools.

I took a year of Spanish in middle school, and I cannot speak Spanish today to save my life; that year was completely wasted. Sure, I know things like “Hola!” and “Adios!”, but I also know things like “gringo” and “Yo quiero Taco Bell” – this is just cultural osmosis again.

So it seems most people forget almost all of what they learn in school, whether we’re talking about facts or skills. The remaining pro-school argument would be that even if they forget every specific thing, they retain some kind of scaffolding that makes it easier for them to learn and understand new things in the future; ie they keep some sort of overall concept of learning. This is a pretty god-of-the-gaps-ish hypothesis, and counterbalanced by all the kids who said school made them hate learning, or made them unable to learn in a non-fake/rote way, or that they can’t read books now because they’re too traumatized from years of being forced to read books that they hate.

It’s common-but-trite to encounter people who say things like “I love learning, but I hated school” — I’ve undoubtedly said that myself many times. A weird experience was having to study a book in school that I’d already read on my own: it was like an early form of aversion therapy … here’s something you loved once, let’s make you hate it now.

May 23, 2024

The “post-national” entity formerly known as “Canada”

You have to hand it to the Trudeau family (and all their sycophantic enablers in the legacy media, of course). What other Canadian family has had such an impact on the country? By the time Justin Trudeau’s successor is invited to form a government, Canada will have changed so much — to the point that he could describe us as the first “post-national” country thanks to his unceasing effort to destroy the nation. At The Hub, Eric Kaufmann points the finger at Trudeau’s “Liberal-left extremism” as the motivating factor in Trudeau’s career:

Justin Trudeau has always had a strong affinity for the symbolic gesture, especially when the media are around to record it.

Canada is currently suffering from left-liberal extremism the likes of which the world has never seen. This excess is not socialist or classically liberal, but specifically “left-liberal”. It is evident in everything from this country’s world record immigration and soaring rents to state-sanctioned racial discrimination in hiring and sentencing, to the government-led shredding of the country’s history and memory. Rowing back from this overreach will not be the work of voters in one election, but of generations of Canadians.

The task is especially difficult in Canada, because, after the 1960s, the country (outside of Quebec) transferred its soul from British loyalism to cultural left-liberalism. Its new national identity (multiculturalist, post-national, with no “core” identity) was based on a quest for moral superiority measured using a left-liberal yardstick. Canada was to be the most diverse, most equitable, most inclusive nation in world history. No rate of immigration, no degree of majority self-abasement, no level of minority sensitivity, would ever be too much.

In my new book The Third Awokening, I define woke as the making sacred of historically marginalized race, gender, and sexual identity groups. Woke cultural socialism, the idea of equal outcomes and emotional harm protection for totemic minorities, represents the ideological endpoint of these sacred values. Like economic socialism, the result of cultural socialism is immiserization and a decline in human flourishing. We must stand against this extremism in favour of moderation.

The woke sanctification of identity did not stem primarily from Marxism, which rejected identity talk as bourgeois, but from a fusion of liberal humanism with the New Left’s identitarian version of socialism. What it produced was a hybrid which is neither Marxism nor classical liberalism.

Left-liberalism is moderate on economics, favouring a mixed capitalism in which regulation and the welfare state ameliorate the excesses of the market, without strangling economic growth. Its suspicion of communist authoritarianism helped insulate it from the lure of Soviet Moscow.

On culture, however, left-liberalism has no guardrails. When it comes to group inequality and harm protection, its claims are open-ended, with institutions and the nation castigated as too male, pale, and stale. For believers, the only way forward is through an unrestricted increase in minority representation. They will not entertain the idea that the distribution of women and minorities across different occupations could reflect cultural or psychological diversity as opposed to “systemic” discrimination. This is the origin of the letters “D” and “E” in Diversity, Equity, and Inclusion (DEI). Rather than seeking to optimize equity and diversity for maximal human flourishing, these are ends in themselves that brook no limits.

Left-liberals fail to ring-fence the degree of sensitivity that majority groups are supposed to display toward minority groups. Their emphasis on inclusivity through speech suppression rounds out the “I” in DEI. From racial sensitivity training (starting in the 1970s) to the “inclusive” avoidance of words like “Latino” or “mother” that offend and create a so-called hostile environment that silences subaltern groups, majorities are expected to police their speech.

May 21, 2024

Tribalism

Filed under: Africa, Americas, History — Tags: , , , , , , — Nicholas @ 04:00

Theophilus Chilton pulls up an older essay from the vault, discussing tribalism, how it likely arose, and examples of cultures that relapsed into tribalism for various reasons:

In this post, I’d like to address the phenomenon of tribalism. There can be two general definitions of this term. The first is attitudinal – it refers to the possession by a group of people of a strong ethnic and cultural identity, one which pervades every level and facet of their society, and which serves to separate (often in a hostile sense) the group’s understanding of itself apart from its neighbours. The second definition is more technical and anthropological, referring to a group of people organised along kinship lines and possessing what would generally be referred to as a “primitive” governmental form centered around a chieftain and body of elders who are often thought to be imbued with supernatural authority and prestige (mana or some similar concept). The first definition, of course, is nearly always displayed by the second. It is this second definition which I would like to deal with, however.

Specifically, I’d like to explore the question of how tribalism relates to the collapse of widely spread cultures when they are placed under extreme stresses.

There is always the temptation to view historical and pre-historical (i.e., before written records were available) people-groups which were organised along tribal lines as “primitives” or even “stupid”. This is not necessarily the case, and in many instances is certainly not true. However, tribalism is not a truly optimal or even “natural” form of social organisation, and I believe is forced onto people-groups more out of necessity than anything else.

Before exploring the whys of tribalism’s existence, let’s first note what I believe can be stated as a general truism – Mankind is a social creature who naturally desires to organise himself along communal lines. This is why cities, cultures, civilisations even exist in the first place. Early in the history of Western science, Aristotle expressed this sentiment in his oft-quoted statement that “Man is by nature a political animal” (ὁ ἄνθρωπος φύσει πολιτικὸν ζῷον). This aphorism is usually misunderstood, unfortunately, due to the failure of many to take its cultural context into account. Aristotle was not saying that mankind’s nature is to sit around reading about politicians in the newspaper. He was not talking about “politics” in some sort of demotic or operational sense. Rather, “political” means “of the polis” [” rel=”noopener” target=”_blank”>link]. The polis, in archaic and classical Greece, was more than just a city-state – it was the very sum of Greek communal existence. Foreigners without poleis were not merely barbarians, they were something less than human beings, they lacked a crucial element of communal existence that made man – capable of speech and reason – different from the animals and able to govern himself rationally. “Political” did not mean “elections” or “scandals”, as it does with us today. Instead, it meant “capable of living with other human beings as a rational creature”. It meant civilisation itself. Tribalism, while perhaps incorrectly called “primitive”, nevertheless is “underdeveloped”. It is in the nature of man to organise himself socially, and even among early and technologically backwards peoples, this organisation was quite often more complex than tribal forms. While modern cities may be populated by socially atomised shells of men, the classical view of the city was that it was vital to genuine humanity.

My point in all of this is that I don’t believe that tribal organisation is a “natural” endpoint for humanity, socially speaking. The reason tribes are tribes is not because they are all too stupid to be capable of anything else, nor because they have achieved an organisation that truly satisfies the human spirit and nature. As the saying goes, “The only morality is civilisation”. The direction of man’s communal association with man is toward more complex forms of social and governing interactions which satisfy man’s inner desire for sociability.

So why are tribal peoples … tribal? My theory is that tribalism arises neither from stupidity or satisfaction, but as a result of either environmental factors such as geography, habitability, etc. which inhibit complexification of social organisation, or else as a result of civilisation-destroying catastrophes which corrode and destroy central authority and the institutions necessary to maintain socially complex systems.

The first – environmental factors – would most likely be useful for explaining why cultures existing in more extreme biomes persist in a tribal state. For example, the Arctic regions inhabited by the Inuit would militate against building complexity into their native (i.e. pre-contact with modern Europeans) societies. The first great civilisations of the river valleys – Egypt, Mesopotamia, the Indus valley, and China – all began because of the organisation needed to construct and administer large scale irrigation projects for agriculture. Yet, the weather in the Arctic precludes any sort of agriculture, as well as many other activities associated with high civilisation such as monumental architecture and large scale trade. The Inuit remained tribal hunter-gatherers not because they were inherently incapable of high culture, but because their surroundings inhibited them from it. Likewise, the many tribal groups in the Rub’ al-Khali (the Empty Quarter of the Arabian peninsula) were more or less locked into a semi-nomadic transhumant existence by their environment, even as the racially and linguistically quite similar peoples of Yemen and the Hadramaut were developing complex agricultural and commercial cultures along the wadis.

However, I believe that the more common reason for tribalism in history is that of catastrophes – of various types, some fast-acting and others much slower – which essentially “turned the world upside down” for previous high civilisations which were affected by them. I believe that there are many examples of this which can be seen, or at least inferred, from historical study. I’ll detail five of them below.

The first is an example which would formerly have been considered to fall into the category of tribes remaining tribal because of geographical factors, but which recent archaeological evidence suggests is not the case. This would be the tribes (or at least some of them) of the Amazon jungles, especially the Mato Grosso region of western Brazil. Long considered to be one of the most primitive regions on the planet, one could easily make the argument that these tribes were such because of the extreme conditions found in the South American jungles. While lush and verdant, these jungles are really rather inhospitable from the standpoint of human habitability – the jungle itself is extremely dense, is rife with parasites and other disease-carriers, and is full of poisonous plants and animals of all kinds. Yet, archaeologists now know that there was an advanced urban culture in this region which supported large-scale root agriculture, build roads, bridges, and palisades, and dammed rivers for the purpose of fish farming – evidently the rumours told to the early Spanish conquistadores of cities in the jungle were more than just myth. This culture lasted for nearly a millennium until it went into terminal decline around 1550 AD, the jungle reclaiming it thoroughly until satellite imaging recently rediscovered it.

What happened? We’re not sure, but the best theory seems to be that diseases brought by Europeans terminated this Mato Grosso culture, destroying enough of its population that urban existence could no longer be sustained. The result of this was a turn to tribalism, a less complex form more easily sustained by the post-plague population. The descendants of this culture are the Kuikuro people, a Carib-speaking tribe living in the region, and probably also other tribes living in the greater area around the Matto Grosso. In the case of the Mato Grosso city culture, the shock of disease against which they had no immunity destroyed their population, and concomitantly their ability to maintain more complex forms of civilisation.

The conical tower inside the Great Enclosure at Great Zimbabwe.
Photo by Marius Loots via Wikimedia Commons.

The second example would be that of the Kingdom of Zimbabwe, centered around its capital of “Great Zimbabwe,” designated as such so as to distinguish it from the 200 or so smaller “zimbabwes” that have been scattered around present-day Rhodesia and Mozambique. Great Zimbabwe, at its peak, housed almost 20,000 people and was the nucleus of a widespread Iron Age culture in southern Africa, and this Bantu culture flourished from the 11th-16th centuries AD before collapsing. It is thought that the decline of Zimbabwean culture was due to the exhaustion of key natural resources which kept them from sustaining their urban culture. The result, if the later state of the peoples in the area is any indicator, was a conversion to the tribal structures more typically associated with sub-Saharan Africa. The direct descendants of the Zimbabwean culture are thought to be the various tribes in the area speaking Shona, a Bantu language group with over 8 million speakers now (post Western medicine and agriculture, of course). Once again, though, we see that when conditions changed – the loss of key resource supports for the urban culture – the shock to the system led to a radical decomplexification of the society involved.

May 11, 2024

The second time as farce – “we’re living through a performative version of the seventies”

Filed under: History, Media, Politics — Tags: , , , , , — Nicholas @ 05:00

Sarah Hoyt posted this a few days back, but I only noticed it now:

A member of the CIA helps evacuees up a ladder onto an Air America helicopter on the roof of 22 Gia Long Street April 29, 1975, shortly before Saigon fell to advancing North Vietnamese troops.
Hubert van Es photo via Wikimedia Commons.

Yesterday talking to a friend, he said that it seems like we’re living through a shoddy version of the seventies.

But that’s not QUITE it. It’s more complicated. It’s more like we’re living through a performative version of the seventies.

It’s like all the recasting and re-doing of classic movies and series, at this point even those that weren’t particularly successful: it feels like Hollywood is just redoing these things out of some sort of dinosaur brain memory that they were successful. However, the people in charge no longer have any idea why these things were successful or why they resonated or achieved the results they did.

So the re-casts/re-dos sound hollow and strange, and would even if they didn’t use them to push their weird personal current obsessions. (All heroes must be women and black and increasingly of some odd sexual identity! Only villains can be white!) Because the car is there, but the engine is gone metaphorically speaking.

A Boeing CH-47 Chinook transport helicopter appears over the U.S. embassy compound in Kabul, 15 Aug 2021. Image from Twitter via libertyunyielding.com

All these redos and recastings and all are just shells of what the original was. And imbuing them with current wokeness doesn’t make them massively popular, because it doesn’t have that kind of purchase amid the public.

The left and current “Cultural gatekeeping elite” doesn’t seem to be aware of this, or aware of why they fail. In fact, each failure baffles them.

I could be snide, here, and say that it’s because this entire administration, and in fact, the entire upper-crust/controlling layer of our institutions are profoundly untalented theater kiddies, who have no creativity but love the style, and so are trying to do performance of what they think should be there, in the hopes it will work. And are forever baffled it doesn’t.

The truth is not quite that mean, but it rhymes. They are people of a certain frame of mind. In most places and most times, this would make them profoundly “conservative.” Frankly they are, because 100 years into the “progressive” project, those who support it are conservatives. But it’s a weird sort of “conservatism” because what they’re conserving is the cult that tells them if they tear Western civ apart paradise ensues. The whole just-so cult of Marx as filtered through their parents, grandparents and great grandparents.

Part of the whole Marxian philosophy is that it’s a self-contained system, congruent within itself, and with no basis in reality. This makes a certain type of mind susceptible to it. In other centuries they’d be religious fanatics, missionaries to the heathens and zeal-burned puritans.

That type of mind tends to think of things in terms of pre-ordained and fixed narrative, not wildly creative and innovative. That THEY think of themselves as creatives is the insanity of the current system and the Marxian corruption of institutions. They are not actually capable of creativity, only of passing on the received word.

And so we get to the other side of the rerun of the seventies: These kids, by and large, grew up with everything from schools, to TV to even their parents (for the children and grandchildren of boomers) being sold a version of the sixties and seventies in which protesting on the street, behaving badly and destroying property was being passionate and fighting for the voiceless and by itself meant IMPROVING SOCIETY and MAKING THE WORLD A BETTER PLACE.

So the most gullible of this generation are rebels without a clue. They must perform the hit the streets and protest, but they lack the immediacy of the draft to make it personal, and they lack anything like civil rights to make it righteous.

Instead they attach to any stupid cause they can find or which is handed to them by manipulative SOBs. So, you know, it might be saving the endangered Prebles Jumping Mouse, or perhaps saving old buildings, or even well … Lately Occupy Wall Street, BLM, antifidiots and of course pro-Hamass.

May 6, 2024

QotD: Confident cultures … unlike our modern one

Filed under: Britain, History, Law, Quotations, USA — Tags: , , , , , , — Nicholas @ 01:00

A self-confident culture, like the Victorian, can handle ambiguities. It has a healthy respect for hypocrisy, which, as I think Snoop Dogg once said, is the tribute vice pays to virtue. It’s ok with concepts like legal-but-forbidden and illegal-but-tolerated. Prostitution was the former, homosexuality the latter, and so far was it illegal-but-tolerated that feminist icon Naomi Wolff got herself into a spot of bother over it, the kind that only a feminist icon can (i.e. “the kind that even the most basic research would’ve disproven in about five minutes“). The point of the statutes isn’t so much to regulate behavior, as it is to express society’s mores.

Only in the modern period do we feel we need black-letter law for everything. And once we’ve got formal law, of course, the very next thing we do is start carving out penumbras and emanations, because we are so far from a self-confident culture that we must constantly prove to ourselves what clever, clever boys we are …

Severian, “Barely Legal”, Rotten Chestnuts, 2021-06-21.

May 1, 2024

QotD: Entitlement politics

Filed under: Government, Quotations, USA — Tags: , , — Nicholas @ 01:00

My grandparents’ generation thought being on the government dole was disgraceful, a blight on the family’s honor. Today’s senior citizens blithely cannibalize their grandchildren because they have a right to get as much “free” stuff as the political system will permit them to extract … Big government is … [t]he drug of choice for multinational corporations and single moms, for regulated industries and rugged Midwestern farmers, and militant senior citizens.

Janice Rogers Brown, Speech at McGeorge School of Law, 1997-11-21.

April 30, 2024

DNA and India’s caste system

Filed under: History, India, Religion, Science — Tags: , , , , — Nicholas @ 03:00

Earlier this month, Palladium published Razib Khan‘s look at the genetic components of India’s Caste System:

Though the caste system dominates much of Indian life, it does not dominate Indian American life. At slightly over 1% of the U.S. population, only about half of Indian Americans identify as Hindu, the religion from which the broader categories of caste, or varna, emerge. While caste endogamy — marrying within one’s caste — in India remains in the range of 90%, in the U.S. only 65% of American-born Indians even marry other people of subcontinental heritage, and of these, a quick inspection of The New York Times weddings pages shows that inter-caste marriages are the norm. While tensions between the upper-caste minority and middle and lower castes dominate Indian social and political life, 85% of Indian Americans are upper-caste, broadly defined, and only about 1% are truly lower-caste. Ultimately, the minor moral panic over caste discrimination among a small minority of Americans is more a function of our nation’s current neuroses than the reality of caste in the United States.

But this does not mean that caste is not an important phenomenon to understand. In various ways, caste impacts the lives of the more than 1.4 billion citizens of India — 18% of humans alive today — whatever their religion. While the American system of racial slavery is four centuries old at most, India’s caste system was recorded by the Greek diplomat Megasthenes in 300 BC, and is likely far more ancient, perhaps as old as the Indus Valley Civilization more than 4,000 years ago. Indian caste has a deep pedigree as a social technology, and it illustrates the outer boundary of our species’ ability to organize itself into interconnected but discrete subcultures. And unlike many social institutions, caste is imprinted in the very genes of Indians today.

Of Memes and Genes

Beginning about twenty-five years ago, geneticists finally began to look at the variation within the Indian subcontinent, and were shocked by what they found. In small villages in India, Dalits, formerly called “outcastes,” were as genetically distinct from their Brahmin neighbors as Swedes were from Sicilians. In fact, a Brahmin from the far southern state of Tamil Nadu was genetically closer to a Brahmin from the northern state of Punjab then they were to their fellow non-Brahmin Tamils. Dalits from the north were similar to Dalits from the south, while the three upper castes, Brahmins, Kshatriyas and Vaishyas, tended to cluster together against the Sudras.

Some scholars, like Nicholas Dirks, the former chancellor of UC Berkeley, argued for the mobility and dynamism of the caste system in their scholarship. But the genetic evidence seemed to indicate a level of social stratification that echoes through millennia. Across the subcontinent, Dalit castes engaged in menial and unsanitary labor and therefore were considered ritually impure. Meanwhile, Brahmins were the custodians of the Hindu Vedic tradition that ultimately bound the Indic cultures together with other Indo-European traditions, like that of the ancient Iranians or Greeks. Other castes also had their occupations: Kshatriyas were the warriors, while Vaishyas were merchants and other economically productive occupations.

Brahmins, Kshatriyas, and Vaishyas were traditionally the three “twice-born” castes, allowing them to study the Hindu scriptures after an initiatory ritual. The majority of the population were Sudras (or Shudras), India’s peasant and laboring majority. Shudras could not study the scriptures, and might be excluded from some temples and festivals, but they were integrated into the Hindu fold, and served by Brahmin priests. A traditional ethnohistory posits that elite Brahmin priests and Kshatriya rulers combined with Vaishya commoners formed the core of the early Aryan society in the subcontinent, with Shudras integrated into their tribes as indigenous subalterns. Outcastes were tribes and other assorted latecomers who were assimilated at the very bottom of the social system, performing the most degrading and impure tasks.

The caste system as a layered varna system with five classes and numerous integrated jati communities.
Razib Khan

This was the theory. Reality is always more complex. In India, the caste system combines two different social categories: varna and jati. Varna derives from the tripartite Indo-European system, in India represented by Brahmins, Kshatriyas, and Vaishyas. It literally translates as “color”, white for Brahmin purity, red for Kshatriya power, and yellow for Vaishya fertility. But India also has Shudras, black for labor. In contrast to the simplicity of varna, with its four classes, jati is fractured into thousands of localized communities. If varna is connected to the deep history of Indo-Aryans and is freighted with religious significance, jati is the concrete expression of Indian communitarianism in local places and times.

April 25, 2024

Jeremy Black reviews Empireworld by Sathnam Sanghera

Filed under: Books, Britain, History — Tags: , , , — Nicholas @ 03:00

The author of a book on the same topic says that Sathnam Sanghera’s work “really should have devoted more attention to the pre-Western history“:

With its pretensions and authorial conceit, Sanghera’s book is actually rather a good laugh. He apparently is the word and the way for Britain, which “cannot hope to have a productive future in the world without acknowledging what it did to the world in the first place”, a process that is to be done on his terms in order to overcome a British allergy to the unattractive aspects of the imperial past.

Stripped to its essentials, this is a book that repeats well-established themes and serves them up in a familiar fashion. Although 461 pages long, only 247 are text and, with a generous typeface that is a pleasure to read, there is only so much space for his analysis. Unfortunately, that is what is on offer.

It might be thought appropriate to establish what was different or familiar in British imperialism in a Western European context by comparing in detail, say, Britain’s Caribbean empire with those of France, Spain and the Dutch. It might be thought useful to assess Britain as an Asian imperial power alongside Russia or the Ottomans, China or the Persians.

It might be appropriate to follow the direction of much of the world history approach over the last half-century and assess empires as shared projects in which there were many stakeholders, British and non-British. To turn to the British empire, it might be useful to discuss the oldest “colony”, Ireland, or to assess policy in (Highland) Scotland. It could be appropriate to consider how the causes, context, course and consequences of British imperialism varied greatly.

Sanghera has not risen to the challenge. His study is conceptually weak, methodologically flawed, historiographically limited and lacking basic skills in source assessment. This is a pity, as his position as a journalist, and his link with Penguin, provide an opportunity for using his abilities as a communicator to expand public understanding of the subject.

Sanghera criticises “an enervating culture war on the theme of British empire”. He rightly draws attention to the flaws of the “balance sheet” view of British empire, but I am less confident than he is about how best to consider what he terms “a culture war”. The promotion of “understanding” for which he calls is scarcely value-free, nor does he adequately address the degree to which there have always been “culture wars” in both Britain and its colonies and former colonies. Unsurprisingly so, as there were substantive issues at stake, and questions of goal and identity were very much part of the equation.

From reading journalists’ comment pieces, it is hard to avoid the sense that they feel that there is a correct view (theirs, what a surprise) and that others are variously culture wars, populist, ignorant, etc. This is the standard approach to history, notably national history, and, particularly in the case of Britain, empire and slavery. Yet, such a stance scarcely captures the complexities of the issue, a problem very much seen in Sanghera’s work, despite his claim to nuance.

April 24, 2024

QotD: The psychological trap for teens in our social media-saturated era

Filed under: Health, Quotations, USA — Tags: , , , , , — Nicholas @ 01:00

I’ve written a lot on how teenagerhood used to be. In my day, when dinosaurs roamed the earth, it was just given that any one person would “be” many different things over the course of his or her adolescence. I myself was briefly a metalhead, a skater, a jock, a nerd, and a preppie, and I think I’m forgetting a few. And I was far from an outlier. That’s just the way it worked back then, because that’s how you figured out who you really are. There comes a point in every metalhead’s life, for instance, when he realizes that metal kinda sucks. Oh, it’s great for pissing off your parents, but once that’s accomplished, there’s really nowhere else to go with it. So you move on, your scratched Ride the Lightning CD being the only relic of your youthful Metallica phase. And since everyone else is doing the same thing, no one is going to call you out as a hypocrite, lest you come back with “Oh yeah, Jessica, you’re so cool in your cheerleader outfit. Weren’t you a Goth last semester?”

These days, though, your “phases” are all over social media. If you pick one, you’d best be prepared to stick with it permanently. And it can be permanent indeed — ask the kids who decided to “transition” when they were fifteen and are now killing themselves in record numbers, because they weren’t really “transgender”, since that doesn’t actually exist. Given all that, there are only a few “safe” identities for kids to pick, and they’re pretty much all just flavors of SJW. Which is why the #wokeness seems to be coming on so strong. There’s really only one or two “safe” ways to express your “individuality” — you can be an SJW berating other SJWs for insufficient #wokeness in the matter of race, or gender, or perhaps health (vegans and covidians), but that’s about it.

Severian, “Mail Bag / Grab Bag”, Rotten Chestnuts, 2021-06-11.

« Newer PostsOlder Posts »

Powered by WordPress