A “grinder” isn’t merely a guy who studies hard. I knew a dude in college, for instance, who had such phenomenal self-discipline that he’d walk off the basketball court practically in the middle of a game. It’s 7:30, so it’s study time; if we’re still playing at 10, he’ll join back in.
Looking back on it, homeboy was more than a little “on the spectrum”, as the kids say nowadays, but he wasn’t a grinder. Nor do long hours, in themselves, make a grinder — med students, for instance, work in the neighborhood of 60-75 hours a week, but though there are lots of grinders in medical school, not all med students are grinders. Long hours in the lab just go with the territory.
Indeed, actually working hard is almost an exclusion criterion for grinder-ness. Grinders ostentatiously spend many, many hours hitting the books, but it’s almost literally hitting the books. They “work” the Latin way — lots of activity, almost no accomplishment. Put a big honking stack of the largest, mustiest tomes you can find in front of you in your study carrel. Pick one up, flip through it, take one note, then rotate it to the bottom of the stack. Do this for hours on end, always making sure that your stack is flush with the wall, so that everyone in the room can see how many books you have, and how diligently you’re “taking notes”. That’s a grinder.
And cheat your ass off, it goes without saying. In my day, when dinosaurs roamed the earth and “the Internet” was a way for Defense Department nerds to exchange missile schematics with one another, the preferred method was with a graphing calculator. Your dishwasher has more hard disk space than those things, but the mere presence of memory capacity made it ideal for cheating, providing you could come up with some elaborate shorthand to cram all the material in … and providing, of course, you could use it. My friends, you have never seen true comedy until you’ve seen some sweaty Chinese kid begging the teacher to be allowed to use his graphing calculator in Engrish class. Quick, what’s the cosine of MacBeth?
And speaking of begging, that’s the final diagnostic criterion. Have you polished so many apples, your fingers are permanently stained red? Are you so far up the guidance counselor’s ass that you’re banging your skull on her uvula? Have you kissed so much butt, you’ve got a mouth like a lamprey? Would you cheerfully murder your best friend’s dog if it would get you an extra 0.02 on your GPA? Then you, my friend, are a grinder.
Severian, “The Grinder Mindset [expanded]”, Rotten Chestnuts, 2021-06-22.
July 27, 2024
QotD: The academic “grinder”
July 21, 2024
QotD: There’s no recovery mode from being a Basic College Girl
Do you have any examples of BCGs recuperating?
Sadly, very few. Part of this is just in the nature of the biz — I don’t see too many former students out and about, since they all leave College Town for the big wide world — but I do know this: Scratch a Karen, find a BCG. In fact, you could go so far to say that “Karen” simply IS the BCG after she hits The Wall. The faster the impact, the bigger the Karen (this is a testable hypothesis — given that our gal Taylor Swift is currently impacting The Wall at about Mach 3, if I’m right, she’ll soon unleash the kraken of Karens on an unsuspecting world).
I also strongly suspect that BCGs can’t recover. As any shrink will tell you, Narcissistic and Borderline Personality Disorders are almost impossible to treat. For one thing, treatment requires believing that you have a problem, and believing you don’t have a problem is pretty much diagnostic of those two syndromes. And while I’m not sure the BCG is clinically diagnosable with either of those, what they actually are is close enough that I’m betting whatever therapies “work” on actual clinical cases would “work” on them … but see above.
Finally, I guess I can’t really blame the BCG for not realizing she’s got a problem, because she obviously doesn’t have a problem. Look around — society rewards this shit. AOC, for example, is going to be La Presidenta por Vida de los Estados Unidos here in a decade or so; if that’s a problem, I can’t really blame them for not fixing it. Eventually, of course, reality will intrude, and your BCG will be screaming for a real man to come save her … but, thanks to her BCG antics, there won’t be any real men around. Or, you know, we’ll all be in the OPFOR, so good luck with that, beeyatch.
Severian, “Friday Mailbag /Grab Bag”, Rotten Chestnuts, 2021-06-25.
July 20, 2024
Counting citation numbers in “Chomskys”
The latest anonymous reviewer in Astral Codex Ten‘s “Your Book Review” series considers the work of Noam Chomsky, and notes just how his works dominate the field of linguistics:
You may have heard of a field known as “linguistics”. Linguistics is supposedly the “scientific study of language“, but this is completely wrong. To borrow a phrase from elsewhere, linguists are those who believe Noam Chomsky is the rightful caliph. Linguistics is what linguists study.
I’m only half-joking, because Chomsky’s impact on the study of language is hard to overstate. Consider the number of times his books and papers have been cited, a crude measure of influence that we can use to get a sense of this. At the current time, his Google Scholar page says he’s been cited over 500,000 times. That’s a lot.
It isn’t atypical for a hard-working professor at a top-ranked institution to, after a career’s worth of work and many people helping them do research and write papers, have maybe 20,000 citations (= 0.04 Chomskys). Generational talents do better, but usually not by more than a factor of 5 or so. Consider a few more citation counts:
- Computer scientist Alan Turing (65,000 = 0.13 Chomskys)
- Neuro / cogsci / AI researcher Matthew Botvinick (83,000 = 0.17 Chomskys)
- Mathematician Terence Tao (96,000 = 0.19 Chomskys)
- Cognitive scientist Joshua Tenenbaum (107,000 = 0.21 Chomskys)
- Nobel-Prize-winning physicist Richard Feynman (120,000 = 0.24 Chomskys)
- Psychologist and linguist Steven Pinker (123,000 = 0.25 Chomskys)
- Two-time Nobel Prize winner Linus Pauling (128,000 = 0.26 Chomskys)
- Neuroscientist Karl Deisseroth (143,000 = 0.29 Chomskys)
- Biologist Charles Darwin (182,000 = 0.36 Chomskys)
- Theoretical physicist Ed Witten (250,000 = 0.50 Chomskys)
- AI researcher Yann LeCun (352,000 = 0.70 Chomskys)
- Historian and philosopher Hannah Arendt (359,000 = 0.72 Chomskys)
- Karl Marx (458,000 = 0.92 Chomskys)
Yes, fields vary in ways that make these comparisons not necessarily fair: fields have different numbers of people, citation practices vary, and so on. There is also probably a considerable recency bias; for example, most biologists don’t cite Darwin every time they write a paper whose content relates to evolution. But 500,000 is still a mind-bogglingly huge number.
Not many academics do better than Chomsky citation-wise. But there are a few, and you can probably guess why:
- Human-Genome-Project-associated scientist Eric Lander (685,000 = 1.37 Chomskys)
- AI researcher Yoshua Bengio (780,000 = 1.56 Chomskys)
- AI researcher Geoff Hinton (800,000 = 1.60 Chomskys)
- Philosopher and historian Michel Foucault (1,361,000 = 2.72 Chomskys)
…well, okay, maybe I don’t entirely get Foucault’s number. Every humanities person must have an altar of him by their bedside or something.
Chomsky has been called “arguably the most important intellectual alive today” in a New York Times review of one of his books, and was voted the world’s top public intellectual in a 2005 poll. He’s the kind of guy that gets long and gushing introductions before his talks (this one is nearly twenty minutes long). All of this is just to say: he’s kind of a big deal.
[…]
Since around 1957, Chomsky has dominated linguistics. And this matters because he is kind of a contrarian with weird ideas.
July 15, 2024
July 5, 2024
“Private property rights? How do they work?” (U of T students, probably)
In The Line, Josh Dehaas rounds up the concept of private property rights for the University of Toronto students (and non-student antisemitic fellow occupiers) who have been squatting for Palestinian terrorists on university property for the last while:
After Justice Koehnen delivered his ruling Tuesday ordering the occupiers to dismantle the People’s Circle for Palestine at the University of Toronto, one of the protesters accused the school of hypocrisy.
“It’s quite interesting that a university that claims to practice decolonization is falling back on this claim of private property,” master’s student Sarah Rasikh told a journalist on the day before the students began taking down their tents.
“U of T and the Court more specifically is quite literally telling Indigenous students to leave and get off of their own land,” she added.
Rasikh has a point, sort of.
As someone who did law school relatively recently, I can attest that many university professors are downright hostile to the concept of private property. They commonly claim that all of Canada belongs to Indigenous people and that Indigenous peoples don’t believe in private property. Rather, they believe in “sharing”. Decolonization therefore requires that land be treated communally, or so the theory goes. University administrators who pay lip service to the concept of decolonization shouldn’t be surprised when students try to turn theory into action.
Thankfully the law still protects private property rights. Students who didn’t get taught how that works by their professors ought to give Justice Koehnen’s decision a read.
As Justice Koehnen explained, “in our society we have decided that the owner of property generally gets to decide what happens on the property”.
“If the protesters can take that power for themselves by seizing Front Campus, there is nothing to stop a stronger group from coming and taking the space over from the current protesters,” he went on. “That leads to chaos. Society needs an orderly way of addressing competing demands on space. The system we have agreed to is that the owner gets to decide how to use the space.”
“If it is not the owner who gets to determine what happens on the property it will become a brutal free-for-all,” Justice Koehnen added.
July 2, 2024
July 1, 2024
The Anglosphere “imported American racial progressivism, and then commenced to import American-style racial problems. Thanks, America.”
At Postcards From Barsoom, John Carter discusses meritocratic racial quotas in employment and higher education as a “Universally Disagreeable Compromise”:
The race question has been a fault line in American society from its inception. In the aftermath of the hypermigration of the early twenty-first century, it has only become more complicated and divisive, not only in America, but throughout the Anglospheric world. The rest of us imported American racial progressivism, and then commenced to import American-style racial problems. Thanks, America.
The question seems to ultimately revolve around who shall receive the economic spoils. The “equity” that is endlessly referenced by diversity commissars is literally the home equity held by the white middle class, which the diverse and their champions openly intend to expropriate and redistribute.
The most contentious battlegrounds are in academic admissions and corporate hiring, in which the imperative is to minimize the number of White men, and maximize everything that isn’t White men. How the everything else is maximized is of no particular account. A team composed entirely of black men is just as “diverse” as a team which also features Black lesbians, Arab homosexuals, and Thai ladyboys. It is the presence of White men that makes organizations less diverse: a team composed entirely of Black men, with the exception of a solitary White male token, is less diverse than the all-Black team.
For generations now we have suffered under the affirmative action regulations imposed under the banner of Civil Rights. For proponents, Civil Rights are a civic religion, and they guard the advantages won by adherence to their faith jealously. For the victims of affirmative action – which includes both those rejected from employment or university, as well as those subjected to the incompetency of affirmative action admits and hires – affirmative action is a hateful absurdity.
The underlying problem, which to this day only Internet edgelords will openly discuss, is human biodiversity. The various ancestral groups are, in fact, different, in ways that go beyond the merely cosmetic, to include general levels of cognitive aptitude, along with specific behavioural proclivities. To a certain degree this is due to upbringing, but only to a certain degree; upbringing can bring a child as close to his genetic potential as possible, but cannot push him beyond it. The best that nurture can do is to allow nature to flower; it cannot change nature. The natural outcome of this is that, under a purely race-blind, meritocratic dispensation, there will be noticeable and ineradicable differences in the representation of various races within any given profession.
Whether or not one supports a purely meritocratic approach to admissions and hiring then tends to depend a lot on whether one belongs to a group that is likely to do well, or poorly, under such a system. East Asians tend to support a more meritocratic approach, because their high test scores, good study habits, and strong work ethic mean that they will be extremely competitive. Blacks, on the other extreme, are far more skeptical of meritocracy, intuiting that a ruthlessly meritocratic approach would tend to see them pushed out of the professions at the expense [or rather, to the benefit] of Whites, Asians, and Indians.
The current system is practically the worst possible system. The official narrative is built upon the foundational lie that we are all the same under the skin, and that any difference in group-level socioeconomic outcome can only be the result of bigotry, racism, systemic racism, implicit bias, and the historical consequences of slavery or colonialism. This lie has driven our society quite insane, leading in particular to the demonization of Whites – a large fraction of whom buy into the narrative of ethnomasochistic guilt with religious zeal, and another large fraction of whom reject this framing of their racial character as sick and ugly. To a large degree the culture wars are driven by this very division. In the American context, this division maps quite closely to Constitutionalists vs Civil Rights adherents, i.e. it is a holy war between the two dominant civic religions. It is not accidental that this also maps to Republican (i.e. those who wish to preserve the Old Republic built by the Constitution) vs Democrat (i.e. those who wish to complete the transformation of the Republic into something [like] the Our Democracy they’ve been growing in the soil of Civil Rights).
As William M Briggs has pointed out ad nauseum, the prohibition of “disparate impact” and “discrimination” under the Civil Rights regime is an absolute nightmare for corporate America. On the one hand, to discriminate on the basis of race (or any other identity) is plainly illegal; on the other, to not discriminate is invariably to open oneself to charges of discrimination, as the various statistical differences between racial groups work themselves out in aptitude tests, SATs, grade point averages, or job performance. This places employers in the Kafkaesque position of being required to discriminate without being seen to discriminate. They must put their thumbs on the scale to ensure equal outcomes, without being caught doing so.
For Whites especially, this has been a very bad deal. Because no organization will ever be sued for taking on too many officially victimized minorities, there is no upper limit to the number of diversity hires; but if the student body or corporate org chart falls below a given group’s fraction of the population, lawsuits are almost guaranteed. This then produces an inevitable ratchet effect which systematically excludes White people from their own society, with corrosive effects on competence, morale, and confidence in institutions. It doesn’t help that, because we are still officially meritocratic, the leadership classes subject us all to constant gaslighting: we are discriminated against openly by people who brag about discriminating against us while insisting in the same breath that there is no discrimination. It is not surprising that many of us are ready to burn these people at the stake.
Welcome to the “Omnicause” (aka “the Fatberg of Activism”)
Helen Dale first encountered the Omnicause as a university student council member:
For my sins — in 1991 — I spent a year on the University of Queensland Student Union Council. Yes, I was elected, which means I was a volunteer. It ranks up there among the more pointless activities I’ve undertaken. I was 19, that’s my excuse.
Because I’m conscientious, I took it seriously. I turned up to the monthly meetings. I researched the motions to be debated and voted on in advance. I tried to say not-stupid-things when I thought it was worth making a comment. One side benefit: I learnt meeting procedure.
I also had my first encounter with the Omnicause.
Every single student union council meeting had a Palestine motion, sometimes more than one. These were long, detailed, and competently drafted. They routinely dominated more typical student union fare: budgetary allocations to fix the Rec Club roof, say, or complaints about tuition fees. I wondered what the union’s employed secretarial staff thought of typing up and then photocopying pages upon pages of tedious detail about Middle Eastern geopolitics. I remember picking up copies of both minutes and agendas and boggling at the amount of work involved.
There, in miniature — in sleepy meetings in hot rooms where dust particles danced in stray sunbeams as those of us reading law or STEM subjects tried to make sense of it all — was the Omnicause we now see in campuses all over the developed world. My earliest memories of it involve Aboriginal activists describing Australia as a “settler-colonial state” which had been “invaded” — just like Israel. Australia also had no right to exist.
During one meeting, a Palestine-obsessive buttonholed an engineering student known for his commitment to conservation, bending his ear about the Nakba. I misunderstood the exchange, and congratulated my Greens fellow councillor on recruiting a new party member.
“I’m not sure we want her,” he said. “She doesn’t know or care about the environment, just this Israel thing.”
Already, in 1991, the infant Omnicause had learnt to crawl. It was possible to see — albeit dimly — what would happen to genuine conservationists as single-issue lunatics took over their movement and rotted its political party from within. Darren Johnson — whom I’d call a “Green Green” — and his cri de coeur captures the process well:
Terrible haircut I know, but here’s me in the Hull Daily Mail running for the Green Party in 1990. I stood on a platform of male rapists in female prisons, hormone drugs for 10yos and rebranding women as uterus-owners. No, don’t be silly, it was housing, environment & poll tax.
Darren Johnson, recall, was the UK Green Party’s former principal speaker, its first-ever London councillor, twice its London mayoral candidate, and is a former chair of the London Assembly.
The Greens in both Australia and the UK have become a vector for much of the worst nonsense: trans and Gaza and chucking orange paint around an art gallery near you have displaced saving the Fluffy Antechinus1 or improving biodiversity, quite apart from anything else. Trans, in my view, is also part of the Omnicause, albeit a junior partner. Like Palestine, it’s capable of colonising major political movements focussed on something else entirely, as this (justifiably angry) supporter of Scottish independence points out.
1. This animal does not exist, although the Antechinus does.
June 18, 2024
June 17, 2024
Farm Camp for city slickers
In The Free Press, Larissa Phillips explains the kind of things city children (and their parents) learn when they stay at her family’s small farm for a week or a weekend:
Here are some things I have taught the kids who visit my farm: animals don’t care about your feelings, and sometimes we kill them to eat them. It doesn’t matter how desperately you want to find more eggs, the hens don’t lay on demand. Tomatoes aren’t ripe in June. The stalls aren’t going to clean themselves. Cuts, scrapes, and stings aren’t really a big deal. And there will always be poop.
I’m often struck by what city kids don’t know when they turn up at the education program I run for families on our 15-acre hobby farm — Honey Hollow Farm — in the Upper Hudson Valley. As a longtime urbanite, I get it. I lived in Brooklyn for 15 years before my husband and I moved upstate in 2010 with our two young children and one goal: start a farm. We kept horses and ponies for fun and raised poultry and sheep — and sometimes pigs — for food.
It was hard. Slaughtering animals we’d raised since they were babies was wrenching. Breeding and birthing those babies was dicey, too. But these experiences toughened us up. Working with animals and the land and the seasons was grounding — and the best antidote to anxiety I’d ever found. And most of it was fun.
I wanted to share this outlook with other families, even if it was just for a weekend. So at the start of the pandemic, I opened our guest cottage — and set up an informal curriculum to teach escaping urbanites what I’d learned.
I called it farm camp.
We host one family at a time, all through the year, in a renovated barn apartment overlooking the pony pasture. Most come for a week, some for a weekend. Every morning I’ll take a handful of kids, sometimes as young as three, through a two-hour, hands-on class on animal care, life, death, poop. All of them have to do some real farmwork.
There is a lot to learn. I don’t expect a child to know how long it takes for a chick to hatch, or why the roosters are always jumping on top of the hens. But I am often surprised by some of the straightforward things they don’t know how to do. Like how to pull a wagon around a corner, hold a shovel, climb over a gate, make a braid, or tie a knot.
Don’t get me wrong — I love offering explicit instructions on the most mundane tasks, then standing back and cheering when a kid does it independently. But two generations ago, these skills would have been common knowledge. For most of human history, the proportion of the world’s population living in cities was below 5 percent. It’s at 56 percent now. By the time today’s toddlers reach adulthood, it is expected that 80 percent of humans will live in urban areas.
Overprotected as they are, a lot of city kids are missing out on so many important encounters with material reality: with death or danger or manual labor. These encounters can be unpleasant, even painful. It’s understandable that we want to save our children from them. But they lose something essential when we do.
Most urbanites have a very sanitized — in fact, Disneyfied — view of rural life and especially life on a farm. It can be traumatic to discover that all the animals aren’t like you saw in the cutesy cartoons as a preschooler …
June 14, 2024
When propaganda wins over historical facts, Ontario public schools edition
To someone of my generation (late boomer/early GenX), the history of the Residential School system was taught, at least superficially, in middle school. Along with the early settlement of what is now Canada by the French and later the English (with a very brief nod to the Vikings, of course), we got a cursory introduction to the relationships among the European settlers and explorers and the various First Nations groups they encountered. It wasn’t in great depth — what is taught in great depth in middle school? — but we got a rough outline. In my case, details about the Residential School system came more from a “young adult” novel about a young First Nations student running away from his school and trying to find his way back to his home and family. My best friend in school had First Nations ancestry, so I felt a strong desire to understand the book and the system and culture portrayed in it.
If, in the early 1970s, the Ontario school system taught at least a bit about the history of the First Nations peoples, how is it possible that they stopped doing so and my son’s generation were utterly blindsided by the sensationalist treatment of the students at a particular Residential School in British Columbia? And as a result, were far more credulous and willing to believe the worst that the “anticolonialist” propagandists could come up with.
“Igor Stravinsky” is a teacher in the Ontario school system who writes under a pseudonym for fairly obvious reasons, as he’s not a believer in the modern narrative about the history of First Nations children in the Residential School system:
This will be my last instalment of this series. I have attempted to shed light on the poor quality of information students are receiving in Ontario schools with regard to Indigenous history and current issues. It is important to note that this is being done intentionally. It is to the advantage of the leaders of the Indigenous Grievance Industry to characterise Canada and the pre-Canadian colonies of this land as genocidal oppressors, and our politicians have exploited this situation for crass political gain. This was perhaps epitomised by Prime Minister Justin Trudeau’s photo op of himself holding a teddy bear in the proximity of a soil disturbance in a field at the site of a former residential school in Cowessess First Nation, Saskatchewan on Tuesday, July 6, 2021:
Are there actually human remains there? If so, of whom? Is this evidence of any kind of foul play? These are questions he was not about to bother to ask. Why would he, when such a golden opportunity to score political points presented itself?
We now know all this murdered Indigenous children stuff was a big hoax but don’t hold your breath waiting for Trudeau to issue an apology for staining the international reputation of Canada and triggering a knee-jerk vote by our Parliament declaring Canada a genocidal state and adopting the The United Nations Declaration on the Rights of Indigenous People (more on that below). Undoing all this damage will be a herculean task.
Just as students are fed simplistic, misleading, and false information about the past with regard to Indigenous people (the focus being the Indian Residential Schools) they are being presented with the point of view that human rights violations against the Indigenous people are ongoing, and are the reason for the poor quality of life in which such a disproportionate number of Indigenous people find themselves.
The claim of generational trauma
On Apr. 27, 2010, speaking as chair of the Truth and Reconciliation Commission and for the people of Canada, Sinclair told the Ninth Session of the United Nations Permanent Forum on Indigenous Issues: “For roughly seven generations nearly every Indigenous child in Canada was sent to a residential school. They were taken from their families, tribes and communities, and forced to live in those institutions of assimilation.”
This lie is promoted in the schools. It is the foundation of the generational trauma claim but in fact, during the IRS era, perhaps 30% of Status Indians (you can cut that figure in half if you include all people who identify as Indigenous) ever attended, and for an average of 4.5 years.
Even if it were true that most Indigenous people who attended the IRS suffered trauma, there is no evidence or logical reason to believe that trauma could be transferred down the generations. If generational trauma is a thing, why have the descendants of the victims of the holocaust been doing so well?
If there is generational trauma, the culprit is alcohol. Alcohol abuse has been a major problem in Indigenous communities since first contact but rarely comes up these days, certainly not in schools. Fetal Alcohol Syndrome (FAS), which occurs when a mother consumes alcohol during pregnancy, is also a major problem and the children born with it suffer from mental and emotional challenges throughout their lives. It impacts their social life, education and work. Girls who suffer from the condition all too often end up drinking during pregnancy themselves and the cycle continues.
June 8, 2024
QotD: Teaching military history
In addition to the low regard that military history is sometimes held in from outside of the field, there is also an odd tension in being a life-long civilian who studies and teaches on military history. It often means teaching military topics to students (or readers) who have personal military experience. I have, of course, heard it suggested that military history ought not be studied by non-veterans, or that a civilian academic simply cannot provide any useful perspective on military activity without military experience (though I should note, I have never heard that opinion expressed by someone I knew to be a combat veteran themselves). And while obviously I do not find this argument persuasive, or I wouldn’t do the job I do, I also have to admit that on a fundamental level I will always be on the civilian side of the “civilians do not understand” gap that is discussed so frequently, particularly in the experience of veterans coming home.
At the same time, in the context of the discipline of history, this complaint is patently absurd. No Roman historian has ever bought garum at the market with sestertii, nor voted in the Roman comitia centuriata, or any experienced any of a nearly infinite number of the daily activities of life in ancient Rome. The same is obviously fundamentally true of literally any history that takes place before living memory. The closest we can ever come is something like experimental archaeology, trying out historical methods and objects and while that method is an important tool, especially for the pre-modern period, it is far from the only way to do history and not necessarily the best. So of course historians study things they have no personal experience of. That’s what history is.
Teaching military history to students either bound for the military or who have military experience is actually one of the most rewarding things I have gotten to do as an academic. In this sense I have been remarkably fortunate in a lot of my teaching, which has been at large state universities in North Carolina and Florida. Both states are well above the United States population-adjusted average for the percentage of veterans in the state and I get the sense that – though I have no hard data on this (so I may be wrong) – veterans tend to matriculate through public universities at higher rates than at smaller private liberal arts college. Moreover, every university I have taught at this far has a significant ROTC program.
Consequently, I am pretty accustomed to having both veterans back from abroad in my class, as well as students who expect to commission at the end of their college experience, along with some students who are active-duty military personnel while they are taking my classes. This is especially true (no surprise) in military history classes, as one might guess. It was not uncommon, in a 45 or 55 student section of a Global Military History survey to have the complete military-career-cycle present (though of course the ROTC students would be commissioning as officers, while the active-duty and veteran students were enlisted personnel and that is a meaningful difference). Of course those students were then side-by-side with students who have no plans to ever be in the military.
It is true that there is sometimes a higher bar of “proving” yourself to the students in those situations before they begin to trust you (as anyone who so much as looks at me knows I have never served in a military), though I would note that the hardest students to reach in this regard have always been the ROTC students (rather than active duty or veteran students), who ironically have no more experience of combat than I do. At the same time, those students are choosing to be in your class because they think you have something to say on the topic and clearing the bar of “this guy knows what he’s talking about” has never been a real problem for me. If you know your business and show that you take the subject seriously, the matter resolves itself.
Bret Devereaux, “Collections: Why Military History?”, A Collection of Unmitigated Pedantry, 2020-11-13.
June 5, 2024
May 31, 2024
“You only support that because it’s in your self-interest to do so”
Helen Dale considers the painful notion that political ideas that work for the “elite” (defined in various ways) may not work at all for people unlike members of any given “elite”:
When I reviewed Rob Henderson’s Troubled for Law & Liberty at Liberty Fund, I made this observation:
The reality that classical liberalism — the closest to my own political views, I admit — has at least a whiff of the luxury belief around it stings. It’s discomforting to acknowledge that what goes by the name of paternalism has its own intellectual pedigree, while liberalism can be a system developed by the clever, for the clever. “Highly educated and affluent people are more economically conservative and socially liberal,” Henderson says. “This doesn’t make sense. The position is roughly that people shouldn’t have to adhere to norms and if/when they inevitably hurt themselves or others, then there should be no safety net available. It’s a luxury belief.”
[…]
Joseph Heath […] uses the phrase “self-control aristocracy” to describe those who really do benefit from maximal freedom. These are people who can make better choices for themselves than any authority could make on their behalf. When the state or large corporates boss them (us) around, they (we) get really bloody annoyed. They (we) know better!
Heath’s phrase is simply a layman’s term for the personality trait various formal tests measure, and which overlaps with executive function to a considerable but as yet unknown degree.
Because I am self-conscious about my membership in the self-control aristocracy, I am acutely aware of the fact that, when I think about questions of “individual liberty” in society, I come to it with a particular set of class interests. That is because I stand to benefit much more from an expansion of the space of individual liberty than the average person does – because I have greater self-control. So I recognize that, while a 24-hour beer store would be great for me, it would be a mixed blessing for others […]
What does this have to do with libertarianism? It is important because every academic proponent of libertarianism – understood loosely, as any doctrine that assigns individual liberty priority over other political values – is a member of the self-control aristocracy. As a result, they are advancing a political ideal that benefits themselves to a much greater extent than it benefits other people. In most cases, however, they do so naively, because they do not recognize themselves as members of an elite, socially-dominant group, that stands to benefit disproportionately. They think of liberty as something that creates an equal benefit for all.
My response to reading Professor Heath’s piece was simplicity itself: I feel seen. I’ve even done the night school thing while working full-time. I’ve written books and chosen to play sports that require a long time and lots of skill to master. I retired at 45.
Politically, I’m not a libertarian. Libertarianism is a distinctive and largely American ideology (as the recent and bonkers fracas at its US Convention indicates) with philosophically unusual deontological roots. I am, however, within the British and French tradition of classical liberalism (which does assign individual liberty priority over other political values). And like many classical liberals I’ve been blind to problems of laws and governance for people unlike me.
I disclose this because I’ve worked in policy development in both devolved and national parliaments. I’ve probably given politicians and civil servants alike dud advice. There is almost certainly a shit policy out there (in either Scotland or Australia) with my name on it. However, this mind-blindness doesn’t only apply to people who advocate libertarian politics. I think it applies to a significant number of political ideologies just as strongly as it does to libertarianism.
That is, the ideology serves the inherited personality traits of those who promote it. “You only support that because it’s in your self-interest to do so” always struck me as a genuinely mean criticism of people who were involved in politics and policy (I may have been one of those people, natch). The problem — as I’ve been forced to accept — is that it’s true.
May 25, 2024
“Education” versus “learning”
At Astral Codex Ten, Scott Alexander discusses some of the ideas from Bryan Caplan’s book The Case Against Education:
Education isn’t just about facts. But it’s partly about facts. Facts are easy to measure, and they’re a useful signpost for deeper understanding. If someone has never heard of Chaucer, Dickens, Melville, Twain, or Joyce, they probably haven’t learned to appreciate great literature. If someone can’t identify Washington, Lincoln, or either Roosevelt, they probably don’t understand the ebb and flow of American history. So what facts does the average American know?
In a 1999 poll, only 66% of Americans age 18-29 knew that the US won independence from Britain (as opposed to some other country). About 47% of Americans can name all three branches of government (executive, legislative, and judicial). 37% know the closest planet to the sun (Mercury). 58% know which gas causes most global warming (carbon dioxide). 44% know Auschwitz was the site of a concentration camp. Fewer than 50% (ie worse than chance) can correctly answer a true-false question about whether electrons are bigger than atoms.
These results are scattered across many polls, which makes them vulnerable to publication bias; I can’t find a good unified general knowledge survey of the whole population. But there’s a great survey of university students. Keeping in mind that this is a highly selected, extra-smart population, here are some data points:
- 85% know who wrote Romeo and Juliet (Shakespeare)
- 56% know the biggest planet (Jupiter)
- 44% know who rode on horseback in 1775 to warn that the British were coming (Paul Revere)
- 33% know what organ produces insulin (pancreas)
- 31% know the capital of Russia (Moscow)
- 30% know who discovered the Theory of Relativity (Einstein)
- 19% know what mountain range contains Mt. Everest (Himalayas)
- 19% know who wrote 1984 (George Orwell)
- 16% know what word the raven says in Poe’s “The Raven” (“Nevermore!”)
- 10% know the captain’s name in Moby Dick (Ahab)
- 7% know who discovered, in 1543, that the Earth orbits the sun (Copernicus)
- 4% know what Chinese religion was founded by Lao Tse (Taoism)
- <1% know what city the general Hannibal was from (Carthage)
Remember, these are university students, so the average person’s performance is worse.
Most of these are the kinds of facts that I would expect school to teach people. Some of them (eg the branches of government) are the foundations of whole subjects, facts that I would expect to get reviewed and built upon many times during a student’s career. If most people don’t remember them, there seems to be little hope that they remember basically anything from school. So what’s school even doing?
Maybe school is why at least a majority of people know the very basics – like that the US won independence from Britain, or that Shakespeare wrote Romeo and Juliet? I’m not sure this is true. Here are some other questions that got approximately the same level of correct answers as “Shakespeare wrote Romeo and Juliet“:
- What is the name of the rubber object hit by hockey players? (Puck, 89%)
- What is the name of the comic strip character who eats spinach to increase his strength? (Popeye, 82% correct)
- What is the name of Dorothy’s dog in The Wizard of Oz? (Toto, 80% correct)
I don’t think any of these are taught in school. They’re absorbed by cultural osmosis. It seems equally likely that Romeo and Juliet could be absorbed the same way. Wasn’t there an Academy-Award-winning movie about Shakespeare writing Romeo and Juliet just a decade or so before this study came out? Sure, 19% of people know that Orwell wrote 1984 – but how many people know the 1984 Calendar Meme, or the “1984 was not an instruction manual!” joke, or have heard of the reality show Big Brother? Nobody learned those in school, so maybe they learned Orwell’s name the same place they learned about the other 1984-related stuff.
Okay, so school probably doesn’t do a great job teaching facts. But maybe it could still teach skills, right?
According to tests, fewer than 10% of Americans are “proficient” at PIIAC-defined numeracy skills, even though in theory you need to know algebra to graduate from most public schools.
I took a year of Spanish in middle school, and I cannot speak Spanish today to save my life; that year was completely wasted. Sure, I know things like “Hola!” and “Adios!”, but I also know things like “gringo” and “Yo quiero Taco Bell” – this is just cultural osmosis again.
So it seems most people forget almost all of what they learn in school, whether we’re talking about facts or skills. The remaining pro-school argument would be that even if they forget every specific thing, they retain some kind of scaffolding that makes it easier for them to learn and understand new things in the future; ie they keep some sort of overall concept of learning. This is a pretty god-of-the-gaps-ish hypothesis, and counterbalanced by all the kids who said school made them hate learning, or made them unable to learn in a non-fake/rote way, or that they can’t read books now because they’re too traumatized from years of being forced to read books that they hate.
It’s common-but-trite to encounter people who say things like “I love learning, but I hated school” — I’ve undoubtedly said that myself many times. A weird experience was having to study a book in school that I’d already read on my own: it was like an early form of aversion therapy … here’s something you loved once, let’s make you hate it now.