Quotulatiousness

November 4, 2024

Violence and sex differences

Filed under: Health — Tags: , , , , — Nicholas @ 04:00

Lorenzo Warby discusses some basic biological differences between men and women and how those differences account for much of the variance in violent behaviour:

Human anatomy fundamentals: advanced body proportions
design.tutsplus.com

(Note on usage: Sex is biological — i.e., which gametes a body is structured to produce. Sex roles are the behavioural manifestation of sex. Gender is the cultural manifestation of sex.)

Adult human males have, on average, about twice the lean upper body mass of adult human females. This means that adult human females have, on average, 52 per cent of the upper body strength of adult human males.

The consequence of this is that men dominate violence between adults. They dominate victims — another male is far more likely to be a physical threat or obstacle than a woman. Men even more strongly dominate perpetrators.

A Swedish study found that one per cent of the population committed almost two-thirds of all violent crimes. That one per cent was almost entirely male. Four per cent of the population committed all the violent crimes. That four percent was almost 90 per cent male1 and constituted just over seven per cent of the male population.

These patterns of behaviour do not require any “hard wired” differences by sex in human brains. They merely require that men have about twice the upper body strength of women. They represent strategic behaviour within that context.

Indeed, these results are not compatible with sex-differentiation being strongly “hard-wired” in brains by sex. The overwhelming majority of men do not commit any violent crimes, while some of the perpetrators —almost eleven per cent — were female.

What makes it even clearer that these patterns represent strategic behaviour—that is, responses grounded in (biological) constraints and capacities — is that men and women each make up about half the perpetrators of violence against children.

When women are dealing with the physically stronger sex, they are much less likely to use violence than is the physically stronger sex. When they are dealing with a systematically weaker group of Homo sapiens — children — they are as likely to be perpetrators of violence as men.

These patterns represent strategic behaviour. They represents actions responding to constraints and capacities. You get sex-differentiated patterns when the constraints are different between men and women. Our sex-differentiated biology is enough, on its own, to produce sex-differentiated patterns of behaviour.

So, even in (then) peaceful Sweden, one in 14 men are violent. That a significant proportion of men are violent predators informs female behaviour, as the systematically physically weaker sex.

Men dominate sexual violence because they are physically stronger, have penises and cannot get pregnant. That is enough to have men dominate sexual violence without any sex differentiation in the “hard-wiring” of brains at all.

We are embodied agents. How we are embodied makes a difference for our behaviour.

Women have, on average, half the lean upper body mass as men not so much because they are smaller—the average differences in height and weight are nowhere near as large. A much more significant factor is that women have a higher fat content to their body, especially their upper body.

They have a higher fat content because human brains are energy hogs, and women are structured to be able to support not just one, but two or more, energy-hog brains — i.e. babies and toddlers. More fat means more readily-available stored energy. That extra female fat enables us Homo sapiens to be the most body-shape dimorphic of the primates: far more so than any of our ape cousins.

This goes to the other biological constraint that produces sex-differentiated behaviour. Women can get pregnant, men cannot. The risk profile differs for men and women, and not just for the risks of pregnancy and childbirth but also for child-rearing.


    1. The text of the paper and the summary table have different numbers for female offenders. As the text states that 10.9 per cent of offenders were females, which agrees with the table but not the figures given in the text, I have corrected accordingly. Fortunately, it does not affect the logic being presented.

October 8, 2024

QotD: The competitive instinct

Filed under: Gaming, Quotations, Soccer, Sports, USA — Tags: , , , , , , — Nicholas @ 01:00

I once saw an interview with basketball player Charles Barkley, in which he discussed his retirement. Barkley was a Hall of Fame player, and like most of those guys, he hung on a few seasons too long. Even having lost a step or three, Sir Charles was still a decent player, but that’s all he was — a decent player, but getting paid like a superstar and with a superstar’s reputation. A few seasons after retiring, he admitted as much. He said something like (from memory) “I’d guard a guy and think, ‘this is going to be easy, this guy is terrible’. And then he’d beat me, and I’d realize I just got beat by some guy who’s terrible, and then I knew it was time to hang it up.”

One thing chicks of both sexes and all however-many-we’re-up-to genders don’t realize these days is how competitive men — actual biological males — are hardwired to be. Things like World of Warcraft and fantasy football only exist because the genius who invented those figured out a way to tap into that heretofore-unexpressed male competitiveness. And indeed, it’s the guy who’d never even dream of putting on shoulder pads who’s the most insanely competitive guy in a fantasy football league or (I’m certain) a whatever-they’re-called in World of Warcraft. Even the uber-dorks in the Math Club and the Speech and Debate Society went after each other like Mickey Ward and Arturo Gatti. It’s just how guys are … or, at least, how guys used to be.

[…]

When it comes right down to it, that’s why men of a certain age simply don’t get “women’s sports”. Few will be as crustily chauvinistic as yer ‘umble narrator, and come right out and say it, but here goes: Women’s “sports” are just a shoddy knockoff of the real thing, because women just aren’t wired that way. That’s not to say that there aren’t competitive women, or athletic women — obviously there are, some very athletic and very competitive — but the female of the species just isn’t wired to put in the work the way males are. When faced with the prospect of three straight hours in the batting cage, swinging at curve after curve until your blisters have blisters and your shoulders feel like they’re falling out of their sockets, most women will quite sensibly ask “why bother?” Competition-for-competition’s-sake, even when it’s only against yourself in those long, long, looooong hours in the cage, just doesn’t motivate them the way it does us.

Which is why a person’s reaction to Simone Biles, or the USA Women’s soccer team, or the WNBA, or what have you is an almost perfect predictor of their age, not just their “gender”. I judge sports as sports. I don’t care about soccer, but if I did, I’d care about it as soccer — meaning, I’d want to see the best possible players, playing at the highest possible level. Women’s Olympic teams — that is to say, all star teams, the very best players — routinely get smoked by teams of 15 year old boys. Sir Charles is pushing sixty, but he could dominate the WNBA right now, in street clothes. Obviously this doesn’t apply to Pee Wee or rec leagues, but if you’re going to take a paycheck for doing it, then I want to see exactly what I paid for.

In estrogen-drenched, synchronized-ovulation Clown World, it’s all about appearances. Sure, she let her team down and wussed out (while still talking up how great she is), but can’t you see that it gave her the sadz? Sure, Megan Rapinoe et al keep getting smoked by 14 year old boys, then choking in international competition, but can’t you see her out there, with her pink hair and her tats and her Strong, Confident Empowerment? The “competition”, such as it is, is an excuse for the display. Michael Jordan ought to give baseball another shot. We know he can cry. These days, that’d get him a first-class ticket to Cooperstown.

Severian, “On Competition”, Rotten Chestnuts, 2021-08-02.

June 9, 2024

QotD: The biological importance of salt to humans

Filed under: Britain, Food, History, Quotations — Tags: , , , — Nicholas @ 01:00

… regardless of whether it was used in agriculture, for preservation, or for cooking, salt was also essential. The human body is constantly losing salt through sweat, and to a certain extent urine, but it tries to keep the blood’s salt concentrations maintained at a certain level. So as the blood loses salt, the body also ejects water to adjust. Ironically, as you lose salt your body responds by drying you out. Without constantly replacing the salt in your body — which is only ever stored for a couple of days at a time — you will at first feel fatigued and a little breathless, but increasingly weak and debilitated, as though sapped of all energy. The slightest exertion would start to bring on cramps, then problems with your heart and lungs, as your body continually shed water. If these did not kill you — and they probably would — you would essentially die through desiccation. The process would be all the faster if you became ill, rendering even the slightest dehydrating fever or bout of diarrhoea utterly lethal.1

A population deprived of salt was thus one that was weaker and more prone to disease — and at a time when the vast majority of the economy’s energy supply came from the straining of muscle, both human and animal, that weakness in effect meant a severe energy shortage. Although the main fuels for muscle power were carb-heavy grains like wheat, rye, oats, and rice, the indispensable ingredient to getting the most out of these grains was salt — just as how nuclear power uses uranium as its fuel, but also requires a suitable neutron moderator. A population deprived of salt would quite literally be more lethargic and sluggish, making it less productive and poorer too.

Salt’s unique properties made it a serious tool of state. In 1633 king Charles I’s newly-appointed Lord Deputy for Ireland, Baron Wentworth, advised controlling its salt supply as a way to make the Irish utterly economically dependent on England. Given salt was “that which preserves and gives value to all their native staple commodities” — herrings, butter and beef — then “how can they depart from us without nakedness and beggary?” Salt would be a method of control, and a profitable one too, being “of so absolute necessity” that it could be sold to the Irish at inflated prices without much dampening demand: salt “must be had whether they will or no, and may at all times be raised in price”.2 Much like economists today, Wentworth saw revenue-raising potential in taxing goods with such unresponsive or “inelastic” demand.

Wentworth’s scheme to control the Irish never came to be. But a great many other countries did choose to tax it. Given a minimum amount of salt had to be consumed by absolutely everyone, monopolising its sale — and levying what was effectively a tax by inflating the price well above the costs of importing or producing it — could function as kind of indirect poll tax, levied more or less per head of both people and livestock, but without any of the administrative hassle of taking and maintaining an accurate census in order to impose such a tax directly.

When compared to other necessities like grain, salt did not need to be traded in especially large quantities either, meaning that its supply could be monopolised with relative ease. And it could not be produced everywhere. Salt tended to be lacking the further you got from the sea coast, unless there happened to be some relatively rare inland sources like salt lakes, brine springs, or rock salt mines. And it could even be lacking on the sea coast where it was either too humid or too cold to get salt cheaply by evaporating seawater using the sun, or where there was insufficient fuel for boiling the brine. These places were thus prone to being charged inflated prices, while the states that controlled places where the costs of production were low — in warmer and drier climes where the salty water of coastal marshes could cheaply be evaporated using only the heat of the summer sun — could extract especially large monopoly profits from the difference. The revenue from controlling solar salt thus became the basis of many kingdoms, some unusually powerful republics, and even empires.

Anton Howes, “The Second Soul”, Age of Invention, 2024-03-08.


    1. Roy Moxham, “Salt Starvation in British India: Consequences of High Salt Taxation in Bengal Presidency, 1765 to 1878”, Economic and Political Weekly 36, no. 25 (2001): p.2270–74.

    2. George O’Brien, The Economic History of Ireland in the Seventeenth Century (Maunsel and Company Limited, 1919), p.244, which has the transcription of Wentworth’s proposal

January 27, 2024

Modern academics “were perfectly happy to accept that evolution explains the behaviour of every other species on earth, with the exception of humans”

Filed under: Cancon, Education, Politics — Tags: , , , , , — Nicholas @ 03:00

In the National Post, Gad Saad offers an action plan to bring our universities back to a slightly more reality-based view of the world and prevent further postmodernist deterioration:

University College, University of Toronto, 31 July, 2008.
Photo by “SurlyDuff” via Wikimedia Commons.

This year, I am celebrating my 30th year as a professor. During those three decades, I have witnessed the proliferation of several parasitic ideas that are fully decoupled from reality, common sense, reason, logic and science, which led to my 2020 book, The Parasitic Mind: How Infectious Ideas Are Killing Common Sense. As George Orwell famously noted, “There are some ideas so absurd that only an intellectual could believe them”. Each of these ideas were spawned on university campuses, originally in the humanities and the social sciences, but as I predicted long ago, they have infiltrated the natural sciences, and now can be found in all areas of our culture.

These destructive ideas include, but are not limited to, postmodernism (there are no objective truths, which is a fundamental attack on the epistemology of science); cultural relativism (who are we to judge the cultural mores of another society, such as performing female genital mutilation on little girls?); the rejection of meritocracy in favour of identity politics (diversity, inclusion and equity (DIE) as the basis for admitting, hiring and promoting individuals); and victimhood as the means by which one adjudicates between competing ideas (I am a greater victim therefore my truth is veridical).

I was first exposed to this pervasive academic lunacy via my scientific work at the intersection of evolutionary psychology and consumer behaviour. Central to this endeavour is the fact that the human mind has evolved via the dual processes of natural and sexual selection. Nothing could be clearer, and yet I was astonished early in my career to witness the extraordinary resistance that I faced from my colleagues, many of whom were perfectly happy to accept that evolution explains the behaviour of every other species on earth, with the exception of humans.

Apparently, human beings transcend their biological imperatives, as they are strictly cultural beings. This biophobia (fear of using biology to explain human phenomena) is the means by which transgender activists can argue with a straight face that “men too can menstruate and bear children”. Biology is apparently the means by which the patriarchy implements its nefarious misogyny, making us all “wrongly” believe that men can on average lift heavier weights and run faster than women, notwithstanding a litany of evolutionary-based anatomical, physiological, hormonal and morphological sex differences.

According to radical feminists, these differences are largely due to social construction. Hence, a man who stands 6-4 and weighs 285 pounds can wake up one day and declare himself to be a transgender woman. Anyone who disagrees with this notion is clearly a transphobe.

October 3, 2023

“Just play safe” is difficult when the definition of “safe” is uncertain

Filed under: Food, Health — Tags: , , , — Nicholas @ 04:00

David Friedman on the difficulty of “playing safe”:

It’s a no brainer. Just play safe

It is a common argument in many different contexts. In its strongest form, the claim is that the choice being argued for is unambiguously right, eliminates the possibility of a bad outcome at no cost. More plausibly, the claim is that one can trade the risk of something very bad for a certainty of something only a little bad. By agreeing to pay the insurance company a hundred dollars a year now you can make sure that if your house burns down you will have the money to replace it.

Doing that is sometimes is possible but, in an uncertain world, often not; you do not, cannot, know all the consequences of what you are doing. You may be exchanging the known risk of one bad outcome for the unknown risk of another.

Some examples:

Erythritol

Erythritol was the best of the sugar alcohols, substitutes tolerably well for sugar in cooking, has almost zero calories or glycemic load. For anyone worried about diabetes or obesity, using it instead of sugar is an obvious win. Diabetes and obesity are dangerous, sometimes life threatening.

Just play safe.

I did. Until an research came out offering evidence that it was not the best sugar alcohol but the worst:

    People with the highest erythritol levels (top 25%) were about twice as likely to have cardiovascular events over three years of follow-up as those with the lowest (bottom 25%). (Erythritol and cardiovascular events, NIH)

A single article might turn out to be wrong, of course; to be confident that erythritol is dangerous requires more research. But a single article was enough to tell me that using erythritol was not playing safe. I threw out the erythritol I had, then discovered that all the brands of “keto ice cream” — I was on a low glycemic diet and foods low in carbohydrates are also low in glycemic load — used erythritol as their sugar substitute.

Frozen bananas, put through a food processor or super blender along with a couple of ice cubes and some milk, cream, or yogurt, make a pretty good ice cream substitute.1 Or eat ice cream and keep down your weight or glycemic load by eating less of something else.

It’s safer.

Lethal Caution: The Butter/Margarine Story

For quite a long time the standard nutritional advice was to replace butter with margarine, eliminating the saturated fat that caused high cholesterol and hence heart attacks. It turned out to be very bad advice. Saturated fats may be bad for you — the jury is still out on that, with one recent survey of the evidence concluding that they have no effect on overall mortality — but transfats are much worse. The margarine we were told to switch to was largely transfats.2

“Consumption of trans unsaturated fatty acids, however, was associated with a 34% increase in all cause mortality”3

If that figure is correct, the nutritional advice we were given for decades killed several million people.


    1. Bananas get sweeter as they get riper so for either a keto or low glycemic diet, freeze them before they get too ripe.

    2. Some more recent margarines contain neither saturated fats nor transfats.

    3. “Intake of saturated and trans unsaturated fatty acids and risk of all cause mortality, cardiovascular disease, and type 2 diabetes: systematic review and meta-analysis of observational studies”, BMJ 2015; 351 doi: https://doi.org/10.1136/bmj.h3978 (Published 12 August 2015)

September 27, 2023

The fascinating world of trees

Filed under: Books, Environment — Tags: , , , — Nicholas @ 03:00

The latest book review from Mr. and Mrs. Psmith’s Bookshelf examines Tristan Gooley’s How to Read a Tree:

Okay, I admit it: I read this book because I wanted to know more about the trees in my yard.

I’m afraid that’s not how Tristan Gooley means it to be used. He’s an expert in what he terms “natural navigation“, which means finding your way wherever you’re going using the sun, moon, stars, weather, land, sea, plants and animals. He teaches classes in it. He tested Viking navigation methods in a small boat in the north Atlantic and wrote a scholarly paper about it. He traveled the desert with the Tuareg. He’s the only living person to have crossed the Atlantic solo in both a plane and a sailboat.1 Meanwhile, I consistently walk a block in the wrong direction when I come out of the subway. But I am interested in trees!

Do you think much about trees? Could you draw one from memory and come up with something besides a fat green lollipop? Can you describe a tree you walk past every day with something more than its species and “leaves turn a pretty color in the fall” or “had its whole middle chopped out because planting trees directly under power lines is a terrible idea”? (Or if you live somewhere urban enough to have buried power lines, “they really, really should have made sure all these ginkgos were male”.)2 My guess is that you can’t, because most of us couldn’t, but trees deserve some real thought. They are actually fabulously, unintuitively weird, and learning just a little bit about how they work will dramatically enhance your ability to understand why the world around you is the way it is. I don’t expect I’ll use a tree to find my way any time soon, but since reading the book I’ve started spotting things in my yard and my neighborhood that I’d never noticed before — and noticing things is halfway to understanding them. (Which is, of course, why you must not be permitted to notice that which you are not supposed to understand.)

The most fundamental insight here is that trees are not like animals. This sounds breathtakingly obvious (and indeed, when I shared this pearl of wisdom at the dinner table everyone laughed at me), but it’s hard to internalize. Our increasingly urbanized and domesticated lives have so impoverished our natural imaginary — the available stock of symbols, metaphors, and archetypes through which we understand the natural world — that we’re more or less limited to commensals and charismatic megafauna, and are therefore vaguely surprised when we encounter organisms that work differently.3 And trees really do work differently, in a wide variety of ways that make perfect sense when Gooley points them out.

What are these differences? Well, for one thing, where animals have their physical architecture written into their genes, trees — like all plants — have potential. Sure, they have general growth habits4 (you’d never mistake a willow for a maple), but compare two trees of the same species — even two genetically identical trees cloned from grafts or cuttings of the same parent — and you’ll find dramatic structural differences depending on how the individual tree grew. This isn’t true for animals: one lion might be smaller than another, or bear the scars of an old injury, but all lions have four legs with the same joint anatomy. A lion will never grow a new leg, drop an old one, or add new tendons to support a particularly overworked limb. Trees, on the other hand, do all of those and more, following general rules dictated by species but growing in response to the conditions they encounter. And because only the top of the tree continues to grow up — a branch five feet off the ground will still be five feet off the ground in a decade, though quite a lot thicker — you can read a tree’s whole history in its structure. As with looking at a genome, looking at a tree is a way of looking into the past.

Trees seek the light. Just down the street, my neighbor’s entire front yard is shaded by three enormous oak trees planted in a rough triangle and each arching gently away from the others (with a surprising similarity to the Air Force Memorial) as they try to escape each others’ shade. A few blocks away is a survivor of a similar situation, an old pine tree that’s branchless most of the way up its trunk so you can really see the alarming 15° lean with which it grew. Some long-gone giant cast the shade that sculpted this tree into its present funny shape, and if we were in the woods we might be able to see its stump — Gooley encourages the reader to greet a woodland stump by looking for the “footprint” of the missing tree in its surroundings — but I suspect this one was probably removed to make way for the foundation of the nearby house. (Given the apparent age of the pine and the house, its old neighbor probably met its end around the time the new streetcars turned this farming village on a railroad into a proper suburb.)


    1. The late Steve Fossett did it first, but since he holds about a billion other records it feels churlish to take this from Gooley.

    2. Only female gingkos drop those awful berries. There are entire all-male cultivars that make fabulous trees, and somehow, inexplicably, I spent every autumn of my childhood scraping horrible stinky mush off the bottoms of my shoes. Why.

    3. Also on this front, I recommend Merlin Sheldrake’s Entangled Life, which is exactly the sort of book about fungi you would expect someone named Merlin Sheldrake to write.

    4. In fact “tree” is really just a growth habit, evolved independently by thousands of unrelated species of plants, because trees are the crabs of the plant kingdom. [NR: Do read that thread, it’s quite amusing}

August 10, 2023

QotD: The variable pace of evolution

Filed under: Books, Environment, History, Quotations, Science — Tags: , , , — Nicholas @ 01:00

The central argument of Gelernter’s essay is that random chance is not good enough, even at geologic timescales, to produce the ratchet of escalating complexity we see when we look at living organisms and the fossil record. Most mutations are deleterious and degrade the functioning of the organism; few are useful enough to build on. There hasn’t been enough time for the results we see.

Before getting to that one I want to deal with a subsidiary argument in the essay, that Darwinism is somehow falsified because we don’t observe the the slow and uniform evolution that Darwin posited. But we have actually observed evolution (all the way up to speciation) in bacteria and other organisms with rapid lifespans, and we know the answer to this one.

The rate of evolutionary change varies; it increases when environmental changes increase selective pressures on a species and decreases when their environment is stable. You can watch this happen in a Petri dish, even trigger episodes of rapid evolution in bacteria by introducing novel environmental stressors.

Rate of evolution can also increase when a species enters a new, unexploited environment and promptly radiates into subspecies all expressing slightly different modes of exploitation. Darwin himself spotted this happening among Galapagos finches. An excellent recent book, The 10,000 Year Explosion, observes the same acceleration in humans since the invention of agriculture.

Thus, when we observe punctuated equilibrium (long stretches of stable morphology in species punctuated by rapid changes that are hard to spot in the fossil record) we shouldn’t see this as the kind of ineffable mystery that Gelernter and other opponents of Darwinism want to make of it. Rather, it is a signal about the shape of variability in the adaptive environment – also punctuated.

Even huge punctuation marks like the Cambrian explosion, which Gelernter spends a lot of rhetorical energy trying to make into an insuperable puzzle, fall to this analysis. The fossil record is telling us that something happened at the dawn of the Cambrian that let loose a huge fan of possibilities; adaptive radiation, a period of rapid evolution, promptly followed just as it did for the Galapagos finches.

We don’t know what happened, exactly. It could have been something as simple as the oxygen level in seawater going up. Or maybe there was some key biological invention – better structural material for forming hard body parts with would be one obvious one. Both these things, or several other things, might have happened near enough together in time that the effects can’t be disentangled in the fossil record.

The real point here is that there is nothing special about the Cambrian explosion that demands mechanisms we haven’t observed (not just theorized about, but observed) on much faster timescales. It takes an ignotum per æque ignotum kind of mistake to erect a mystery here, and it’s difficult to imagine a thinker as bright as Dr. Gelernter falling into such a trap … unless he wants to.

But Dr. Gelernter makes an even more basic error when he says “The engine that powers Neo-Darwinian evolution is pure chance and lots of time.” That is wrong, or at any rate leaves out an important co-factor and leads to badly wrong intuitions about the scope of the problem and the timescale required to get the results we see. Down that road one ends up doing silly thought experiments like “How often would a hurricane assemble a 747 from a pile of parts?”

Eric S. Raymond, “Contra Gelernter on Darwin”, Armed and Dangerous, 2019-08-14.

July 5, 2023

The “orgasm gap”, yet another problematic front in the war of the sexes

Filed under: Health, Media, USA — Tags: , , , , , , — Nicholas @ 03:00

Janice Fiamengo discusses the faked orgasm scene in When Harry Met Sally and its role in the ongoing arguments over the “orgasm gap”:

An iconic moment in modern movie history is the diner scene from When Harry Met Sally (1989), when Sally stuns an incredulous Harry with her rendition of a convincing orgasm. Her bravura performance causes shocked silence in the restaurant until one woman, sitting nearby, says admiringly (or enviously), “I’ll have what she’s having.”

The scene and the woman’s amused reaction told of a simple reality with wit and without judgement: some portion of women — perhaps many — are convincing fakers, and even a sexually experienced man will find it hard to be sure.

Many women in the movie audience laughed in recognition, and many men likely scratched their heads, wondering why anyone would need to fake sexual enjoyment. Some men may have remembered times when they faked it too. In the romantic-comedic world of the movie, the scene symbolized one of the differences between the average woman and the average man that only a generous and committed love could bridge.

A few years ago, When Harry Met Sally turned 30 years old, and its anniversary prompted a number of reflection pieces, some turning a harsh feminist lens on the film’s gender politics. In “‘I’ll have what she’s having’: How that scene from When Harry Met Sally changed the way we talk about sex,” Lisa Bonos at The Washington Post found in the fake climax scene a salutary revelation of male sexual arrogance. For Bonos, Harry is a typical macho man, someone who doesn’t care about a woman’s pleasure. The fact that the whole point of his conversation with Sally had been his confidence that he was giving women pleasure simply confirmed his emetic masculinity.

According to Bonos, the fact that some women fake orgasm supposedly reveals that women’s sexual pleasure is “not prioritized” in heterosexual relationships, and Sally’s performance gave sobering evidence of a gendered pleasure gap. It was implicitly the man’s fault that his partner felt the need to lie to him about her sexual satisfaction, and his desire for her to orgasm proved his typically male ego. Bonos’s analysis was an egregious violation of the spirit of the movie but was eminently faithful to the feminist perspective. The politics of grievance had come a long way in three decades.

Right on cue, studies in human psycho-sexuality are now taking up the same theme, alleging a culturally imposed “orgasm gap” between men and women in which men outpace women in the frequency with which they report orgasm during sexual intercourse (86% for men vs. 62% for women, according to one national survey).

Remembering how consistently feminist pundits have expressed outrage at male incels‘ (alleged) sense of “entitlement” to sex, I cannot help but find it ironic how unapologetically researchers assume a female entitlement to orgasm. Apparently, the whole society is to be concerned if women fail to climax every time they have sex, while no one has compassion for young men who face a lifetime of sexlessness. The prime exhibit is “Orgasm Equality: Scientific Findings and Societal Implications“, a paper published in 2020 by three female researchers at the University of Florida. The paper not only surveys the literature on the subject but also makes recommendations for “a world of orgasm equality”.

April 16, 2023

QotD: Homo electronicus and the permanent caloric surplus

Filed under: Britain, Food, Health, History, Quotations, USA, WW1 — Tags: , , , , — Nicholas @ 01:00

Finally, I suggest that the permanent caloric surplus that has obtained in the West since about 1950 has done more than anything to speciate us Postmoderns. It would take someone who Fucking Loves Science™ way more than I do to assert that the vast, obvious changes in the human race in the 20th century were merely physical. Consider the oft-remarked fact (at the time, at least) that British officers on the Western Front were a full head taller than their men. Then consider (ditto) the more-or-less open secret that a lot of those tall subalterns were gay. Correlation is not causation — growing up in the infamous English public schools probably had a lot to do with it, as Robert Graves himself says — but … there’s a pretty strong correlation.

Excess fat cranks up estrogen levels. You don’t need to be House MD to interpret this finding:

    In males with increasing obesity there is increased aromatase activity, which irreversibly converts testosterone to estradiol resulting in decreased testosterone and elevated estrogen levels.

Or this one:

    A study supports the link between excess weight and higher hormone levels. The study found that estrogen and testosterone levels dropped quite a bit when overweight and obese women lost weight.

This is not to say those swishy subalterns were fat — indeed, they were comically scrawny compared to Postmodern people. But a little goes a long way when it comes to hormones, especially in a world where “intermittent fasting” wasn’t a fad diet, but a way of life. Any one of us would keel over from hunger if we were forced to eat the kind of diet George Orwell described as his public school’s standard fare.

Follow that trend out to the Current Year, when pretty much everyone is grossly obese compared to even the Silent Generation. Heartiste and other “game” bloggers loved pointing out that the average modern woman weighs as much as the average man did in the 1960s. And while I think that’s overblown — we’re also several inches taller, on average, than 1960s people — there’s definitely something to it, especially when you consider how far the bell curve has shifted to the fat end. Not only do people weigh a lot more on average, the people who weigh more than average now weigh a hell of a lot more than heavier-than-average people did back when. See, for example, the ballooning weight of offensive linemen, who are professionally fat — in 2011 a quarterback, Cam Newton, weighed more than the average offensive lineman in the 1960s.

Put the two trends together and you have, on average, a hormone cocktail way, way different than even 50 years ago … and that’s before you add in things like all-but-universal hormonal contraception, lots of which ends up in municipal drinking water.

Severian, “Recent Evolution”, Rotten Chestnuts, 2020-09-28.

April 12, 2023

Institutional Review Boards … trying to balance harm vs health, allegedly

Filed under: Books, Bureaucracy, Health, USA — Tags: , , , , , — Nicholas @ 06:00

At Astral Codex Ten Scott Alexander reviews From Oversight to Overkill by Simon N. Whitley, in light of his own experience with an Institutional Review Board’s demands:

Dr. Rob Knight studies how skin bacteria jump from person to person. In one 2009 study, meant to simulate human contact, he used a Q-tip to cotton swab first one subject’s mouth (or skin), then another’s, to see how many bacteria traveled over. On the consent forms, he said risks were near zero — it was the equivalent of kissing another person’s hand.

His IRB — ie Institutional Review Board, the committee charged with keeping experiments ethical — disagreed. They worried the study would give patients AIDS. Dr. Knight tried to explain that you can’t get AIDS from skin contact. The IRB refused to listen. Finally Dr. Knight found some kind of diversity coordinator person who offered to explain that claiming you can get AIDS from skin contact is offensive. The IRB backed down, and Dr. Knight completed his study successfully.

Just kidding! The IRB demanded that he give his patients consent forms warning that they could get smallpox. Dr. Knight tried to explain that smallpox had been extinct in the wild since the 1970s, the only remaining samples in US and Russian biosecurity labs. Here there was no diversity coordinator to swoop in and save him, although after months of delay and argument he did eventually get his study approved.

Most IRB experiences aren’t this bad, right? Mine was worse. When I worked in a psych ward, we used to use a short questionnaire to screen for bipolar disorder. I suspected the questionnaire didn’t work, and wanted to record how often the questionnaire’s opinion matched that of expert doctors. This didn’t require doing anything different — it just required keeping records of what we were already doing. “Of people who the questionnaire said had bipolar, 25%/50%/whatever later got full bipolar diagnoses” — that kind of thing. But because we were recording data, it qualified as a study; because it qualified as a study, we needed to go through the IRB. After about fifty hours of training, paperwork, and back and forth arguments — including one where the IRB demanded patients sign consent forms in pen (not pencil) but the psychiatric ward would only allow patients to have pencils (not pen) — what had originally been intended as a quick record-keeping had expanded into an additional part-time job for a team of ~4 doctors. We made a tiny bit of progress over a few months before the IRB decided to re-evaluate all projects including ours and told us to change twenty-seven things, including re-litigating the pen vs. pencil issue (they also told us that our project was unusually good; most got >27 demands). Our team of four doctors considered the hundreds of hours it would take to document compliance and agreed to give up. As far as I know that hospital is still using the same bipolar questionnaire. They still don’t know if it works.

Most IRB experiences can’t be that bad, right? Maybe not, but a lot of people have horror stories. A survey of how researchers feel about IRBs did include one person who said “I hope all those at OHRP [the bureaucracy in charge of IRBs] and the ethicists die of diseases that we could have made significant progress on if we had [the research materials IRBs are banning us from using]”.

Dr. Simon Whitney, author of From Oversight To Overkill, doesn’t wish death upon IRBs. He’s a former Stanford IRB member himself, with impeccable research-ethicist credentials — MD + JD, bioethics fellowship, served on the Stanford IRB for two years. He thought he was doing good work at Stanford; he did do good work. Still, his worldview gradually started to crack:

    In 1999, I moved to Houston and joined the faculty at Baylor College of Medicine, where my new colleagues were scientists. I began going to medical conferences, where people in the hallways told stories about IRBs they considered arrogant that were abusing scientists who were powerless. As I listened, I knew the defenses the IRBs themselves would offer: Scientists cannot judge their own research objectively, and there is no better second opinion than a thoughtful committee of their peers. But these rationales began to feel flimsy as I gradually discovered how often IRB review hobbles low-risk research. I saw how IRBs inflate the hazards of research in bizarre ways, and how they insist on consent processes that appear designed to help the institution dodge liability or litigation. The committees’ admirable goals, in short, have become disconnected from their actual operations. A system that began as a noble defense of the vulnerable is now an ignoble defense of the powerful.

So Oversight is a mix of attacking and defending IRBs. It attacks them insofar as it admits they do a bad job; the stricter IRB system in place since the ‘90s probably only prevents a single-digit number of deaths per decade, but causes tens of thousands more by preventing life-saving studies. It defends them insofar as it argues this isn’t the fault of the board members themselves. They’re caught up in a network of lawyers, regulators, cynical Congressmen, sensationalist reporters, and hospital administrators gone out of control. Oversight is Whitney’s attempt to demystify this network, explain how we got here, and plan our escape.

April 3, 2023

The poison garden of Alnwick

Filed under: Britain, Environment, History — Tags: , , , — Nicholas @ 02:00

Tom Scott
Published 29 May 2017

Inside the beautiful Alnwick Garden, behind a locked gate, there’s the Poison Garden: it contains only poisonous plants. Trevor Jones, head gardener, was kind enough to give a guided tour!

For more information about visiting the Castle, Garden, and poison garden: https://alnwickgarden.com/

(And yes, it’s pronounced “Annick”.)
(more…)

March 24, 2023

A very different take on the Wuhan Coronavirus pandemic

At The Conservative Woman, Dr. Mike Yeadon lays out his case for doubting that there ever actually was a novel coronavirus in the first place:

Wuhan Institute of Virology.
Wikimedia Commons.

I’ve grown increasingly frustrated about the way debate is controlled around the topic of origins of the alleged novel virus, SARS-CoV-2, and I have come to disbelieve it’s ever been in circulation, causing massive scale illness and death. Concerningly, almost no one will entertain this possibility, despite the fact that molecular biology is the easiest discipline in which to cheat. That’s because you really cannot do it without computers, and sequencing requires complex algorithms and, importantly, assumptions. Tweaking algorithms and assumptions, you can hugely alter the conclusions.

This raises the question of why there is such an emphasis on the media storm around Fauci, Wuhan and a possible lab escape. After all, the “perpetrators” have significant control over the media. There’s no independent journalism at present. It is not as though they need to embarrass the establishment. I put it to readers that they’ve chosen to do so.

So who do I mean by “they” and “the perpetrators”? There are a number of candidates competing for this position, with their drug company accomplices, several of whom are named in Paula Jardine’s excellent five-part series for TCW, Anatomy of the sinister Covid project. High on the list is the “enabling” World Economic Forum and their many political acolytes including Justin Trudeau and Jacinda Ardern.

But that doesn’t answer the question why are they focusing on the genesis of the virus. In my view, they are doing their darnedest to make sure you regard this event exactly as they want you to. Specifically, that there was a novel virus.

I’m not alone in believing that myself at the beginning of the “pandemic”, but over time I’ve seen sufficient evidence to cast strong doubt on that idea. Additionally, when considered as part of a global coup d’état, I have put myself in the position of the most senior, hidden perpetrators. In a Q&A, they would learn that the effect of a released novel pathogen couldn’t be predicted accurately. It might burn out rapidly. Or it might turn out to be quite a lot more lethal than they’d expected, demolishing advanced civilisations. Those top decision-makers would, I submit, conclude that this natural risk is intolerable to them. They crave total control, and the wide range of possible outcomes from a deliberate release militates against this plan of action: “No, we’re not going to do this. Come back with a plan with very much reduced uncertainty on outcomes.”

The alternative I think they’ve used is to add one more lie to the tall stack of lies which has surrounded this entire affair. This lie is that there has ever been in circulation a novel respiratory virus which, crucially, caused massive-scale illness and deaths. In fact, there hasn’t.

Instead, we have been told there was this frightening, novel pathogen and ramped up the stress-inducing fear porn to 11, and held it there. This fits with cheating about genetic sequences, PCR test protocols (probes, primers, amplification and annealing conditions, cycles), ignoring contaminating genetic materials from not only human and claimed viral sources, but also bacterial and fungal sources. Why for example did they need to insert the sampling sticks right into our sinuses? Was it to maximise non-human genetic sequences?

Notice the soft evidence that our political and cultural leaders, including the late Queen, were happy to meet and greet one another without testing, masking or social distancing. They had no fear. In the scenario above, a few people would have known there was no new hazard in their environment. If there really was a lethal pathogen stalking the land, I don’t believe they’d have had the courage or the need to act nonchalantly and risk exposure to the virus.

Most convincingly for me is the US all-cause mortality (ACM) data by state, sex, age and date of occurrence, as analysed by Denis Rancourt and colleagues. The pattern of increased ACM is inconsistent with the presence of a novel respiratory virus as the main cause.

If I’m correct that there was no novel virus, what a genius move it was to pretend there was! Now they want you only to consider how this “killer virus” got into the human population. Was it a natural emergence (you know, a wild bat bit a pangolin and this ended up being sold at a wet market in Wuhan) or was it hubristically created by a Chinese researcher, enabled along the way by a researcher at the University of North Carolina funded by Fauci, together making an end run around a presidential pause on such work? Then there’s the question as to whether the arrival of the virus in the general public was down to carelessness and a lab leak, or did someone deliberately spread it?

February 20, 2023

Monocultures are risky in agriculture … and even more so in politics

Filed under: Government, Politics, USA — Tags: , , , , — Nicholas @ 05:00

Unlike his usual bite-sized quips-with-links at Instapundit, Glenn Reynolds occasionally writes at length for his Substack page:

The western front of the United States Capitol. The Neoclassical style building is located in Washington, D.C., on top of Capitol Hill at the east end of the National Mall. The Capitol was designated a National Historic Landmark in 1960.
Photo via Wikimedia Commons.

Our modern ruling class is peculiar. One of its many peculiarities is its penchant for fads, and what can only be called mass hysteria. Repeatedly, we see waves in which something that nobody much cared about suddenly comes to dominate ruling class discourse. Almost in synchrony, a wide range of institutions begin to talk about it, and to be preoccupied by it, even as every leading figure virtue-signals regarding this subject which, only a month or two previously, hardly any of them even knew about, much less cared about.

There are several factors behind this, but one of the most important, I think, is that our ruling class is a monoculture.

In agriculture, a monoculture exists when just a single variety dominates a crop. “Monoculture has its benefits. The entire system is standard, so there are rarely new production and maintenance processes, and everything is compatible and familiar to users. On the other hand, as banana farmers learned, in a monoculture, all instances are prone to the same set of attacks. If someone or something figures out how to affect just one, the entire system is put at risk.”

In a monoculture, if one plant is vulnerable to a disease or an insect, they all are. Thus diseases or pests can rip through it like nobody’s business. (As John Scalzi observes in one of his books, it’s also why clone armies, popular in science fiction, are a bad idea in reality, as they would be highly vulnerable to engineered diseases.) A uniform population is a high-value target.

[…]

Codevilla wrote the essay [here] over a decade ago, and it has only grown more true in the interim. Despite its constant invocation of “diversity”, in many important ways our ruling class is much less diverse than it has ever been. And, as a monoculture, it is vulnerable to viruses of a sort. Including what amount to viruses of the mind.

When Elon Musk referred to the dangers of the “woke mind virus“, he knew exactly what he was talking about. Ideas can be contagious, and can be viewed as analogous to viruses, entities that reproduce by infecting individuals and coopting those individuals into spreading them to others. Richard Dawkins, in his The Selfish Gene, coined the term “meme” to describe these infectious ideas, though the term has since acquired a more popular meaning involving photos of cats, etc. with captions. And yet those pictures are themselves memes, to the extent they “go viral” and persuade others to copy and spread them.

Our ruling class is particularly vulnerable to mind viruses for several reasons. First, it is a monoculture, so that what is persuasive to one member is likely to be persuasive to many.

Second, it suffers from deep and widespread status anxiety – not least because most of its members have status, but few real accomplishments to rely on – and thus requires constant reassurance in the form of peer acceptance, reassurance that is generally achieved by repeating whatever the popular people are saying already. And third, it has few real deeply held values, which might otherwise provide guard rails of a sort against believing crazy things.

In a more diverse ruling class, ideas would not spread so swiftly or be received so uncritically. People with different worldviews would respond differently to ideas as they entered the world of discourse. There would be criticism and there would be debate. (Indeed, this is how things generally worked during the earlier, more diverse, era described by Codevilla, though intellectual fads – lobotomy, say, or eugenics – spread then, too, though mostly through the Gentry/Academic stratum of society that now dominates the ruling class.)

September 7, 2022

The “self-domestication” hypothesis in human evolution

Filed under: Books, Science — Tags: , , — Nicholas @ 03:00

A review of The Goodness Paradox: The Strange Relationship Between Virtue and Violence in Human Evolution by Richard Wrangham in the latest edition of Rob Henderson’s Newsletter:

The “self-domestication hypothesis” is the idea that in the ancestral environment, early human communities collectively killed individuals prone to certain forms of aggression: arrogance, bullying, random violence, and monopolizing food and sexual partners.

Over time, our ancestors eliminated humans — typically males — who were exceedingly aggressive toward members of their own group.

If there was a troublemaker, then other less domineering males conspired to organize and commit collective murder against them.

Women too were involved in such decisions involving capital punishment, but men typically carried out the killing.

Humans tamed one another by taking out particularly aggressive individuals. This led us to become relatively peaceful apes.

But if humans are “self-domesticated”, then why are there so many violent people among us today?

The fact is, humans are not nearly as violent as our nearest evolutionary relatives.

Comparing the level of within-group physical aggression among chimpanzees with human hunter-gatherer communities, chimps are 150 to 550 times more likely than humans to inflict violence against their peers.

We humans are far nicer to members of our own group than chimps are. Thanks to our ancestors and their ability to plan organized murder. And tear overly dominant males to shreds.

Many people are familiar with the findings that bonobos are more peaceful than chimpanzees.

This is true.

Male bonobos are about half as aggressive as male chimpanzees, while female bonobos are more aggressive than female chimpanzees.

Bonobos are “peaceful”, relative to chimps. But bonobos are extremely aggressive compared to humans.

The eminent Harvard biological anthropologist Richard Wrangham explores these findings at length in his fascinating 2019 book The Goodness Paradox: The Strange Relationship Between Virtue and Violence in Human Evolution.

This is a review and discussion of Wrangham’s book.

August 30, 2022

NYT op-ed – “Maternal instinct is a social construct devised by men to keep women subordinate”

Filed under: Health, Media, Science — Tags: , , , , , — Nicholas @ 03:00

Jerry Coyne responds to a New York Times op-ed by Chelsea Conaboy (author of a forthcoming book from which the op-ed was adapted):

The recent article […] from the New York Times (of course), is one of the worst of the lot. It bespeaks a lack of judgment on the part of the author — who ignores biology because of her ideology — as well as on the part of the newspaper, which failed to hold the author’s feet to the scientific fire. Let this post be my rebuttal.

Author Conaboy, who apparently hasn’t done enough scientific research, maintains that “maternal instinct” doesn’t exist, but is a social construct devised by men to keep women subordinate.

The immediate problem is that Conaboy never defines “maternal instinct”. It could mean any number of things, including a greater desire of women than men to have children, a greater desire of women than of men to care for those offspring, the fact that in animals mothers spend more time caring for offspring than do fathers, a greater emotional affinity of women than of men towards children (including offspring), or the demonstration of such a mental difference by observing a difference in caring behavior.

I will define “maternal instinct” as not only the greater average tendency of females than males to care for offspring, but also a greater behavioral affinity towards offspring in females than in males. The term involves behavioral response, not “feelings”, which are demonstrable only in humans. Thus one can look for difference in “parental instincts” across various species of animals.

But even in this sense, Conoboy is partly (but far from wholly) correct when she discusses humans. It’s undoubtedly true that women were socialized into the sex role as offspring breeders and caretakers, with men assuming the “breadwinning” role. It’s also true that women were often denied access to work or education because their vocation was seen as “reproducer”, or out of fear that they would spend less time working and more on children, or even that they’d get pregnant and would leave jobs. Further, it’s also true that this role difference was justified by being seen as “hard-wired” (i.e., largely the result of genes, which, I argue below, is true), and that “hard-wired” was conceived as “unable to be changed”. The latter construal, however, is wrong, and that is what really held back women. The socialization of sex roles, which still occurs, goes on from early ages, with girls given dolls and boys toy cars, though, as society has matured, we’re increasingly allowing girls to choose their own toys and their own path through life. I of course applaud such “equal opportunity”.

But to claim that women don’t have a greater desire than men to care for offspring, or have a greater emotional affinity towards offspring, is to deny biology, and evolution in particular. (I freely admit that many men love their kids deeply, and that some men care for them as much or more as do mothers, but I’m talking about averages here, not anecdotes.)

There are two reasons why Conaboy is wrong, and both involve evolution.

The first is theoretical, but derived from empirical observations. It thus explains the second, which is wholly empirical and predictive. How do we explain the fact that, across the animal kingdom, when members of only one sex do most of the childrearing, it’s almost invariably the females? (Yes, in many species males share the duties, and in a very few, like seahorses, males provide more parental care; and there are evolutionary reasons for that.)

The reasons for the statement in bold above involves the biology of reproduction. It is the female who must lay the eggs or give birth, and there is no way she can leave her genes behind unless she does that. It’s easier for males to take off after insemination and let the females care for offspring. Given that females are constrained to stick with the fertilized eggs, their best strategy is to take care of the gestation and resultant offspring, which of course allows males to seek other mates. Not only must females carry the fetuses, lay the eggs, and so on, but they are also constrained to see out the pregnancy until offspring are produced and then suckle or tend them in other ways. In some cases it’s the best evolutionary strategy for a male to stick around and share the child-rearing, but often it’s not.

This disparity in behavior holds not just in humans, of course, but in many animals: it’s a prediction — largely verified — of evolutionary psychology.

Older Posts »

Powered by WordPress