Quotulatiousness

January 27, 2024

Modern academics “were perfectly happy to accept that evolution explains the behaviour of every other species on earth, with the exception of humans”

Filed under: Cancon, Education, Politics — Tags: , , , , , — Nicholas @ 03:00

In the National Post, Gad Saad offers an action plan to bring our universities back to a slightly more reality-based view of the world and prevent further postmodernist deterioration:

University College, University of Toronto, 31 July, 2008.
Photo by “SurlyDuff” via Wikimedia Commons.

This year, I am celebrating my 30th year as a professor. During those three decades, I have witnessed the proliferation of several parasitic ideas that are fully decoupled from reality, common sense, reason, logic and science, which led to my 2020 book, The Parasitic Mind: How Infectious Ideas Are Killing Common Sense. As George Orwell famously noted, “There are some ideas so absurd that only an intellectual could believe them”. Each of these ideas were spawned on university campuses, originally in the humanities and the social sciences, but as I predicted long ago, they have infiltrated the natural sciences, and now can be found in all areas of our culture.

These destructive ideas include, but are not limited to, postmodernism (there are no objective truths, which is a fundamental attack on the epistemology of science); cultural relativism (who are we to judge the cultural mores of another society, such as performing female genital mutilation on little girls?); the rejection of meritocracy in favour of identity politics (diversity, inclusion and equity (DIE) as the basis for admitting, hiring and promoting individuals); and victimhood as the means by which one adjudicates between competing ideas (I am a greater victim therefore my truth is veridical).

I was first exposed to this pervasive academic lunacy via my scientific work at the intersection of evolutionary psychology and consumer behaviour. Central to this endeavour is the fact that the human mind has evolved via the dual processes of natural and sexual selection. Nothing could be clearer, and yet I was astonished early in my career to witness the extraordinary resistance that I faced from my colleagues, many of whom were perfectly happy to accept that evolution explains the behaviour of every other species on earth, with the exception of humans.

Apparently, human beings transcend their biological imperatives, as they are strictly cultural beings. This biophobia (fear of using biology to explain human phenomena) is the means by which transgender activists can argue with a straight face that “men too can menstruate and bear children”. Biology is apparently the means by which the patriarchy implements its nefarious misogyny, making us all “wrongly” believe that men can on average lift heavier weights and run faster than women, notwithstanding a litany of evolutionary-based anatomical, physiological, hormonal and morphological sex differences.

According to radical feminists, these differences are largely due to social construction. Hence, a man who stands 6-4 and weighs 285 pounds can wake up one day and declare himself to be a transgender woman. Anyone who disagrees with this notion is clearly a transphobe.

October 3, 2023

“Just play safe” is difficult when the definition of “safe” is uncertain

Filed under: Food, Health — Tags: , , , — Nicholas @ 04:00

David Friedman on the difficulty of “playing safe”:

It’s a no brainer. Just play safe

It is a common argument in many different contexts. In its strongest form, the claim is that the choice being argued for is unambiguously right, eliminates the possibility of a bad outcome at no cost. More plausibly, the claim is that one can trade the risk of something very bad for a certainty of something only a little bad. By agreeing to pay the insurance company a hundred dollars a year now you can make sure that if your house burns down you will have the money to replace it.

Doing that is sometimes is possible but, in an uncertain world, often not; you do not, cannot, know all the consequences of what you are doing. You may be exchanging the known risk of one bad outcome for the unknown risk of another.

Some examples:

Erythritol

Erythritol was the best of the sugar alcohols, substitutes tolerably well for sugar in cooking, has almost zero calories or glycemic load. For anyone worried about diabetes or obesity, using it instead of sugar is an obvious win. Diabetes and obesity are dangerous, sometimes life threatening.

Just play safe.

I did. Until an research came out offering evidence that it was not the best sugar alcohol but the worst:

    People with the highest erythritol levels (top 25%) were about twice as likely to have cardiovascular events over three years of follow-up as those with the lowest (bottom 25%). (Erythritol and cardiovascular events, NIH)

A single article might turn out to be wrong, of course; to be confident that erythritol is dangerous requires more research. But a single article was enough to tell me that using erythritol was not playing safe. I threw out the erythritol I had, then discovered that all the brands of “keto ice cream” — I was on a low glycemic diet and foods low in carbohydrates are also low in glycemic load — used erythritol as their sugar substitute.

Frozen bananas, put through a food processor or super blender along with a couple of ice cubes and some milk, cream, or yogurt, make a pretty good ice cream substitute.1 Or eat ice cream and keep down your weight or glycemic load by eating less of something else.

It’s safer.

Lethal Caution: The Butter/Margarine Story

For quite a long time the standard nutritional advice was to replace butter with margarine, eliminating the saturated fat that caused high cholesterol and hence heart attacks. It turned out to be very bad advice. Saturated fats may be bad for you — the jury is still out on that, with one recent survey of the evidence concluding that they have no effect on overall mortality — but transfats are much worse. The margarine we were told to switch to was largely transfats.2

“Consumption of trans unsaturated fatty acids, however, was associated with a 34% increase in all cause mortality”3

If that figure is correct, the nutritional advice we were given for decades killed several million people.


    1. Bananas get sweeter as they get riper so for either a keto or low glycemic diet, freeze them before they get too ripe.

    2. Some more recent margarines contain neither saturated fats nor transfats.

    3. “Intake of saturated and trans unsaturated fatty acids and risk of all cause mortality, cardiovascular disease, and type 2 diabetes: systematic review and meta-analysis of observational studies”, BMJ 2015; 351 doi: https://doi.org/10.1136/bmj.h3978 (Published 12 August 2015)

September 27, 2023

The fascinating world of trees

Filed under: Books, Environment — Tags: , , , — Nicholas @ 03:00

The latest book review from Mr. and Mrs. Psmith’s Bookshelf examines Tristan Gooley’s How to Read a Tree:

Okay, I admit it: I read this book because I wanted to know more about the trees in my yard.

I’m afraid that’s not how Tristan Gooley means it to be used. He’s an expert in what he terms “natural navigation“, which means finding your way wherever you’re going using the sun, moon, stars, weather, land, sea, plants and animals. He teaches classes in it. He tested Viking navigation methods in a small boat in the north Atlantic and wrote a scholarly paper about it. He traveled the desert with the Tuareg. He’s the only living person to have crossed the Atlantic solo in both a plane and a sailboat.1 Meanwhile, I consistently walk a block in the wrong direction when I come out of the subway. But I am interested in trees!

Do you think much about trees? Could you draw one from memory and come up with something besides a fat green lollipop? Can you describe a tree you walk past every day with something more than its species and “leaves turn a pretty color in the fall” or “had its whole middle chopped out because planting trees directly under power lines is a terrible idea”? (Or if you live somewhere urban enough to have buried power lines, “they really, really should have made sure all these ginkgos were male”.)2 My guess is that you can’t, because most of us couldn’t, but trees deserve some real thought. They are actually fabulously, unintuitively weird, and learning just a little bit about how they work will dramatically enhance your ability to understand why the world around you is the way it is. I don’t expect I’ll use a tree to find my way any time soon, but since reading the book I’ve started spotting things in my yard and my neighborhood that I’d never noticed before — and noticing things is halfway to understanding them. (Which is, of course, why you must not be permitted to notice that which you are not supposed to understand.)

The most fundamental insight here is that trees are not like animals. This sounds breathtakingly obvious (and indeed, when I shared this pearl of wisdom at the dinner table everyone laughed at me), but it’s hard to internalize. Our increasingly urbanized and domesticated lives have so impoverished our natural imaginary — the available stock of symbols, metaphors, and archetypes through which we understand the natural world — that we’re more or less limited to commensals and charismatic megafauna, and are therefore vaguely surprised when we encounter organisms that work differently.3 And trees really do work differently, in a wide variety of ways that make perfect sense when Gooley points them out.

What are these differences? Well, for one thing, where animals have their physical architecture written into their genes, trees — like all plants — have potential. Sure, they have general growth habits4 (you’d never mistake a willow for a maple), but compare two trees of the same species — even two genetically identical trees cloned from grafts or cuttings of the same parent — and you’ll find dramatic structural differences depending on how the individual tree grew. This isn’t true for animals: one lion might be smaller than another, or bear the scars of an old injury, but all lions have four legs with the same joint anatomy. A lion will never grow a new leg, drop an old one, or add new tendons to support a particularly overworked limb. Trees, on the other hand, do all of those and more, following general rules dictated by species but growing in response to the conditions they encounter. And because only the top of the tree continues to grow up — a branch five feet off the ground will still be five feet off the ground in a decade, though quite a lot thicker — you can read a tree’s whole history in its structure. As with looking at a genome, looking at a tree is a way of looking into the past.

Trees seek the light. Just down the street, my neighbor’s entire front yard is shaded by three enormous oak trees planted in a rough triangle and each arching gently away from the others (with a surprising similarity to the Air Force Memorial) as they try to escape each others’ shade. A few blocks away is a survivor of a similar situation, an old pine tree that’s branchless most of the way up its trunk so you can really see the alarming 15° lean with which it grew. Some long-gone giant cast the shade that sculpted this tree into its present funny shape, and if we were in the woods we might be able to see its stump — Gooley encourages the reader to greet a woodland stump by looking for the “footprint” of the missing tree in its surroundings — but I suspect this one was probably removed to make way for the foundation of the nearby house. (Given the apparent age of the pine and the house, its old neighbor probably met its end around the time the new streetcars turned this farming village on a railroad into a proper suburb.)


    1. The late Steve Fossett did it first, but since he holds about a billion other records it feels churlish to take this from Gooley.

    2. Only female gingkos drop those awful berries. There are entire all-male cultivars that make fabulous trees, and somehow, inexplicably, I spent every autumn of my childhood scraping horrible stinky mush off the bottoms of my shoes. Why.

    3. Also on this front, I recommend Merlin Sheldrake’s Entangled Life, which is exactly the sort of book about fungi you would expect someone named Merlin Sheldrake to write.

    4. In fact “tree” is really just a growth habit, evolved independently by thousands of unrelated species of plants, because trees are the crabs of the plant kingdom. [NR: Do read that thread, it’s quite amusing}

August 10, 2023

QotD: The variable pace of evolution

Filed under: Books, Environment, History, Quotations, Science — Tags: , , , — Nicholas @ 01:00

The central argument of Gelernter’s essay is that random chance is not good enough, even at geologic timescales, to produce the ratchet of escalating complexity we see when we look at living organisms and the fossil record. Most mutations are deleterious and degrade the functioning of the organism; few are useful enough to build on. There hasn’t been enough time for the results we see.

Before getting to that one I want to deal with a subsidiary argument in the essay, that Darwinism is somehow falsified because we don’t observe the the slow and uniform evolution that Darwin posited. But we have actually observed evolution (all the way up to speciation) in bacteria and other organisms with rapid lifespans, and we know the answer to this one.

The rate of evolutionary change varies; it increases when environmental changes increase selective pressures on a species and decreases when their environment is stable. You can watch this happen in a Petri dish, even trigger episodes of rapid evolution in bacteria by introducing novel environmental stressors.

Rate of evolution can also increase when a species enters a new, unexploited environment and promptly radiates into subspecies all expressing slightly different modes of exploitation. Darwin himself spotted this happening among Galapagos finches. An excellent recent book, The 10,000 Year Explosion, observes the same acceleration in humans since the invention of agriculture.

Thus, when we observe punctuated equilibrium (long stretches of stable morphology in species punctuated by rapid changes that are hard to spot in the fossil record) we shouldn’t see this as the kind of ineffable mystery that Gelernter and other opponents of Darwinism want to make of it. Rather, it is a signal about the shape of variability in the adaptive environment – also punctuated.

Even huge punctuation marks like the Cambrian explosion, which Gelernter spends a lot of rhetorical energy trying to make into an insuperable puzzle, fall to this analysis. The fossil record is telling us that something happened at the dawn of the Cambrian that let loose a huge fan of possibilities; adaptive radiation, a period of rapid evolution, promptly followed just as it did for the Galapagos finches.

We don’t know what happened, exactly. It could have been something as simple as the oxygen level in seawater going up. Or maybe there was some key biological invention – better structural material for forming hard body parts with would be one obvious one. Both these things, or several other things, might have happened near enough together in time that the effects can’t be disentangled in the fossil record.

The real point here is that there is nothing special about the Cambrian explosion that demands mechanisms we haven’t observed (not just theorized about, but observed) on much faster timescales. It takes an ignotum per æque ignotum kind of mistake to erect a mystery here, and it’s difficult to imagine a thinker as bright as Dr. Gelernter falling into such a trap … unless he wants to.

But Dr. Gelernter makes an even more basic error when he says “The engine that powers Neo-Darwinian evolution is pure chance and lots of time.” That is wrong, or at any rate leaves out an important co-factor and leads to badly wrong intuitions about the scope of the problem and the timescale required to get the results we see. Down that road one ends up doing silly thought experiments like “How often would a hurricane assemble a 747 from a pile of parts?”

Eric S. Raymond, “Contra Gelernter on Darwin”, Armed and Dangerous, 2019-08-14.

July 5, 2023

The “orgasm gap”, yet another problematic front in the war of the sexes

Filed under: Health, Media, USA — Tags: , , , , , , — Nicholas @ 03:00

Janice Fiamengo discusses the faked orgasm scene in When Harry Met Sally and its role in the ongoing arguments over the “orgasm gap”:

An iconic moment in modern movie history is the diner scene from When Harry Met Sally (1989), when Sally stuns an incredulous Harry with her rendition of a convincing orgasm. Her bravura performance causes shocked silence in the restaurant until one woman, sitting nearby, says admiringly (or enviously), “I’ll have what she’s having.”

The scene and the woman’s amused reaction told of a simple reality with wit and without judgement: some portion of women — perhaps many — are convincing fakers, and even a sexually experienced man will find it hard to be sure.

Many women in the movie audience laughed in recognition, and many men likely scratched their heads, wondering why anyone would need to fake sexual enjoyment. Some men may have remembered times when they faked it too. In the romantic-comedic world of the movie, the scene symbolized one of the differences between the average woman and the average man that only a generous and committed love could bridge.

A few years ago, When Harry Met Sally turned 30 years old, and its anniversary prompted a number of reflection pieces, some turning a harsh feminist lens on the film’s gender politics. In “‘I’ll have what she’s having’: How that scene from When Harry Met Sally changed the way we talk about sex,” Lisa Bonos at The Washington Post found in the fake climax scene a salutary revelation of male sexual arrogance. For Bonos, Harry is a typical macho man, someone who doesn’t care about a woman’s pleasure. The fact that the whole point of his conversation with Sally had been his confidence that he was giving women pleasure simply confirmed his emetic masculinity.

According to Bonos, the fact that some women fake orgasm supposedly reveals that women’s sexual pleasure is “not prioritized” in heterosexual relationships, and Sally’s performance gave sobering evidence of a gendered pleasure gap. It was implicitly the man’s fault that his partner felt the need to lie to him about her sexual satisfaction, and his desire for her to orgasm proved his typically male ego. Bonos’s analysis was an egregious violation of the spirit of the movie but was eminently faithful to the feminist perspective. The politics of grievance had come a long way in three decades.

Right on cue, studies in human psycho-sexuality are now taking up the same theme, alleging a culturally imposed “orgasm gap” between men and women in which men outpace women in the frequency with which they report orgasm during sexual intercourse (86% for men vs. 62% for women, according to one national survey).

Remembering how consistently feminist pundits have expressed outrage at male incels‘ (alleged) sense of “entitlement” to sex, I cannot help but find it ironic how unapologetically researchers assume a female entitlement to orgasm. Apparently, the whole society is to be concerned if women fail to climax every time they have sex, while no one has compassion for young men who face a lifetime of sexlessness. The prime exhibit is “Orgasm Equality: Scientific Findings and Societal Implications“, a paper published in 2020 by three female researchers at the University of Florida. The paper not only surveys the literature on the subject but also makes recommendations for “a world of orgasm equality”.

April 16, 2023

QotD: Homo electronicus and the permanent caloric surplus

Filed under: Britain, Food, Health, History, Quotations, USA, WW1 — Tags: , , , , — Nicholas @ 01:00

Finally, I suggest that the permanent caloric surplus that has obtained in the West since about 1950 has done more than anything to speciate us Postmoderns. It would take someone who Fucking Loves Science™ way more than I do to assert that the vast, obvious changes in the human race in the 20th century were merely physical. Consider the oft-remarked fact (at the time, at least) that British officers on the Western Front were a full head taller than their men. Then consider (ditto) the more-or-less open secret that a lot of those tall subalterns were gay. Correlation is not causation — growing up in the infamous English public schools probably had a lot to do with it, as Robert Graves himself says — but … there’s a pretty strong correlation.

Excess fat cranks up estrogen levels. You don’t need to be House MD to interpret this finding:

    In males with increasing obesity there is increased aromatase activity, which irreversibly converts testosterone to estradiol resulting in decreased testosterone and elevated estrogen levels.

Or this one:

    A study supports the link between excess weight and higher hormone levels. The study found that estrogen and testosterone levels dropped quite a bit when overweight and obese women lost weight.

This is not to say those swishy subalterns were fat — indeed, they were comically scrawny compared to Postmodern people. But a little goes a long way when it comes to hormones, especially in a world where “intermittent fasting” wasn’t a fad diet, but a way of life. Any one of us would keel over from hunger if we were forced to eat the kind of diet George Orwell described as his public school’s standard fare.

Follow that trend out to the Current Year, when pretty much everyone is grossly obese compared to even the Silent Generation. Heartiste and other “game” bloggers loved pointing out that the average modern woman weighs as much as the average man did in the 1960s. And while I think that’s overblown — we’re also several inches taller, on average, than 1960s people — there’s definitely something to it, especially when you consider how far the bell curve has shifted to the fat end. Not only do people weigh a lot more on average, the people who weigh more than average now weigh a hell of a lot more than heavier-than-average people did back when. See, for example, the ballooning weight of offensive linemen, who are professionally fat — in 2011 a quarterback, Cam Newton, weighed more than the average offensive lineman in the 1960s.

Put the two trends together and you have, on average, a hormone cocktail way, way different than even 50 years ago … and that’s before you add in things like all-but-universal hormonal contraception, lots of which ends up in municipal drinking water.

Severian, “Recent Evolution”, Rotten Chestnuts, 2020-09-28.

April 12, 2023

Institutional Review Boards … trying to balance harm vs health, allegedly

Filed under: Books, Bureaucracy, Health, USA — Tags: , , , , , — Nicholas @ 06:00

At Astral Codex Ten Scott Alexander reviews From Oversight to Overkill by Simon N. Whitley, in light of his own experience with an Institutional Review Board’s demands:

Dr. Rob Knight studies how skin bacteria jump from person to person. In one 2009 study, meant to simulate human contact, he used a Q-tip to cotton swab first one subject’s mouth (or skin), then another’s, to see how many bacteria traveled over. On the consent forms, he said risks were near zero — it was the equivalent of kissing another person’s hand.

His IRB — ie Institutional Review Board, the committee charged with keeping experiments ethical — disagreed. They worried the study would give patients AIDS. Dr. Knight tried to explain that you can’t get AIDS from skin contact. The IRB refused to listen. Finally Dr. Knight found some kind of diversity coordinator person who offered to explain that claiming you can get AIDS from skin contact is offensive. The IRB backed down, and Dr. Knight completed his study successfully.

Just kidding! The IRB demanded that he give his patients consent forms warning that they could get smallpox. Dr. Knight tried to explain that smallpox had been extinct in the wild since the 1970s, the only remaining samples in US and Russian biosecurity labs. Here there was no diversity coordinator to swoop in and save him, although after months of delay and argument he did eventually get his study approved.

Most IRB experiences aren’t this bad, right? Mine was worse. When I worked in a psych ward, we used to use a short questionnaire to screen for bipolar disorder. I suspected the questionnaire didn’t work, and wanted to record how often the questionnaire’s opinion matched that of expert doctors. This didn’t require doing anything different — it just required keeping records of what we were already doing. “Of people who the questionnaire said had bipolar, 25%/50%/whatever later got full bipolar diagnoses” — that kind of thing. But because we were recording data, it qualified as a study; because it qualified as a study, we needed to go through the IRB. After about fifty hours of training, paperwork, and back and forth arguments — including one where the IRB demanded patients sign consent forms in pen (not pencil) but the psychiatric ward would only allow patients to have pencils (not pen) — what had originally been intended as a quick record-keeping had expanded into an additional part-time job for a team of ~4 doctors. We made a tiny bit of progress over a few months before the IRB decided to re-evaluate all projects including ours and told us to change twenty-seven things, including re-litigating the pen vs. pencil issue (they also told us that our project was unusually good; most got >27 demands). Our team of four doctors considered the hundreds of hours it would take to document compliance and agreed to give up. As far as I know that hospital is still using the same bipolar questionnaire. They still don’t know if it works.

Most IRB experiences can’t be that bad, right? Maybe not, but a lot of people have horror stories. A survey of how researchers feel about IRBs did include one person who said “I hope all those at OHRP [the bureaucracy in charge of IRBs] and the ethicists die of diseases that we could have made significant progress on if we had [the research materials IRBs are banning us from using]”.

Dr. Simon Whitney, author of From Oversight To Overkill, doesn’t wish death upon IRBs. He’s a former Stanford IRB member himself, with impeccable research-ethicist credentials — MD + JD, bioethics fellowship, served on the Stanford IRB for two years. He thought he was doing good work at Stanford; he did do good work. Still, his worldview gradually started to crack:

    In 1999, I moved to Houston and joined the faculty at Baylor College of Medicine, where my new colleagues were scientists. I began going to medical conferences, where people in the hallways told stories about IRBs they considered arrogant that were abusing scientists who were powerless. As I listened, I knew the defenses the IRBs themselves would offer: Scientists cannot judge their own research objectively, and there is no better second opinion than a thoughtful committee of their peers. But these rationales began to feel flimsy as I gradually discovered how often IRB review hobbles low-risk research. I saw how IRBs inflate the hazards of research in bizarre ways, and how they insist on consent processes that appear designed to help the institution dodge liability or litigation. The committees’ admirable goals, in short, have become disconnected from their actual operations. A system that began as a noble defense of the vulnerable is now an ignoble defense of the powerful.

So Oversight is a mix of attacking and defending IRBs. It attacks them insofar as it admits they do a bad job; the stricter IRB system in place since the ‘90s probably only prevents a single-digit number of deaths per decade, but causes tens of thousands more by preventing life-saving studies. It defends them insofar as it argues this isn’t the fault of the board members themselves. They’re caught up in a network of lawyers, regulators, cynical Congressmen, sensationalist reporters, and hospital administrators gone out of control. Oversight is Whitney’s attempt to demystify this network, explain how we got here, and plan our escape.

April 3, 2023

The poison garden of Alnwick

Filed under: Britain, Environment, History — Tags: , , , — Nicholas @ 02:00

Tom Scott
Published 29 May 2017

Inside the beautiful Alnwick Garden, behind a locked gate, there’s the Poison Garden: it contains only poisonous plants. Trevor Jones, head gardener, was kind enough to give a guided tour!

For more information about visiting the Castle, Garden, and poison garden: https://alnwickgarden.com/

(And yes, it’s pronounced “Annick”.)
(more…)

March 24, 2023

A very different take on the Wuhan Coronavirus pandemic

At The Conservative Woman, Dr. Mike Yeadon lays out his case for doubting that there ever actually was a novel coronavirus in the first place:

Wuhan Institute of Virology.
Wikimedia Commons.

I’ve grown increasingly frustrated about the way debate is controlled around the topic of origins of the alleged novel virus, SARS-CoV-2, and I have come to disbelieve it’s ever been in circulation, causing massive scale illness and death. Concerningly, almost no one will entertain this possibility, despite the fact that molecular biology is the easiest discipline in which to cheat. That’s because you really cannot do it without computers, and sequencing requires complex algorithms and, importantly, assumptions. Tweaking algorithms and assumptions, you can hugely alter the conclusions.

This raises the question of why there is such an emphasis on the media storm around Fauci, Wuhan and a possible lab escape. After all, the “perpetrators” have significant control over the media. There’s no independent journalism at present. It is not as though they need to embarrass the establishment. I put it to readers that they’ve chosen to do so.

So who do I mean by “they” and “the perpetrators”? There are a number of candidates competing for this position, with their drug company accomplices, several of whom are named in Paula Jardine’s excellent five-part series for TCW, Anatomy of the sinister Covid project. High on the list is the “enabling” World Economic Forum and their many political acolytes including Justin Trudeau and Jacinda Ardern.

But that doesn’t answer the question why are they focusing on the genesis of the virus. In my view, they are doing their darnedest to make sure you regard this event exactly as they want you to. Specifically, that there was a novel virus.

I’m not alone in believing that myself at the beginning of the “pandemic”, but over time I’ve seen sufficient evidence to cast strong doubt on that idea. Additionally, when considered as part of a global coup d’état, I have put myself in the position of the most senior, hidden perpetrators. In a Q&A, they would learn that the effect of a released novel pathogen couldn’t be predicted accurately. It might burn out rapidly. Or it might turn out to be quite a lot more lethal than they’d expected, demolishing advanced civilisations. Those top decision-makers would, I submit, conclude that this natural risk is intolerable to them. They crave total control, and the wide range of possible outcomes from a deliberate release militates against this plan of action: “No, we’re not going to do this. Come back with a plan with very much reduced uncertainty on outcomes.”

The alternative I think they’ve used is to add one more lie to the tall stack of lies which has surrounded this entire affair. This lie is that there has ever been in circulation a novel respiratory virus which, crucially, caused massive-scale illness and deaths. In fact, there hasn’t.

Instead, we have been told there was this frightening, novel pathogen and ramped up the stress-inducing fear porn to 11, and held it there. This fits with cheating about genetic sequences, PCR test protocols (probes, primers, amplification and annealing conditions, cycles), ignoring contaminating genetic materials from not only human and claimed viral sources, but also bacterial and fungal sources. Why for example did they need to insert the sampling sticks right into our sinuses? Was it to maximise non-human genetic sequences?

Notice the soft evidence that our political and cultural leaders, including the late Queen, were happy to meet and greet one another without testing, masking or social distancing. They had no fear. In the scenario above, a few people would have known there was no new hazard in their environment. If there really was a lethal pathogen stalking the land, I don’t believe they’d have had the courage or the need to act nonchalantly and risk exposure to the virus.

Most convincingly for me is the US all-cause mortality (ACM) data by state, sex, age and date of occurrence, as analysed by Denis Rancourt and colleagues. The pattern of increased ACM is inconsistent with the presence of a novel respiratory virus as the main cause.

If I’m correct that there was no novel virus, what a genius move it was to pretend there was! Now they want you only to consider how this “killer virus” got into the human population. Was it a natural emergence (you know, a wild bat bit a pangolin and this ended up being sold at a wet market in Wuhan) or was it hubristically created by a Chinese researcher, enabled along the way by a researcher at the University of North Carolina funded by Fauci, together making an end run around a presidential pause on such work? Then there’s the question as to whether the arrival of the virus in the general public was down to carelessness and a lab leak, or did someone deliberately spread it?

February 20, 2023

Monocultures are risky in agriculture … and even more so in politics

Filed under: Government, Politics, USA — Tags: , , , , — Nicholas @ 05:00

Unlike his usual bite-sized quips-with-links at Instapundit, Glenn Reynolds occasionally writes at length for his Substack page:

The western front of the United States Capitol. The Neoclassical style building is located in Washington, D.C., on top of Capitol Hill at the east end of the National Mall. The Capitol was designated a National Historic Landmark in 1960.
Photo via Wikimedia Commons.

Our modern ruling class is peculiar. One of its many peculiarities is its penchant for fads, and what can only be called mass hysteria. Repeatedly, we see waves in which something that nobody much cared about suddenly comes to dominate ruling class discourse. Almost in synchrony, a wide range of institutions begin to talk about it, and to be preoccupied by it, even as every leading figure virtue-signals regarding this subject which, only a month or two previously, hardly any of them even knew about, much less cared about.

There are several factors behind this, but one of the most important, I think, is that our ruling class is a monoculture.

In agriculture, a monoculture exists when just a single variety dominates a crop. “Monoculture has its benefits. The entire system is standard, so there are rarely new production and maintenance processes, and everything is compatible and familiar to users. On the other hand, as banana farmers learned, in a monoculture, all instances are prone to the same set of attacks. If someone or something figures out how to affect just one, the entire system is put at risk.”

In a monoculture, if one plant is vulnerable to a disease or an insect, they all are. Thus diseases or pests can rip through it like nobody’s business. (As John Scalzi observes in one of his books, it’s also why clone armies, popular in science fiction, are a bad idea in reality, as they would be highly vulnerable to engineered diseases.) A uniform population is a high-value target.

[…]

Codevilla wrote the essay [here] over a decade ago, and it has only grown more true in the interim. Despite its constant invocation of “diversity”, in many important ways our ruling class is much less diverse than it has ever been. And, as a monoculture, it is vulnerable to viruses of a sort. Including what amount to viruses of the mind.

When Elon Musk referred to the dangers of the “woke mind virus“, he knew exactly what he was talking about. Ideas can be contagious, and can be viewed as analogous to viruses, entities that reproduce by infecting individuals and coopting those individuals into spreading them to others. Richard Dawkins, in his The Selfish Gene, coined the term “meme” to describe these infectious ideas, though the term has since acquired a more popular meaning involving photos of cats, etc. with captions. And yet those pictures are themselves memes, to the extent they “go viral” and persuade others to copy and spread them.

Our ruling class is particularly vulnerable to mind viruses for several reasons. First, it is a monoculture, so that what is persuasive to one member is likely to be persuasive to many.

Second, it suffers from deep and widespread status anxiety – not least because most of its members have status, but few real accomplishments to rely on – and thus requires constant reassurance in the form of peer acceptance, reassurance that is generally achieved by repeating whatever the popular people are saying already. And third, it has few real deeply held values, which might otherwise provide guard rails of a sort against believing crazy things.

In a more diverse ruling class, ideas would not spread so swiftly or be received so uncritically. People with different worldviews would respond differently to ideas as they entered the world of discourse. There would be criticism and there would be debate. (Indeed, this is how things generally worked during the earlier, more diverse, era described by Codevilla, though intellectual fads – lobotomy, say, or eugenics – spread then, too, though mostly through the Gentry/Academic stratum of society that now dominates the ruling class.)

September 7, 2022

The “self-domestication” hypothesis in human evolution

Filed under: Books, Science — Tags: , , — Nicholas @ 03:00

A review of The Goodness Paradox: The Strange Relationship Between Virtue and Violence in Human Evolution by Richard Wrangham in the latest edition of Rob Henderson’s Newsletter:

The “self-domestication hypothesis” is the idea that in the ancestral environment, early human communities collectively killed individuals prone to certain forms of aggression: arrogance, bullying, random violence, and monopolizing food and sexual partners.

Over time, our ancestors eliminated humans — typically males — who were exceedingly aggressive toward members of their own group.

If there was a troublemaker, then other less domineering males conspired to organize and commit collective murder against them.

Women too were involved in such decisions involving capital punishment, but men typically carried out the killing.

Humans tamed one another by taking out particularly aggressive individuals. This led us to become relatively peaceful apes.

But if humans are “self-domesticated”, then why are there so many violent people among us today?

The fact is, humans are not nearly as violent as our nearest evolutionary relatives.

Comparing the level of within-group physical aggression among chimpanzees with human hunter-gatherer communities, chimps are 150 to 550 times more likely than humans to inflict violence against their peers.

We humans are far nicer to members of our own group than chimps are. Thanks to our ancestors and their ability to plan organized murder. And tear overly dominant males to shreds.

Many people are familiar with the findings that bonobos are more peaceful than chimpanzees.

This is true.

Male bonobos are about half as aggressive as male chimpanzees, while female bonobos are more aggressive than female chimpanzees.

Bonobos are “peaceful”, relative to chimps. But bonobos are extremely aggressive compared to humans.

The eminent Harvard biological anthropologist Richard Wrangham explores these findings at length in his fascinating 2019 book The Goodness Paradox: The Strange Relationship Between Virtue and Violence in Human Evolution.

This is a review and discussion of Wrangham’s book.

August 30, 2022

NYT op-ed – “Maternal instinct is a social construct devised by men to keep women subordinate”

Filed under: Health, Media, Science — Tags: , , , , , — Nicholas @ 03:00

Jerry Coyne responds to a New York Times op-ed by Chelsea Conaboy (author of a forthcoming book from which the op-ed was adapted):

The recent article […] from the New York Times (of course), is one of the worst of the lot. It bespeaks a lack of judgment on the part of the author — who ignores biology because of her ideology — as well as on the part of the newspaper, which failed to hold the author’s feet to the scientific fire. Let this post be my rebuttal.

Author Conaboy, who apparently hasn’t done enough scientific research, maintains that “maternal instinct” doesn’t exist, but is a social construct devised by men to keep women subordinate.

The immediate problem is that Conaboy never defines “maternal instinct”. It could mean any number of things, including a greater desire of women than men to have children, a greater desire of women than of men to care for those offspring, the fact that in animals mothers spend more time caring for offspring than do fathers, a greater emotional affinity of women than of men towards children (including offspring), or the demonstration of such a mental difference by observing a difference in caring behavior.

I will define “maternal instinct” as not only the greater average tendency of females than males to care for offspring, but also a greater behavioral affinity towards offspring in females than in males. The term involves behavioral response, not “feelings”, which are demonstrable only in humans. Thus one can look for difference in “parental instincts” across various species of animals.

But even in this sense, Conoboy is partly (but far from wholly) correct when she discusses humans. It’s undoubtedly true that women were socialized into the sex role as offspring breeders and caretakers, with men assuming the “breadwinning” role. It’s also true that women were often denied access to work or education because their vocation was seen as “reproducer”, or out of fear that they would spend less time working and more on children, or even that they’d get pregnant and would leave jobs. Further, it’s also true that this role difference was justified by being seen as “hard-wired” (i.e., largely the result of genes, which, I argue below, is true), and that “hard-wired” was conceived as “unable to be changed”. The latter construal, however, is wrong, and that is what really held back women. The socialization of sex roles, which still occurs, goes on from early ages, with girls given dolls and boys toy cars, though, as society has matured, we’re increasingly allowing girls to choose their own toys and their own path through life. I of course applaud such “equal opportunity”.

But to claim that women don’t have a greater desire than men to care for offspring, or have a greater emotional affinity towards offspring, is to deny biology, and evolution in particular. (I freely admit that many men love their kids deeply, and that some men care for them as much or more as do mothers, but I’m talking about averages here, not anecdotes.)

There are two reasons why Conaboy is wrong, and both involve evolution.

The first is theoretical, but derived from empirical observations. It thus explains the second, which is wholly empirical and predictive. How do we explain the fact that, across the animal kingdom, when members of only one sex do most of the childrearing, it’s almost invariably the females? (Yes, in many species males share the duties, and in a very few, like seahorses, males provide more parental care; and there are evolutionary reasons for that.)

The reasons for the statement in bold above involves the biology of reproduction. It is the female who must lay the eggs or give birth, and there is no way she can leave her genes behind unless she does that. It’s easier for males to take off after insemination and let the females care for offspring. Given that females are constrained to stick with the fertilized eggs, their best strategy is to take care of the gestation and resultant offspring, which of course allows males to seek other mates. Not only must females carry the fetuses, lay the eggs, and so on, but they are also constrained to see out the pregnancy until offspring are produced and then suckle or tend them in other ways. In some cases it’s the best evolutionary strategy for a male to stick around and share the child-rearing, but often it’s not.

This disparity in behavior holds not just in humans, of course, but in many animals: it’s a prediction — largely verified — of evolutionary psychology.

January 8, 2022

“We are a sexually dimorphic species, and men and women are different”

Filed under: Health, Science — Tags: , , , , — Nicholas @ 05:00

A statement like that on Twitter or other social media platforms might run you the risk of denunciation, cancellation, and a plethora of accusations of transphobia, but it isn’t the intent of Robert King to troll the hypersensitive online:

No, this carving isn’t directly related to the article … but it is eyecatching.

We are a sexually dimorphic species, and men and women are different. Evolution has designed us to be different. Realising that we evolved through slow steps, rather than just popping into being in an act of creation, has implications. For one thing, it means that men and women have their own separate evolutionary histories, as a result of differing (although not wholly different, of course) selection pressures. Resisting this truth — pretending that men and women are a sort of silly putty, totally moulded by social forces — has already had serious consequences in medical science, and it also has implications for my field of study.

I study the nature and function of the female orgasm. It might surprise people that there is even a set of questions about this phenomenon, but it is one of the most vexed fields in evolutionary biology. I do not claim that we have solved the puzzle of it. However, I do claim that we know a lot more about female orgasm than we used to. For example, female orgasm is multi-faceted in nature (unlike male orgasm) and is associated with a host of complex, fertility-related, functions. Male orgasm has but one (and a pretty-well understood one at that) fertility related function: reinforcing sexual behaviour. How is it that these stark differences between the sexes have been missed?

A major reason is that sex researchers, in some cases even self-described feminists, have often persisted in treating female orgasm as a mere adjunct to male orgasm. On this view — the by-product view — only male orgasms have a function. Female ones exist as a sort of afterthought of nature. Thus, clitorises have been routinely compared to (functionless) male nipples by, among others, the influential palaeontologist, Stephen Jay Gould. However, this comparison does not stand up to scrutiny. Clitorises are not substandard penises. For starters, they are large, four inches in length, on average. They are highly complex, but their structure — including muscular, erectile, and sensitive tissue — is mostly internal.

The external part — the glans — is highly sensitive, but so is the rest of it, when appropriately aroused. Clitorises connect to their own dedicated area of brain (the somatosensory cortex) utterly distinct from the male version. To see some of this for yourself you could read any number of excellent works by, for example, the brilliant anatomist Helen O’Connell.

If the structure that generates female orgasm is at least as, if not more, complex than the male counterpart, then it makes little sense to assume that the female version depends on the male one. This is doubly true of the event of orgasm itself, prompting the eminent biologist Robert Trivers to quip of female orgasms that “One has to wonder how often Steve [Gould] has been near to that blessed event to regard it as a by-product.” That may be a tad unkind — but it raises a rather important point. If we restrict ourselves to studying female orgasm, or human sexual behaviour generally, in the laboratory alone, then we run a very real risk of missing out on crucial aspects.

Let me make this point more concrete. Over the last couple of years, zoos and wildlife parks across the planet have seen a huge upswing in births, among species previously thought to be sexually frigid — like Pandas. Why? Simple. No humans were about. The animals had some privacy from prying eyes. Does it really stretch imagination to appreciate that the full range of human sexual responses might be also muted when under laboratory conditions? Inefficiency is a hallmark of good sex, and humans use the privacy of the boudoir to do more than make each orgasm as rapidly as possible. We use this space to find out about one another.

January 2, 2022

In 1978, E.O. Wilson was “the only scientist in modern times to be physically attacked for an idea”

Filed under: History, Politics, Science — Tags: , , , , — Nicholas @ 03:00

In the current year, I suspect many, many scientists have been physically attacked for advocating unpopular ideas. In Quillette, Alice Dreger publishes an interview she had with Wilson in 2009:

Edward O. Wilson in February 2003.
PLoS image by Jim Harrison via Wikimedia Commons.

Alice Dreger: I know you’ve spoken about it many times before, but I would like to begin by asking you about the session at the 1978 AAAS [American Association for the Advancement of Science] conference during which you were rushed on the stage and a protester emptied a pitcher of water onto your head. By all accounts, the talk you then gave was very measured. How on Earth were you able to remain so calm after being physically assaulted?

Edward O. Wilson: I think I may have been the only scientist in modern times to be physically attacked for an idea. The idea of a biological human nature was abhorrent to the demonstrators and was, in fact, too radical at the time for a lot of people — probably most social scientists and certainly many on the far-Left. They just accepted as dogma the blank-slate view of the human mind — that everything we do and think is due to contingency, rather than based upon instinct like bodily functions and the urge to keep reproducing. These people believe that everything we do is the result of historical accidents, the events of history, the development of personality through experience.

That was firmly believed in 1978 by a wide part of the population, but particularly by the political Left. And it was thought at the time that raising the specter of a biological basis for human behavior was not only wrong, but a justification for war, sexism, and racism. Biological gender differences could justify sexism, and any imputation that we evolved a human nature, or that human qualities might differ from one race to another, was dangerously racist.

So, furious ideologically based opposition had built up in 1978. That opposition had been fanned by a small number of academics including [paleontologist] Stephen Jay Gould and [evolutionary biologist] Richard Lewontin and two or three others on the Harvard faculty who thought this was a very dangerous idea and said so. These people helped organize the so-called “Science for the People” movement, or the branch of it called the “Sociobiology Study Group”. Their purpose was to discredit me personally for having brought up such a dangerous and destructive idea.

In fact, at that meeting, InCAR — the International Committee Against Racism — held up signs condemning me and sociobiology and racism in general. Of course, racism never even entered my thinking in developing these ideas. Anyway, after they dumped the water on me, amazingly, they returned to their seats while I was drying myself off. A couple of people then made short speeches — most notably Stephen Gould, of all people, the guy whose agitation and inflammatory essays had been partly responsible for all this. He addressed the demonstrators and said, in effect, that while he fully understood their motivation, violence was not the right way to achieve their goals.

As for me, I don’t know why, but I just get calm under a lot of stress. I’ve been in that sort of stressful situation many times, especially in the field. I started thinking to myself, this is probably going to be an historical moment, and it is very interesting. I wasn’t in the least doubt that my science was correct. I knew this was a kind of aberration. I understood the source because I knew the people who had been the chief thinkers, the ideological leaders. An astonishingly good percentage of them were on the faculty at Harvard. I wasn’t concerned this would come to anything in the long term.

So, someone found a paper towel and I dried my head. As soon as things settled down, I just read my talk. I knew things were going to work out — there was so much evidence accumulated already for a somewhat programmed human brain. By then, it was already coming from many directions, including genetics and neuroscience. There was no doubt about where things would go. There may be hold-outs but the inevitable conclusion from neuroscience and anthropology and genetics is for this way of thinking. [American anthropologist] Nap[oleon] Chagnon was present and he was certainly a leader in thinking about human nature and how valuable it is, and what its motivations are, by studying groups like the Yanomamö.

I knew history was on my side. I was young enough that I thought I would live through a good part of it. I was annoyed! But I wasn’t under stress in an extreme way. Before going home, I went to the next session, at which an anthropologist made the mistake of stating that I believe every cultural difference has a genetic basis, so that I am a racist. Of course, I rebutted that, but that was the kind of thing being exchanged at that meeting.

November 20, 2021

DicKtionary – M is for Mathematics – Newton and Hooke

Filed under: Britain, History, Science — Tags: , , , , — Nicholas @ 04:00

TimeGhost History
Published 19 Nov 2021

Today we turn away from killers and sociopathic rulers and look at two men from the world of science. Isaac Newton and Robert Hooke were certainly very intelligent and creative, but were they dicks as well?
(more…)

Older Posts »

Powered by WordPress