Quotulatiousness

April 9, 2024

Checking in on the front lines of paleontology research

Filed under: History, Politics, Science — Tags: , , , , , , — Nicholas @ 04:00

Sarah Hoyt is amazed and surprised at some recent revelations about the latest paleontological discoveries:

Yesterday I did something I used to do more often, and looked into the latest discoveries in paleontology and archaeology and such.

And shortly after remembered why I no longer do it every month.

Apparently the latest, greatest news is that — yo, this is amazing — they’re finally not studying pre-history through the eyes of racism and sexism, and as such have determined that pre-historic hunting parties were as much as 80% female.

This is a discovery that of course makes perfect sense. No doubt the males were all back at base camp, chest-feeding the babies. Thereby freeing the pre-historic girl bosses to go and hunt them some mammoth.

If you’re staring at the screen with dropped jaws, I haven’t gone insane. I know this is utter and complete nonsense. It’s not my fault that the people running all our intellectual institutions, including research are morons studying to be idiots.

Applying the Heinlein filter for the actual reason they have to believe that up to 80% — 80%! — of pre-historic hunters were female, i.e. “Again and again, what are the facts?” we get that they found one grave — one — where the DNA of the remains are female (and here we’ll keep quiet about the strange idea that 6000 (I think) year old DNA is easily extracted, non-contaminated, etc. We’ll pretend we don’t know all the times they walked back “new species” because the DNA amplification techniques CREATED those discoveries.)

Let’s assume they’re correct and this was a female, buried with hunting implements. Sure, maybe she was a hunter. There will always be one or two in a large enough band, for the reason that in primitive societies some women are brought up as male: lack of a son, need to support the family, etc. (It is rarely a sexual thing, or because the person WANTED this. In fact it’s often decided for them before they are weaned. In fact, in primitive/ancient societies including the ones of our ancestors that we know about in detail, there was remarkably little room for self expression, self-conception or self interest. When you live close to the bone, such things are subjugated to the needs of the family, the clan, the tribe, more or less in that order. Because survival is hard.) We’re also informed, in BREATHLESS tones that it’s now thought that spear throwers were used to make sure women could throw spears fast enough! That’s why they exist.

But spear throwers are made and used by males, the world over. Go look at the tubes of you and you’ll see videos of people making them and using them, and they’re all male.

Further there is no society today of the ones still surviving more or less in a stone age way that has that kind of distribution for hunters. More importantly, there is no record of them, going as far back as we can.

Maybe this is because yes indeed, the past (being much closer to the bone, and therefore less willing to indulge in story telling) viewed things through a racist and sexist lens. Why not? After all I grew up in an intensely patriarchal society that still hadn’t adapted to the idea of women taking any hand in intellectual pursuits. And yes, that was unwarranted sexism. And every society is racist against every other (Actually culturist, but it’s often couched in terms of race.)

But still … You’d think that here and there there would be a race of valiant Amazons, whose men stay home and pound the taro while they go out and hunt, right?

But the truth is that if you put this notion to the remaining stone age people, they will laugh till they pee themselves. Yes, I know, I know, they internalized sexism from the evil white colonizers whom they’ve met three times in the last 100 years. That’s how powerful and evil whiteness is.

Or, listen, okay? I know this is just crazy talk, but maybe males and females are different and have evolved to fulfill different reproductive functions. And the reproductive function of females is more onerous than that of males. Women in our natural state, and unless something has gone seriously wrong — which of course makes us of less use to the tribe — spend most of our lives pregnant, nursing or carrying for children too young to care for themselves.

Historical examples of social contagion

Filed under: Health, Media — Tags: , , , , — Nicholas @ 03:00

Andrew Doyle discusses how social contagions of the past resemble the current gender identity boom among western young people:

… social contagions are especially common among teenage girls, and that there are numerous historical precedents for this. I have written elsewhere about the Salem witch trials of 1692-93, in which a group of girls began seeing demons in the shadows and accusing members of their own community of being in league with the Devil. Then there were the various “dancing plagues” of the middle ages which seemed to impact young women in particular. In 1892, girls at a school in Germany began to involuntarily shake their hands whenever they performed writing exercises. And when I visited Sweden last year, I was told about a local village where, during the medieval period, the girls all inexplicably began to limp.

It’s perfectly clear that the latest social contagion to take hold in the western world is that of girls identifying out of their femaleness, either through claims that they are trans or non-binary. Whereas in 2012, there were only 250 referrals (mostly boys) to the NHS’s Gender Identity Development Service (GIDS), by 2021 the figure had risen to more than 5,000 (mostly female) patients. Gender activists like to claim that this is simply the consequence of more people “coming out” as society becomes more tolerant, and at the same time insist that it has never been a worse time to be trans. Consistency is not their strong suit.

Of course there are no easy answers as to the explosion of this latest fad, but surely the proliferation of social media has something to do with it. Platforms such as TikTok are replete with activists explaining to teenagers that their feelings of confusion are probably evidence that they have been “born in the wrong body”. For pubescent girls who are uncomfortable with their physiological changes, as well as sudden unwanted male sexual attention, the prospect of identifying out of womanhood makes complete sense. These online pedlars have some snake-oil to sell. And while a limping epidemic in a medieval village would be unlikely to spread very far, social contagions cannot be so confined in the digital age.

Much of this is reminiscent of the recovered memory hysteria of the late twentieth-century, when therapist cranks promoted the idea that most victims of sexual abuse had repressed their traumatic memories from childhood. It led to numerous cases of people imagining that they had been abused by parents and other family members, and many lives were ruined as a result. One of the key texts in this movement was The Courage to Heal (1988) by Ellen Bass and Laura Davis, which made the astonishing and unevidenced claim that “if you are unable to remember any specific instances … but still have a feeling that something abusive happened to you, it probably did”.

A common feature of social contagions is that they depend upon the elevation of intuition over material reality. Just as innocent family members were accused of sexual abuse because of “feelings” teased out by unscrupulous therapists, many girls are now being urged by online influencers to trust the evidence of their emotions and accept a misalignment between their body and their gendered soul. We are not talking here about the handful of children who suffer from gender dysphoria, but rather healthy children who have been swept up in a temporary craze.

April 8, 2024

“The carbon rebate seems to be one of those rare examples of people getting mad at receiving government money rather than being grateful”

In The Line, Jen Gerson makes a strong argument that the vaunted (by Justin Trudeau and the Liberal Party) carbon tax rebate is actually the big problem with the carbon tax, not the “Conservative misinformation” constantly being pointed at by the government’s paid accomplices in the mainstream media:

Is the purpose of the Liberals’ carbon tax to materially reduce carbon emissions — or is it a wealth redistribution program? I ask because every time the Liberals defend the carbon tax by resorting to the awesomeness of the rebate, what they cease to talk about is how effective it is at actually reducing carbon emissions.

Instead, we fall into an endless series of counterproductive debates about whether what individuals are getting from the rebate equals what they’re paying out in tax. And that debate is repeated every quarter, and each time the carbon tax rises. In other words, our entire political discourse about the tax is centred on wealth redistribution — not emissions.

That makes people suspicious of the government’s actual goals, and skeptical about its claims. This, again, is a problem of message dilution. If you cannot clearly express your intentions, then you’re not going to get political buy-in to your aims. This problem is particularly acute on a policy that is — by definition — demanding a sacrifice of cash and/or quality of life by Canadians. People can get on board with sacrifice, but only if it’s tied to a clear, obtainable, and material objective.

[…]

And here’s where we get into the real dark heart of the problem.

It’s the rebate itself.

I understand why the Canada Carbon Rebate happened. The government wanted to introduce a carbon tax without disproportionately penalizing the poor — the demographic least able to make the investments and lifestyle changes necessary to respond to the tax. But did that relief have to come in the form of a rebate?

Well, no.

There are lots of methods a government can use to ease poverty. But governments love themselves a rebate. Why? Because rebates are normalized vote buying. One that all political parties are guilty of using. The Liberals implemented the rebate thinking Canadians would hit their mailboxes every quarter, see a few hundred bucks, and get warm fuzzy feelings for Papa Trudeau and the natural governing party. “Government’s looking out for me!”

Getting government cheques is popular, and the Liberals were no doubt trying to replicate the appeal of the Canada Child Benefit.

But that didn’t happen here. The carbon rebate seems to be one of those rare examples of people getting mad at receiving government money rather than being grateful. Why?

Well, may I suggest that it’s because every time people open up those cheques, instead of processing the dopamine hit of “free” money, they’re instead reminded of how much they had to pay in to get it. They do the math in their head, think about their rising grocery bills and gas, and come away thinking “not worth it”. Every single quarter, millions of Canadian households are feeling as if they are paying dollars to get dimes — and it’s pissing them right off. Further, demanding they acknowledge they’re better off in the exchange is only adding salt to the wound. Throwing Parliamentary Budget Officer (PBO) reports at them doesn’t change their minds. It just pisses them off more.

To put it more pithily — a benefit is a gift. A rebate is a value proposition. And a hell of a lot of Canadians are looking at this rebate and determining that its value is wanting — all the more so as the goals of that purchase haven’t been clearly articulated.

April 5, 2024

“[T]oo many charlatans of this species have already been allowed to make vast fortunes at the expense of a gullible public”

Colby Cosh on his “emerging love-Haidt relationship” as Jonathan Haidt’s new book is generating a lot of buzz:

If Haidt has special expertise that wouldn’t pertain to any well-educated person, I wonder a little in what precise realm it lies. Read the second sentence of this article again: he’s a psychologist … who teaches ethics … at a business school? Note that he seems to have abandoned a prior career as an evolutionary biology pedlar, and the COVID pandemic wasn’t kind to his influential ideas about political conservatives being specially motivated by disgust and purity. Much of The Anxious Generation is instead devoted to trendy findings from “neuroscience” that it might be too kind to describe as “speculative”. (I’ll say it again until it’s conventional wisdom: a “neuroscientist” is somebody in a newly invented pseudofield who couldn’t get three inches into the previously established “-ology” for “neuro-“.)

These are my overwhelming prejudices against Haidt; and, in spite of all of them, I suspect somebody had to do what he is now doing, which is to make the strongest available case for social media as a historical impactor on social arrangements and child development. Today the economist/podcaster Tyler Cowen has published a delightfully adversarial interview with Haidt that provides a relatively fast way of boning up on the Haidt Crusade. Cowen belongs to my pro-innovation, techno-optimist, libertarian tribe: we both feel positive panic at the prospect of conservative-flavoured state restrictions on media, which are at the heart of the Haidt agenda.

But reading the interview makes me somewhat more pro-Haidt than I would otherwise be (i.e., not one tiny little bit). On a basic level, Cowen doesn’t, by any means, win the impromptu debate by a knockout — even though he is one of the most formidable debaters alive. Haidt has four key reforms he would like to see implemented politically: “No smartphones before high school; no social media before age 16; phone-free schools; far more unsupervised play and childhood independence.”

This is a fairly limited, gentle agenda for school design and other policies, and although I believe Haidt’s talk of “rewiring brains” is mostly ignorable BS, none of his age-limitation rules are incompatible with a free society, and none bear on adults, except in their capacity as teachers and parents.

The “rewiring” talk isn’t BS because it’s necessarily untrue, mind you. Haidt, like Jordan Peterson, is another latter-day Marshall McLuhan — a boundary-defying celebrity intellectual who strategically turns speculation into assertion, and forces us, for better or worse, to re-examine our beliefs. McLuhan preached that new forms of media like movable type or radio do drive neurological change, that they cause genuine warp-speed human evolution — but his attitude, unlike Haidt’s, was that these changes are certain to happen, and that arguing against them was like arguing with the clouds in favour of a sunny day. The children who seem “addicted” to social media are implicitly preparing to live in a world that has social media. They are natives of the future, and we adults are just observers of it.

April 4, 2024

QotD: What we mean by the term “indigenous”

Well, if by indigenous we mean “the minimally admixed descendants of the first humans to live in a place”, we can be pretty confident about the Polynesians, the Icelanders, and the British in Bermuda. Beyond that, probably also those Amazonian populations with substantial Population Y ancestry and some of the speakers of non-Pama–Nyungan languages in northern Australia? The African pygmies and Khoisan speakers of click languages who escaped the Bantu expansion have a decent claim, but given the wealth of hominin fossils in Africa it seems pretty likely that most of their ancestors displaced someone. Certainly many North American groups did; the “skraelings” whom the Norse encountered in Newfoundland were probably the Dorset, who within a few hundred years were completely replaced by the Thule culture, ancestors of the modern Inuit. (Ironically, the people who drove the Norse out of Vinland might have been better off if they’d stayed; they could hardly have done worse.)

But of course this is pedantic nitpicking (my speciality), because legally “indigenous” means “descended from the people who were there before European colonialism”: the Inuit are “indigenous” because they were in Newfoundland and Greenland when Martin Frobisher showed up, regardless of the fact that they had only arrived from western Alaska about five hundred years earlier. Indigineity in practice is not a factual claim, it’s a political one, based on the idea that the movements, mixtures, and wholesale destructions of populations since 1500 are qualitatively different from earlier ones. But the only real difference I see, aside from them being more recent, is that they were often less thorough — in large part because they were more recent. In many parts of the world, the Europeans were encountering dense populations of agriculturalists who had already moved into the area, killed or displaced the hunter-gatherers who lived there, and settled down. For instance, there’s a lot of French and English spoken in sub-Saharan Africa, but it hasn’t displaced the Bantu languages like they displaced the click languages. Spanish has made greater inroads in Central and South America, but there’s still a lot more pre-colonial ancestry among people there than there is pre-Bantu ancestry in Africa. I think these analogies work, because as far as I can tell the colonization of North America and Australia look a lot like the Early European Farmer and Bantu expansions (technologically advanced agriculturalists show up and replace pretty much everyone, genetically and culturally), while the colonization of Central and South America looks more like the Yamnaya expansion into Europe (a bunch of men show up, introduce exciting new disease that destabilizes an agricultural civilization,1 replace the language and heavily influence the culture, but mix with rather than replacing the population).

Some people argue that it makes sense to talk about European colonialism differently than other population expansions because it’s had a unique role in shaping the modern world, but I think that’s historically myopic: the spread of agriculture did far more to change people’s lives, the Yamnaya expansion also had a tremendous impact on the world, and I could go on. And of course the way it’s deployed is pretty disingenuous, because the trendier land acknowledgements become, the more the people being acknowledged start saying, “Well, are you going to give it back?” (Of course they’re not going to give it back.) It comes off as a sort of woke white man’s burden: of course they showed up and killed the people who were already here and took their stuff, but we’re civilized and ought to know better, so only we are blameworthy.

More reasonable, I think, is the idea that (some of) the direct descendants of the winners and losers in this episode of the Way Of The World are still around and still in positions of advantage or disadvantage based on its outcome, so it’s more salient than previous episodes. Even if, a thousand years ago, your ancestors rolled in and destroyed someone else’s culture, it still sucks when some third group shows up and destroys yours. It’s just, you know, a little embarrassing when you’ve spent a few decades couching your post-colonial objections in terms of how mean and unfair it is to do that, and then the aDNA reveals your own population’s past …

Reich gets into this a bit in his chapter on India, where it’s pretty clear that the archaeological and genetic evidence all point to a bunch of Indo-Iranian bros with steppe ancestry and chariots rolling down into the Indus Valley and replacing basically all the Y chromosomes, but his Indian coauthors (who had provided the DNA samples) didn’t want to imply that substantial Indian ancestry came from outside India. (In the end, the paper got written without speculating on the origins of the Ancestral North Indians and merely describing their similarity to other groups with steppe ancestry.) Being autochthonous is clearly very important to many peoples’ identities, in a way that’s hard to wrap your head around as an American or northern European: Americans because blah blah nation of immigrants blah, obviously, but a lot of northern European stories about ethnogenesis (particularly from the French, Germans, and English) draw heavily on historical Germanic tribal migrations and the notion of descent (at least in part) from invading conquerors.

One underlying theme in the book — a theme Reich doesn’t explicitly draw out but which really intrigued me — is the tension between theory and data in our attempts to understand the world. You wrote above about those two paradigms to explain the spread of prehistoric cultures, which the lingo terms “migrationism” (people moved into their neighbors’ territory and took their pots with them) and “diffusionism”2 (people had cool pots and their neighbors copied them), and which archaeologists tended to adopt for reasons that had as much to do with politics and ideology as with the actual facts on (in!) the ground. And you’re right that in most cases where we now have aDNA evidence, the migrationists were correct — in the case of the Yamnaya, most modern migrationists didn’t go nearly far enough — but it’s worth pointing out that all those 19th century Germans who got so excited about looking for the Proto-Indo-European Urheimat were just as driven by ideology as the 21st century Germans who resigned as Reich’s coauthors on a 2015 article where they thought the conclusions were too close to the work of Gustaf Kossinna (d. 1931), whose ideas had been popular under the Nazis. (They didn’t think the conclusions were incorrect, mind you, they just didn’t want to be associated with them.) But on the other hand, you need a theory to tell you where and how to look; you can’t just be a phenomenological petri dish waiting for some datum to hit you. This is sort of the Popperian story of How Science Works, but it’s more complex because there are all kinds of extra-scientific implications to the theories we construct around our data.

The migrationist/diffusionist debate is mostly settled, but it turns out there’s another issue looming where data and theory collide: the more we know about the structure and history of various populations, the more we realize that we should expect to find what Reich calls “substantial average biological differences” between them. A lot of these differences aren’t going to be along axes we think have moral implications — “people with Northern European ancestry are more likely to be tall” or “people with Tibetan ancestry tend to be better at functioning at high altitudes” isn’t a fraught claim. (Plus, it’s not clear that all the differences we’ve observed so far are because one population is uniformly better: many could be explained by greater variation within one population. Are people with West African ancestry overrepresented among sprinters because they’re 0.8 SD better at sprinting, or because the 33% higher genetic diversity among West Africans compared to people without recent African ancestry means you get more really good sprinters and more really bad ones?) But there are a lot of behavioral and cognitive traits where genes obviously play some role, but which we also feel are morally weighty — intelligence is the most obvious example, but impulsivity and the ability to delay gratification are also heritable, and there are probably lots of others. Reich is adorably optimistic about all this, especially for a book written in 2018, and suggests that it shouldn’t be a problem to simultaneously (1) recognize that members of Population A are statistically likely to be better at some thing than members of Population B, and (2) treat members of all populations as individuals and give them opportunities to succeed in all walks of life to the best of their personal abilities, whether the result of genetic predisposition or hard work. And I agree that this is a laudable goal! But for inspiration on how our society can both recognize average differences and enable individual achievement, Reich suggests we turn to our successes in doing this for … sex differences! Womp womp.

Jane Psmith and John Psmith, “JOINT REVIEW: Who We Are and How We Got Here, by David Reich”, Mr. and Mrs. Psmith’s Bookshelf, 2023-05-29.


    1. aDNA works for microbes too, and it looks like Y. pestis, the plague, came from the steppe with the Yamnaya. It didn’t yet have the mutation that causes buboes, but the pneumonic version of the disease is plenty deadly, especially to the Early European Farmers who didn’t have any protection against it. In fact, as far as we can tell, in all of human history there have only been four unique introductions of plague from its natural reservoirs in the Central Asian steppe: the one that came with or slightly preceded the Yamnaya expansion around 5kya, the Plague of Justinian, the Black Death, and an outbreak that began in Yunnan in 1855. The waves of plague that wracked Europe throughout the medieval and early modern periods were just new pulses of the strain that had caused Black Death. Johannes Krause gets into this a bit in his A Short History of Humanity, which I didn’t actually care for because his treatment of historic pandemics and migrations is so heavily inflected with Current Year concerns, but I haven’t found a better treatment in a book so it’s worth checking it out from the library if you’re interested.

    2. I cheated with that “pots not people” line in my earlier email; it usually gets (got?) trotted out not as a bit of epistemological modesty about what the archaeological record is capable of showing, but as a claim that the only movements involved were those of pots, not of people.

March 28, 2024

Why European farmers are revolting

Filed under: Bureaucracy, Economics, Environment, Europe, Government, Liberty, Politics — Tags: , , , , — Nicholas @ 04:00

spiked
Published Mar 27, 2024

Europe’s farmers are rising up – and the elites are terrified. From the Netherlands to Germany to Ireland, farmers are taking to the streets, parking their tractors on the establishment’s lawn, spraying buildings with manure and bringing life to a standstill. The reason? Because unhinged green regulations, dreamt up by European Union bureaucrats, are immiserating them. In this spiked video polemic, Fraser Myers explores the roots of the farmers’ revolt across the continent – and explains why it must succeed. Watch, share and let us know what you think in the comments.

Support spiked:
https://www.spiked-online.com/support/
Sign up to spiked‘s newsletters: https://www.spiked-online.com/newslet…
Check out spiked‘s shop: https://www.spiked-online.com/shop/

March 25, 2024

One major change in sexual behaviour since the mid-20th century

Filed under: Books, Health, History — Tags: , , , , — Nicholas @ 03:00

David Friedman usually blogs about economics, medieval cooking, or politics. His latest post carefully avoids (almost) all of that:

I didn’t have a convenient graphic to use for this post … but I know not to Google something like this.

My picture of sexual behavior now and in the past is based on a variety of readily observable sources — free online porn for the present, writing, both pornography and non-pornographic but explicit, for the past. On that imperfect and perhaps misleading evidence the pattern of when oral sex was or was not common in our society in recent centuries is the opposite of what one would, on straightforward economic grounds, expect.

Casanova’s memoirs provide a fascinating picture of eighteenth century Europe, including its sexual behavior. He mentions incest, male homosexuality, lesbianism, which he regards as normal for unmarried girls:

    Marton told Nanette that I could not possibly be ignorant of what takes place between young girls sleeping together.

    “There is no doubt,” I said, “that everybody knows those trifles …

I do not believe he ever mentions either fellatio or cunnilingus. Neither does Fanny Hill, published in London in 1748, when Casanova was twenty-three.

Frank Harris, writing in the early 20th century, is familiar with cunnilingus, uses it as a routine part of his seduction tactics, but treats it as something sufficiently exotic so that he had to be talked into trying it by a woman unwilling to risk pregnancy. I do not think he ever mentions fellatio.

Modern online porn in contrast treats both fellatio and cunnilingus as normal parts of foreplay, what routinely comes between erotic kissing and vaginal intercourse.

One online article on the history of fellatio that I found dated the change in attitudes to after the 1976 Hite Report, which found a strongly negative attitude among women to performing it. In contrast:

And from another, present behavior:

    Oral sex precedes and often replaces sexual intercourse because it’s perceived to be noncommittal, quick and safe. For some kids it’s a cool thing to do; for others it’s a cheap thrill. Raised in a culture in which speed is valued, kids, not surprisingly, seek instant gratification through oral sex (the girl by instantly pleasing the boy, the boy by sitting back and enjoying the ride). A seemingly facile command over the sexual landscape of one’s partner is achieved without the encumbrances of clothes, coitus and the rest of the messy business. The blow job is, in essence, the new joystick of teen sexuality. (Salon)

Contrasted with:

    When I was a teenager, in the bad-taste, disco-fangled ’70s, fellatio was something you graduated into. Rooted in the great American sport of baseball, the sexual metaphors of my generation put fellatio somewhere after home base, way off in the distant plains of the outfield. In fact, skipping all the bases and going directly to fellatio was the sort of home run reserved only for racy, borderline delinquents, who enjoyed a host of licentious and forbidden activities that made them stars in the firmament of teen recklessness.

March 23, 2024

QotD: The SCIENCE was SETTLED in the 1970s

When it comes to Leftie, it’s really hard to sort out what’s intentional from what’s merely wrong, or outdated, or stupid, or some combination of the above. So while there really does seem to be some kind of coordinated push to get us to eat grass and bugs, the red meat thing is, I think, just old misinformation that Leftie can’t admit has been overtaken by events (because, of course, Leftists can never be wrong about anything). And I’ll even kinda sorta give them a pass on that, because I know a lot of medical people who learned the “red meat is bad for you” mantra back in the days and still haven’t gotten over it …

For younger readers, back in the late 70s the nutritional Powers That Be got in bed with the corn lobby. It sounds funny, but they were and are huge, the corn lobby — why do you think we’re still getting barraged with shit about ethanol, even though it’s actually much worse for the environment than plain ol’ dinosaur juice, when you factor in all the “greenhouse emissions” from growing and harvesting it? Anyway, ethanol wasn’t a thing back then … but corn syrup was, and so suddenly, for no reason whatsoever, the PTB decided that fat was bad and carbohydrates were good.

Teh Science (TM) for this was as bogus and politicized as all the other Teh Science (TM) these days, but since we still had a high degree of social and institutional trust back then — living in a country that’s still 85% White will do that — nobody questioned it, and so suddenly everything had to be “fat free”, lest you get high blood pressure and colon cancer and every other damn thing (ever notice how, with Teh Science (TM), everything they decide is bad suddenly correlates with everything that has ever been bad? Funny, that). But since fat is what makes food taste good, they had to find a tasty substitute … and whaddya know, huge vats full of corn syrup just kinda happened to be there. Obesity rates immediately skyrocketed; who’d have thunk it?

… but again, this isn’t a deliberate thing with your average Leftie. You know how they are about Teh Science (TM), even Teh Science (TM) produced by people who thought polyester bellbottoms were a great look, which alone should tell you everything you need to know. They just learned “red meat is bad”, and so, being the helpful sorts they are, decided to boss you around about it. You know, for your own good.

Severian, “Friday Mailbag / Grab Bag”, Rotten Chestnuts, 2021-06-25.

March 22, 2024

Four years later

Kulak hits the highlights of the last four years in government overstretch, civil liberties shrinkage, the rise of tyrants local and national, and the palpably still-growing anger of the victims:

4 years ago, at this exact moment, we were in the “two weeks” that were supposed to flatten the Curve of Covid.

4 years ago you were still a “conspiracy theorist” if you thought it would be anything more than a minor inconvenience that would last less than a month.

Of course if you predicted that this would not last 2 weeks, but over 2 years; that within 2 months anti-lockdown protests would end in storming of state houses and false-flag FBI manufactured kidnapping attempts of Governors; that within 3 riots would burn a dozens of American cities; that the election would be inconclusive; that matters would go before the US Supreme Court, again; that a riot/mass entrapment would take place within the halls of congress … And then that this was just the Beginning …

That Big-Pharma would rush a vaccine which may well have been more dangerous that the virus; that Australia and various countries would build concentration camps for unvaccinated; that nearly all employers would be pressured or mandated to FORCE this vaccine on their employees; that vaccine passports would be implemented to track your biological status; that Canada and several other countries would implement travel restrictions on the unvaccinated and collude with their neighbors to prevent their population escaping; and then that, nearly 2 years from 2weeks to slow the spread, Canadians!? would mount one of the most logistically complex protests in human history, in the dead of winter, besieging Ottawa and blockading the US border to all trade in an apocalyptic showdown to break free of lockdowns …

Well … not even Alex Jones predicted all of that, though he got a remarkable amount of it.

Indeed the reverence with which Jones is now treated, a Cassandra-like oracle who predicts the future with seemingly (and memeably) 100% clairvoyance only to doomed to disbelief. That alone would have been unpredictable, or unbelievable in those waning days of the long 2019, those first 2-3 months when you could imagine 2020 would MERELY be an Trumpianly heated election cycle like 2016, and not a moment Fukuyama’s veil threatened to tear and History pour back into the world.

Oh, and also the bloodiest European war since the death of Stalin broke out.

March 14, 2024

The insane pursuit of a “zero waste economy”

Filed under: Britain, Economics, Environment, Government, Politics, Technology — Tags: , , , — Nicholas @ 04:00

Tim Worstall explains why it does not make economic sense to pursue a truly “zero waste” solution in the vast majority of cases:

It’s entirely possible to think that waste minimisation is a good idea. It’s also possible to think that waste minimisation is insane. The difference is in what definition of the word “waste” we’re using here. If by waste we mean things we save money by using instead of not using then it’s great. If by waste we mean just detritus then it’s insane.

Modern green politics has — to be very polite about it indeed — got itself confused in this definitional battle. Which is why we get nonsense like this being propounded as potential political policy:

    A Labour government would aim for a zero-waste economy by 2050, the shadow environment secretary has said.

    Steve Reed said the measure would save billions of pounds and also protect the environment from mining and other negative actions. He was speaking at the Restitch conference in Coventry, held by the thinktank Create Streets.

    Labour is finalising its agenda for green renewal and Reed indicated a zero-waste economy would be part of this.

    This would mean the amount of waste going to landfill would be drastically reduced and valuable raw materials including plastic, glass and minerals reused, which would save money for businesses who would not have to buy, import or create raw materials.

The horror here does depend upon that definition of waste. Or, if we want to delve deeper, the definition of resource that is being saved.

[…]

OK. So, we’ve two possible models here. One is homes sort into 17 bins or whatever the latest demand is. Or, alternatively, we have big factories where all unsorted rubbish goes to. To be mechanically sorted. Right — so our choice between the two should be based upon total resource use. But when we make those comparisons we do not include that household time. 25 million households, 30 minutes a week, 450 million hours a year. At, what, minimum wage? £10 an hour (just to keep my maths simple) is £4.5 billion a year. That household sorting is cheaper — sorry, less resource using — than the factory model is it?

And that little slip — cheaper, less resource using — is not really a slip. For we are in a market economic system. Resources have prices attached to them. So, we can measure resource use — imperfectly to be sure but usefully — by the price of different ways of doing things. Cool!

At which point, recycling everything, moving to a zero waste economy, is more expensive than the current system. Therefore it uses more resources. We know this because we always do have to provide a subsidy to these recycling systems. None of them do make a profit. Or, rather, when they do make a profit we don’t even call them recycling, we call them scrap processing.

Which all does lead us to a very interesting even if countercultural conclusion. The usual support for recycling is taken to be an anti-price, anti-market, even anti-capitalist idea. Supported by the usual soap dodging hippies. But, as actually happens out in the real world, recycling is one of those things that should be — even if it isn’t — entirely dominated by the price system and markets. Even, dread thought, capitalism. We should only recycle those things we can make a profit by recycling. Because that’s now prices inform us about which systems actually save resources.

“The dark world of pediatric gender ‘medicine’ in Canada”

Filed under: Bureaucracy, Cancon, Health, Media, Politics — Tags: , , , , , — Nicholas @ 03:00

The release of internal documents from the World Professional Association for Transgender Health (WPATH) revealed just how little science went into many or most juvenile gender transitions and how much the process was being driven politically rather than scientifically. Shannon Douglas Boschy digs into how the WPATH’s methods are implemented in Canada:

An undercover investigation at a Quebec gender clinic recently documented that a fourteen-year-old girl was prescribed testosterone for the purpose of medical gender transition within ten minutes of seeing a doctor. She received no other medical or mental health assessment and no information on side-effects. This is status quo in the dark world of pediatric gender “medicine” in Canada.

On March 5th Michael Shellenberger, one of the journalists who broke the Twitter Files in 2022, along with local Ottawa journalist Mia Hughes, released shocking leaks from inside WPATH, the organization that proclaims itself the global scientific and medical authority on gender affirming care. The World Professional Association of Transgender Health is the same organization that the Quebec gender clinic, and Ottawa’s CHEO, cite as their authority for the provision of sex-change interventions for children.

These leaks expose WPATH as nothing more than a self-appointed activist body overseeing and encouraging experimental and hormonal and surgical sex-change interventions on children and vulnerable adults. Shellenberger and Hughes reveal that members fully understand that children cannot consent to loss of fertility and of sexual function, nor can they understand the lifetime risks that will result from gender-affirming medicalization, and they ignore these breaches of medical ethics.

The report reveals communication from an “Internal messaging forum, as well as a leaked internal panel discussion, demonstrat(ing) that the world-leading transgender healthcare group is neither scientific nor advocating for ethical medical care. These internal communications reveal that WPATH advocates for many arbitrary medical practices, including hormonal and surgical experimentation on minors and vulnerable adults. Its approach to medicine is consumer-driven and pseudoscientific, and its members appear to be engaged in political activism, not science.”

These findings have profound implications for medical and public education policies in Canada and raise serious concerns about the practices of secret affirmations and social transitions of children in local schools.

These leaks follow on the recent publication of a British Medical Journal study (BMJ Mental Health), covering 25-years of data, dispelling the myth that without gender-affirmation that children will kill themselves. The study, comparing over 2,000 patients to a control population, found that after factoring for other mental health issues, there was no convincing evidence that children and youth who are not gender-affirmed were at higher risk of suicide than the general population.

In the last week, a second study was released, this one from the American Urology Association, showing that post-surgical transgender-identified men, who underwent vaginoplasty, have twice the rate of suicide attempts as before affirmation surgery, and showing that trans-identified women who underwent phalloplasty, showed no change in pre-operative rates of suicide and post-operative.

These and other studies are now thoroughly debunking the emotional blackmail myths promoted by WPATH, that the absence of sex-change interventions, suggest that gender-distressed children are at high risk of taking their own lives.

March 13, 2024

QotD: Filthy coal

… coal smoke had dramatic implications for daily life even beyond the ways it reshaped domestic architecture, because in addition to being acrid it’s filthy. Here, once again, [Ruth] Goodman’s time running a household with these technologies pays off, because she can speak from experience:

    So, standing in my coal-fired kitchen for the first time, I was feeling confident. Surely, I thought, the Victorian regime would be somewhere halfway between the Tudor and the modern. Dirt was just dirt, after all, and sweeping was just sweeping, even if the style of brushes had changed a little in the course of five hundred years. Washing-up with soap was not so very different from washing-up with liquid detergent, and adding soap and hot water to the old laundry method of bashing the living daylights out of clothes must, I imagined, make it a little easier, dissolving dirt and stains all the more quickly. How wrong could I have been.

    Well, it turned out that the methods and technologies necessary for cleaning a coal-burning home were fundamentally different from those for a wood-burning one. Foremost, the volume of work — and the intensity of that work — were much, much greater.

The fundamental problem is that coal soot is greasy. Unlike wood soot, which is easily swept away, it sticks: industrial cities of the Victorian era were famously covered in the residue of coal fires, and with anything but the most efficient of chimney designs (not perfected until the early twentieth century), the same thing also happens to your interior. Imagine the sort of sticky film that settles on everything if you fry on the stove without a sufficient vent hood, then make it black and use it to heat not just your food but your entire house; I’m shuddering just thinking about it. A 1661 pamphlet lamented coal smoke’s “superinducing a sooty Crust or Furr upon all that it lights, spoyling the moveables, tarnishing the Plate, Gildings and Furniture, and corroding the very Iron-bars and hardest Stones with those piercing and acrimonious Spirits which accompany its Sulphure.” To clean up from coal smoke, you need soap.

Coal needs soap?” you may say, suspiciously. “Did they … not use soap before?” But no, they (mostly) didn’t, a fact that (like the famous “Queen Elizabeth bathed once a month whether she needed it or not” line) has led to the medieval and early modern eras’ entirely undeserved reputation for dirtiness. They didn’t use soap, but that doesn’t mean they didn’t clean; instead, they mostly swept ash, dust, and dirt from their houses with a variety of brushes and brooms (often made of broom) and scoured their dishes with sand. Sand-scouring is very simple: you simply dampen a cloth, dip it in a little sand, and use it to scrub your dish before rinsing the dirty sand away. The process does an excellent job of removing any burnt-on residue, and has the added advantage of removed a micro-layer of your material to reveal a new sterile surface. It’s probably better than soap at cleaning the grain of wood, which is what most serving and eating dishes were made of at the time, and it’s also very effective at removing the poisonous verdigris that can build up on pots made from copper alloys like brass or bronze when they’re exposed to acids like vinegar. Perhaps more importantly, in an era where every joule of energy is labor-intensive to obtain, it works very well with cold water.

The sand can also absorb grease, though a bit of grease can actually be good for wood or iron (I wash my wooden cutting boards and my cast-iron skillet with soap and water,1 but I also regularly oil them). Still, too much grease is unsanitary and, frankly, gross, which premodern people recognized as much as we do, and particularly greasy dishes, like dirty clothes, might also be cleaned with wood ash. Depending on the kind of wood you’ve been burning, your ashes will contain up to 10% potassium hydroxide (KOH), better known as lye, which reacts with your grease to create a soap. (The word potassium actually derives from “pot ash,” the ash from under your pot.) Literally all you have to do to clean this way is dump a handful of ashes and some water into your greasy pot and swoosh it around a bit with a cloth; the conversion to soap is very inefficient (though if you warm it a little over the fire it works better), but if your household runs on wood you’ll never be short of ashes. As wood-burning vanished, though, it made more sense to buy soap produced industrially through essentially the same process (though with slightly more refined ingredients for greater efficiency) and to use it for everything.

Washing greasy dishes with soap rather than ash was a matter of what supplies were available; cleaning your house with soap rather than a brush was an unavoidable fact of coal smoke. Goodman explains that “wood ash also flies up and out into the room, but it is not sticky and tends to fall out of the air and settle quickly. It is easy to dust and sweep away. A brush or broom can deal with the dirt of a wood fire in a fairly quick and simple operation. If you try the same method with coal smuts, you will do little more than smear the stuff about.” This simple fact changed interior decoration for good: gone were the untreated wood trims and elaborate wall-hangings — “[a] tapestry that might have been expected to last generations with a simple routine of brushing could be utterly ruined in just a decade around coal fires” — and anything else that couldn’t withstand regular scrubbing with soap and water. In their place were oil-based paints and wallpaper, both of which persist in our model of “traditional” home decor, as indeed do the blue and white Chinese-inspired glazed ceramics that became popular in the 17th century and are still going strong (at least in my house). They’re beautiful, but they would never have taken off in the era of scouring with sand; it would destroy the finish.

But more important than what and how you were cleaning was the sheer volume of the cleaning. “I believe,” Goodman writes towards the end of the book, “there is vastly more domestic work involved in running a coal home in comparison to running a wood one.” The example of laundry is particularly dramatic, and her account is extensive enough that I’ll just tell you to read the book, but it goes well beyond that:

    It is not merely that the smuts and dust of coal are dirty in themselves. Coal smuts weld themselves to all other forms of dirt. Flies and other insects get entrapped in it, as does fluff from clothing and hair from people and animals. to thoroughly clear a room of cobwebs, fluff, dust, hair and mud in a simply furnished wood-burning home is the work of half an hour; to do so in a coal-burning home — and achieve a similar standard of cleanliness — takes twice as long, even when armed with soap, flannels and mops.

And here, really, is why Ruth Goodman is the only person who could have written this book: she may be the only person who has done any substantial amount of domestic labor under both systems who could write. Like, at all. Not that there weren’t intelligent and educated women (and it was women doing all this) in early modern London, but female literacy was typically confined to classes where the women weren’t doing their own housework, and by the time writing about keeping house was commonplace, the labor-intensive regime of coal and soap was so thoroughly established that no one had a basis for comparison.

Jane Psmith, “REVIEW: The Domestic Revolution by Ruth Goodman”, Mr. and Mrs. Psmith’s Bookshelf, 2023-05-22.


    1. Yeah, I know they tell you not to do this because it will destroy the seasoning. They’re wrong. Don’t use oven cleaner; anything you’d use to wash your hands in a pinch isn’t going to hurt long-chain polymers chemically bonded to cast iron.

March 11, 2024

“Is it possible that the new therapy culture and the emphasis on introspection is actually making things worse?”

Filed under: Health — Tags: , , , , , — Nicholas @ 05:00

In Quillette, Brandon McMurtrie asks us to consider why, with more people in therapy than ever before, the overall mental health of the population is declining:

Why has mental health got worse given the prevailing emphasis on self-care and accurately knowing and expressing oneself? And why do people and groups most inclined to focus on their identity appear to be the most distressed, confused, and mentally unwell? Is it possible that the new therapy culture and the emphasis on introspection is actually making things worse?

I am not the first to notice these developments — Abigail Shrier’s new book Bad Therapy has carefully delineated a similar argument. Her arguments are elsewhere supported by research on semantic satiation and ironic uncertainty, the effects of mirror gazing, the effects of meditation, and how all this relates to the constant introspection encouraged by therapy culture and concept creep.

Satiation and Its Effects

Semantic satiation is the uncanny sensation that occurs when a word or sentence is repeated again and again, until it appears to become foreign and nonsensical to the speaker. You may have done this as a child, repeating a word in quick succession until it no longer seems to be recognizable. It’s a highly reliable effect — you can try it now. Repeat a word to yourself quickly, out loud, for an extended period, and really focus on the word and its meaning. Under these circumstances, most people experience semantic satiation.

This well-studied phenomenon — sometimes called “inhibition”, “fatigue”, “lapse of meaning”, “adaptation”, or “stimulus satiation” — applies to objects as well as language. Studies have found that compulsive staring at something can result in dissociation and derealization. Likewise, repeatedly visually checking something can make us uncertain of our perception, which results, paradoxically, in uncertainty and poor memory of the object. This may also occur with facial recognition.

Interestingly, a similar phenomenon can occur in the realm of self-perception. Mirror gazing (staring into one’s own eyes in the mirror) may induce feelings of depersonalization and derealization, causing distortions of self-perception and bodily sensation. This persistent self-inspection can result in a person feeling that they don’t recognize their own face, that they no longer feel real, that their body no longer feels the same as it once did, or that it is not their body at all. Mirror-gazing so reliably produces depersonalization and realization (and a wide range of other anomalous effects), that it can be used in experimental manipulations to trigger these symptoms for research purposes.

[…]

The Satiation of Gender Identity

The number of people identifying as non-binary or trans has skyrocketed in recent years, and a growing number of schools are now teaching gender theory and discussing it with children — sometimes in kindergarten, more often in primary school, but especially in middle- and high-school (though in other schools it is entirely banned). While this may be beneficial for those already struggling with gender confusion, it may also present an avenue for other children to ruminate and become confused via “identity satiation”.

The kind of gender theory increasingly taught in schools encourages children to spend extended periods of time ruminating on self-concepts that most would not otherwise have struggled with. They are given exercises that encourage them to doubt their own unconscious intuitions about themselves, and to ruminate on questions like “Do I feel like a boy?” and “What does it mean to feel like a boy?” and “I thought I was a boy but what if I am not?”

Such questions are often confusing to answer and difficult to express, even for adults unaffected by gender dysphoria. But asking children to ruminate in this way may lead to confusion and depersonalization-derealization via the mechanisms described above. “Identity satiation” may then lead them to decide they are non-binary or trans, especially when identifying as such is rewarded with social recognition and social support. Many people who subsequently de-transitioned have described this process: “I never thought about my gender or had a problem with being a girl before”.

QotD: The profound asshole-ishness of the “best of the best”

Filed under: Health, Quotations, Sports — Tags: , , — Nicholas @ 01:00

Ever met a pro athlete? How about a fighter pilot, or a trauma surgeon? I’ve met a fair amount of all of them, and unless they’re on their very best behavior they all tend to come off as raging assholes. And they get worse the higher up the success ladder they go — the pro athletes I’ve met were mostly in the minors, and though they were big-league assholes they were nothing compared to the few genuine “you see them every night on Sports Center” guys I met. Same way with fighter pilots — I never met an astronaut, but I had buddies at NASA back in the days who met lots, and they told me that even other fighter jocks consider astronauts to be world-class assholes …

The truth is, they’re not — or, at least, they’re no more so than the rest of the population. It’s just that they have jobs where total, utter, profoundly narcissistic self-confidence is a must. It’s what keeps them alive, in the pilots’ case at least, and it’s what keeps you alive if, God forbid, you should ever need the trauma surgeon. Same way with the sportsballers. I can say with 100% metaphysical certainty that there are better basketball players than Michael Jordan, better hitters than Mike Trout, better passers than Tom Brady, out there. There are undoubtedly lots of them, if by “better” you mean “possessed of more raw physical talent at the neuronal level”. What those guys don’t have, but Jordan, Brady, Trout et al do have, is the mental wherewithal to handle failure.

Everyone knows of someone like Billy Beane, the Moneyball guy. So good at football that he was recruited to replace John Elway (!!) at Stanford, but who chose to play baseball instead … and became one of the all-time busts. He had all the talent in the world, but his head wasn’t on straight. Not to put too fine a point on it, he doubted himself. He got to Double A (or wherever) and faced a pitcher who mystified him. Which made him think “Maybe I’m not as good as I think I am?” … and from that moment, he was toast as a professional athlete. Contrast this to the case of Mike Piazza, the consensus greatest offensive catcher of all time. A 27th round draftee, only picked up as a favor to a family friend, etc. Beane was a “better” athlete, but Piazza had a better head. Striking out didn’t make him doubt himself; it made him angry, and that’s why Piazza’s in the Hall of Fame and Beane is a legendary bust.

The problem though, for us normal folks, is that the affect in all cases is pretty much the same … and it’s really hard to turn off, which is why so many pro athletes (fighter jocks, surgeons, etc.) who are actually nice guys come off as assholes. It’s hard to turn off … but as it turns out, it’s pretty easy to turn ON, and that’s in effect what Game teaches.

Severian, “Mental Middlemen II: Sex and the City and Self-Confidence”, Rotten Chestnuts, 2021-05-06.

March 10, 2024

The rapid transition from the amazing smartphone to the “pocket moloch”

Filed under: Health, Media, Technology — Tags: , , , , , — Nicholas @ 03:01

Magdalene J. Taylor follows up her New York Times article from last year with more evidence that so many of the social problems identified today are caused by, or at least made worse, by the almost universal addiction to smartphones:

A year ago, I published an opinion essay for the New York Times that changed the trajectory of my career. It was about how fewer Americans are having sex, across nearly every demographic. For any of the usual caveats — wealth, age, orientation — the data almost always highlighted that previous generations in the same circumstances were having more sex than we are today. My purpose in writing the essay was mainly to try to emphasize the role that sex plays in our cultural wellbeing its connection to the loneliness epidemic. Many of us have developed a blasé attitude toward sex, and I wanted people to care. It wasn’t really about intercourse, and I said as much. It was about wanting to live in an lively, energetic society.

Since writing, I have been continuously asked what I think the cause of all this is. Obviously, there isn’t one universal answer. After publishing, I went on radio shows and podcasts and was asked to share what I thought some of them could be. Economic despair, political unrest, even climate fears were among the reasons I’d heard cited. But all of that, honestly, feels pointlessly abstract. It puts the problem entirely out of our hands, when in fact I believe it may quite literally be in them.

The problem is obviously our phones.

In February, The Atlantic published a feature about the decline of hanging out. Within it was a particularly damning graph sharing the percentage of teens who report hanging out with friends two or more times per week since 1976. Rates were steady around 80 percent up until the mid-90s, when a subtle decrease began to occur. Then, in 2008 — one year after the release of the first iPhone — the decrease became much more dramatic. It has continued falling sharply since, hovering now at just under 60 percent of teens who spend ample time with friends each week.

Some of us really don’t like our screen time habits criticized. Others may think they appear smarter by highlighting other issues, that they can see above the fray and observe the macro trends that are really shaping our lives, not that stupid anti-phone rhetoric we hear from the Boomers. And some of these other trends do indeed apply. Correlation does not equal causation. Lots of things happened in 2008. Namely, a financial crisis the effects of which many argue we are still experiencing. When I shared the graph on Twitter/X saying phones are the obvious cause, this was one of the most common rebuttals. Another was the decline in third spaces. There are indeed few places for teenagers to hang out outside of the home. Skate parks are being turned into pickleball courts with “no loitering” signs, malls are shuttering and you can no longer spend $1 on a McChicken to justify hanging out in the McDonald’s dining area for hours. But as the Atlantic piece explains, the dwindling of places to be and experience community has a problem we’ve been lamenting since the 90s. And it’s not just teens — everyone is spending less time together than they used to. “In short, there is no statistical record of any other period in U.S. history when people have spent more time on their own,” the article states.

« Newer PostsOlder Posts »

Powered by WordPress