Quotulatiousness

February 7, 2026

Food hang-ups by generation

Filed under: Food, Health, Media, USA — Tags: , , , , , , , , — Nicholas @ 05:00

Around the early to mid-80s, I started to notice trends in the kind of health information being pushed by the mainstream media. One of the big topics of the day was the dangers of … eggs. Eggs were so dangerous that “experts” were warning adults to avoid eating more than one or two per week. Three was the absolute limit and you were dicing with death if you went over that “healthy” limit. Then, a few years later, eggs were “the perfect food” and we weren’t eating enough of these formerly abominated death pills. A few years after that, OMG! Apples, people, apples! Danger, danger, danger! That was around the time I stopped putting any credence into health reporting in the media. However, as Lisa De Pasquale points out, food issues have been an ongoing struggle for each succeeding generation:

In the ’80s, the ultimate healthy Boomer breakfast was a bran muffin. There were also various cereals like Grape-Nuts and Raisin Bran. There definitely wasn’t room for their parents’ bacon and eggs unless you had a death wish. Boomers settled on eggs as the devil’s snack when the American Heart Association warned in the 1960s that people shouldn’t consume more than three eggs per week. Like social distancing six feet from others during COVID and eight 8oz glasses of water per day, the recommendation wasn’t based on science, but on being a simple number Americans could remember.

Thanks, but this Gen Xer will stick to getting my 8,675,309 steps per year as my guiding fitness principle.

[…]

The Millennial Food Pyramid

Level One: Genetically Modified Organism and Nonorganic Foods — Use Sparingly

While Gen X was at ground zero in doubting Big Food’s pyramid, our Millennial colleagues and kids really continued the battle. Like luxury logos, they seek out the organic and non-GMO labels. It’s a virtue signal of both their values and what they can afford. Erewhon smoothies, anyone?

Level Two: Various Overpriced Coffee Drinks — Two to Three Servings

Gen Xers link coffee to work and responsibility; caffeine is a tool to get through the morning. Millennials view coffee drinks as self-care. It’s about treating themselves to dessert any time of day — a major win for marketing executives.

Level Three: Charcuterie Boards, Wine, Hard Seltzers, Craft Beers — Three to Five Servings

Millennials love to entertain. Nothing shows sophistication and “adulting” in your 30s and early 40s like a charcuterie board. Lunchables upgraded! They came of drinking age at the same time as small-batch beers, American boutique wineries, and hard seltzers.

Level Four: Instagram-Worthy Food — Six to Eleven Servings

Camera phones leveled up the entertainment value of food consumption. Like organic labels, what Millennials eat signals their open-mindedness. As they get older, they straddle the line of wanting to be in on the trends (avocado toast and açai bowls) and the dive you haven’t heard of with authentic phở.

The Generation Z and Generation Alpha Food Pyramid

Level One: Real Meat, Dairy, and Peanuts — Use Sparingly

The Gen X and Millennial generations dabbled in veggie burgers, but Gen Z and Gen Alpha went whole lab-created hog into plant-based meats and milks, to the point that meat and milk no longer have a meaning until a company gets sued for using the words. To be fair, they are also embracing biohacking trends and ditching seed oils. Due to the growing prevalence of allergies, peanuts are a universal no-no food in public spaces.

Level Two: TikTok Recipes — Two to Three Servings

The term “recipe” is used loosely. I’ve come across a TikTok video for making a cream sauce from a block of cream cheese, water, and dried pasta. There is a positive aspect of trying these TikTok recipes, though: it prepares them for trying new things and for failure when a recipe doesn’t come out right.

Level Three: Food Delivery Service Meals — Three to Five Servings

Postmates, Uber Eats, Grubhub, and DoorDash are staples at mealtime. Following their surge during the COVID era as restaurants struggled to stay in business, accounts linked to their parents’ bank accounts became as common as sharing a cell phone plan.

Level Four: Gamer Food and Drinks — Six to Eleven Servings

Living next to a park has taught me one thing about Gen Z and Gen Alpha — they’re all inside. I mostly see neighborhood kids on Halloween, and every year, I recognize fewer and fewer costumes because they’re dressed as video game characters. Their snacks are manufactured for their attention span: quick hits of spicy, sour, or sweet while on pause. The gamer culture and H Mart remove barriers as Japanese snacks dominate.

So, where does this leave Gen X? We’re not immune to the powers of Big Food. In fact, recent research shows that ultra-processed food addiction began with us thanks to the explosion and availability of ultra palatable foods with added refined carbs and fats. StudyFinds reported researchers from the University of Michigan said, “Individuals who are now older adults were in developmentally sensitive stages during the 1970s and 1980s, precisely when tobacco-owned food manufacturers were shaping the market with addictive ultra-processed foods”.

January 7, 2026

More anti-anti-boomer discussion from Scott Alexander

Filed under: Economics, History, Media, Politics, USA — Tags: , , , , , , — Nicholas @ 03:00

I linked to Scott’s original article last month and thanks to the interest it generated (and perhaps my clickbait-y headline) it got linked at Instapundit thanks to Sarah Hoyt. Scott got a lot of feedback on his post and shares some of that here:

“… Millennials and Generation Z have more money (adjusted for inflation ie cost-of-living, and compared at the same age) than their Boomer parents, to about the same degree that the Boomers exceeded their own parents. This is good and how it should be. The Boomers have successfully passed on a better life to their children”

First, I wish I’d been more careful to differentiate the following claims:

  1. Boomers had it much easier than later generations.
  2. The political system unfairly prioritizes Boomers over other generations.
  3. Boomers are uniquely bad on some axis like narcissism, selfishness, short-termism, or willingness to defect on the social contract.

Anti-Boomerism conflates all three of these positions, and in arguing against it, I tried to argue against all three of these positions — I think with varying degrees of success. But these are separate claims that could stand or fall separately, and I think a true argument against anti-Boomerists would demand they declare explicitly which ones they support — rather than letting them switch among them as convenient — then arguing against whichever ones they say are key to their position.

Second, I wish I’d highlighted how much of this discussion centers around disagreements over which policies are natural/unmarked vs. unnatural/marked.

Nobody is passing laws that literally say “confiscate wealth from Generation A and give it to Generation B”. We’re mostly discussing tax policy, where Tax Policy 1 is more favorable to old people, and Tax Policy 2 is more favorable to young people. If you’re young, you might feel like Tax Policy 1 is a declaration of intergenerational warfare where the old are enriching themselves at young people’s expense. But if you’re old, you might feel like reversing Tax Policy 1 and switching to Tax Policy 2 would be intergenerational warfare confiscating your stuff. But in fact, they’re just two different tax policies and it’s not obvious which one a fair society with no “intergenerational warfare” would have, even assuming there was such a thing. We’ll see this most clearly in the section on housing, but I’ll try to highlight it whenever it comes up.

I’m in a fighty frame of mind here and probably defend the Boomers (and myself) in these responses more than I would in an ideal world.

[…]

1: Top Comments I Especially Want To Highlight

Sokow writes:

Many Europeans chimed in to say this, including people whose opinions I trust.

I find this pretty interesting. We all know stories of American opinions infecting Europeans, like how they’re obsessed about anti-black racism, but rarely worry about anti-Roma racism which is much more prevalent there. I’d never heard anyone argue the opposite — that the European discourse is infecting Americans with ideas that don’t apply to our context — but it makes sense that this should happen. I might write a post on this.

Kevin Munger (Never Met A Science) writes:

    Hating Boomers (and talking about hating Boomers) is uninteresting and I agree morally dubious.

    But it is *emphatically* false that “Boomers were a perfectly normal American generation”. They have served far more terms in Congress than any generation before or since (and we currently have the oldest average age of elected officials in a legislative body IN THE WORLD other than apparently Cambodia), they have dominated the presidency (look up the birthdate of every major party candidate since the 2000 presidential election…), they controlled the commanding heights of major companies, cultural institutions (especially academica).

    They are a historically *unique* generation, for three intersecting reasons: 1. They are a uniquely large generation 2. they came of age as the country and its institutions were maturing 3. they are sticking around because of increased longevity. These are analytical facts, and they produce what I call “Boomer Ballast” — a concentration of our societies resources in one, older generation that increases the tension we are experiencing from technological innovation. Our demography is pulling us towards the past, the internet is pulling us into the future, and this I think is the major source of the anti-Boomer frustration.

    On the specifics of social security and why we might think Boomers have played things to their advantage (not bc they’re specifically evil but bc they have the political power to do so) — the key thing is that they have prevented forward-thinking politicians from fixing the inevitable hole in social security that comes from our demographic pyramid. It would have been relatively painless to increase the rate or incidence of the social security payroll tax at any point in the past 25 years, the looming demographic cliff was obvious and the increased burden could’ve been shared more equally. Instead, they prevented reforms and all of the fiscal pain from demographic shifts will be borne by younger generations.

I agree this is a strong argument, and part of why I think it’s helpful to separate the three points I mentioned at the beginning.

RH writes:

    We [Boomers] did [vote for ourselves to pay higher taxes and get fewer benefits]. My lifetime SS benefits will be 20-25 percent less than they would have been under previous law, and I voted for that. My SS tax rate went up itself, and has been well over 15% since the changes took effect, and the cap on earned income subject to that went up a lot. And I voted to accept all that because it was projected to be sufficient.

    Then the immigrant haters decided we needed fewer workers in the country, or at least fewer paying SS taxes, so they slowed legal immigration and pushed illegals into the underground economy, so they don’t pay taxes to support social security. And social security is going to get whacked again, plus the evils the SS system was intended to alleviate — people too old to work and too poor to live — will return.

I think this says something profound about politics. The problem is less that there’s some group of people who don’t believe in fairness, but that fairness is very hard to calculate.

Suppose RH is right (I haven’t checked), and that Social Security would be sustainable with lots of immigration. Then whether Boomers are paying “their fair share” or not depends on whether immigration is good or bad (a hard question!), and on whether we think of high vs. low immigration as the natural unmarked state of the universe (such that immigration opponents must “own” closed borders and compensate the losers), and on what kind of compensation the losers from closed borders deserve.

Someone else commented by saying we could solve all of these problems without inconveniencing either the Boomers or the young by just increasing taxes on a few ultra-rich people. The ultra-rich could reasonably say they didn’t create this problem and it’s unfair to tax them for it. But so could the Boomers and the young! So whose “fair share” is it?

December 29, 2025

The war against white men didn’t start in 2015

Filed under: Business, Economics, Education, Government, Media, Politics — Tags: , , , , , — Nicholas @ 05:00

Janice Fiamengo responds to Jacob Savage’s essay on the “lost generation” of young white men who have been subject to open and explicit discrimination in education, employment, and loudly denounced for noticing this:

Most people who have discussed Savage’s essay accept his time frame: that the exclusion of white men took place mainly over the past ten to fifteen years. But this is not true. It has been going on for much longer than that, as Nathan Glazer made clear in his comprehensive Affirmative Discrimination: Ethnic Inequality and Public Policy, first published in 1975 and updated in 1987. Government initiatives to provide jobs for women and racial minorities, particularly blacks, were rooted in the equal rights legislation of the 1960s, implemented later that decade and aggressively expanded in the 1970s and 1980s. The National Organization for Women under the leadership of Betty Friedan, for example, brought a lawsuit against the U.S. Equal Employment Opportunity Commission to force it to comply with federal legislation, and sued the country’s 1300 largest corporations for alleged sex discrimination.

Anyone wishing to read a detailed prehistory of what Savage has chronicled can also consult Martin Loney’s extensively documented The Pursuit of Division: Race, Gender, and Preferential Hiring in Canada (1998), which shows how what was called equity hiring in Canada spread across areas such as the police force, firefighting, the civil service, crown corporations, law, teaching, academia, and elsewhere, beginning in the 1980s. What Glazer’s and Loney’s research shows is that discrimination against white men in employment is far more deeply embedded than most people realize and has affected many more men than is currently recognized.

It is ridiculous to castigate Boomer white men, as it seems popular now to do, for allegedly implementing and benefiting from diversity policies. The last thing that should be encouraged is for younger white men to turn their anger on older white men. Many of these older men themselves faced active discrimination, psychological warfare, divorce-rape, and immiseration. Every organ of the culture told them it was time to change, get with it, stop being Archie Bunker, recognize the superior merits of the women and racial minorities their people had allegedly oppressed for so long. White women were by far the majority and most enthusiastic architects and proponents of equity hiring, bullied in turn by the black and brown women with whom they originally formed their alliance against white men (and all men, with a few exceptions).

Older white men may have secured (tenuous) positions of power, but they had no power in themselves as white men. Most of them knew they could find themselves disgraced, friendless, and jobless as the result of an unpopular decision or an unguarded statement. Accusations of sexual misconduct to take such men out of their positions were not confined to millennial males.

I was in the academic job market in 1997, and diversity hiring was already commonplace then. Everyone knew it was going on, and it was signaled both explicitly and implicitly in the advertisements that encouraged applications from women and visible minorities. My friend Steve Brule remembers when affirmative action was brought in at the large chemical company where he worked in 1984. At the beginning, it was said that these programs would be time-limited, lasting only for a short season. Instead, they lasted for well over 40 years and are still going strong.

It is foolish to imagine that such discrimination is now going to lie down and die. There have been a number of occasions over the last few years in which that was confidently predicted (remember Claudine Gay?) and did not occur. Already the diversity advocates, who are legion, are marshalling their counter-arguments and nit-picking the evidence, finding (or lying about) the ways in which what Savage described hasn’t really happened, recalibrating numbers, rationalizing and justifying them. Thousands of academics will spend years joining forces to discredit claims about discrimination, recasting them as a MAGA or Groyper lament and a dangerous attack on the legitimate (but still inadequate!) gains made by valiant women and long-oppressed racial minorities. Recently for The Washington Post, Megan McArdle, in an ostensibly critical article, is still playing with false justifications and outlandish untruths, saying the following about the rationale for equity hiring:

    … One could say of course it’s unfair, but repairing the legacy of slavery and sexism is a hard problem, and sometimes hard problems have unfair solutions. It wasn’t fair to round up huge numbers of men born between 1914 and 1927 and send them off to fight the Nazis, but that was the only way to win.

    One might argue that, but I haven’t seen anyone do so. No one seems brave enough to state baldly that we should penalize White men born in 1988 for hiring decisions that were made in 1985 by another White guy who was born in 1930. Instead what I’ve seen is a lot of deflection.

What bizarre nonsense, what spurious claims even if her point is that such logic is ugly. Discrimination in favor of white men has been illegal since 1964, and affirmative action/equity hiring was already fully in place by the mid-1980s when the “white guy who was born in 1930” was allegedly discriminating in his hiring practices. As McArdle inadvertently shows, we’ve been operating on the basis of deliberately-perpetrated false beliefs for years, beliefs that the intelligentsia adhered to and promulgated.

On the City Journal Substack, Renu Mukherjee argues that Supreme Court Chief Justice John Roberts is correct that “The best way to stop discrimination on the basis of race is stop discriminating on the basis of race”:

First, public opinion is clear: Americans of all racial and ethnic backgrounds have long opposed the use of racial and identity-based preferences. While this trend extends to employment, I’ve studied it extensively in the context of college admissions. The data underscore Americans’ strong support for colorblind meritocracy.

One year before the Supreme Court struck down the use of racial preferences in college admissions in Students for Fair Admissions v. Harvard, the Pew Research Center asked Americans whether an applicant’s race or ethnicity should be a factor in the college admissions process. 74 percent of respondents said that it should not, including 79 percent of whites, 59 percent of blacks, 68 percent of Hispanics, and 63 percent of Asian Americans. By way of comparison, 93 percent of Americans said that high-school grades should be a factor in college admissions, and 85 percent said the same about standardized test scores. Several surveys since then have produced similar results.

A May 2023 study that I co-authored with my Manhattan Institute colleague Michael Hartney reinforces this point. Through an original survey experiment on the 2022 Cooperative Election Study (CES), we asked Americans to play the role of an admissions officer and decide between two competing medical-school applicants. While the applicants’ accomplishments were randomly varied, the specific pair of applicants that respondents saw always consisted of an Asian American male and a black male.

Our objective was to determine whether, and when, Americans believe diversity should take precedence over merit in medical-school admissions. We found that even when respondents were informed that the medical school lacked diversity, the vast majority made their admissions decisions based on merit — in this case, college grades and MCAT scores — not race.

A few months prior to the publication of that paper, for a separate report, I reviewed hundreds of survey questions on affirmative action stored on the Roper Center for Public Opinion Research’s online database. I found that Americans are most likely to say that they oppose “affirmative action” when survey language explicitly describes the policy as providing “preferential treatment” or “preferences” for a given group. This suggests a deep American aversion to racial and gender-based favoritism — which is why Democrats, when pushing policies rooted in such ideology, tend to rely on euphemisms. Republicans should not do what even Democrats know doesn’t work.

Unfortunately, over the last few weeks, they have sounded like they might. Several prominent Republicans have taken to the social media platform X to argue that “Heritage Americans” — those who can trace their lineage to the Founding era — are inherently superior to more recent arrivals. In doing so, they suggest that the former are entitled to preferential treatment on the basis of ancestry. Here, the logic is that “all animals are equal, but some animals are more equal than others”.

Republican leaders, such as Vice President JD Vance, should reject such grievance-based politics. These ideas were unpopular when Democrats pushed them, and they will be unpopular when Republicans try them, too.

December 27, 2025

Diversity is not our strength, no matter how many times they say it is

On the social media site formerly known as Twitter, John Carter responds to a post from Martin Sellner on the visible results of institutionalized “diversity”:

These are the consequences of anti-white policies!

“DEI” has robbed an entire white generation of their careers and thus the realization of their life plans.

The infographics show the impact of the “DEI” policies on a whole generation of white male millennials.

John Carter:

The young white men whose lives were derailed by this psychosis amount to millions of quiet personal tragedies — careers that didn’t launch, marriages that never happened, children who were never born.

But the civilizational fallout is even worse.

The diversity shoved into the places that should have gone to talented young white men has proven itself unequal to the task, to put it mildly. They weren’t smart enough to be mentored for the positions they occupied. As the boomers shuffle away into retirement, they’ll take their knowledge and skills with them — knowledge and skills that weren’t passed onto the diversity (which was incapable of learning it), but also weren’t passed on to talented young white men (who could have mastered it, but were prevented from doing so). Since the diversity is too dumb to master that material, it’s certainly too dumb to pass it on. The chain of knowledge transmission is broken.

Autodidacticism only goes so far. There’s only so much you can learn from books and YouTube videos. There’s ultimately no replacement for hands on professional training. Those talented young white men have gotten very good at podcasting, trading crypto, growing their presence in the attention economy … But by and large they haven’t been allowed to become doctors, lawyers, engineers, etc. Maybe that won’t matter in the end because of AI, but in the meantime, if you think the quality of everything has nosedived throughout the Cancelled Years, you really haven’t seen anything yet. The dwindling old guard of white male boomers is the only force keeping the lights on. When they leave, the real darkness closes in.

December 21, 2025

Women are walking away from the corporate world

Filed under: Business, Media, Politics, USA — Tags: , , , — Nicholas @ 05:00

On her Substack, Elizabeth Nickson starts her most recent post with the shocking headline that “400,000 women left the workforce this year”:

Digging into these reports, it seems the problem is that no one wants to mentor young women, as seniors traditionally have done for young men. No one seems to want to promote women as equally as they do men. Also women don’t want to “work as hard”. They aren’t “as ambitious” as men.

Also women do twice as much uncompensated labor as men, taking on the great majority of household chores, and, as well, are expected to organize the Christmas party. Not me, I might add — on a personal note. I cook. He does everything else.(editors note)

This means they are over-burdened and resentful and they are quitting. Four hundred thousand women left the workforce in 2025, putting down their tools and refusing to spend their lives working for “the man”.

The reports and accompanying “analyses” in the mainstream cry that government and corporations should do more! More of other people’s money chasing a fruitless dream that goes against human nature and sets sex against sex, turns family dynamics into a conflict zone, and takes away yet another chunk of private life to be traded on the market.

Quitting is the right choice.

    Rather than leaving a job they love, they are quitting for a better life. As one creator said, “Women, during the pandemic, got a sense of what it felt like to not be tied to a desk five days a week in an office. Women started to expand their dreams, expand what was inside of them, and they started to really tun into what was in their gut and in their heart. And a lot of that was ‘I don’t want to work for somebody else’s dreams. I want to spend more time with my kids, I want to spend more time in community, I want to launch a business, I wanna a robust side hustle. I want to be an author, I want to be a content creator.’ I’m excited to see what women build when they are untethered to a corporate job. For a lot of millennial women, it’s I’m going to do something better, I’m gonna do something different.”

This in fact, is enormously exciting to me. Because our towns and cities are bereft of female genius — which is not moving widgets around for McKinsey. Our main streets are mostly barren wastes of utility, and the only town center in most places is the parking lot of a big box store. Unless you live in a tourist town and then it’s commercial cosplaying of an earlier better time.

Charitable work is equally as utilitarian, and the assignment of care of the weak to government is brutal and failing. There are more homeless, more lost and broken people every single year. It’s as if the vast, resplendently-funded homeless bureaucracies think that filing quarterly and annual reports filled with noble-sounding “initiatives” is the same as actually solving the problem. I had one middle-class woman warrior in my house say that they were trying to get more hookers on the streets of good neighborhoods. These people are literally, insane.

Women individuating and returning to a private life indicates they are yearning after a more traditional and based occupation for women and I’m not talking about submission, early child bearing and a boss daddy. My pioneer family women, all ten thousand of them ran small businesses, a home farm, the general store, did bookkeeping, ran a workshop, and/or (usually and) some kind of business in town that was charitable, before that was taken over by corporatism and the ravenous maw of the public service who never saw an innovation they didn’t want to ruin by systematizing and ripping out the heart and purpose.

That and only that is the history of women in America, not this cobbled together whining, mewling, weak, oppressed, screeching, “stressed”, “exhausted”, victim. Women, from 1600-1950 had real problems to solve. They were fully adult.

The generations since tried corporate life. It sucked. And they’re not going back. I think this is a forerunner of the life pattern of women into the future. In fact, in millennial-world, one person with a W-2 job and one person with an entrepreneurial spirit is touted as how you game the system to perfection. Taxes are limited, security is up-levelled, and you can actually build something together, rather than both partners slaving away in the globalist maw.

I expect this to take flight almost immediately.

Because women in corporate life?

Nightmare.

This is what these reports are ignoring. Senior officers do not want to mentor or promote women because they are nightmares to work with. They have been trained by their universities and culture to be ideological freaks, demanding and whining and surreptitiously tearing each other down. There was a study done in the 80’s, before ideology took over social research, that found women in corporate life practiced Power Dead Even, which meant crabs in a bucket, baby. If someone was perceived as too powerful, tear them down.

Introduce that into corporate “culture” and nothing gets done. No wonder senior executives don’t mentor or promote women.

Update, 22 December: Welcome, Instapundit readers! Please do have a look around at some of my other posts you may find of interest. I send out a daily summary of posts here through my Substackhttps://substack.com/@nicholasrusson that you can subscribe to if you’d like to be informed of new posts in the future.

Boomers – A vampiric generation battening on the blood of the young

Filed under: Economics, History, Media, Politics, USA — Tags: , , , — Nicholas @ 03:00

As a member of the recently identified “Generation Jones”, I could take part in the widespread boomer hate with a clear conscience … but as Scott Alexander points out, the hate may be more than a little over-done:

“… Millennials and Generation Z have more money (adjusted for inflation ie cost-of-living, and compared at the same age) than their Boomer parents, to about the same degree that the Boomers exceeded their own parents. This is good and how it should be. The Boomers have successfully passed on a better life to their children”

There’s a more developed theory of Boomer-hating. The more developed theory goes: Boomers are plundering the young. We know this, because their share of resources is high and keeps increasing. They use their large population share and good voter turnout to vote themselves ever-higher pensions at the expense of working taxpayers.

How might we investigate this theory? We can’t use total social security spending, because the number of elderly has gone up. Can we use social security spending per elderly person? No; the amount of social security paid out depends on the amount paid in. If each year’s retirees earned more during their career than the previous year’s did (this is true), then each year’s will get a higher SSI payment, even if the system’s “generosity” stays the same.

We might start by looking at change in social security payment divided by change in median income. Over the past fifty years, average Social Security payment in inflation-adjusted dollars increased 60%. If we expect these payments to reflect earnings twenty years before disbursement, we can look at real median personal income from 1953 to 2003; this also increased 60%. There is no increase in generosity.

Or we can just look at the history. The Social Security Administration’s own website says that its generosity peaked in 1972, when the program primarily served the Greatest Generation; since then, it’s been one contraction after another. In 1983, the government increased the full retirement age from 65 to 67; in 1993, they made Social Security more taxable. Since then, most of the changes have been cost-of-living increases, which are indexed to inflation and not the result of active lobbying on old people’s behalf.

Why do so many believe that old people have discovered a vote-themselves-infinite-benefits hack? Since old people represent an increasing fraction of the population, are living longer, and face a secular trend of rising healthcare costs, even when their benefits per capita per year are stable or declining the government will spend more money on them as a group. This spending is indeed rapidly becoming unsustainable, the elderly will need to accept big benefit cuts to make it sustainable again, and they are resisting those cuts.

So have we finally discovered the fabled Boomer selfishness? Call it what you want. But remember that the Boomers did pay money into Social Security to support their own parents, believing that they would be supported in turn. Learning that yours is the generation where the pyramid collapses is a hard pill to swallow. Maybe they should suck it up and take the sacrifice. You’d do this, right? Voluntarily give up money which is yours by right, in order to help other generations? Oh, sorry, you didn’t hear the question, you were too busy writing your 500th “You don’t hate Boomers enough, why won’t they hurry up and die, we need to declare intergenerational warfare and seize our rightful inheritance” post.

Update, 22 December: Welcome, Instapundit readers! Please do have a look around at some of my other posts you may find of interest. I send out a daily summary of posts here through my Substackhttps://substack.com/@nicholasrusson that you can subscribe to if you’d like to be informed of new posts in the future.

December 20, 2025

Ours is a culture that actively conspires against and sabotages its own children

Filed under: Business, Economics, Education, Government, Media, Politics — Tags: , , , , , — Nicholas @ 05:00

Following up on yesterdays post (here) on the viral essay about the Millennial “lost generation”, John Carter enumerates the extent of damage done to Millennials in general and Millennial men in particular:

A Bloomberg report from 2023 tracked reported hiring by 88 Standard & Poor’s 100 companies and of 323,094 reported hires from 2018-2021, only 6% were white.

The response to the essay has been an outpouring of suppressed rage that has been simmering for years in an emotional pressure cooker of silenced frustration. The author, Jacob Savage, provides a ground-level view of the DEI revolution’s human cost, beginning with his personal experiences as an aspiring screenwriter, and then widening the reader’s perspective via interviews with would-be journalists and academics. Every subject described a similar pattern of frustrated ambitions in which, starting around the middle of the 2010s, their careers stalled out for no other reason than their melanin-deficiency and y-chromosome superfluity. Young white men were systematically excluded from every institutional avenue of prestige and prosperity. Doors were closed in academia, in journalism, in entertainment, in the performing arts, in publishing, in tech, in the civil service, in the corporate world. It didn’t matter if you wanted to be a journalist, a novelist, a scientist, an engineer, a software developer, a musician, a comedian, a lawyer, a doctor, an investment banker, or an actor. In every direction, Diversity Is Our Strength and The Future Is Female; every job posting particularly encourages applications from traditionally underrepresented and equity-seeking groups including women, Black and Indigenous People Of Colour, LGBTQ+, and the disabled … a litany of identities in which “white men” was always conspicuous by its absence.

The Lost Generation does not rely only on the pathos of anecdote. Savage includes endless reams of data, demonstrating how white men virtually disappeared from Hollywood writing rooms, editorial staff, university admissions, tenure-track positions, new media journalism, legacy media, and internships. He shows how, after the 2020s, they even stopped bothering to apply, because what was the point? The comprehensive push to exclude young white men from employment wasn’t limited to prestigious creative industries, of course. The corporate sector has also adopted a practice of hiring anyone but white men, as revealed two years ago by a Bloomberg article which gloated that well over 90% of new hires at America’s largest corporations weren’t white.

The Bloomberg article was criticized for methodological flaws, but judging by the outpouring of stories it elicited (just see the several hundred comments my own essay got, the best of which I summarized here) it was certainly directionally accurate.

The real strength of Savage’s article isn’t the cold statistics, though, but the heartrending poignancy with which it highlights the emotional wreckage left in the wake of this cultural revolution.

Hiring processes are opaque. If an employer doesn’t extend an offer, they rarely explain why; at best one receives a formulaic “thank you for your interest in the position, but we have decided to move forward with another applicant. We wish you the best of luck in your endeavours.” They certainly never come out and say that you didn’t get hired because you’re a white man, which is generally technically illegal, for whatever that is worth in an atmosphere in which the unspoken de facto trumps the written de jure. Candidates are not privy to the internal deliberations of hiring committees, which will always publicly claim that they hired the best candidate. Officially a facade of meritocracy was maintained, even as meritocracy was systematically dismantled from within.

The power suit-clad feminists who body-checked their padded shoulder into C-suites and academic departments in the 1970s flattered themselves that they were subduing sexist male chauvinism by outdoing the boys at their own game and forcing the patriarchy to acknowledge their natural female excellence. Growing up I would often hear professional women say things like “as a woman, to get half as far as a man, you have to be twice as good and work twice as hard”. [NR: usually with a smug “fortunately, that’s not difficult” tacked on] The implication of this was that women were just overall better than men, because the old boy’s club held the fairer sex to a higher standard than it did the good old boys. Of course this was almost never true, these women were overwhelmingly the beneficiaries of affirmative action programs motivated by anti-discrimination legislation that opened up any corporation that didn’t put a sufficient number females on the payroll to ruinous lawsuits. Moreover, a fair fraction of them were really being recruited as decorative additions to the secretarial harems of upper management. Nevertheless it helped lay the foundation for the Future Is Female boosterism that stole the future from a generation of young men.

There was a time, not so long ago, where I naively assumed that my own situation was simply the inverse of the one women had faced in the 70s and 80s. I was aware that I was being rather openly discriminated against, but imagined that this simply meant that I had to perform to a higher standard, that if I was good enough, the excellence of my work would shatter the institutional barriers and force someone to employ me. It took me several long and agonizing years to realize that this just wasn’t true. The crotchety patriarchs of the declining West may have been principled men capable of putting stereotypes aside to recognize merit; in fact, the historical evidence suggests that they overwhelmingly prized merit above any other consideration (just as the evidence suggests that their stereotypes were overwhelmingly correct). The priestesses of the present gynocracy hold themselves to no such standard. They don’t care about your promise or your performance, at all. If anything, performing well is a strike against you, because it threatens them. Nothing makes them seethe more than being outperformed by men. They champion mediocrity as much to punish as to promote.

Young white men had been raised to expect meritocracy. They’d also been raised to be racial and sexual egalitarians. People in the past, they believed, had been bigoted, believing superstitious stereotypes about differences of ability and temperament between the sexes and races that had no foundation in reality, pernicious falsehoods that were developed and propagated as intersectional systems of oppression with the purpose of justifying slavery, colonialism, imperialism, and genocide. Naturally they were appalled to have such charges laid at their feet, and so they they agreed that we were all going to try and correct this injustice, and we’d do it by carefully eliminating every potential source of racial or sexual bias, eliminating all the unfair barriers to advancement within society, in particular although not certainly not exclusively via university admissions and institutional hiring. That was the original official line on DEI: that it wasn’t about excluding white men, heaven forbid, no, it was simply about including everyone else, widening the talent pool so that we could ensure both the fairest possible system of advancement, and that the best possible candidates were given access to opportunity. In practice, we were told, this wouldn’t be a quota system: everything would still be meritocratic, but if it came down to a coin flip between two equally qualified candidates, one of whom was a white man and the other of whom was not, the not would win. Fair enough, the young white men thought at first: we’ll all compete on a level playing field, in fact we’ll even accept a bit of a handicap in the interests of correcting historical injustices, and may the best human win.

But the DEI commissars had absolutely no interest in a level playing field. That the playing field wasn’t already as level as it could be was, in fact, one of their most infamous lies. The arena has always been level: physics plays no favourites in the eternal struggle for survival and mastery. If some always end up on top – certain individuals, certain families, certain nations, certain races – this is invariably due to their own innate advantages over their competitors. An interesting example of this was provided by the Russian revolution. The Bolsheviks cast down the old Czarist aristocracy, stripping them of land, wealth, and status, and then discriminated against them in every way possible; a century later, their descendants had clawed their way back to power and prominence. The only possible conclusion from this is that the Russian aristocrats were, at least to some degree, aristos – the best, the noblest – in some sense that went beyond inherited estates.

The young white men did not think of themselves as aristocrats with a blood right to a certain position in life, but as contestants in a fair competition, who would rise or fall on their own merits and by their own efforts. They then abruptly found themselves competing in a system in which it was simply impossible for them to rise, but which also lied to them about the impassable barrier that had been placed in their way. If you noticed the unfairness, you were told that this was ridiculous, that as a white man you were automatically and massively privileged, that it was impossible to discriminate against you because of this, and that in addition to being a bigoted racist you were also quite clearly mediocre, a bitter little man filled with envy for the winners in life, the brilliant beautiful black women who had obviously outcompeted you because they were just so much smarter, so much more dedicated, and so much better because after all they had succeeded in spite of the deck being stacked against them whereas you had failed despite having been born with every unearned advantage in the world.

An entire generation had their future ripped from their hands, and were then told that it was their fault, their inadequacy. They were gaslit that there was no systemic discrimination against them, that their failure to launch was purely due to their individual failings … while at the same time being told that those who were so clearly the beneficiaries of a heavy thumb on the scale were the victims of discrimination, that the oppressors were the oppressed, and that to cry “oppression” yourself was therefore itself a form of oppression.

Do you see how cruel that is? How sadistic? It is more psychologically vicious by far than anything the Bolsheviks did to the Russian aristocracy. At least the Bolsheviks were honest. Although, it must be said, the psychological sadism of the gay race commissars is part of a tradition, communists have often been noted for their demonic cruelty.

December 19, 2025

“2014 was the hinge, the year DEI became institutionalized across American life”

Filed under: Business, Education, Government, Media, Politics, USA — Tags: , , , , , — Nicholas @ 05:00

In Compact, Jacob Savage talks about the “Lost Generation” … not a reference to the group before the “Greatest Generation” who fought and died in their millions in the trenches of World War One … but a much more recent group who are still becoming living casualties of a war fought without weapons and uniforms, but just as bitter and unnecessary:

In retrospect, 2014 was the hinge, the year DEI became institutionalized across American life.

In industry after industry, gatekeepers promised extra consideration to anyone who wasn’t a white man — and then provided just that. “With every announcement of promotions, there was a desire to put extra emphasis on gender [or race],” a former management consultant recalled. “And when you don’t fall into those groups, that message gets louder and louder, and gains more and more emphasis. On the one hand, you want to celebrate people who have been at a disadvantage. On the other hand, you look and you say, wow, the world is not rooting for you — in fact, it’s deliberately rooting against you.”

As the Trump Administration takes a chainsaw to the diversity, equity, and inclusion apparatus, there’s a tendency to portray DEI as a series of well-meaning but ineffectual HR modules. “Undoubtedly, there has been ham-fisted DEI programming that is intrusive or even alienating,” explained Keeanga-Yamahtta Taylor in The New Yorker. “But, for the most part, it is a relatively benign practice meant to increase diversity, while also sending a message that workplaces should be fair and open to everyone.”

This may be how Boomer and Gen-X white men experienced DEI. But for white male millennials, DEI wasn’t a gentle rebalancing — it was a profound shift in how power and prestige were distributed. Yet practically none of the thousands of articles and think-pieces about diversity have considered the issue by cohort.

This isn’t a story about all white men. It’s a story about white male millennials in professional America, about those who stayed, and who (mostly) stayed quiet. The same identity, a decade apart, meant entirely different professional fates. If you were forty in 2014 — born in 1974, beginning your career in the late-90s — you were already established. If you were thirty in 2014, you hit the wall.

Because the mandates to diversify didn’t fall on older white men, who in many cases still wield enormous power: They landed on us.

[…]

Institutions pursuing diversity decided that there would be no backsliding. If a position was vacated by a woman or person of color, the expectation was it would be filled by another woman or person of color. “The hope was always that you were going to hire a diverse candidate,” a senior hiring editor at a major outlet told me. “If there was a black woman at the beginning of her career you wanted to hire, you could find someone … but if she was any good you knew she would get accelerated to The New York Times or The Washington Post in short order.”

The truth is, after years of concerted effort, most news outlets had already reached and quietly surpassed gender parity. By 2019, the newsrooms of ProPublica, The Washington Post, and The New York Times were majority female, as were New Media upstarts Vice, Vox, Buzzfeed, and The Huffington Post.

And then 2020 happened, and the wheels came off.

[…]

There are many stories we tell ourselves about race and gender, especially in academia. But the one thing everyone I spoke to seemed to agree on is it’s best not to talk about it, at least not in public, at least not with your name attached. “The humanities are so small,” a millennial professor nervously explained. “There’s a difference between thinking something and making common knowledge that you think it,” said another.

So it came as a bit of a shock when David Austin Walsh, a Yale postdoc and left-wing Twitter personality, decided to detonate any chance he had at a career with a single tweet.

“I’m 35 years old, I’m 4+ years post-Ph.D, and — quite frankly — I’m also a white dude,” he wrote on X. “Combine those factors together and I’m for all intents and purposes unemployable as a 20th-century American historian.”

The pile-on was swift and vicious. “You are all just laughable,” wrote The New York Times‘ Nikole Hannah-Jones. “Have you seen the data on professorships?” “White males are 30 percent of the US population but nearly 40 percent of faculty,” tweeted a tenured professor at GWU. “Hard to make the case for systemic discrimination.”

It didn’t matter that as far back as 2012 women were more likely to be tenure-track across the humanities than men, or that a 2015 peer-reviewed study suggested that STEM hiring favored women, or even that CUPAHR, an association of academic DEI professionals, found that “assistant professors of color (35 percent) and female assistant professors (52 percent) are overrepresented in comparison to US doctoral degree recipients (32 percent and 44 percent respectively).”

As in other industries, what mattered were the optics. When people looked at academia, they still saw old white men. Lots of them.

“A big part of why it’s hard to diversify is the turnover is really slow,” a tenured millennial professor explained. “And that’s become worse now, because Boomers live a long time.” Many elite universities once had mandatory retirement at 70. But in 1994, Congress sunsetted the academic exemption for age discrimination, locking in the demographics of the largely white male professoriate for a generation.

White men may still be 55 percent of Harvard’s Arts & Sciences faculty (down from 63 percent a decade ago), but this is a legacy of Boomer and Gen-X employment patterns. For tenure-track positions — the pipeline for future faculty — white men have gone from 49 percent in 2014 to 27 percent in 2024 (in the humanities, they’ve gone from 39 percent to 21 percent).

November 23, 2025

QotD: “Operation Atlas Shrugged”

Filed under: Books, Economics, Politics, Quotations — Tags: , , — Nicholas @ 01:00

It’s increasingly clear that Millennials are like the Eloi in The Time Machine by H.G. Wells. Ignorant, pampered, incompetent, lazy, short attention span and incapable of productive work. They long for the continuance of the protective arm of government originally provided by their misguided parents.

Ayn Rand foretold such a circumstance in Atlas Shrugged. It’s time for us all to go away into the mountains and let the Millennials and their boosters face life without a productive economy. It won’t take long for it to all collapse, but we should wait another generation before returning to rebuild civilisation. Certainly there will be no Millennial worthy of a statue – it will be a reprise of the dark ages following the collapse of Mycenae.

Lucius Quinctius Cincinnatus, “Operation Atlas Shrugged”, Catallaxy Files, 2020-06-12.

November 13, 2025

Blue Hairster Cult: (Do Fear) The Zoomers

Filed under: Economics, Media, Politics — Tags: , , , , , , — Nicholas @ 03:00

In the free-to-cheapskates portion of this post, Ed West explains why the Boomers and even the Millennials should fear the Zoomers:

“To understand the man, you have to know what was happening in the world when he was twenty”. I’ve thought about that quote, sometimes attributed to Napoleon, a fair bit recently. I suppose for my generation, 9/11 was the formative event, which signalled the end of the triumphalist Nineties — although the extent to which it affected us is questionable. Perhaps of far greater importance was the financial crisis which unfolded towards the very end of the Bush-Blair era.

What about those born around the turn of the millennium, the so-called “Zoomers”? I suppose it would be the experience of being locked down for a year in order to protect an older generation whose wealth they can never hope to emulate. An already bitter and disillusioned cohort, denied their patrimony by house price inflation, came to adulthood with a period of deliberate social isolation with only the internet at hand — a lockdown that was punctuated by weeks of millennial hysteria over racism.

The intelligent ones would have seen the hysteria for it was — a wild distortion, and realised that the media regularly distorts all sorts of things, and it’s the intelligent ones I worry about. Indeed, when I read the thoughts and worldview of that generation, I feel a sense of dread about what’s coming; perhaps even more so when it comes from the Right.

I only watched a Nick Fuentes video for the first time this summer, an amusingly edited version of a talk in which he rails against Israeli military success. It had been sent by a Jewish friend with strong Zionist sympathies, and it’s very funny — Fuentes is very funny. If I were 20 years old, I might have watched his show, one of many aspects of life in 2025 which I thank God wasn’t around in my adolescence.

After all, most of the things I watched on television – five channels, kids, in fact more like four and half, as the Channel 5 reception wasn’t very good – liked to poke fun at the prevailing morality of the older generation. My favourite comic, Viz, would laugh at the old people whose fault it was that Eddie Murphy’s swearing had to be dubbed over with “freak you, monkeyfeather”. Today it’s only natural that young men should wish to offend woke scolds.

But then, of course, something darker might also be happening. Rod Dreher recalls a fascinating, and disturbing, account of his conversations with young Republican activists this week, writing that: “Not every DC Zoomercon who identifies with Fuentes agrees with everything he says, or the way he says it. What they like most of all is his rage, and willingness to violate taboos. I asked one astute Zoomer what the Groypers actually wanted (meaning, what were their demands). He said, ‘They don’t have any. They just want to tear everything down’.”

There is certainly polling to suggest that younger voters in the US are moving to extremes, if you believe polls. One found that “explicit antisemitic attitudes are now much more common among young voters”, who are five times more likely to have an “unfavourable view of the Jewish people than 65 year olds”. Since 2018, the percentage of American boys who believe in gender equality has shrunk. Far more worrying is that younger Americans are also much more likely to support political violence. and this is more of a problem on the left.

August 16, 2025

This is just crazy enough to work …

Filed under: Bureaucracy, Business, Government, USA — Tags: , , , , , — Nicholas @ 05:00

Disclaimer: I’m not an American and I don’t know the details of the US immigration system, but from what I’ve read elsewhere, Copernican‘s suggestion has a lot of merit:

I can’t be the only one sick of H1Bs destroying the western labor market, particularly in tech, but across the board. Out-of-work tech workers further compress the labor market in other areas. This problem is not unique to the United States, but I understand the laws of the US better, so I’ll be arguing from that perspective.

I know it. Walt Bismarck has a whole organization dedicated to trying to find reasonable employment by job-stacking. A few new and interesting resources have appeared, dedicated to screwing with these companies that open the floodgates to a horde of foreign software engineers. Seven-eleven clerks, and SAAR YOU MUST REDEEMs, that can crash our software, our ships, and our interstate semi-trucks for us.

Fortunately, there’s something we can do to fight back.

[…]

Well, while the government doesn’t seem intent on doing anything about it, the Millennials and Zoomers that have been fucked-over appear to finally have enough cultural weight to start pushing back. Here’s the thing about hiring H1B workers: doing so requires that the company demonstrate that no American Citizens can fulfill the role. That demonstration usually takes the form of a listing in a newspaper with 500 readers, the back-end of a website with black text on a black background, or something similar. They don’t want Americans to apply for these jobs; they want to successfully demonstrate that no Americans even applied.

So they make the application process nearly impossible.

Usually, the way this is done is that when an H1B is hired, they are permitted to remain in the country for up to 6 years (2 renewals of 2 years). Once that’s completed, either the H1B worker is forced to return to where they came from, or the job must be re-posted for 2 weeks for a potential American worker. If no American worker applies (because they didn’t see it because it was posted in a hidden corern of the website or a newspaper with no readers), then the H1B may be sponsored for perminent US residency.

What was clearly once a method for gaining the Best and Brightest as potential employees in the United States has become a system of exploitation. H1Bs are underpaid, undervalued, and often booted from the country, so there’s no impetus for them to assimilate. It’s a mess all the way around, and the only ones who benefit are stockholders for billion-dollar tech companies.

For the most part, we all know the story.

But … what if during that 2-week posting, a qualified American candidate does apply for the job? Well, then everything goes to shit. The company is legally not allowed to deny an American Candidate that job without opening themselves up to a massive lawsuit and fines, and penalties. If only one American candidate has applied, then the company has to hire that individual … and if they don’t hire the American candidate and then apply for another H1B to fill that slot, the company is in deep shit in a legal sense.

August 11, 2025

Smug Canadian boomer autohagiography rightly antagonizes the under-35s

Fortissax had an argument with one of his readers over a smug, self-congratulating meme about how wonderful Canada was in the 1990s and early 2000s:

What we lived through long before Trudeau was the Shattering, the breakdown of Canada’s social cohesion, driven by left-liberalism with communist characteristics applied to race, ethnicity, sex, and gender, and punitive almost exclusively toward visibly White men. My generation, those millennials born on the cusp of Gen Z, saw post-national Canada take shape not in the comfortable suburban rings of the GTA or the posh boroughs of Outremont and Westmount, but in self-segregated, ghettoised enclaves of immigrants whose parents never integrated and were never required to.

Memes like that are dishonest because they feed a false memory. The 2000s were not normal. Wages were stagnant, housing was already an asset bubble, and immigration was still flooding in under a policy that explicitly forbade assimilation. Brian Mulroney had enshrined multiculturalism into law in 1988. Quebec alone resisted, carving out the right to limit immigration under the 1992 Quebec–Canada Accord. After Chrétien, Stephen Harper brought in three million immigrants, primarily from China, India, and the Philippines in that order.

The Don Cherry conservatives of that era were Bush lite. They were rootless, cut off from their history, their identities manufactured from the top down since the days of Lester B. Pearson. They conserved nothing. For Canadian youth, it was the dawn of a civic religion of wokeness, totalitarian self-policing by striver peers, and the quiet coercion of every institution. My memories of that decade are of constant assault — mental, physical, spiritual — from leftists in power, from encroaching foreigners, and from the cowardice of conservatives.

Your 2000s might have been great. For us, they were communist struggle sessions. In 2009 we were pulled from class to watch the inauguration of Barack Obama, a foreign president, as a historic moment for civil rights. Our schools excluded us while granting space to every group under the sun: LGBT safe spaces and cultural clubs for Italians, Jamaicans, Jews, Indians, Indigenous, Balkaners, Greeks, Slavs, Portuguese, Quebecois, Iroquois, Pakistanis — every culture celebrated except our own. Anglo-Quebecers and Anglo-Canadians got nothing but an Irish club, closely monitored for “white supremacy” and “racism” by the HR grandmas of the gyno-gerontocracy of English Montreal. Students self-segregated, sitting at different cafeteria tables and smoking at different bus shelters. At Vanier, Dawson, and John Abbott College, these divisions were institutionalised. I remember walking into the atrium of Dawson, my first post-secondary experience, greeted by a wigger rolling a joint while a Jamaican beatboxed to Soulja Boy.

We became amateur anthropologists out of necessity, forced to navigate a nationwide cosmopolitan experiment from birth. We learned the distinctions between squabbling southeastern Europeans of the former Yugoslavia, and we did not care if Kosovo was Serbia or whether Romanians and Albanians were Slavic, they all acted the same way. We learned the divides within South Asia, the rivalries between Hindutva and Khalistani, the differences between a Punjabi, a Gujarati, a Telugu, a Pakistani, a Hong Konger, a mainlander, and a Taiwanese. We know the shades of Caribbean identity, the factions of the Middle East, and the intricacies of North African identity. We should never have needed to know these things, but we do.

For us, childhood in this cesspit was the seedbed of radicalism. We never knew an era when contact with foreigners was limited to sampling food at Loblaws. All we know is being surrounded by those who hate us, governed by a state that wants to erase us, with no healthcare, no homes, no jobs that are not contested by foreigners, and no money to start families.

August 6, 2025

QotD: Modern English night life

Filed under: Britain, Quotations — Tags: , , , , , — Nicholas @ 01:00

There are few sounds more frightening than that of the English young enjoying themselves. The English, it was once said, take their pleasures sadly; but now they take them loudly, which is far, far worse. Their pleasures are brutish, and the sounds the men emit while experiencing them are indistinguishable from those of a mob indignantly beating someone to death. As for the women, they never speak but they scream, as if being chronically raped. Of course, they all have to raise the level of their vocalizations because there is the perpetual background throb and thump of background music, or para-music, turned up to maximum volume, so that the ground vibrates beneath you like a ripple bed in an intensive care unit.

Recently I stayed overnight in a charming small cathedral city in England, genteel by day and Gomorrah by night. It is a little like H.G. Wells’ story The Time Machine, set 3,000 years hence, when humanity has divided into two: the effete, gentle, vegetarian diurnal Eloi, and the ugly, vicious, carnivorous nocturnal Morlocks, who emerge from underground once the sun goes down and prey on the Eloi.

I had booked no place to stay until the last minute, and found only a room above a cavernous, darkened bar, for me an antechamber of Hell, where the Morlock youth of the cathedral city gathered to enjoy themselves — or at least to pretend to do so, for I have long thought that those who cannot enjoy themselves without shouting and screaming are really hysterics, trying to convince themselves that they are enjoying themselves when actually they do not really know how to do so.

Theodore Dalrymple, “Evening Above the Hell-Bar”, Taki’s Magazine, 2019-12-16.

July 26, 2025

The desperate narcissism of the “Cool Professor”

Filed under: Education — Tags: , , — Nicholas @ 05:00

Freddie deBoer on the pathetic academic specimen sometimes known as “Bob” or “Biff” or “Lizzie” — the dreaded self-imagined “cool professor”:

“heh, probably never expected to have a professor with full sleeve tattoos, huh? well, that’s not the last time your mind’s gonna be blown this semester …”
Image and caption from Freddie deBoer’s Substack

Let me tell you about the saddest figures in the American university. They wear black jeans and Chuck Taylors to class, except maybe on the first day, when they stroll in wearing semi-ironic suits designed to contrast with their ample tattoos. Their syllabuses are printed in Helvetica. They mention Chappell Roan in the first fifteen minutes of the first day of class. They tell their students, with a wink, that they don’t believe in grades — why, who are they to judge their students! They encourage everyone in class to call them by their first names, or perhaps a contrived nickname. They hope to blow everyone’s minds when they theatrically announce that in their classes, students pick the readings, because the students are the ones who really know what’s worthy of their time. They describe themselves as “friends” or “guides” or “partners”, not as teachers or professors. They disdainfully invoke the words “rigor” and “standards” only with ironic scare quotes and want you to know that they don’t believe in deadlines. They subtweet the provost on BlueSky. They are the Cool Professors. And they are frauds.

The Cool Professor fundamentally does not want to teach, as teaching requires the teacher to sometimes be the bad guy. The Cool Professor can’t stand to be the bad guy, chafes at the very idea. That’s the core of all of this. The posture, the cultivated aesthetic of rejection, the performance of cool — none of it’s about students, even though Cool Professors will not shut the fuck up about how they run a “student-centered classroom”. Their affect isn’t about pedagogy. It’s about insecurity and narcissism, their desperate need to be perceived as the rare exception, the rogue academic, the anti-institutional rebel. Cool Professors aren’t trying to liberate students. They’re trying to be loved, and in being loved by students stave off their horror about growing old. And if that means letting students drift intellectually, if that means mistaking chaos for creativity, if that means failing to ever give anyone a hard but necessary lesson, then so be it. Because the thing the Cool Professor wants to avoid at all costs is being perceived as an authority figure, and that is precisely what students most need them to be.

It’s a common misunderstanding, particularly among faculty who feel alienated from the bureaucracy of the university or who fancy themselves transgressive thinkers, that teaching should never be hierarchical. The idea is that it’s somehow oppressive to know more than your students or to presume to evaluate their performance; that knowing more than your students and evaluating their performance are publicly understood to be core parts of being a teacher typically goes ignored. Many who consider themselves modern or progressive in the academy insist that education should be horizontal, an equal exchange between learner and guide, that the classroom is a site of resistance or liberation. But these ideas, while maybe flattering to the professor’s ego and superficially appealing to a certain kind of idealist, are incoherent. They’re built on a fundamental category error: mistaking the classroom for a club meeting, or a dinner party, or a DSA breakout session. The classroom is none of those things. It’s a site of instruction, and in a site of instruction one party knows more than the other; one party evaluates the other; one party is, necessarily, in charge.

(And, for the record, the fundamental dictate of critical pedagogy is always and forever self-defeating: if you inspire your students to rebel against your authority in your own classroom, they’re still following your lead and thus not rebelling at all. The ubiquitous goal of prompting students to resist top-down education, whatever that means, is unachievable, because if you do prompt them to resist, they’re actually complying with your desires, not resisting them. It’s a good old fashioned paradox and not one you can bluff your way out of with abstruse academic vocabulary.)

The plain fact that a teacher must necessarily have some sort of control over the classroom space that the students do not makes people uncomfortable. Authority always does. But then, the job of a teacher is not to minimize discomfort; indeed, a good teacher will necessarily make their students uncomfortable, on occasion, as it’s often only in the space of genuine discomfort that we’re inspired to achieve our deepest growth. The professor’s job to be responsible for the intellectual development of students, which inevitably involves making judgments: what is true, what is false, what is well argued, what is sloppy, what is insightful, what is clichéd. If you aren’t willing to say those things, if you shrink from judgment, you’re abandoning the role you signed up for, you’re copping out. You’re indulging yourself, and your own flattering self-mythology, at the expense of the people you’re supposed to be teaching.

July 21, 2025

“Normal”? Dude, that’s extremist right-wing hate speech!

Filed under: Education, Health, Media — Tags: , , , , , , — Nicholas @ 05:00

The Bone Writer on the huge increase in young people “identifying” as something other than what unreconstructed cavemen used to call “normal”:

Walk through any high school, scroll through TikTok, or attend a freshman orientation, and you’ll see the new hierarchy of modern identity:

  • Straight white male? Bottom rung.
  • Bisexual nonbinary neurodivergent? Stunning and brave.
  • Confused, anxious, fluid? You’re seen. You’re valid.
  • Rooted, stable, and clear? YOU must be dangerous.

It’s not just a cultural shift anymore. It’s a cultural mutation. A slow but total dislocation from reality.

We are no longer celebrating the diversity of life. We are celebrating the diversity of escape routes from it.

Identity as a Compass? No … It’s Identity as Camouflage

There was a time when “identity” meant something integrated, a clear expression of who you are, shaped by your values, your upbringing, your nature.

Now, identity is:

  • A product
  • A protest
  • A mask

It’s often less about expressing truth and more about shielding from judgment.

And nowhere is that clearer than in the explosion of LGBT+ self-identification, especially among the young.

The Numbers Don’t Lie but No One Wants to Look at Them

In 2012, Gallup found ~3.5% of Americans identified as LGBT.

By 2021? Over 20% of Gen Z now identify somewhere on the spectrum.

Among Gen Z women, bisexual identity has grown by over 400%.

Do you really believe this is all “just visibility”? Do you really think the human genome changed this much in 10 years?

Of course not.

What changed was the culture. And culture now rewards deviation and punishes normativity.

Reported by Axios in 2021

Older Posts »

Powered by WordPress