Quotulatiousness

May 1, 2023

Britain’s first embassy to India

Filed under: Britain, Business, History, India — Tags: , , , , , — Nicholas @ 05:00

In The Critic, C.C. Corn reviews Courting India: England, Mughal India and the Origins of Empire by Nandini Das, a look at the first, halting steps of the East India Company at the court of the Mughal Emperor Jahangir early in the seventeenth century:

The late Sir Christopher Meyer, the closest thing modern British diplomacy has produced to a public figure, enjoyed comparing his trade to prostitution. Both are ancient trades, and neither enjoys a wholly favourable reputation. Any modern diplomat will discreetly confirm that the profession is far from the anodyne, flag-emoji civility and coyly embarrassed glamour they project on Twitter.

Whilst none of our modern representatives are working in quite the same conditions as their predecessor Sir Thomas Roe, they may well find uncanny parallels with his unfortunate mission.

The fledgling and precarious East India Company, founded in 1600, had sent representatives to the Mughal court before, but they were mere merchants and messengers. The stern rebuff they received called for a formal representative of the King.

After the company persuaded James I of the necessity, Thomas Roe (a well-connected MP, friend to John Donne and Ben Jonson, and already an experienced traveller after an attempt to reach the legendary El Dorado) was dispatched to the court of Mughal Emperor Jahangir in 1615. He remained there until 1619, in an embassy that the cultural historian, Nandini Das, describes in Courting India as “infuriatingly unproductive”.

The company kept rigorous records, and Roe meticulously kept a daily diary. Professor Das uses these and the reports of other English travellers to narrate Roe’s journey, as well as contemporary literature and, more importantly, their Indian equivalents. It is not so much the diplomatic success that fascinates Das about Roe’s embassy, but the mindset of the early modern encounter between England and India.

In a boom time for histories of British colonialism, this is an intelligent and gripping book with a thoughtful awareness of human relationships and frailties, and a model approach to early modern cross-cultural encounters.

The privations suffered by Roe’s embassy are striking. Only three in ten people had a chance of coming home alive from the voyage to India. Das’s recreation of the journey out is as intense and claustrophobic as Das Boot, with rotten medicine, cruel maritime punishments and untrained boys acting as surgeons. Dead bodies onboard would have their toes gnawed off by rats within hours.

In India, the English sailors excelled themselves as uncouth Brits abroad: drinking, fighting and baiting local customs, such as killing a calf. A chaplain was notorious for “drunkenly dodging brothel-keepers and engaging in half-naked brawls”. For most of his time, Roe — seeking to keep costs down — lived with merchants and factors already in India, in a cramped, filthy, dangerous house.

April 28, 2023

Use and misuse of the term “regression to the mean”

Filed under: Books, Business, Football, Sports, USA — Tags: , , , — Nicholas @ 03:00

I still follow my favourite pro football team, the Minnesota Vikings, and last year they hired a new General Manager who was unlike the previous GM in that not only was he a big believer in analytics, he actually had worked in the analytics area for years before moving into an executive position. The first NFL draft under the new GM and head coach was much more in line with what the public analytics fans wanted — although the result on the field is still undetermined as only one player in that draft class got significant playing time. Freddie deBoer is a fan of analytics, but he wants to help people understand what the frequently misunderstood term “regression to the mean” actually … means:

Kwesi Adofo-Mensah, General Manager of the Minnesota Vikings. Adofo-Mensah was hired in 2022 to replace Rick Spielman.
Photo from the team website – https://www.vikings.com/team/front-office-roster/kwesi-adofo-mensah

The sports analytics movement has proven time and again to help teams win games, across sports and leagues, and so unsurprisingly essentially every team in every major sport employs an analytics department. I in fact find it very annoying that there are still statheads that act like they’re David and not Goliath for this reason. I also think that the impact of analytics on baseball has been a disaster from an entertainment standpoint. There’s a whole lot one could say about the general topic. (I frequently think about the fact that Moneyball helped advance the course of analytics, and analytics is fundamentally correct in its claims, and yet the fundamental narrative of the book was wrong.*) But while the predictive claims of analytics continue to evolve, they’ve been wildly successful.

I want to address one particular bugaboo I have with the way analytical concepts are discussed. It was inevitable that popularizing these concepts was going to lead to some distortion. One topic that I see misused all the time is regression/reversion to the mean, or the tendency of outlier performances to be followed up by performances that are closer to the average (mean) performance for that player or league. (I may use reversion and regression interchangeably here, mostly because I’m too forgetful to keep one in my head at a time.) A guy plays pro baseball for five years, he hits around 10 or 12 homeruns a year, then he has a year where he hits 30, then he goes back to hitting in the low 10s again in following seasons – that’s an example of regression to the mean. After deviation from trends we tend (tend) to see returns to trend. Similarly, if the NFL has a league average of about 4.3 yards per carry for a decade, and then the next year the league average is 4.8 without a rule change or other obvious candidate for differences in underlying conditions, that’s a good candidate for regression to the mean the next year, trending back towards that lower average. It certainly doesn’t have to happen, but it’s likely to happen for reasons we’ll talk about.

Intuitively, the actual tendency isn’t hard to understand. But I find that people talk about it in a way that suggests a misunderstanding of why regression to the mean happens, and I want to work through that here.

So. We have a system, like “major league baseball” or “K-12 public education in Baltimore” or “the world”. Within those systems we have quantitative phenomena (like on-base percentage, test scores, or the price of oil) that are explainable by multiple variables, AKA the conditions in which the observed phenomena occur. Over time, we observe trends in those phenomena, which can be in the system as a whole (leaguewide batting average), in subgroups (team batting average), or individuals (a player’s batting average). Those trends are the result of underlying variables/conditions, which include internal factors like an athlete’s level of ability, as well as elements of chance and unaccounted-for variability. (We could go into a big thing about what “chance” really refers to in a complex system, but … let’s not.) The more time goes on, and the more data is collected, the more confidently we can say that a trend is an accurate representation of some underlying reality, again like an athlete’s level of ability. When we say a baseball player is a good hitter, it’s because we’ve observed over time that he has produced good statistics in hitting, and we feel confident that this consistency is the product of his skill and attributes rather than exogenous factors.

However, we know that good hitters have bad games, just as bad hitters have good games. We know that good hitters have slumps where they have bad three or five or ten etc game stretches. We even acknowledge that someone can be a good hitter and have a bad season, or at least a season that’s below their usual standards. However, if a hitter has two or three bad seasons, we’re likely to stop seeing poor performance as an outlier and change our overall perception of the player. The outlier becomes the trend. There is no certain or objective place where that transition happens.

Here’s the really essential point I want to make: outliers tend to revert to the mean because the initial outlier performance was statistically unlikely; a repeat of that outlier performance is statistically unlikely for the same reasons, but not because of the previous outlier. For ease of understanding let’s pretend underlying conditions stay exactly the same, which of course will never happen in a real-world scenario. If that’s true, then the chance of having an equally unlikely outcome is exactly as likely as the first time; repetition of outliers is not made any less likely by the fact that the initial outlier happened. That is, there’s no inherent reason why a repetition of the outlier becomes more unlikely, given consistent underlying conditions. I think it’s really important to avoid the Gambler’s Fallacy here, thinking that a roulette wheel is somehow more likely to come up red because it’s come up black a hundred times in a row. Statistically unlikely outcomes in the past don’t make statistically unlikely outcomes any less likely in the future. The universe doesn’t “remember” that there’s been an outlier before. Reversion to the mean is not a force in the universe. It’s not a matter of results being bent back into the previous trend by the gods. Rather, if underlying conditions are similar (if a player is about as good as he was the previous year and the role of variability and chance remains the same), and he had an unlikely level of success/failure the prior year, he’s unlikely to repeat that performance because reaching that level of performance was unlikely in the first place.


    * – the A’s not only were not a uniquely bad franchise, they had won the most games of any team in major league baseball in the ten years prior to the Moneyball season
    – major league baseball had entered an era of unusual parity at that time, belying Michael Lewis’s implication that it was a game of haves and have-nots
    – readers come away from the book convinced that the A’s won so many games because of Scott Hatteberg and Chad Bradford, the players that epitomize the
    Moneyball ethos, but the numbers tell us they were so successful because of a remarkably effective rotation in Tim Hudson, Barry Zito, and Mark Mulder, and the offensive skill of shortstop Miguel Tejada – all of whom were very highly regarded players according to the old-school scouting approach that the book has such disdain for.
    – Art Howe was not an obstructionist asshole.

April 27, 2023

“… the Department of Defense is rejoicing that Tucker Carlson has been driven off of Fox News”

Filed under: Business, Government, Media, Military, USA — Tags: , , , — Nicholas @ 04:00

Chris Bray on the odd phenomenon of the US military formally having opinions on who is sitting at the big desk for Fox News these days:

In 2001, I was a nominal infantryman assigned to some exceptionally tedious duty at Fort Benning, Georgia. That spring, the Chief of Staff of the United States Army decided to symbolically make the whole army feel elite by changing the uniform and putting everyone into the black beret that had been unique to the Ranger battalions. See, now you have a special hat, so morale and esprit de corps and stuff.

Because I was in the infantry, surrounded all day every day by infantrymen, I can report the absolutely rock-solid consensus in the combat arms branches with complete confidence: we wondered why we were being led by idiots.* Quietly, but not quietly enough, we said things like, “See, the lethality of a combat force is tied directly to the quality of its fashion design“. A series of impromptu briefings and formal training sessions reminded us that we were not allowed to express open contempt for our senior leaders, so shut up about the dumbassery with the berets.

In retrospect, I think history shows us that new hats really were the most pressing challenge facing the American military as we rolled into the summer of 2001, but whatever.

So Politico, the most reliably wrong publication in the history of the known universe, reports this week that the Department of Defense is rejoicing that Tucker Carlson has been driven off of Fox News.

See, Tucker Carlson was an authoritarian, a Trumpian protofascist. For example, he criticized the leadership of the military, who therefore rejoiced in his departure. Anti-authoritarianism, on the other hand, is when the leaders of the armed forces have a hand in shaping the culture and deciding who’s allowed to speak in the public sphere. Fascism is open discourse, so we need the military to say who should be on television so we can have freedom.

[…]

See, it’s good when the military “smites” civilian critics and expresses “revulsion” for them. In fascist countries, critics of the military are just allowed to speak freely. The culture has gone full Alice In Wonderland, and freedom is compliance.


    * See also the switch from BDUs and ACUs.

April 24, 2023

Unconventional hiring practices

Filed under: Business, Media, USA — Tags: , , , , — Nicholas @ 05:00

In The Honest Broker, Ted Gioia recounts tracking down Jazz saxophonist Jimmy Giuffre to interview him for a chapter in the book he was writing and discovering things about team-building that he hadn’t learned at Stanford Business School:

Back when I interviewed Jimmy Giuffre, I was gigging constantly and the format was obvious. The best option was piano, bass, and drum with at least one horn. If I didn’t have enough money to cover that, I brought just a trio — piano, bass, and drums — to the gig. If I couldn’t afford that, I did just piano and bass. And if cash was really tight, I opted for solo piano.

And which players did I hire?

Back then, I wanted to play with the best of the best. I kept careful tabs on all the jazz musicians in the greater San Francisco area, and wanted to play with all the top cats. Even if I didn’t know the musician, I’d make a cold call and try to hire them, provided I could afford it. If I got turned down, I went to the next name on my list.

Didn’t everybody do it that way?

Not Jimmy Giuffre. He explained that musicians played better when they were happier. Now that was a word I’d never heard in organizational theory class.

Giuffre continued to spell it out for me — surprised that I couldn’t figure this out for myself. Didn’t I know that people are always happier when they were with their friends? So group productivity is an easy problem to solve.

In other words, if my three best buddies played bongos, kazoo, and bagpipe, that should be my group.

When I heard this, I thought it made no kind of sense. They don’t call it “show friends” — they call it show business. I couldn’t imagine following Giuffre’s advice.

But over the years, I’ve thought a lot about what Jimmy Giuffre said about group formation—which is not only unusual for a music group but also violates everything I was taught back at Stanford Business School.

[…]

Can I turn this into a rule? And, even more to the point, could you apply this to other settings? Could you start a business with this approach?

That seems like a recipe for disaster, at least at first glance.

But I now think even large corporations could benefit from a dose of Jimmy Giuffre’s thinking. One of the biggest mistakes in hiring practices, as handled by HR (Human Resources) professionals in the current day is an obsession with the “required qualifications” for the job. They won’t even give you an interview unless you mention the right buzz words on your resume. But the best people take unconventional paths, and this checklist approach will exclude precisely those individuals.

(I’ve even heard of a scam for getting interviews — which involves copying and pasting the job description word-for-word at the bottom of your resume. This apparently rings all the bells in their algorithms and gets you moved to the top of the candidate list.)

Giuffre’s quirky theory gets straight to the heart of the problems with contemporary society outlined by Iain McGilchrist in his book The Master and His Emissary. That book is ostensibly a study of neuroscience, but is actually a deep-thinking critique of institutions and cultural biases. The best decisions. McGhilcrhist shows, are made by holistic thinkers who can see the big picture, but the system rewards the detail orientation of people who manage with checklists and jump through all the bureaucratic hoops.

Yet I’ve seen — and I’m sure you have too — amazing people whose skill set can’t be conveyed by their resume. Not even close. I’ve worked alongside visionaries whose education ended with high school, but have ten times the insight and ability as their colleagues with graduate degrees and fancy credentials.

That’s why Duke Ellington is such a great role model for running an organization. He hired people because of their musical character, rather than their sheer virtuosity or technical knowledge. And he certainly paid no attention to formal degrees. I wouldn’t be surprised to learn that Duke went through decades of hiring for his band without looking at a single resume.

That piece of paper wouldn’t have told him a single thing he needed to know.

For all those reasons, I no longer dismiss Jimmy Giuffre’s peculiar views on group formation. I’d recommend them myself — maybe even especially in groups where no music is made.

April 23, 2023

There’s a spectre haunting your pantry – the spectre of “Ultra-Processed Food”

Christopher Snowden responds to some of the claims in Chris van Tulleken’s book Ultra-Processed People: Why Do We All Eat Stuff That Isn’t Food … And Why Can’t We Stop?:

Ultra-processed food (UPF) is the latest bogeyman in diet quackery. The concept was devised a few years ago by the Brazilian academic Carlos Monteiro who also happens to be in favour of draconian and wildly impractical regulation of the food supply. What are the chances?!

Laura Thomas has written some good stuff about UPF. The tldr version is that, aside from raw fruit and veg, the vast majority of what we eat is “processed”. That’s what cooking is all about. Ultra-processed food involves flavourings, sweeteners, emulsifiers etc. that you wouldn’t generally use at home, often combined with cooking processes such as hydrogenation and hydrolysation that are unavailable in an ordinary kitchen. In short, most packaged food sold in shops is UPF.

Does this mean a cake you bake at home (“processed”) is less fattening than a cake you buy from Waitrose (“ultra-processed”)? Probably not, so what is the point of the distinction? This is where the idea breaks down. All the additives used by the food industry are considered safe by regulators. Just because the layman doesn’t know what a certain emulsifier is doesn’t mean it’s bad for you. There is no scientific basis for classifying a vast range of products as unhealthy just because they are made in factories. Indeed, it is positively anti-scientific insofar as it represents an irrational fear of modernity while placing excessive faith in what is considered “natural”. There is also an obvious layer of snobbery to the whole thing.

Taken to an absurd but logical conclusion, you could view wholemeal bread as unhealthy so long as it is made in a factory. When I saw that CVT has a book coming out (of course he does) I was struck by the cover. Surely, I thought, he was not going to have a go at brown bread?

But that is exactly what he does.

    During my month-long UPF diet, I began to notice this softness most starkly with bread — the majority of which is ultra-processed. (Real bread, from craft bakeries, makes up just 5 per cent of the market …

His definition of “real bread” is quite revealing, is it not?

    For years, I’ve bought Hovis Multigrain Seed Sensations. Here are some of its numerous ingredients: salt, granulated sugar, preservative: E282 calcium propionate, emulsifier: E472e (mono- and diacetyltartaric acid esters of mono- and diglycerides of fatty acids), caramelised sugar, ascorbic acid.

Let’s leave aside the question of why he only recently noticed the softness of fake bread if he’s been eating it for years. Instead, let’s look at the ingredients. Like you, I am not familiar with them all, but a quick search shows that E282 calcium propionate is a “naturally occurring organic salt formed by a reaction between calcium hydroxide and propionic acid”. It is a preservative.

E472e is an emulsifier which interacts with the hydrophobic parts of gluten, helping its proteins unfold. It adds texture to the bread.

Ascorbic acid is better known as Vitamin C.

Caramelised sugar is just sugar that’s been heated up and is used sparingly in bread; Jamie Oliver puts more sugar in his homemade bread than Hovis does.

Hovis Multigrain Seed Sensations therefore qualifies as UPF but it is far from obvious why it should be regarded as unhealthy. According to CVT, the problem is that it is too easy to eat.

    The various processes and treatment agents in my Hovis loaf mean I can eat a slice even more quickly, gram for gram, than I can put away a UPF burger. The bread disintegrates into a bolus of slime that’s easily manipulated down the throat.

Does it?? I’ve never tried this brand but it doesn’t ring true to me. It’s just bread. Either you toast it or you use it for sandwiches. Are there people out there stuffing slice after slice of bread down their throats because it’s so soft?

    By contrast, a slice of Dusty Knuckle Potato Sourdough (£5.99) takes well over a minute to eat, and my jaw gets tired.

Far be it from me to tell anyone how to spend their money but, in my opinion, anyone who spends £6 on a loaf of bread is an idiot. Based on his description, the Dusty Knuckle Potato Sourdough is awful anyway. Is that the idea? Is the plan to make eating so jaw-achingly unenjoyable that we do it less? Is the real objection to UPF simply that it tastes nice?

From the Encyclopedia Britannica to Wikipedia

Filed under: Books, Business, Media, Technology — Tags: , , — Nicholas @ 03:00

In the latest SHuSH newsletter, Ken Whyte recounts the decline and fall of the greatest of the print encyclopedias:

I remembered all this while reading Simon Garfield’s wonderful new book, All the Knowledge in the World: The Extraordinary History of the Encyclopedia. It’s an entertaining history of efforts to capture all that we know between covers, starting two thousand years ago with Pliny the Elder.

The star of Garfield’s show, naturally, is Encyclopedia Britannica, which dominated the field through the nineteenth and twentieth centuries. By the time of its fifteenth edition in 1989, the continuously revised Britannica was comprehensive, reliable, scholarly, and readable, with 43 million words and 25,000 illustrations on a half million topics published over 32,640 pages in thirty-two beautifully designed Morocco-leather-bound volumes. It was the greatest encyclopedia ever published and probably the greatest reference tool to that time. It was sold door-to-door in the US by a sales force of 5,000.

Just as the glorious fifteenth edition was going to press, Bill Gates tried to buy Encyclopedia Britannica. Not a set — the whole company. He didn’t want to go into the reference book business. He believed that the availability of a CD-ROM encyclopedia would encourage people to adopt Microsoft’s Windows operating system. The Britannica people told Gates to get stuffed. They were revolted by the thought of their masterpiece reduced to an inexpensive plastic bolt-on to a larger piece of software for gimmicky home computers.

Like the executives at Blockbuster, the executives at Britannica eventually recognized the threat of digital technology but couldn’t see their way to abandoning their old business model and their old production standards and the reliable profits that came with large sets of big books. CD-ROMs seemed to them like a child’s toy.

Even as more of life moved online and the company’s prospects for growth dwindled, the Britannica executives could still not get their heads around abandoning the past and favoring a digital marketplace. They figured that their time-honored strategy of guilting parents into buying a shelf of books in service of their kids’ education would survive the digital challenge, not recognizing that parents would soon be assuaging their guilt by buying personal computers for their kids.

By the time Britannica brought out an overly expensive and not-very-good CD-ROM version of its encyclopedia in 1994, Gates had launched Encarta based on the much inferior Funk & Wagnalls. It might not have been the equal of the printed Britannica, but with its ease of use and storage, its much lower price point, and its many photos and videos of the Apollo moon landing and spuming whales, Encarta made a splash. It was selling a million copies a year in its third year of production — a number that no previous encyclopedia had come close to matching.

As it turned out, Britannica‘s last profitable year was 1990 when it sold 117,000 bound sets for $650 million and a profit of $40 million. With the launch of Encarta, its annual sales were reduced to 50,000 sets and it was laying off masses of employees.

Encarta‘s own life was relatively short. It closed in 2009, at which point it was selling for a mere $22.95. The world now belonged to Wikipedia.

QotD: Developing “multi-disciplinary teams”

Filed under: Bureaucracy, Business, Education — Tags: , , , — Nicholas @ 01:00

I will mention just one of the courses on offer, the “building” of multi-disciplinary teams, so called. I have some experience of multi-disciplinary teams once they have been “built”, or should I say “assembled”, “agglomerated” or “accumulated”. More often than not, in my experience, they are not so much multi-disciplinary as undisciplined. Lacking a clear structure of overall authority, and therefore of responsibility, they lead to endless disputes as to who is to do what, as well as the grossest neglect of the ostensible aims of the “team”.

The power struggles are interminable and insoluble, for no one is truly in charge and any instructions are regarded as an infringement of or attack upon the idea of equality of disciplines and equality within disciplines. The pretence that the most junior is equal to the most senior means that supervision scarcely happens, or only retrospectively, after a disaster, when the most junior person who can plausibly be blamed is singled out.

The inevitable squabbles that result lead to accusations of bullying, usually defined in purely subjective terms: you are bullied if you feel you are (in the absence of a requirement of objective correlates of feeling, thought is, of course, quite unnecessary and probably best avoided). Such accusations can result in a Kafka-esque procedure lasting months and occupying days, weeks and months of labour-time.

Meanwhile, neglect of the real work is ascribed to a shortage of “resources” and the object of the team’s attentions, that is to say members of the public, are offered perfunctory services, for example never seeing the same member of the team twice. Between holidays, team meetings and courses on how to make the team function better, there is no time left for the elementary compassion of consistency.

Every public enquiry into every disaster that comes within the remit of the services set up to ameliorate the social pathology brought about by years of social engineering finds the same thing: lack of communication between the various parts of the multi-disciplinary teams. Time, surely, for a course on Communication Skills.

Theodore Dalrymple, “Workshops and why you must avoid them”, The Social Affairs Unit, 2009-11-18.

April 22, 2023

The Big Four

Filed under: Britain, Business, Government, History, Railways — Tags: , , , , — Nicholas @ 02:00

Jago Hazzard
Published 1 Jan 2023

It’s 100 years since the Grouping – what happened, why and how?
(more…)

April 20, 2023

It’s not your imagination, you really did just hear that song again … and again … and again

Filed under: Business, Media — Tags: , , , — Nicholas @ 05:00

Ted Gioia on some sort of scam-like activity going down on Spotify and other large web platforms:

Adam Faze kept hearing the same song on Spotify over and over again.

Such things aren’t unusual. Hit songs get played repeatedly—although this one seemed more annoying than most.

But in this case, something even more bizarre was happening.

When Adam looked to see the name of the song, it was always different. The titles were a wild assortment—almost as if a random word generator had been used to pick them:

  • “Trumpet Bublefig”
  • “The Proud Dewdrop Amulet”
  • “Thorncutter”
  • “Viper Beelzebub”
  • “Whomping Clover”
  • Etc. etc.

But the music was always the same.

Even stranger, the artist was also different in each instance. And if you clicked on the writer credits, those were all different too.

How could the same track be attributed to dozens of different musicians? How could the same song be written by dozens of different composers?

Adam started compiling a playlist, and adding each new iteration of this song when he found it. When he got to 49 versions, he shared the playlist on social media. “I’ve officially stumbled upon the weirdest thing I’ve ever seen”, he announced.

A Twitter user, alerted by this, quickly discovered another 10 iterations of the same song. This banal tune was everywhere. The use of multiple aliases made it difficult to gauge the full extent of the deception, but Spotify was pushing this track so aggressively that it was impossible to hide the charade they were playing.

When asked how it was possible to find so many examples, Adam replied: I’m completely serious when I say it was starting to be every other song after a while.”

The song itself is just 53 seconds. And you’re glad when it’s over, because this tune is a loser — almost a Frank-Ocean-at-Coachella level of bad. Even call centers have better taste in their on-hold music.

April 14, 2023

Twists and turns in the “Twitter Files” narrative

Filed under: Business, Government, Media, Politics, Technology, USA — Tags: , , , , — Nicholas @ 03:00

Matt Taibbi recounts how he got involved in the “Twitter Files” in the first place through the hysterical and hypocritical responses of so many mainstream media outlets up to the most recent twist as Twitter owner Elon Musk burns off so much of the credit he got for exposing the information in the first place:

I was amazed at this story’s coverage. From the Guardian last November: “Elon Musk’s Twitter is fast proving that free speech at all costs is a dangerous fantasy.” From the Washington Post: “Musk’s ‘free speech’ agenda dismantles safety work at Twitter, insiders say.” The Post story was about the “troubling” decision to re-instate the Babylon Bee, and numerous stories like it implied the world would end if this “‘free speech’ agenda” was imposed.

I didn’t have to know any of the particulars of the intramural Twitter dispute to think anyone who wanted to censor the Babylon Bee was crazy. To paraphrase Kurt Vonnegut, going to war against a satire site was like dressing up in a suit of armor to attack a hot fudge sundae. This was an obvious moral panic and the very real consternation at papers like the Washington Post and sites like Slate over these issues seemed to offer the new owners of Twitter a huge opening. With critics this obnoxious, even a step in the direction of free speech values would likely win back audiences that saw the platform as a humorless garrison of authoritarian attitudes.

This was the context under which I met Musk and the circle of adjutants who would become the go-betweens delivering the material that came to be known as the Twitter Files. I would have accepted such an invitation from Hannibal Lecter, but I actually liked Musk. His distaste for the blue-check thought police who’d spent more than a half-year working themselves into hysterics at the thought of him buying Twitter — which had become the private playground of entitled mainstream journalists — appeared rooted in more than just personal animus. He talked about wanting to restore transparency, but also seemed to think his purchase was funny, which I also did (spending $44 billion with a laugh as even a partial motive was hard not to admire).

Moreover the decision to release the company’s dirty laundry for the world to see was a potentially historic act. To this day I think he did something incredibly important by opening up these communications for the public.

Taibbi and the other Twitter File journalists were, of course, damned by the majority of the establishment media outlets and accused of every variant of mopery, dopery, and gross malfeasance by the blue check myrmidons. Some of that must have been anticipated, but a lot of it seems to have surprised even Taibbi and company for its blatant hypocrisy and incandescent rage.

But all was not well between the Twitter Files team and the new owner of Twitter:

We were never on the same side as Musk exactly, but there was a clear confluence of interests rooted in the fact that the same institutional villains who wanted to suppress the info in the Files also wanted to bankrupt Musk. That’s what makes the developments of the last week so disappointing. There was a natural opening to push back on the worst actors with significant public support if Musk could hold it together and at least look like he was delivering on the implied promise to return Twitter to its “free speech wing of the free speech party” roots. Instead, he stepped into another optics Punji Trap, censoring the same Twitter Files reports that initially made him a transparency folk hero.

Even more bizarre, the triggering incident revolved around Substack, a relatively small company that’s nonetheless one of the few oases of independent media and free speech left in America. In my wildest imagination I couldn’t have scripted these developments, especially my own very involuntary role.

I first found out there was a problem between Twitter and Substack early last Friday, in the morning hours just after imploding under Mehdi Hasan’s Andrey Vyshinsky Jr. act on MSNBC. As that joyous experience included scenes of me refusing on camera to perform on-demand ritual criticism of Elon Musk, I first thought I was being pranked by news of Substack URLs being suppressed by him. “No way,” I thought, but other Substack writers insisted it was true: their articles were indeed being labeled, and likes and retweets of Substack pages were being prohibited.

April 12, 2023

QotD: Karen

Back in March, I was certain this whole thing [the pandemic] would blow over in a matter of weeks. It’s a Karen-driven phenomenon, I argued, but unlike everything everything else they do, this time Karen’s going to have to shoulder the burden herself. She’ll have fun berating the manager of the local Starbucks for not closing down … until she realizes there’s no place to get a half-caff, triple-foam, venti soy latte frappuccino. Nor is there any place to dump her self-propelled lifestyle accessories kids while she gets exalted at hot yoga and the nail salon, now that school’s out. Give her a week without Starbucks, I said, locked in her house with Kayden, Brayden, Jayden, and Khaleesi, and she’ll demand we never mention the word “flu” again.

In other words, I misunderstood the essence of Karen. Karen is — first, foremost, and always — a victim. I of all people should’ve known better, because I was surrounded by Karens all the time in my personal and professional life. I’ve mentioned this story before, but bear with a quick repeat: At one of my first teaching gigs, at the big directional tech that makes up a lot of “Flyover State”, the department’s women got it into their vapid little heads that they — women — were being systematically excluded from positions of power. The fact that the department chair was a woman, and in fact the whole department, emeritus through first year grad student, was something like 65% female should’ve been their first clue, but nevertheless, they persisted. They got together a blue-ribbon commission, as one does, and studied the shit out of the problem. The much-ballyhooed report revealed …

… that all the positions of authority in the department, every blessed one, was held by a female. At which point, without missing a single fucking beat, they started complaining that being forced to hold all these positions of authority was keeping them from making adequate career progress.

I shit you not.

That’s Karen, my friends.

Severian, “The Civil War That Wasn’t”, Rotten Chestnuts, 2020-09-09.

April 11, 2023

The end of single-sex spaces began in the 1970s, at least for men

Filed under: Business, Government, Law, Media, Politics, Sports, USA — Tags: , , , , , — Nicholas @ 03:00

Janice Fiamengo points out that the initial loss of single-sex spaces began a long time ago and for what — at the time — seemed sensible and egalitarian reasons:

Robin Herman of the New York Times was one of the first two female reporters ever allowed into NHL dressing rooms, starting with the 1975 NHL All-Star Game in Montreal.

There has been a good deal of talk lately about women’s spaces being invaded by biologically male persons identifying as women. Some women’s campaigners claim that the trans phenomenon constitutes an attack on womanhood itself, an attempt to “erase” women and replace them with men who perform womanhood. Some even call it a new form of patriarchy.

But well before women had their single-sex spaces threatened, something similar had already happened to men. Beginning in the 1970s, men’s spaces were usurped, their maleness was denigrated, and policies and laws forced changes in male behavior that turned many workplaces into feminized fiefdoms in which men held their jobs only so long as women allowed them to. The very idea of an exclusively male workspace or club — especially if it was a space for socializing (not so much if it was a sewer, oil field, or shop floor in which men did unpleasant, dangerous work) — came to be seen as dangerous. In light of the recent furor over single-sex spaces for women, it is useful to consider the source of some men’s justifiable apathy and resentment.

At my new academic job in the late 1990s, a woman who had been the first female historian hired into her department used to tell a story she’d had passed on to her from a male colleague. After the decision had been made to hire her, one of the historians said to another somewhat dolefully, “I guess that’s the end of our meetings in the urinal.” The joke ruefully acknowledged, and good-naturedly accepted, the end of their all-male work environment.

Though this woman didn’t have any trouble with her male colleagues, who welcomed her civilly, she told the story with an edge of contempt. Even thoroughly modern men, the story suggested, held a foolish nostalgia for pre-feminist days.

But was it foolish — or did the men recognize something real?

No one thought seriously, then, about the disappearance of men’s single-sex spaces. The idea that men and boys need places where they can be with other men (defended, for example, in Jack Donovan’s The Way of Men) would have been cause, amongst the women I knew, for scornful laughter. In 2018, anti-male assumptions had become so deeply entrenched that the female author of a Guardian article titled “Men-only clubs and menace: how the establishment maintains male power” simply could not believe that any decent man could legitimately seek out male-only company.

Under the circumstances of mixed groups of reporters crowding into team locker rooms after games, it’s rather surprising how few “towel malfunction” incidents have been reported.

April 6, 2023

Japan is weird, example MCMLXIII

Filed under: Books, Bureaucracy, Business, Government, History, Japan — Tags: , , — Nicholas @ 03:00

John Psmith reviews MITI and the Japanese Miracle by Chalmers Johnson:

I’ve been interested in East Asian economic planning bureaucracies ever since reading Joe Studwell’s How Asia Works (briefly glossed in my review of Flying Blind). But even among those elite organizations, Japan’s Ministry of International Trade and Industry (MITI) stands out. For starters, Japanese people watch soap operas about the lives of the bureaucrats, and they’re apparently really popular! Not just TV dramas; huge numbers of popular paperback novels are churned out about the men (almost entirely men) who decide what the optimal level of steel production for next year will be. As I understand it, these books are mostly not about economics, and not even about savage interoffice warfare and intraoffice politics, but rather focus on the bureaucrats themselves and their dashing conduct, quick wit, and passionate romances … How did this happen?

It all becomes clearer when you learn that when the Meiji period got rolling, Japan’s rulers had a problem: namely, a vast, unruly army of now-unemployed warrior aristocrats. Samurai demobilization was the hot political problem of the 1870s, and the solution was, well … in many cases it was to give the ex-samurai a sinecure as an economic planning bureaucrat. Since positions in the bureaucracy were often quasi-hereditary, what this means is that in some sense the samurai never really went away, they just hung up their swords — frequently literally hung them up on the walls of their offices — and started attacking the problem of optimal industrial allocation with all the focus and fury that they’d once unleashed on each other. According to Johnson, to this day the internal jargon of many Japanese government agencies is clearly and directly descended from the dialects and battle-codes of the samurai clans that seeded them.

This book is about one such organization, MITI, whose responsibilities originally were limited to wartime rationing and grew to encompass, depending who you ask, the entire functioning of the Japanese government. Because this is the buried lede and the true subject of this book: you thought you were here to read about development economics and a successful implementation of the ideas of Friedrich List, but you’re actually here to read about how the entire modern Japanese political system is a sham. This suggestion is less outrageous than it may sound at first blush. By this point most are familiar with the concept of “managed democracy,” wherein there are notionally competitive popular elections, culminating in the selection of a prime minister or president who’s notionally in charge, but in reality some other locus of power secretly runs things behind the scenes.

There are many flavors of managed democracy. The classic one is the “single-party democracy”, which arises when for whatever reason an electoral constituency becomes uncompetitive and returns the same party to power again and again. Traditional democratic theory holds that in this situation the party will split, or a new party will form which triangulates the electorate in just such a way that the elections are competitive again. But sometimes the dominant party is disciplined enough to prevent schisms and to crush potential rivals before they get started. The key insight is that there’s a natural tipping-point where anybody seeking political change will get a better return from working inside the party than from challenging it. This leads to an interesting situation where political competition remains, but moves up a level in abstraction. Now the only contests that matter are the ones between rival factions of party insiders, or powerful interest groups within the party. The system is still competitive, but it is no longer democratic. This story ought to be familiar to inhabitants of Russia, South Africa, or California.

The trouble with single-party democracies is that it’s pretty clear to everybody what’s going on. Yes, there are still elections happening, there may even be fair elections happening, and inevitably there are journalists who will point to those elections as evidence of the totally-democratic nature of the regime, but nobody is really fooled. The single-party state has a PR problem, and one solution to it is a more postmodern form of managed democracy, the “surface democracy”.

Surface democracies are wildly, raucously competitive. Two or more parties wage an all-out cinematic slugfest over hot-button issues with big, beautiful ratings. There may be a kaleidoscopic cast of quixotic minor parties with unusual obsessions filling the role of comic relief, usually only lasting for a season or two of the hit show Democracy. The spectacle is gripping, everybody is awed by how high the stakes are and agonizes over how to cast their precious vote. Meanwhile, in a bland gray building far away from the action, all of the real decisions are being made by some entirely separate organ of government that rolls onwards largely unaffected by the show.

March 29, 2023

The Grauniad something something glass houses something something throwing stones

Filed under: Britain, Business, History, Media, USA — Tags: , , , , , , — Nicholas @ 05:00

In UnHerd, Ashley Rindsberg recounts the details we know so far about the Guardian‘s embarassing historical project to find out about the newspaper’s links to the slave trade:

The Guardian prides itself on being one of the most Left-leaning and anti-racist news outlets in the English-speaking world. So imagine its embarrassment when, last month, a number of black podcast producers researching the paper’s historic ties to slavery abruptly resigned, alleging they had been victims of “institutional racism”, “editorial whiteness”, “microaggressions, colourism, bullying, passive-aggressive and obstructive management styles”. All of this might smack of progressive excess, but, in reality, it merely reflects an institution incuriously at odds with itself.

Questions about The Guardian‘s ties to slavery have been circulating since 2020, when, amid the media’s collective spasm of racial conscience following the murder of George Floyd, the Scott Trust announced it would launch an investigation into its history. “We in the UK need to begin a national debate on reparations for slavery, a crime which heralded the age of capitalism and provided the basis for racism that continues to endanger black life globally,” journalist Amandla Thomas-Johnson wrote in a June 2020 Guardian opinion piece about the toppling of a statue of 17th-century British slaver Edward Colston. A month later, the Scott Trust committed to determining whether the founder of the paper, John Edward Taylor, had profited from slavery. “We have seen no evidence that Taylor was a slave owner, nor involved in any direct way in the slave trade,” the chairman of the Scott Trust, Alex Graham, told Guardian staff by email at the time. “But were such evidence to exist, we would want to be open about it.” (Notably, Graham, in using the terms “slave owner” and “direct way”, set a very specific and very high bar for what would be considered information worthy of disclosure.)

The problem is that the results of the investigation, conducted by historian Sheryllynne Haggerty, an “expert in the history of the transatlantic slave trade”, have never been made public. When contacted with questions about what happened to the promised report, Haggerty referred all inquiries to The Guardian‘s PR, which has remained silent on the matter. (The Guardian was asked for comment and we were given the stock PR response The Guardian gave following the podcaster’s letter.) But what we do know is this: according to Guardian lore, a business tycoon named John Edward Taylor was inspired to agitate for change after witnessing the 1819 Peterloo Massacre, when over a dozen people were killed in Manchester by government forces as they protested for parliamentary representation. Two years later, Taylor, a young cotton merchant, with the backing of a group of local reformers known as the Little Circle, founded the paper.

“Since 1821 the mission of The Guardian has been to use clarity and imagination to build hope,” The Guardian‘s current editor, Katharine Viner, proudly proclaims on the “About us” page of the paper’s website. Part of this founding myth concerns one of the defining social and political issues of the day, slavery, which the Little Circle members, including Taylor, vigorously opposed as a moral affront. “The Guardian had always hated slavery,” Martin Kettle, an associate editor, wrote in a 2011 apologia on why during the Civil War the paper had vociferously condemned the North while equivocating on the South.

That may be true, but it also presents an incomplete picture. The Manchester Guardian, as the paper was then known, was founded by cotton merchants, including Taylor, who were able to pool the money needed to launch the paper by drawing on their respective fortunes. While none of these men, many of whom were Unitarian Christians, is likely to have engaged in slavery, they didn’t just benefit from but depended upon the global slave trade that provided virtually all of the cotton that filled their mills. As Sarah Parker Remond, an African American abolitionist, said upon visiting Manchester in 1859: “When I walk through the streets of Manchester and meet load after load of cotton, I think of those 80,000 cotton plantations on which was grown the $125 million worth of cotton which supply your market, and I remember that not one cent of that money ever reached the hands of the labourers.”

March 21, 2023

The musical anomaly that was 2022 – when classical music suddenly became much more popular

Filed under: Britain, Business, Media, USA — Tags: — Nicholas @ 05:00

Ted Gioia looks at some surprising numbers for the music industry showing that of all genres, classical music suddenly became much more popular in 2022:

Last year, I went viral with an article about the rising popularity of old music. But I focused on old rock songs. Many of these songs are 40 or 50 years old. And in the world of pop culture, that’s like ancient history.

But if you really want old music, you can dig back 200 or 300 years — or even more, if you want. But does anybody really do that?

Conventional wisdom tells us that only around 1% of the public cares about classical music. And it doesn’t change much from year to year.

For proof, just take a look at this chart:

If you love concerts at the philharmonic, you read these figures with much weeping and gnashing of teeth. If classical music were any smaller, it would be a rounding error. Or — even sadder — it would be like jazz.

But that data only covers the period up to 2021. And 2022 was different.

In fact, it was remarkably different.

Over the last 12 months, I’ve started to see surprising signs of a larger audience turning to classical music. Last year, I wrote about the amazing saga of WDAV, the first classical music radio station in US history to take the top spot in its city.

I analyzed the numbers, and tried to get to the bottom of this unexpected success story. At the time, I wrote:

    Women are the key drivers here. The station boasts a double-digit share in the female 35-44 category. But this probably is tilted heavily toward mothers, at least if we factor in the next bit of evidence — which reveals that WDAV has a mind-boggling 38% share among young children.

But then a few weeks later, this research report was issued:

I need to point out that respondents were allowed to mention multiple genres — but even given that loophole, who would expect classical music to rank ahead of country music, hip-hop, or folk?

This can’t be true. The numbers must be wrong. Or, maybe, people are lying to pollsters.

But then a survey of holiday listening trends in the UK revealed the unprecedented popularity of orchestral music — especially among younger listeners.

« Newer PostsOlder Posts »

Powered by WordPress