Quotulatiousness

June 13, 2024

Debunking the “miraculous” Marshall Plan

If you’ve read anything about the state of Europe in the aftermath of the Second World War, you’ll undoubtedly have heard of the way the Marshall Plan did wonders to get (western) Germany and the other battle-devastated nations back on their feet economically. At FEE, Christian Monson suggest that you’ve been provided with a very rosy scenario that doesn’t actually accord with the facts:

Konrad Adenauer in conversation with Ludwig Erhard.
KAS-ACDP/Peter Bouserath, CC-BY-SA 3.0 DE via Wikimedia Commons.

Unfortunately, the ubiquity of the myth that the Marshall Plan rebuilt Germany is proof that state-controlled education favors propaganda over economic literacy. Despite the fact that most modern historians don’t give the Marshall Plan much credit at all for rebuilding Germany and attribute to it less than 5 percent of Germany’s national income during its implementation, standard history textbooks still place it at the forefront of the discussion about post-war reconstruction.

Consider this section from McDougal Littell’s World History (p. 968), the textbook I was given in high school:

    This assistance program, called the Marshall Plan, would provide food, machinery, and other materials to rebuild Western Europe. As Congress debated the $12.5 billion program in 1948, the Communists seized power in Czechoslovakia. Congress immediately voted approval. The plan was a spectacular success.

Of course, the textbook makes no mention of the actual cause of the Wirtschaftwunder: sound economic policy. That’s because, for the state, the Marshall Plan makes great statist mythology.

Not only is it frequently brought up to justify the United States getting involved in foreign conflicts, but it simply gives support for central planning. Just look at the economic miracle the government was able to create with easy credit, they say.

And of course, admitting that the billions of dollars pumped into Germany after WWII accomplished next to nothing, especially when compared to something as simple as sound money, would be tantamount to admitting that the government spends most of its time making itself needed when it isn’t and thereby doing little besides getting in the way.

The Inconvenient Truth of Currency Reform

You are unlikely to find the real cause of the Wirtschaftwunder mentioned in any high school history textbook, but here is what it was. In 1948, the economist and future Chancellor of West Germany Ludwig Erhard was chosen by the occupational Bizonal Economic Council as their Director of Economics. He went on to liberalize the West German economy with a number of good policies, the most important being currency reform.

The currency in Germany immediately after WWII was still the Reichsmark, and both the Nazis and then the occupying Soviet authorities had increased the amount in circulation significantly. As a result, by 1948 the Reichsmark was so worthless that people had turned to using cigarettes and coffee as money.

To give people a true store of value so that they could calculate economic costs accurately, assess risk and invest in the future, Erhard created the Deutsche Mark, West Germany’s new currency. Like ripping off a bandaid, he decreased the money supply by 93 percent overnight.

It’s also worth noting that while Erhard, following his school of Ordoliberalism, did form a central bank, it was at least designed independent from the government and followed a hard-money policy (preserving a stable amount of money) through the length of the Wirtschaftswunder. In fact, the original Bank Deutsche Länder was rather limited in scope until it was reorganized as the considerably more centralized Bundesbank in 1957, incidentally when Germany’s economic miracle began to lose steam.

Other notable liberal policies instituted by Erhard included removing all price controls and lowering taxes from the Nazis’ absurd 85 percent to 18 percent. The American occupational authorities opposed these reforms, but Erhard went through with them anyway. This liberalization had an immediate effect. The black market disappeared almost overnight, and in one year, industrial output almost doubled.

Perhaps most poignantly, unemployment dropped from more than 10 percent to around 1 percent by the end of the 1950s. Normally the government tries to justify currency manipulation as a means to eliminate unemployment, but the Wirtschaftwunder is evidence that sound money does the job far better.

June 12, 2024

England “is a parochial country doomed to nostalgia and irrelevance by its unwavering belief in a series of grandiose historical myths”

Filed under: Books, Britain, History — Tags: , , , , , — Nicholas @ 03:00

In The Critic, Fred Skulthorp reviews England: Seven Myths That Changed a Country and How to Set Them Straight by Tom Baldwin and Marc Stears:

Should Keir Starmer find himself in Downing Street after the next election, he will have little to play with in terms of zeitgeist. Unlike Blair, there is no Cool Britannia to tap into. There are few unifying cultural figures and despair seems the only discernible national mood. Starmer has only the recent success of the Lionesses and an oft-quoted anecdote about his dad being a toolmaker to inspire the nation.

But there is one nation-renewing narrative on the centre-left that has emerged since 2016. England, unlike the rest of Europe, is a parochial country doomed to nostalgia and irrelevance by its unwavering belief in a series of grandiose historical myths. The real 21st century England is being held back by people singing “Rule Britannia” at the Last Night of the Proms and the fantasies of Daniel Hannan.

In England: Seven Myths That Changed a Country, Starmer’s biographer Tom Baldwin and former Labour Party speechwriter Mike Stears embark on a journey to set us free from such falsehoods. In Hull we find that William Wilberforce has given the nation an unqualified moral superiority. In Plymouth we discover that Sir Francis Drake is the inspiration for “the aggressively macho nationalist idea” that Brexit can “restore the country’s global reach”. In Runnymede we find that Magna Carta has given rise to the idea of an “Anglo-Saxon birthright sealed with the blood of dead kings”.

Whether anyone actually believes these things is beside the point. These national myths, the authors insist, can account for everything from the popularity of Michael Portillo’s railway documentaries to the 2016 vote to leave the European Union.

Journeys in search of England tend to lend themselves more to projection than discovery. This book presents the worst of that sin. Reading Seven Myths is a bit like being stuck on a very long car journey and regretting having asked the driver: “Whatever happened to the legacy of the London 2012 Olympics?”

Unsurprisingly, much of what follows spouts repackaged Blairite clichés about football, curry and the NHS. Lingering behind their polemic is the tedious psychodrama of the Corbyn years and Labour infighting about how the party should allow itself to feel patriotic. This book is as much about two middle-aged Starmerites trying to work out what it is acceptable to like between their party, the electorate and the limited scope of their inquiry into the England of the 2020s.

And the scope is indeed limited. Reportage and interview, where the book is allowed to breathe away from the grating polemic, is cramped, incomplete and tokenistic. The most memorable soundbite is from Nigel Farage, who tells them — perhaps half-mockingly — that his favourite place in England is London: “It gets faster and more trendy every year that comes”.

Interactions with the public are even more painful. “What do you think of Enoch Powell?” one “brown-skinned man” is asked in Wolverhampton. A refugee from Hong Kong is asked “Does Magna Carta mean anything to you?” Unsurprisingly these conversations don’t return much, but they pave the way for the eye-rollingly mundane conclusion that when it comes to English identity there is “complexity everywhere” (as if anyone’s sense of national identity were ever simple).

Still from the 1964 movie Zulu with Michael Caine as Lieutenant Gonville Bromhead, 24th Regiment of Foot.

For a book that spends nearly 400 pages debunking myths and trying to correct the course of English history, its sources require a lot of reading between the lines. Many can be narrowed down to soundbites from a few politicians and forgotten op-eds in the Telegraph (one quoted is dated as far back as 2004).

All this generates endless false dichotomies, strawmen and reductive statements to account for a grander myth loosely referred to as “English exceptionalism”. At times, attempts to source these myths in the body politic come across as comically desperate. Zulu (1964) becomes a film which kept alive the “British Empire myth” and which “the current generation of politicians would have watched growing up”.

Ironically, the writing itself is laced with the sins of myth-making: boring, trite, incoherent, lazy and unfunny. At times it veers into self-parody. In Runnymede, the “high iron gates” of a housing development near the Magna Carta memorial serve to remind us that national identity myths can “make others feel excluded”. In Plymouth, Greta Thunberg is placed in a pantheon alongside Darwin and Drake who both set sail from the Devon port: “None of these dead Englishmen have as much relevance right now as [the voyage] undertaken from the same city by a Swedish Girl”.

June 2, 2024

QotD: The Spartans do not deserve the admiration of the modern US military

Filed under: Books, Europe, History, Media, Military, Quotations, USA — Tags: , , , , , , , , — Nicholas @ 01:00

The Athenian historian Thucydides once remarked that Sparta was so lacking in impressive temples or monuments that future generations who found the place deserted would struggle to believe it had ever been a great power. But even without physical monuments, the memory of Sparta is very much alive in the modern United States. In popular culture, Spartans star in film and feature as the protagonists of several of the largest video game franchises. The Spartan brand is used to promote obstacle races, fitness equipment, and firearms. Sparta has also become a political rallying cry, including by members of the extreme right who stormed the U.S. Capitol on Jan. 6, 2021. Sparta is gone, but the glorification of Sparta — Spartaganda, as it were — is alive and well.

Even more concerning is the U.S. military’s love of all things Spartan. The U.S. Army, of course, has a Spartan Brigade (Motto: “Sparta Lives”) as well as a Task Force Spartan and Spartan Warrior exercises, while the Marine Corps conducts Spartan Trident littoral exercises — an odd choice given that the Spartans were famously very poor at littoral operations. Beyond this sort of official nomenclature, unofficial media regularly invites comparisons between U.S. service personnel and the Spartans as well.

Much of this tendency to imagine U.S. soldiers as Spartan warriors comes from Steven Pressfield’s historical fiction novel Gates of Fire, still regularly assigned in military reading lists. The book presents the Spartans as superior warriors from an ultra-militarized society bravely defending freedom (against an ethnically foreign “other”, a feature drawn out more explicitly in the comic and later film 300). Sparta in this vision is a radically egalitarian society predicated on the cultivation of manly martial virtues. Yet this image of Sparta is almost entirely wrong. Spartan society was singularly unworthy of emulation or praise, especially in a democratic society.

To start with, the Spartan reputation for military excellence turns out to be, on closer inspection, mostly a mirage. Despite Sparta’s reputation for superior fighting, Spartan armies were as likely to lose battles as to win them, especially against peer opponents such as other Greek city-states. Sparta defeated Athens in the Peloponnesian War — but only by accepting Persian money to do it, reopening the door to Persian influence in the Aegean, which Greek victories at Plataea and Salamis nearly a century early had closed. Famous Spartan victories at Plataea and Mantinea were matched by consequential defeats at Pylos, Arginusae, and ultimately Leuctra. That last defeat at Leuctra, delivered by Thebes a mere 33 years after Sparta’s triumph over Athens, broke the back of Spartan power permanently, reducing Sparta to the status of a second-class power from which it never recovered.

Bret Devereaux, “Spartans Were Losers”, Foreign Policy, 2023-07/22.

June 1, 2024

So who did write Shakespeare’s plays?

Filed under: Books, Britain, History, Media — Tags: , , , , — Nicholas @ 05:00

Mere mortals might be tempted to answer “Well, Shakespeare, duh!”, but to the dedicated conspiracist, the obvious is never the right answer:

This was long thought to be the only portrait of William Shakespeare that had any claim to have been painted from life, until another possible life portrait, the Cobbe portrait, was revealed in 2009. The portrait is known as the “Chandos portrait” after a previous owner, James Brydges, 1st Duke of Chandos. It was the first portrait to be acquired by the National Portrait Gallery in 1856. The artist may be by a painter called John Taylor who was an important member of the Painter-Stainers’ Company.
National Portrait Gallery image via Wikimedia Commons.

Was Shakespeare a fraud? The American writer Jodi Picoult seems to think so. Her latest novel By Any Other Name is based on the premise that William Shakespeare was not the real author of his plays. Specifically, in her story, the poet Emilia Lanier (née Bassano) pays Shakespeare for the use of his name so that she might see her work staged at a time when female playwrights were extremely rare.

The theory that Shakespeare was a woman isn’t original to Picoult. As with all conspiracy theories relating to the bard, the “true” Shakespeare is identified as one of the upper echelons of society (although not an aristocrat, Lanier was part of the minor gentry thanks to her father’s appointment as court musician to Queen Elizabeth I). Those known as “anti-Stratfordians” – i.e., those who believe that the man from Stratford-upon-Avon called William Shakespeare did not write the plays attributed to him – invariably favour candidates who had direct connections to the court. The general feeling seems to be that a middle-class lad from a remote country town could not possibly have created such compelling depictions of lords, ladies, kings and queens.

[…]

The notion that the actor Shakespeare could have hired out his identity to Lanier, or anyone else for that matter, makes no sense if one considers the collaborative nature of the theatrical medium. Shakespeare was the house playwright for the Lord Chamberlain’s Men (the company that became the King’s Men on the accession of James I). His job was to oversee productions, to write on the hoof, to adapt existing scripts in the process of rehearsal. (This is probably why his later plays such as Henry VIII contain so many stage directions; at this point he was almost certainly residing in Stratford-upon-Avon, and so was not available to provide the necessary detail in person.) It was never simply a matter of Shakespeare dropping off his latest script at The Globe and quickly scarpering. If he was being fed the lines, it is implausible that nobody in the company would have noticed.

[…]

The theory that Shakespeare’s contemporaries – fans and critics alike – would all collude in an elaborate deception requires a full explanation. The burden of proof is very much on the anti-Stratfordians, but proof doesn’t appear to be their priority. They seem to think they know more about Shakespeare than those who actually lived and worked with him. It’s oddly hubristic.

All of this nonsense began with the Baconian theory propounded by James Wilmot in 1785 and has never gone away. The candidates are usually university educated and aristocratic: Francis Bacon, Christopher Marlowe, the Earl of Rutland, the Earl of Oxford – even Queen Elizabeth I has been proposed. The anti-Stratfordian position seems to be based on a combination of class snobbery and presentism. They assume that the middle-class son of a glover who did not attend university could not have developed the range of knowledge needed to inform his plays. They forgot, or do not know, that the grammar school education of the time would have provided a firm grounding in the classics. Shakespeare would have been steeped in Ovid, Cicero, Plautus, Terence, and much more besides. Let’s not forget that Ben Jonson, the most scholarly of all his contemporaries, didn’t go to university either.

Moreover, the plays make clear that Shakespeare was a voracious reader. The idea that one must have direct experience in order to write about a subject is very much in keeping with the obsessions of our time, particularly the notion of “lived experience” and how writers ought to “stay in their lane”.

As I’ve joked in the past, I believe the theory that Homer didn’t actually write The Iliad and The Odyssey … it was another Greek chap of the same name.

May 19, 2024

Alexander III of Macedon … usually styled “Alexander the Great”

In the most recent post at A Collection of Unmitigated Pedantry, Bret Devereaux considers whether the most famous king of Macedon deserves his historic title:

Alexander the Great
Detail from the Alexander Mosaic in the House of the Faun in Pompeii, attributed to the first century BC, via Wikimedia Commons.

I want to discuss his reign with that title, “the Great” (magnus in Latin or μέγας in Greek) stripped off, as Alexander III rather than merely assuming his greatness. In particular, I want to open the question of if Alexander was great and more to the point, if he was, what does that imply about our definitions of greatness?

It is hardly new for Alexander III to be the subject of as much mythology as fact; Alexander’s life was the subject of mythological treatment within living memory. Plutarch (Alex. 46.4) relates an episode where the Greek historian Onesicritus read aloud in the court of Lysimachus – then king of Thrace, but who had been one of Alexander’s somatophylakes (his personal bodyguards, of which there were just seven at at time) – his history of Alexander and in his fourth book reached the apocryphal story of how Alexander met the Queen of the Amazons, Thalestris, at which Lysimachus smiled and asked, “And where was I at the time?” It must have been strange to Lysimachus, who had known Alexander personally, to see his friend and companion become a myth before his eyes.

Then, of course, there are the modern layers of mythology. Alexander is such a well-known figures that it has been, for centuries, the “doing thing” to attribute all manner of profound sounding quotes, sayings and actions to him, functionally none of which are to be found in the ancient sources and most of which, as we’ll see, run quite directly counter to his actual character as a person.

So, much as we set out to de-mystify Cleopatra last year, this year I want to set out – briefly – to de-mystify Alexander III of Macedon. Only once we’ve stripped away the mythology and found the man can we then ask that key question: was Alexander truly great and if so, what does that say not about Alexander, but about our own conceptions of greatness?

Because this post has turned out to run rather longer than I expected, I’m going to split into two parts. This week, we’re going to look at some of the history of how Alexander has been viewed – the sources for his life but also the trends in the scholarship from the 1800s to the present – along with assessing Alexander as a military commander. Then we’ll come back next week and look at Alexander as an administrator, leader and king.

[…]

Sources

As always, we are at the mercy of our sources for understanding the reign of Alexander III. As noted above, within Alexander’s own lifetime, the scale of his achievements and impacts prompted the emergence of a mythological telling of his life, a collection of stories we refer to collectively now as the Alexander Romance, which is fascinating as an example of narrative and legend working across a wide range of cultures and languages, but is fundamentally useless as a source of information about Alexander’s life.

That said, we also know that several accounts of Alexander’s life and reign were written during his life and immediately afterwards by people who knew him and had witnessed the events. Alexander, for the first part of his campaign, had a court historian, Callisthenes, who wrote a biography of Alexander which survived his reign (Polybius is aware – and highly critical – of it, Polyb. 12. 17-22), though Callisthenes didn’t: he was implicated (perhaps falsely) in a plot against Alexander and imprisoned, where he died, in 327. Unfortunately, Callisthenes’ history doesn’t survive to the present (and Polybius sure thinks Callisthenes was incompetent in describing military matters in any event).

More promising are histories written by Alexander’s close companions – his hetairoi – who served as Alexander’s guards, elite cavalry striking force, officers and council of war during his campaigns. Three of these wrote significant accounts of Alexander’s campaigns: Aristobulus,1 Alexander’s architect and siege engineer, Nearchus, Alexander’s naval commander, and Ptolemy, one of Alexander’s bodyguards and infantry commanders, who will become Ptolemy I Soter, Pharaoh of Egypt. Of these, Aristobulus and Ptolemy’s works were apparently campaign histories covering the life of Alexander, whereas Nearchus wrote instead of his own voyages by sea down the Indus River, the Indian Ocean and the Persian Gulf which he called the Indike.

And you are now doubtless thinking, “amazing, three contemporary accounts, that’s awesome!” So I hope you will contain your disappointment when I follow with the inevitable punchline: none of these three works survives. We also know a whole slew of other, less reliable sounding histories (Plutarch lists works by Cleitarchus, Polycleitus, Onesicritus, Antigenes, Ister, Chares, Anticleides, Philo, two different Philips, Hecataeus, and Duris) do not survive either.

So what do we have?

Fundamentally, our knowledge of Alexander the Great is premised on four primary later works who wrote when all of these other sources (particularly Ptolemy and Aristobulus) still survived. These four authors are (in order of date): Diodorus Siculus (writing in the first century BC), Quintus Curtius Rufus (mid-first cent. AD), Plutarch (early second century AD) and Arrian (Lucius Flavius Arrianus, writing in the early second century AD). Of these, Diodorus’ work, the Bibliotheca historica is a “universal history”, which of course means it is a mile wide and only an inch deep, but Book 17, which covers Alexander’s life, is intact and complete. Curtius Rufus’ work survives only incompletely, with substantial gaps in the text, including all of the first two books.

Plutarch’s Life of Alexander survives intact and is the most substantial of his biographies, but it is, like all of his Parallel Lives, relatively brief and also prone to Plutarch’s instinct to bend a story to fit his moralizing aims in writing. Which leaves, somewhat ironically, the last of these main sources, Arrian. Arrian was a Roman citizen of Anatolian extraction who entered the Senate in the 120s and was consul suffectus under Hadrian, probably in 130. He was then a legatus (provincial governor/military commander in Cappadocia, where Dio reports (69.15.1) that he checked an invasion by the Alani (a Steppe people). Arrian’s history, the Anabasis Alexandrou (usually rendered “Campaigns of Alexander”)2 comes across as a fairly serious, no-nonsense effort to compile the best available sources, written by an experienced military man. Which is not to say Arrian is perfect, but his account is generally regarded (correctly, I’d argue) as the most reliable of the bunch, though any serious scholarship on Alexander relies on collating all four sources and comparing them together.

Despite that awkward source tradition, what we have generally leaves us fairly well informed about Alexander’s actions as king. While we’d certainly prefer to have Ptolemy or Aristobolus, the fact that we have four writers all working from a similar source-base is an advantage, as they take different perspectives. Moreover, a lot of the things Alexander did – founding cities, toppling the Achaemenid Empire, failing in any way to prepare for succession – leave big historical or archaeological traces that are easy enough to track.


    1. This is as good a place as any to make a note about transliteration. Almost every significant character in Alexander’s narrative has a traditional transliteration into English, typically based on how their name would be spelled in Latin. Thus Aristobulus, instead of the more faithful Aristoboulos (for Ἀριστόβουλος). The trend in Alexander scholarship today is, understandably, to prefer more faithful Greek transliterations, thus rendering Parmenion (rather than Parmenio) or Seleukos (rather than Seleucus). I think, in scholarship, this is a good trend, but since this is a public-facing work, I am going to largely stick to the traditional transliterations, because that’s generally how a reader would subsequently look up these figures.

    2. An ἀνάβασις is a “journey up-country”, but what Arrian is invoking here is Xenophon’s account of his own campaign with the 10,000, the original Anabasis; Arrian seems to have fashioned himself as a “second Xenophon” in a number of ways.

April 13, 2024

The Legend of the Wiener Schnitzel

Filed under: Europe, Food, History — Tags: , , , , , — Nicholas @ 02:00

Tasting History with Max Miller
Published Jan 9, 2024

Variations of wienerschnitzel throughout history and its legendary origin stories, and a recipe for a 19th century version.

Fried breaded veal cutlets served with the traditional lemon wedges and parsley

City/Region: Vienna
Time Period: 1824

Breaded and fried meat has been around for a very long time in many places, but it wasn’t until 1893 that we get the first mention of the word wienerschnitzel. Then in the early 20th century, the Austrian culinary scene decided to champion this term to refer to a veal cutlet that is made into a schnitzel, and restaurants in Vienna began specializing in schnitzel.

This recipe predates the term wienerschnitzel, and unlike modern versions it isn’t dredged in flour first. This makes it so that the breading doesn’t puff away from the meat, but the flavor is rich and delicious, just like I remember from my trip to Vienna. If you don’t like veal or don’t want to use it, you can use pork or chicken. It won’t technically be wienerschnitzel, but nobody’s going to judge you. You can also use another fat instead of the clarified butter, but butter gives the best flavor.
(more…)

April 12, 2024

When it comes to media coverage of environmental issues “bad news sticks around like honey, while good news dries up like water”

In Spiked, Matt Ridley debunks the attitude — universal among climate activists — that humanity’s mere existence is “bad for the planet”:

A 16 foot high sculpture of a polar bear and cub, afloat on a small iceberg on the River Thames, passes in front of Tower Bridge on 26 January 2009 in London, England.
Spiked

Over the past few years, we have been subject to endless media reports on the devastating impact humanity is having on the global bee population. “Climate change is presenting huge challenges to our bees”, claimed the Irish Times last year. “Where has all the honey gone?”, asked the Guardian earlier this year.

The news from last week may come as a shock to some, then. It turns out that America actually has more bees than ever before, having added a million hives in just five years. The Washington Post, which reported these facts, was certainly surprised given what it calls “two decades of relentless colony-collapse coverage”.

Some of us, however, have been pointing out for more than a decade that the mysterious affliction called “colony collapse disorder”, which caused a blip in honey-bee numbers in the mid-2000s, was always only a temporary phenomenon. Globally, bees are doing better than ever. The trouble is that bad news sticks around like honey, while good news dries up like water.

Honey bees are a domesticated species, so their success depends partly on human incentives. In the case of America, the Texas state government’s decision to reduce property taxes on plots containing bee hives has boosted the popularity of beekeeping. When bees were in trouble, they were seen as a measure of the health of the environment generally. So their recovery can be regarded as a sign of good environmental health.

Why do stories of environmental doom, like this one about collapsing bee colonies, linger in the public consciousness, despite being outdated and wrong? The media are partly to blame. For environmental reporters, bad news is always more enticing than good. It’s more likely to catch the attention of editors and more likely to get clicks from readers. Good news is no news.

So I have a simple rule of thumb to work out when an environmental problem is on the mend: it drops out of the news. (The same is true of countries, by the way. When I was young, Angola and Mozambique were often in the news because they were torn by war; not today, because they are at peace.)

Take whales. In the 1960s, they were the (literal) posterboys of environmental alarm. There were just 5,000 humpback whales in the whole world and they seemed headed for extinction. Today, there are 135,000 humpback whales, which represents a 27-fold increase. For the first time in centuries they sometimes gather in groups of over a hundred. I have even seen them several times myself, which I had assumed as a boy I never would.

Most other whale species are doing almost as well: blue, fin, right, bowhead, sperm, grey, minke – all are increasing steadily in numbers (though certain subpopulations, such as North Atlantic right whales, are still struggling). But the story of whales’ resurgence just doesn’t make the news.

Or take polar bears. Just a few years ago, greens were constantly claiming that they were facing imminent extinction. In 2017, National Geographic published a video of a starving polar bear, with the tagline, “This is what climate change looks like”. It was viewed 2.5 billion times. No climate conference or Greenpeace telly advert was complete without a picture of a sad polar bear on an ice floe. Today, that’s a less common sight, because it is harder and harder to deny that polar bears are less and less rare. Despite heroic efforts by environmentalists to claim otherwise, there is now no hiding the fact that polar-bear numbers have not declined and have probably increased, with some populations having doubled over the past few decades. So much so that some environmentalists and researchers no longer think that polar bears are suitable symbols of man’s threat to the planet.

The refusal of polar-bear numbers to conform to the eco-pessimists’ narrative should not be a surprise. In 2009, Al Gore claimed that the Arctic polar ice cap could disappear in as little as five years. A decade on, that is still nowhere near happening yet. Besides, polar bears have always taken refuge on land in late summer in regions where the ice does melt, such as Hudson Bay.

Another Arctic species, the walrus, is doing so well now that it sometimes turns up on beaches in Britain. It’s the same story for fur seals, elephant seals and king penguins. A few years ago, I visited South Georgia in the Antarctic and saw thousands upon thousands of all three species, when little over a century ago they would have been very rare there.

These whales, seals, penguins and bears are booming for a very simple reason: we stopped killing them. Their meat could not compete with beef. And, above all, their fur and blubber could not compete with petroleum products. Or to put it another way, fossil fuels saved the whale.

QotD: Prepper fantasy versus prepper reality

Filed under: Books, Gaming, History, Quotations, USA — Tags: , , , , , — Nicholas @ 01:00

… note that this is also a bit of a rebuke to the dominant strain of prepper fantasies, such as those I began this review with. Prepper fantasies are most fundamentally fantasies of agency, dreams that in the right crisis the actions you take could actually matter, and that in the wake of that crisis you could return to a Rousseauian condition of autonomous activity freed from the internal conflicts engendered by societal oppression (whether that oppression takes the form of stifling social convention or HRified bureaucratic fiat). It’s obvious how the prepper fantasies relate to the great survival stories like Robinson Crusoe, or to the pioneer dramas of the American Westward expansion. It’s a little less obvious, but just as deeply true, that they’re connected to stories of rogues, rascals, and reavers like those by Robert E. Howard or Bronze Age Pervert. All of these stories, fundamentally, are about how a man freed from external restraint and internal conflict can apply himself to better his condition.

The thing is these stories are totally ahistorical — the best that solitary survivors have ever managed was to survive, none of them have rebuilt civilization. As Jane notes in her review of BAP, the sandal-clad barbarians have generally been subjected to a “tyranny of the cousins” even more intrusive and meticulous than the gynocratic safetyism that Bronze Age Lifestyle offers an imaginative escape from. And as for the pioneers, Tanner Greer notes that:

    Many imagine the great American man of the past as a prototypical rugged individual, neither tamed nor tameable, bestriding the wilderness and dealing out justice in lonesome silence. But this is a false myth. It bears little resemblance to the actual behavior of the American pioneer, nor to the kinds of behaviors and norms that an agentic culture would need to cultivate today. Instead, the primary ideal enshrined and ritualized as the mark of manhood was “publick usefuleness”, similar, if not quite identical, to the classical concept of virtus. American civilization was built not by rugged individuals but by rugged communities. Manhood was understood as the leadership of and service to these communities.

It would be too easy to end the review here, with the implication that the prepper identity is a fantasy of radical individualism and like all such fantasies, kinda dumb. But the thing is, the prepper world has by and large absorbed this critique and incorporated it into its theorizing. In contrast to the libertarian fantasies of the 1970s, second-wave prepperism (reformed prepperism?) is constantly talking about community, the importance of having friends you can trust, of cultivating deep social bonds with your neighbors, etc.

What Yu Gun reminds us is that this is still totally ahistorical, but this time in a way that indicts not only the preppers, but also a much broader swathe of our society. A man without a community is unnatural, but so is a community without leadership, hierarchy, and order. The prepper version of community is a vision of freely contracting individuals respecting each others’ autonomy while cooperating because it’s in their best interests. This is also the folk version of community that motivates much of our economic and legal regime. Scratch an American “communitarian”, and underneath it’s just another individualist.

If you hang out on prepper forums, a recurrent mantra is to “practice your preps”, that is to start living on the margin as if the apocalypse had already occurred. The purpose of this is to gain experience in the skills you’ll need after the end, and to work out the kinks in your routine now, while it’s still easy to make adjustments. Originally this meant practicing getting lost in the woods, using and maintaining your weapon of choice, eating some of your food stockpile, or whatever. In second-wave prepperism it means all that, plus a bunch of new stuff like hanging out with your neighbors, attending community barbecues, and whatever else it is that freely contracting individuals like to autonomously do while temporarily occupying the same space.

But for we third-wave preppers, it has to take on a very different meaning. Greer’s essay that I quoted above is mainly about how leadership and service in local-scale organizations served as training for leadership and service in much larger groups aimed at problems with much higher stakes. In other words, they were practicing their preps. One of the great secrets of leadership is that following and leading are actually closely related skills, and that practice at one of them transfers well to the other. This is difficult for we Americans to see, because an aversion to hierarchy is built into our national character, and consequently we operate with impoverished models of what it means to be in a position of authority or of subordination.

Long ago I read an article contrasting Western and Korean massively-multiplayer online role-playing games (MMORPGs). Even if you know nothing about computer games, you probably know that in most of them you are the hero, the chosen one, the child of destiny. Talk about fantasies of agency! MMORPGs thus have a tricky needle to thread — somehow all the thousands and thousands of players need to simultaneously be the chosen one, the child of destiny, etc., etc. And they mostly accomplish this by just rolling with it and asking everybody to suspend disbelief. But this article claimed that Korean MMORPGs are different — when players join these games, they’re randomly assigned a role. A tiny fraction might become kings or generals or children of destiny, with the power to decide the fates of peoples and kingdoms, but most are given a role as ordinary soldiers or porters or blacksmiths, and toil away at their in-game mundane tasks, without much ability to affect anything at all.

We like to imagine that after the bombs fall and the smoke clears we will emerge as the new Yu Gun, apportioning merit and assigning tasks. And perhaps you will indeed be called upon to do that, so you should prepare yourself to step up and do it. That preparation will involve some practice commanding others and some practice obeying others’ commands, because the two are inextricably bound together. But in life as in Korean video games, there’s isn’t very much room at the top. Far more likely, when the stage of history is set, we will be cast in a supporting role, like the Korean gamer assigned to role-play as a peasant or like Yu’s followers standing in orderly ranks. Let us not turn our noses up at this vocation, the poorly-behaved seldom make history.

John Psmith, “REVIEW: Medieval Chinese Warfare, 300-900 by David A. Graff”, Mr. and Mrs. Psmith’s Bookshelf, 2023-06-05.

April 6, 2024

The Fake (and real) History of Potato Chips

Filed under: Britain, Food, History, USA — Tags: , , , , — Nicholas @ 02:00

Tasting History with Max Miller
Published Jan 2, 2024

The fake and true history of the potato chip and an early 19th century recipe for them. Get the recipe at my new website https://www.tastinghistory.com/ and buy Fake History: 101 Things that Never Happened: https://lnk.to/Xkg1CdFB
(more…)

February 26, 2024

QotD: Lockdown rebuttal

Filed under: Government, Health, Media, Quotations, USA — Tags: , , , , — Nicholas @ 01:00

First, lockdowns were neither prudent nor essential. It’s not as if government officials considered the collateral damage to be inflicted on the economy, society, and health – not all health problems are caused by covid – by the lockdowns and then rationally concluded that the benefits of locking down outweighed these costs. No. The collateral damages were ignored. As the New York Times‘s Joe Nocera and Vanity Fair‘s Bethany McLean – authors of the just-released The Big Fail – write, “But there was never any science behind lockdowns – not a single study had ever been undertaken to measure their efficacy in stopping a pandemic. When you got right down to it, lockdowns were little more than a giant experiment.”1 In no universe is such a policy prudent.

Nor were lockdowns “essential”. As Nocera and McLean note,

    … the weight of the evidence seems to be with those who say that lockdowns did not save many lives. By our count, there are at least 50 studies that come to the same conclusion. After The Big Fail went to press, The Lancet published a study comparing the COVID infection rate and death rate in the 50 states. It concluded that “SARS-CoV-2 infections and COVID-19 deaths disproportionately clustered in U.S. states with lower mean years of education, higher poverty rates, limited access to quality health care, and less interpersonal trust – the trust that people report having in one another.” These sociological factors appear to have made a bigger difference than lockdowns (which were “associated with a statistically significant and meaningfully large reduction in the cumulative infection rate, but not the cumulative death rate”.)

Second, the lockdowns were, contra Mr. Orrell’s claim, utterly unprecedented. Isolating individuals known to be infected, such as Typhoid Mary, is a categorically different measure than locking down whole societies. Such lockdowns were never used until China locked Wuhan down in early 2020. Here again are Nocera and McLean: “On April 8, 2020, the Chinese government lifted its lockdown of Wuhan. It had lasted 76 days – two and a half months during which no one was allowed to leave this industrial city of 11 million people, or even leave their homes. Until the Chinese government deployed this tactic, a strict batten-down-the-hatches approach had never been used before to combat a pandemic. Yes, for centuries infected people had been quarantined in their homes, where they would either recover or die. But that was very different from locking down an entire city; the World Health Organization called it ‘unprecedented in public health history’.”

It’s jarring to encounter in an essay that features many excellent arguments – as Mr. Orrell’s does – such irrational and utterly uninformed claims as Mr. Orrell offers about lockdowns.

Donald J. Boudreaux, responding to an article by Brent Orrell in Law & Liberty, 2023-10-31.


February 19, 2024

QotD: Cleopatra VII Philopator

This week on the blog we’re going to talk about Cleopatra or to be more specific, we’re going to talk about Cleopatra VII Philopator, who is the only Cleopatra you’ve likely ever heard of, but that “seven” after her name should signal that she’s not the only Cleopatra.1 One of the trends in scholarship over the years towards larger than life ancient historical figures – Caesar, Alexander, Octavian, etc. – has been attempts to demystify them, stripping away centuries of caked-on reception, assumptions and imitation to ask more directly: who was this person, what did they do and do we value those sorts of things?2

Cleopatra, of course, has all of that reception layered on too. In antiquity and indeed until the modern era, she was one of the great villains of history, the licentious, wicked foreign queen of Octavian’s propaganda. More recently there has been an effort to reinvent her as an icon of modern values, perhaps most visible lately in Netflix’ recent (quite poorly received) documentary series. A lot of both efforts rely on reading into gaps in the source material. What I want to do here instead is to try to strip some of that away, to de-mystify Cleopatra and set out some of what we know and what we don’t know about her, with particular reference to the question I find most interesting: was Cleopatra actually a good or capable ruler?

Now a lot of the debate sparked by that Netflix series focused on what I find the rather uninteresting (but quite complicated) question of Cleopatra’s heritage or parentage or – heaven help us – her “race”. But I want to address this problem too, not because I care about the result but because I am deeply bothered by how confidently the result gets asserted by all sides and how swiftly those confident assertions are mobilized into categories that just aren’t very meaningful for understanding Cleopatra. To be frank, Cleopatra’s heritage should be a niche question debated in the pages of the Journal of Juristic Papyrology by scholars squinting at inscriptions and papyri, looking to make minor alterations in the prosopography of the Ptolemaic dynasty, both because it is highly technical and uncertain, but also because it isn’t an issue of central importance. So we’ll get that out of the way first in this essay and then get to my main point, which is this:

Cleopatra was, I’d argue, at best a mediocre ruler, whose ambitious and self-interested gambles mostly failed, to the ruin of herself and her kingdom. This is not to say Cleopatra was a weak or ineffective person; she was very obviously highly intelligent, learned, a virtuoso linguist, and a famously effective speaker. But one can be all of those things and not be a wise or skillful ruler, and I tend to view Cleopatra in that light.

Now I want to note the spirit in which I offer this essay. This is not a take-down of the Netflix Queen Cleopatra documentary (though it well deserves one and has received several; it is quite bad) nor a take-down of other scholars’ work on Cleopatra. This is simply my “take” on her reign. There’s enough we don’t know or barely know that another scholar, viewing from another angle, might well come away with a different conclusion, viewing Cleopatra in a more positive light. This is, to a degree, a response to some of the more recent public hagiography on Cleopatra, which I think air-brushes her failures and sometimes tries a bit too hard to read virtues into gaps in the evidence. But they are generally gaps in the evidence and in a situation where we are all to a degree making informed guesses, I am hardly going to trash someone who makes a perfectly plausible but somewhat differently informed guess. In history there are often situations where there is no right answer – meaning no answer we know to be true – but many wrong answers – answers we know to be false. I don’t claim to have the right answer, but I am frustrated by seeing so many very certain wrong answers floating around the public.

Before we dive in briefly to the boring question of Cleopatra’s parentage before the much more interesting question of her conduct as a ruler, we need to be clear about the difficult nature of the sources for Cleopatra and her reign. Fundamentally we may divide these sources into two groups: there are inscriptions, coins and papyrus records from Egypt which mention Cleopatra (and one she wrote on!) but, as such evidence is wont to be, [they] are often incomplete or provided only limited information. And then there are the literary sources, which are uniformly without exception hostile to Cleopatra. And I mean extremely hostile to Cleopatra, filled with wrath and invective. At no point, anywhere in the literary sources does Cleopatra get within a country mile of a fair shake and I am saying that as someone who thinks she wasn’t very good at her job.

The problem here is that Cleopatra was the target of Octavian’s PR campaign, as it were, in the run up to his war with Marcus Antonius (Marc Antony; I’m going to call him Marcus Antonius here), because as a foreign queen – an intersecting triad of concepts (foreignness, monarchy and women in power) which all offended Roman sensibilities – she was effectively the perfect target for a campaign aimed at winning over the populace of Italy, which was, it turns out, the most valuable military resource in the Mediterranean.3 That picture – the foreign queen corrupting the morals of good Romans with her decadence – rightly or wrongly ends up coloring all of the subsequent accounts. Of course that in turn effects the reliability of all of our literary sources and thus we must tread carefully.

Bret Devereaux, “Collections: On the Reign of Cleopatra”, A Collection of Unmitigated Pedantry, 2023-05-26.


    1. Or even just the seventh!

    2. This is not to diminish the value of reception studies that trace the meaning a figure – or the memory of a figure – had over time. That’s a valuable but different lens of study.

    3. It’s not all Octavian, mind. Cicero’s impression of Cleopatra was also sharply negative, for many of the same reasons: Cicero was hardly likely to be affable to a foreign queen who was an ally of Julius Caesar.

January 25, 2024

The Bathtub Hoax and debunked medieval myths

Filed under: Europe, History, Humour, USA — Tags: , , , , , , — Nicholas @ 05:00

David Friedman spends a bit of time debunking some bogus but widely believed historical myths:

“Image” by Lauren Knowlton is licensed under CC BY 2.0 .

The first is a false story that teaches a true lesson — the U.S. did treat Amerinds unjustly in a variety of contexts, although the massive die off as a result of the spread of Old World diseases was a natural result of contact, not deliberate biological warfare. The second lets moderns feel superior to their ignorant ancestors; most people like feeling superior to someone.

Another example of that, deliberately created by a master, is H.L. Mencken’s bathtub hoax, an entirely fictitious history of the bathtub published in 1917:

    The article claimed that the bathtub had been invented by Lord John Russell of England in 1828, and that Cincinnatian Adam Thompson became acquainted with it during business trips there in the 1830s. Thompson allegedly went back to Cincinnati and took the first bath in the United States on December 20, 1842. The invention purportedly aroused great controversy in Cincinnati, with detractors claiming that its expensive nature was undemocratic and local doctors claiming it was dangerous. This debate was said to have spread across the nation, with an ordinance banning bathing between November and March supposedly narrowly failing in Philadelphia and a similar ordinance allegedly being effective in Boston between 1845 and 1862. … Oliver Wendell Holmes Sr. was claimed to have campaigned for the bathtub against remaining medical opposition in Boston; the American Medical Association supposedly granted sanction to the practice in 1850, followed by practitioners of homeopathy in 1853.

    According to the article, then-Vice President Millard Fillmore visited the Thompson bathtub in March 1850 and having bathed in it became a proponent of bathtubs. Upon his accession to the presidency in July of that year, Fillmore was said to have ordered the construction of a bathtub in the White House, which allegedly refueled the controversy of providing the president with indulgences not enjoyed by George Washington or Thomas Jefferson. Nevertheless, the effect of the bathtub’s installation was said to have obliterated any remaining opposition, such that it was said that every hotel in New York had a bathtub by 1860. (Wikipedia)

Writing more than thirty years later, Mencken claimed to have been unable to kill the story despite multiple retractions. A google search for [Millard Fillmore bathtub] demonstrates that it is still alive. Among other hits:

    The first bathtub placed in the White House is widely believed to have had been installed in 1851 by President Millard Fillmore (1850-53). (The White House Bathrooms & Kitchen)

Medieval

The desire of moderns to feel superior to their ancestors, helps explain a variety of false beliefs about the Middle Ages including the myth, discussed in detail in an earlier post, that medieval cooking was overspiced to hide the taste of spoiled meat.

Other examples:

Medieval witch hunts: Contrary to popular belief, large scale persecution of witches started well after the end of the Middle Ages. The medieval church viewed the belief that Satan could give magical powers to witches, on which the later prosecutions were largely based, as heretical. The Spanish Inquisition, conventionally blamed for witchcraft prosecutions, treated witchcraft accusations as a distraction from the serious business of identifying secret Jews and Muslims, dealt with such accusations by applying serious standards of evidence to them.

Chastity Belts: Supposedly worn by the ladies of knights off on crusade. The earliest known evidence of the idea of a chastity belt is well after the end of the crusades, a 15th century drawing, and while there is literary evidence for their occasional use after that no surviving examples are known to be from before the 19th century.

Ius Prima Noctae aka Droit de Seigneur was the supposed right of a medieval lord to sleep with a bride on her wedding night. Versions of the institution are asserted in a variety of sources going back to the Epic of Gilgamesh, but while it is hard to prove that it never existed in the European middle ages it was clearly never the norm.

The Divine Right of Kings: Various rulers through history have claimed divine sanction for their rule but “The Divine Right of Kings” is a doctrine that originated in the sixteenth and seventeenth century with the rise of absolute monarchy — Henry VIII in England, Louis XIV in France. Medieval rulers were absolute in neither theory or practice. The feudal relation was one of mutual obligation, in its simplest form protection by the superior in exchange for set obligations of support by the inferior. In practice the decentralized control of military power under feudalism presented difficulties for a ruler who wished to overrule the desires of his nobility, as King John discovered.

Some fictional history functions in multiple versions designed to support different causes. The destruction of the Library of Alexandria has been variously blamed on Julius Caesar, Christian mobs rioting against pagans, and the Muslim conquerors of Egypt, the Caliph Umar having supposedly said that anything in the library that was true was already in the Koran and anything not in the Koran was false. There is no good evidence for any of the stories. The library existed in classical antiquity, no longer exists today, but it is not known how it was destroyed and it may have just gradually declined.

January 11, 2024

Pushing back against the Colonialism Narrative

Filed under: Africa, Books, Britain, History, India — Tags: , , , , — Nicholas @ 04:00

At Samizdata, Brendan Westbridge praises Nigel Biggar’s 2023 book Colonialism: A Moral Reckoning:

He examines the various claims that the “de-colonisers” make: Amritsar, slavery, Benin, Boer War, Irish famine. In all cases he finds that their claims are either entirely ungrounded or lack vital information that would cast events in a very different light. Amritsar? Dyer was dealing with political violence that had led to murder. Some victims had been set alight. Anyway, he was condemned for his actions by the British authorities and, indeed, his own standing orders. Slavery? Everyone had it and Britain was the first to get rid of it. Benin? They had killed unarmed ambassadors. Irish famine? They tried to relieve it but they were quite unequal to the size of the task. In the case of Benin he comes very close to accusing the leading de-coloniser of knowingly lying. The only one of these where I don’t think he is so convincing is the Boer War. He claims that Britain was concerned about the future of the Cape and especially the Simonstown naval base and also black rights. I think it was the pursuit of gold even if it does mean agreeing with the communist Eric Hobsbawm.

He is far too polite about the “de-colonisers”. They are desperate to hammer the square peg of reality into their round-hole of a theory. To this end they claim knowledge they don’t have, gloss over inconvenient facts, erect theories that don’t bear scrutiny and when all else fails: lie. Biggar tackles all of these offences against objectivity with a calmness and a politeness that you can bet his detractors would never return.

The communists – because they are obsessed with such things and are past masters at projection – like to claim that there was an “ideology” of Empire. Biggar thinks this is nonsense. As he says:

    There was no essential motive or set of motives that drove the British Empire. The reasons why the British built an empire were many and various. They differed between trader, migrant, soldier, missionary, entrepreneur, financier, government official and statesman. They sometimes differed between London, Cairo, Cape Town and Calcutta. And all of the motives I have unearthed in this chapter were, in themselves, innocent: the aversion to poverty and persecution, the yearning for a better life, the desire to make one’s way in the world, the duty to satisfy shareholders, the lure of adventure, cultural curiosity, the need to make peace and keep it, the concomitant need to maintain martial prestige, the imperative of gaining military or political advantage over enemies and rivals, and the vocation to lift oppression and establish stable self-government. There is nothing morally wrong with any of these. Indeed, the last one is morally admirable.

One of the benefits of the British Empire is that it tended to put a stop to local wars. How many people lived because of that? But that leads us on to another aspect. Almost no one ever considers what went on before the Empire arrived. Was it better or worse than went before it? Given that places like Benin indulged in human sacrifice, I would say that in many cases the British Empire was an improvement. And if we are going to talk about what went before what about afterwards? He has little to say about what newly-independent countries have done with their independence. The United States, the “white” (for want of a better term) Commonwealth and Singapore have done reasonably well. Ireland is sub-par but OK. Africa, the Caribbean and the Indian sub-continent have very little to show for themselves. This may explain why Britain needed very few people to maintain the Empire. At one point he points out that at the height of the Raj the ratio of Briton to native was 1 to 1000. That implies a lot of consent. Tyrannies need a lot more people.

The truth of the matter is that talk of reparations is rooted in the failure of de-colonisation. If Jamaica were a nicer place to live than the UK, if Jamaica had a small boats crisis rather than the UK then no one would be breathing a word about reparations or colonial guilt. All this talk is pure deflection from the failure of local despots to make the lives of their subjects better.

Biggar has nothing to say about what came after the empire and he also has little to say about how it came about in the first place – so I’ll fill in that gap. Britain acquired an empire because it could. Britain was able to acquire an Empire because it mastered the technologies needed to do it to a higher level and on a greater scale than anyone else. Britain mastered technology because it made it possible to prosper by creating wealth. That in itself was a moral achievement.

November 17, 2023

Prometheus

Filed under: Books, Greece, History, Technology — Tags: , , , — Nicholas @ 04:00

Virginia Postrel tries to correct the common misinterpretation of the story of the Titan Prometheus:

“The Torture of Prometheus” by Salvator Rosa (1615-1673)
Oil painting in the Galleria Nazionale d’Arte Antica di Palazzo Corsini via Wikimedia Commons.

Listening to Marc Andreessen discuss his Techno-Optimist Manifesto on the Foundation for American Innovation’s Dynamist podcast, I was struck by his repetition of something that is in the manifesto and is completely wrong. “The myth of Prometheus – in various updated forms like Frankenstein, Oppenheimer, and Terminator – haunts our nightmares,” he writes.1 On the podcast, he elaborated by saying that, although fire has many benefits, the Prometheus myth focuses on its use as a weapon. He said something similar in a June post called “Why AI Will Save the World“:

    The fear that technology of our own creation will rise up and destroy us is deeply coded into our culture. The Greeks expressed this fear in the Prometheus Myth – Prometheus brought the destructive power of fire, and more generally technology (“techne”), to man, for which Prometheus was condemned to perpetual torture by the gods.

No. No. No. No.

Prometheus is punished for loving humankind. He stole fire to thwart Zeus’ plans to eliminate humanity and create a new subordinate species. He is a benefactor who sacrifices himself for our good. His punishment is an indicator not of the dangers of fire but of the tyranny of Zeus.

Prometheus is cunning and wise. His name means foresight. He knows what he is doing and what the likely consequences will be.

Eventually his tortures end when he is rescued by the hero Herakles (aka Hercules), who shoots the eagle charged with eating Prometheus’ liver every day, only for it to grow back to be eaten again.

The Greeks honored Prometheus. They celebrated technē. They appreciated the gifts of civilization.

The ancient myth of Prometheus is not a cautionary tale. It is a reminder that technē raises human beings above brutes. It is a myth founded in gratitude.


    1. Frankenstein isn’t The Terminator either. Frankenstein is a creator who won’t take responsibility for his creation, a father who rejects and abandons his child. The Creature is frightening and dangerous but he is also the book’s moral center, a tragic, sympathetic character who is feared and rejected by human beings because of his appearance. Only then does he turn deadly. Frankenstein arouses pity and terror because we empathize with its central figure and understand his rage.

    The novel’s most reasonable political reading is not as a story of the dangers of science but as a parable of slavery and rebellion. “By the eighteen-fifties, Frankenstein’s monster regularly appeared in American political cartoons as a nearly naked black man, signifying slavery itself, seeking his vengeance upon the nation that created him,” writes historian Jill LePore, who calls the “Frankenstein-is-Oppenheimer model … a weak reading of the novel.” I agree.

    The Romantics tended to identify with Prometheus and Mary Shelley’s husband, Percy Bysshe Shelley, wrote a play called Prometheus Unbound, further undermining the reading of Frankenstein as an anti-Promethean fable.

November 12, 2023

Who Destroyed The Library of Alexandria? | The Rest is History

The Rest is History
Published 21 Jul 2023

Step back in time with renowned historians Dominic Sandbrook and Tom Holland as they embark on an enthralling journey to explore the enigmatic tale of the Library of Alexandria’s destruction. Join them as they uncover the who, what, and why behind one of history’s greatest losses.

#LibraryOfAlexandria #DominicSandbrook #TomHolland

« Newer PostsOlder Posts »

Powered by WordPress