Quotulatiousness

October 6, 2024

The rise of coal as a fuel in England

Filed under: Britain, Economics, History — Tags: , , , , , — Nicholas @ 03:00

In the latest instalment of Age of Invention, Anton Howes considers the reasons for the rise of coal and refutes the frequently deployed “just so” story that it was driven by mass deforestation in England:

An image of coal pits in the Black Country from Griffiths’ Guide to the iron trade of Great Britain, 1873.
Image digitized by the Robarts Library of the University of Toronto via Wikimedia Commons.

It’s long bothered me as to why coal became so important in Britain. It had sat in the ground for millennia, often near the surface. Near Newcastle and Sunderland it was often even strewn out on the beaches.1 Yet coal had largely only been used for some very specific, small-scale uses. It was fired in layers with limestone to produce lime, largely used in mortar for stone and brick buildings. And it had long been popular among blacksmiths, heating iron or steel in a forge before shaping it into weapons or tools.2

Although a few places burned coal for heating homes, this was generally only done in places where the coal was an especially pure, hard, and rock-like anthracite, such as in southern Wales and in Lowlands Scotland. Anthracite coal could even be something of a luxury fuel. It was burned in the palaces of the Scottish kings.3 But otherwise, the sulphur in the more crumbly and more common coal, like that found near Newcastle, meant that the smoke reeked, reacting with the moisture of people’s eyes to form sulphurous acid, and so making them sting and burn. The very poorest of the poor might resort to it, but the smoke from sulphurous coal fires was heavy and lingering, its soot tarnishing clothes, furnishings, and even skin, whereas a wood fire could be lit in a central open hearth, its smoke simply rising through the rafters and finding its way out through the various crevices and openings of thatched and airy homes. Coal was generally the inferior fuel.

But despite this inferiority, over the course of the late sixteenth century much of the populated eastern coast of England, including the rapidly-expanding city of London, made the switch to burning the stinking, sulphurous, low-grade coal instead of wood.

By far the most common explanation you’ll hear for this dramatic shift, much of which took place over the course of just a few decades c.1570-1600, is that under the pressures of a growing population, with people requiring ever more fuel both for industry and to heat their homes, England saw dramatic deforestation. With firewood in ever shorter supply, its price rose so high as to make coal a more attractive alternative, which despite its problems was at least cheap. This deforestation story is trotted out constantly in books, on museum displays, in conversation, on social media, and often even by experts on coal and iron. I must see or hear it at least once a week, if not more. And there is a mountain of testimonies from contemporaries to back the story up. Again and again, people in the late sixteenth and the seventeenth centuries complained that the woods were disappearing, and that wood fuel prices were on the rise.

And yet the deforestation thesis simply does not work. In fact it makes no sense at all.

Not out of the Woods Yet

This should immediately be obvious from even just a purely theoretical perspective, because wood was almost never exploited for fuel as a one-off resource. It was not like coal or peat or oil, which once dug out of the ground and burned could only be replaced by finding more. It was not a matter of cutting swathes of forest down and burning every branch, stump and root, leaving the land barren and going off in search of more. Our sixteenth-century ancestors were not like Saruman, destroying Fangorn forest for fuel. Instead, acres of forest, and even just the shrubs and trees that made up the hedges separating fields, were carefully maintained to provide a steady yield. The roots of trees were left living and intact, with the wood extracted by cutting away the trunk at the stump, or even just the branches or twigs — a process known as coppicing, and for branches pollarding — so that new trunks or branches would be able to grow back. Although some trees might be left for longer to grow into longer and thicker wood fit for timber, the underwoods were more regularly cropped.4

Given forests were treated as a renewable resource, claiming that they were cut down to cause the price of firewood to rise is like claiming that if energy became more expensive today, then we’d use all the water behind a hydroelectric dam and then immediately fill in the reservoir with rubble. Or it’s like claiming that rising food prices would result in farmers harvesting a crop and then immediately concreting over their fields. What actually happens is the precise opposite: when the things people make become more valuable, they tend to expand production, not destroy it. High prices would have prompted the English to rely on forests more, not to cut them down.

When London’s medieval population peaked — first in the 1290s before a devastating famine, and again in the 1340s on the eve of the Black Death — prices of wood fuel began to rise out of all proportion to other goods. But London had plenty of nearby woodland — wood is extremely bulky compared to its value, so trees typically had to be grown as close as possible to the city, or else along the banks of the Thames running through it, or along the nearby coasts. With the rising price of fuel, however, the city did not even have to look much farther afield for its wood, and nearby coastal counties even continued to export firewood across the Channel to the Low Countries (present-day Belgium and the Netherlands) and to the northern coast of France.5 A few industries did try to shift to coal, with lime-makers and blacksmiths substituting it for wood more than before, and with brewers and dyers seemingly giving it a try. But the stinking smoke rapidly resulted in the brewers and dyers being banned from using it, and there was certainly no shift to coal being burnt in people’s homes.6


    1. Ruth Goodman, The Domestic Revolution (Michael O’Mara Books, 2020), p.91

    2. James A. Galloway, Derek Keene, and Margaret Murphy, “Fuelling the City: Production and Distribution of Firewood and Fuel in London’s Region, 1290-1400”, The Economic History Review 49, no. 3 (1996): pp.447–9

    3. J. U. Nef, The Rise of the British Coal Industry, Vol. 1 (London: George Routledge and Sons, 1932), p.107, pp.115-8

    4. Oliver Rackham, Ancient Woodland: Its History, Vegetation and Uses in England (Edward Arnold, 1980), pp.3-6 is the best and clearest summary I have seen.

    5. Galloway et al.

    6. John Hatcher, The History of the British Coal Industry: Volume 1: Before 1700: Towards the Age of Coal (Oxford University Press, 1993), p.25

June 1, 2024

QotD: When the chimneys rose in London

A coal fire also burns much hotter, and with more acidic fumes, than a wood fire. Pots that worked well enough for wood — typically brass, either thin beaten-brass or thicker cast-brass — degrade rapidly over coal, and people increasingly switched to iron, which takes longer to heat but lasts much better. At the beginning of the shift to coal, the only option for pots was wrought iron — nearly pure elemental iron, wrought (archaic past tense of “worked”, as in “what hath God wrought”) with hammer and anvil, a labor-intensive process. But since the advent of the blast furnace in the late fifteenth century, there was a better, cheaper material available: cast iron.1 It was already being used for firebacks, rollers for crushing malt, and so forth, but English foundries were substantially behind those of the continent when it came to casting techniques in brass and were entirely unprepared to make iron pots with any sort of efficiency. The innovator here was Abraham Darby, who in 1707 filed a patent for a dramatically improved method of casting metal for pots — and also, incidentally, used a coal-fired blast furnace to smelt the iron. This turned out to be the key: a charcoal-fueled blast furnace, which is what people had been using up to then, makes white cast iron, a metal too brittle to be cast into nicely curved shapes like a pot. Smelting with coal produces gray cast iron, which includes silicon in the metal’s structure and works much better for casting complicated shapes like, say, parts for a steam engine. Coal-smelted iron would be the key material of the Industrial Revolution, but the economic incentive for its original development was the early modern market for pots, kettles, and grates suitable for cooking over the heat and fumes of a coal fire.2

In Ruth Goodman’s telling, though, the greatest difference between coal and wood fires is the smoke. Smoke isn’t something we think much about these days: on the rare occasions I’m around a fire at all, I’m either outdoors (where the smoke dissipates rapidly except for a pleasant lingering aroma on my jacket) or in front of a fireplace with a good chimney that draws the smoke up and out of the house. However, a chimney also draws about 70% of the fire’s heat — not a problem if you’re in a centrally-heated modern home and enjoying the fire for ✨ambience✨, but a serious issue if it’s the main thing between your family and the Little Ice Age outdoors. Accordingly, premodern English homes didn’t have chimneys: the fire sat in a central hearth in the middle of the room, radiating heat in all directions, and the smoke slowly dissipated out of the unglazed windows and through the thatch of the roof. Goodman describes practical considerations of living with woodsmoke that never occurred to me:

    In the relatively still milieu of an interior space, wood smoke creates a distinct and visible horizon, below which the air is fairly clear and above which asphyxiation is a real possibility. The height of this horizon line is critical to living without a chimney. The exact dynamics vary from building to building and from hour to hour as the weather outside changes. Winds can cause cross-draughts that stir things up; doors and shutters opening and closing can buffet smoke in various directions. … From my experiences managing fires in a multitude of buildings in many different weather conditions, I can attest to the annoyance of a small change in the angle of a propped-open door, the opening of a shutter or the shifting of a piece of furniture that you had placed just so to quiet the air. And as for people standing in doorways, don’t get me started.

One obvious adaptation was to live life low to the ground. On a warm day the smoke horizon might be relatively high, but on a cold damp one (of which, you may be aware, England has quite a lot) smoke hovers low enough that even sitting in a tall chair might well put your head right up into it. Far better to sit on a low stool, or, better yet, a nice soft insulating layer of rushes on the floor.

Chimneys did exist before the transition to coal, but given the cost of masonry and the additional fuel expenses, they were typically found only in the very wealthiest homes. Everyone else lived with a central hearth and if they could afford it added smoke management systems to their homes piecemeal. Among the available solutions were the reredos (a short half-height wall against which the fire was built and which would counteract drafts from doorways), the smoke hood (rather like our modern cooktop vent hood but without the fan, allowing some of the smoke to rise out of the living space without creating a draw on the heat), or the smoke bay (a method of constructing an upstairs room over only part of the downstairs that still allowed smoke to rise and dissipate through the roof). Wood smoke management was mostly a question of avoiding too great a concentration in places you wanted your face to be. The switch to coal changed this, though, because coal smoke is frankly foul stuff. It hangs lower than wood smoke, in part because it cools faster, and it’s full of sulfur compounds that combine with the water in your eyes and lungs to create a mild sulfuric acid; when your eyes water from the irritation, the stinging only gets worse. Burning coal in an unvented central hearth would have been painful and choking. If you already had one of the interim smoke management techniques of the wood-burning period — especially the smoke hood — you would have found adopting coal more appealing, but really, if you burned coal, you wanted a chimney. You probably already wanted a chimney, though; they had been a status symbol for centuries.

And indeed, chimneys went up all over London; their main disadvantage, aside from the cost of a major home renovation, had been the way they drew away the heat along with the smoke, but a coal fire’s greater energy output made that less of an issue. The other downside of the chimney’s draw, though, is the draft it creates at ground level. Again, this isn’t terribly noticeable today because most of us don’t spend a lot of time sitting in front of the fireplace (or indeed, sitting on the floor at all, unless we have small children), but pay attention next time you’re by an indoor wood fire and you will notice a flow of cold air for the first inch or two off the ground. All of a sudden, instead of putting your mattress directly on the drafty floor, you wanted a bedstead to lift it up — and a nice tall chair to sit on, and a table to pull your chair up to as well. There were further practical differences, too: because a chimney has to be built into a wall, it can’t heat as large an area as a central fire. This incentivized smaller rooms, which were further enabled by the fact that a coal fire can burn much longer without tending than a wood fire. A gentleman doesn’t have much use for small study where he can retreat to be alone with his books and papers if a servant is popping in every ten minutes to stir up the fire, but if the coals in the grate will burn for an hour or two untended he can have some real privacy. The premodern wood-burning home was a large open space where many members of the household, both masters and servants, went about their daily tasks; the coal-burning home gradually became a collection of smaller, furniture-filled spaces that individuals or small groups used for specific purposes. Nowhere is this shift more evident than in the word “hall”, which transitions from referring to something like Heorot to being a mere corridor between rooms.

Jane Psmith, “REVIEW: The Domestic Revolution by Ruth Goodman”, Mr. and Mrs. Psmith’s Bookshelf, 2023-05-22.


    1. Brief ferrous metallurgy digression: aside from the rare, relatively pure iron found in meteors, all iron found in nature is in the form of ores like haematite, where the iron bound up with oxygen and other impurities like silicon and phosphorus (“slag”). Getting the iron out of the ore requires adding carbon (for the oxygen to bond with) and heat (to fuel the chemical reaction): Fe2O3 + C + slag → Fe + CO2 + slag. Before the adoption of the blast furnace, European iron came from bloomeries: basically a chimney full of fuel hot enough to cause a reduction reaction when ore is added to the top, removing the oxygen from the ore but leaving behind a mass of mixed iron and slag called a bloom. The bloom would then be heated and beaten and heated and beaten — the hot metal sticks together while the slag crumbles and breaks off — to leave behind a lump of nearly pure iron. (If you managed the temperature of your bloomery just right you could incorporate some of the carbon into the iron itself, producing steel, but this was difficult to manage and carbon was usually added to the iron afterwards to make things like armor and swords.) In a blast furnace, by contrast, the fuel and ore were mixed together and powerful blasts of air were forced through as the material moved down the furnace and the molten iron dripped out the bottom. From there it could be poured directly into molds and cast into the desired shape. This is obviously much faster and easier! But cast iron has much more carbon, which makes it very hard, lowers its melting point, and leaves it extremely brittle — you would never want a cast iron sword. (The behavior of various ferrous metals is determined by the way the non-metal atoms, especially carbon, interrupt the crystal structure of the iron. Wrought iron has less than .08% carbon by weight, modern “low carbon” steel between .05% and .3%, “high carbon” steel about 1.7%, and cast iron more than 3%.)

    2. The sales of those cooking implements went on to provide the capital for further innovation: Darby’s son and grandson, two more Abrahams, also played important roles in the Industrial Revolution.

March 13, 2024

QotD: Filthy coal

… coal smoke had dramatic implications for daily life even beyond the ways it reshaped domestic architecture, because in addition to being acrid it’s filthy. Here, once again, [Ruth] Goodman’s time running a household with these technologies pays off, because she can speak from experience:

    So, standing in my coal-fired kitchen for the first time, I was feeling confident. Surely, I thought, the Victorian regime would be somewhere halfway between the Tudor and the modern. Dirt was just dirt, after all, and sweeping was just sweeping, even if the style of brushes had changed a little in the course of five hundred years. Washing-up with soap was not so very different from washing-up with liquid detergent, and adding soap and hot water to the old laundry method of bashing the living daylights out of clothes must, I imagined, make it a little easier, dissolving dirt and stains all the more quickly. How wrong could I have been.

    Well, it turned out that the methods and technologies necessary for cleaning a coal-burning home were fundamentally different from those for a wood-burning one. Foremost, the volume of work — and the intensity of that work — were much, much greater.

The fundamental problem is that coal soot is greasy. Unlike wood soot, which is easily swept away, it sticks: industrial cities of the Victorian era were famously covered in the residue of coal fires, and with anything but the most efficient of chimney designs (not perfected until the early twentieth century), the same thing also happens to your interior. Imagine the sort of sticky film that settles on everything if you fry on the stove without a sufficient vent hood, then make it black and use it to heat not just your food but your entire house; I’m shuddering just thinking about it. A 1661 pamphlet lamented coal smoke’s “superinducing a sooty Crust or Furr upon all that it lights, spoyling the moveables, tarnishing the Plate, Gildings and Furniture, and corroding the very Iron-bars and hardest Stones with those piercing and acrimonious Spirits which accompany its Sulphure.” To clean up from coal smoke, you need soap.

Coal needs soap?” you may say, suspiciously. “Did they … not use soap before?” But no, they (mostly) didn’t, a fact that (like the famous “Queen Elizabeth bathed once a month whether she needed it or not” line) has led to the medieval and early modern eras’ entirely undeserved reputation for dirtiness. They didn’t use soap, but that doesn’t mean they didn’t clean; instead, they mostly swept ash, dust, and dirt from their houses with a variety of brushes and brooms (often made of broom) and scoured their dishes with sand. Sand-scouring is very simple: you simply dampen a cloth, dip it in a little sand, and use it to scrub your dish before rinsing the dirty sand away. The process does an excellent job of removing any burnt-on residue, and has the added advantage of removed a micro-layer of your material to reveal a new sterile surface. It’s probably better than soap at cleaning the grain of wood, which is what most serving and eating dishes were made of at the time, and it’s also very effective at removing the poisonous verdigris that can build up on pots made from copper alloys like brass or bronze when they’re exposed to acids like vinegar. Perhaps more importantly, in an era where every joule of energy is labor-intensive to obtain, it works very well with cold water.

The sand can also absorb grease, though a bit of grease can actually be good for wood or iron (I wash my wooden cutting boards and my cast-iron skillet with soap and water,1 but I also regularly oil them). Still, too much grease is unsanitary and, frankly, gross, which premodern people recognized as much as we do, and particularly greasy dishes, like dirty clothes, might also be cleaned with wood ash. Depending on the kind of wood you’ve been burning, your ashes will contain up to 10% potassium hydroxide (KOH), better known as lye, which reacts with your grease to create a soap. (The word potassium actually derives from “pot ash,” the ash from under your pot.) Literally all you have to do to clean this way is dump a handful of ashes and some water into your greasy pot and swoosh it around a bit with a cloth; the conversion to soap is very inefficient (though if you warm it a little over the fire it works better), but if your household runs on wood you’ll never be short of ashes. As wood-burning vanished, though, it made more sense to buy soap produced industrially through essentially the same process (though with slightly more refined ingredients for greater efficiency) and to use it for everything.

Washing greasy dishes with soap rather than ash was a matter of what supplies were available; cleaning your house with soap rather than a brush was an unavoidable fact of coal smoke. Goodman explains that “wood ash also flies up and out into the room, but it is not sticky and tends to fall out of the air and settle quickly. It is easy to dust and sweep away. A brush or broom can deal with the dirt of a wood fire in a fairly quick and simple operation. If you try the same method with coal smuts, you will do little more than smear the stuff about.” This simple fact changed interior decoration for good: gone were the untreated wood trims and elaborate wall-hangings — “[a] tapestry that might have been expected to last generations with a simple routine of brushing could be utterly ruined in just a decade around coal fires” — and anything else that couldn’t withstand regular scrubbing with soap and water. In their place were oil-based paints and wallpaper, both of which persist in our model of “traditional” home decor, as indeed do the blue and white Chinese-inspired glazed ceramics that became popular in the 17th century and are still going strong (at least in my house). They’re beautiful, but they would never have taken off in the era of scouring with sand; it would destroy the finish.

But more important than what and how you were cleaning was the sheer volume of the cleaning. “I believe,” Goodman writes towards the end of the book, “there is vastly more domestic work involved in running a coal home in comparison to running a wood one.” The example of laundry is particularly dramatic, and her account is extensive enough that I’ll just tell you to read the book, but it goes well beyond that:

    It is not merely that the smuts and dust of coal are dirty in themselves. Coal smuts weld themselves to all other forms of dirt. Flies and other insects get entrapped in it, as does fluff from clothing and hair from people and animals. to thoroughly clear a room of cobwebs, fluff, dust, hair and mud in a simply furnished wood-burning home is the work of half an hour; to do so in a coal-burning home — and achieve a similar standard of cleanliness — takes twice as long, even when armed with soap, flannels and mops.

And here, really, is why Ruth Goodman is the only person who could have written this book: she may be the only person who has done any substantial amount of domestic labor under both systems who could write. Like, at all. Not that there weren’t intelligent and educated women (and it was women doing all this) in early modern London, but female literacy was typically confined to classes where the women weren’t doing their own housework, and by the time writing about keeping house was commonplace, the labor-intensive regime of coal and soap was so thoroughly established that no one had a basis for comparison.

Jane Psmith, “REVIEW: The Domestic Revolution by Ruth Goodman”, Mr. and Mrs. Psmith’s Bookshelf, 2023-05-22.


    1. Yeah, I know they tell you not to do this because it will destroy the seasoning. They’re wrong. Don’t use oven cleaner; anything you’d use to wash your hands in a pinch isn’t going to hurt long-chain polymers chemically bonded to cast iron.

March 10, 2024

Viking longships and textiles

Virginia Postrel reposts an article she originally wrote for the New York Times in 2021, discussing the importance of textiles in history:

The Sea Stallion from Glendalough is the world’s largest reconstruction of a Viking Age longship. The original ship was built at Dublin ca. 1042. It was used as a warship in Irish waters until 1060, when it ended its days as a naval barricade to protect the harbour of Roskilde, Denmark. This image shows Sea Stallion arriving in Dublin on 14 August, 2007.
Photo by William Murphy via Wikimedia Commons.

Popular feminist retellings like the History Channel’s fictional saga Vikings emphasize the role of women as warriors and chieftains. But they barely hint at how crucial women’s work was to the ships that carried these warriors to distant shores.

One of the central characters in Vikings is an ingenious shipbuilder. But his ships apparently get their sails off the rack. The fabric is just there, like the textiles we take for granted in our 21st-century lives. The women who prepared the wool, spun it into thread, wove the fabric and sewed the sails have vanished.

In reality, from start to finish, it took longer to make a Viking sail than to build a Viking ship. So precious was a sail that one of the Icelandic sagas records how a hero wept when his was stolen. Simply spinning wool into enough thread to weave a single sail required more than a year’s work, the equivalent of about 385 eight-hour days. King Canute, who ruled a North Sea empire in the 11th century, had a fleet comprising about a million square meters of sailcloth. For the spinning alone, those sails represented the equivalent of 10,000 work years.

Ignoring textiles writes women’s work out of history. And as the British archaeologist and historian Mary Harlow has warned, it blinds scholars to some of the most important economic, political and organizational challenges facing premodern societies. Textiles are vital to both private and public life. They’re clothes and home furnishings, tents and bandages, sacks and sails. Textiles were among the earliest goods traded over long distances. The Roman Army consumed tons of cloth. To keep their soldiers clothed, Chinese emperors required textiles as taxes.

“Building a fleet required longterm planning as woven sails required large amounts of raw material and time to produce,” Dr. Harlow wrote in a 2016 article. “The raw materials needed to be bred, pastured, shorn or grown, harvested and processed before they reached the spinners. Textile production for both domestic and wider needs demanded time and planning.” Spinning and weaving the wool for a single toga, she calculates, would have taken a Roman matron 1,000 to 1,200 hours.

Picturing historical women as producers requires a change of attitude. Even today, after decades of feminist influence, we too often assume that making important things is a male domain. Women stereotypically decorate and consume. They engage with people. They don’t manufacture essential goods.

Yet from the Renaissance until the 19th century, European art represented the idea of “industry” not with smokestacks but with spinning women. Everyone understood that their never-ending labor was essential. It took at least 20 spinners to keep a single loom supplied. “The spinners never stand still for want of work; they always have it if they please; but weavers are sometimes idle for want of yarn,” the agronomist and travel writer Arthur Young, who toured northern England in 1768, wrote.

Shortly thereafter, the spinning machines of the Industrial Revolution liberated women from their spindles and distaffs, beginning the centuries-long process that raised even the world’s poorest people to living standards our ancestors could not have imagined. But that “great enrichment” had an unfortunate side effect. Textile abundance erased our memories of women’s historic contributions to one of humanity’s most important endeavors. It turned industry into entertainment. “In the West,” Dr. Harlow wrote, “the production of textiles has moved from being a fundamental, indeed essential, part of the industrial economy to a predominantly female craft activity.”

February 16, 2024

What remains of the “first” steam powered passenger railway line?

Bee Here Now
Published 23 Oct 2023

The Stockton-Darlington Railway wasn’t the first time steam locomotives had been used to pull people, but it was the first time they had been used to pull passengers over any distance worth talking about. In 1825 that day came when a line running all the way from the coal pits in the hills around County Durham to the River Tees at Stockton was opened officially. This was an experiment, a practice, a great endeavour by local businessmen and engineers, such as the famous George Stephenson, who astounded crowds of onlookers with the introduction of Locomotion 1 halfway along the line, which began pulling people towards Darlington and then the docks at Stockton.

This was a day that would not only transform human transportation forever, but accelerate the industrial revolution to a blistering pace.

In this video I want to look at what remains of that line — not the bit still in use between the two towns, but the bit out in the coalfields. And I want to see how those early trailblazers tackled the rolling hills, with horses and stationary steam engines to create a true amalgamation of old-world and new-world technologies.
(more…)

November 26, 2023

QotD: From Industrial Revolution labour surplus to modern era academic surplus

Back in Early Modern England, enclosure led to a massive over-supply of labor. The urge to explore and colonize was driven, to an unknowable but certainly large extent, by the effort to find something for all those excess people to DO. The fact that they’d take on the brutal terms of indenture in the New World tells you all you need to know about how bad that labor over-supply made life back home. The same with “industrial innovation”. The first Industrial Revolution never lacked for workers, and indeed, Marxism appealed back in its day because the so-called “Iron Law of Wages” seemed to apply — given that there were always more workers than jobs …

The great thing about industrial work, though, is that you don’t have to be particularly bright to do it. There’s always going to be a fraction of the population that fails the IQ test, no matter how low you set the bar, but in the early Industrial Revolution that bar was pretty low indeed. So much so, in fact, that pretty soon places like America were experiencing drastic labor shortages, and there’s your history of 19th century immigration. The problem, though, isn’t the low IQ guys. It’s the high-IQ guys whose high IQs don’t line up with remunerative skills.

My academic colleagues were a great example, which is why they were all Marxists. I make fun of their stupidity all the time, but the truth is, they’re most of them bright enough, IQ-wise. Not geniuses by any means, but let’s say 120 IQ on average. Alas, as we all know, 120-with-verbal-dexterity is a very different thing from 120-and-good-with-a-slide-rule. Academics are the former, and any society that wants to remain stable HAS to find something for those people to do. Trust me on this: You do not want to be the obviously smartest guy in the room when everyone else in the room is, say, a plumber. This is no knock on plumbers, who by and large are cool guys, but it IS a knock on the high-IQ guy’s ego. Yeah, maybe I can write you a mean sonnet, or a nifty essay on the problems of labor over-supply in 16th century England, but those guys build stuff. And they get paid.

Those guys — the non-STEM smart guys — used to go into academia, and that used to be enough. Alas, soon enough we had an oversupply of them, too, which is why academia soon became the academic-industrial complex. 90% of what goes on at a modern university is just make-work. It’s either bullshit nobody needs, like “education” majors, or it’s basically just degrees in “activism”. It’s like Say’s Law for retards — supply creates its own demand, in this case subsidized by a trillion-dollar student loan industry. Better, much better, that it should all be plowed under, and the fields salted.

Any society digging itself out of the rubble of the future should always remember: No overproduction of elites!

Severian, “The Academic-Industrial Complex”, Rotten Chestnuts, 2021-05-30.

November 25, 2023

Crystal Palace Station is Needlessly Magnificent

Jago Hazzard
Published 9 Aug 2023

Crystal Palace and the station built for it. Well, one of them.
(more…)

November 19, 2023

Ted Gioia wonders if we need a “new Romanticism”

Filed under: Books, Europe, History, Media — Tags: , , , , , — Nicholas @ 05:00

He raised the question earlier this year, and it’s sticking with him to the point he’s gathering notes on the original Romantic movement and what it was reacting against:

The issues that enraged the original Luddites certainly have many modern echoes.

I realized that, the more I looked at what happened circa 1800, the more it reminded me of our current malaise.

  • Rationalist and algorithmic models were dominating every sphere of life at that midpoint in the Industrial Revolution — and people started resisting the forces of progress.
  • Companies grew more powerful, promising productivity and prosperity. But Blake called them “dark Satanic mills” and Luddites started burning down factories — a drastic and futile step, almost the equivalent of throwing away your smartphone.
  • Even as science and technology produced amazing results, dysfunctional behaviors sprang up everywhere. The pathbreaking literary works from the late 1700s reveal the dark side of the pervasive techno-optimism — Goethe’s novel about Werther’s suicide [Wiki], the Marquis de Sade’s nasty stories [Wiki], and all those gloomy Gothic novels [Wiki]. What happened to the Enlightenment?
  • As the new century dawned, the creative class (as we would call it today) increasingly attacked rationalist currents that had somehow morphed into violent, intrusive forces in their lives — an 180 degree shift in the culture. For Blake and others, the name Newton became a term of abuse.
  • Artists, especially poets and musicians, took the lead in this revolt. They celebrated human feeling and emotional attachments — embracing them as more trustworthy, more flexible, more desirable than technology, profits, and cold calculation.

That’s the world, circa 1800.

The new paradigm shocked Europe when it started to spread. Cultural elites had just assumed that science and reason would control everything in the future. But that wasn’t how it played out.

Resemblances with the current moment are not hard to see.

    “Imagine a growing sense that algorithmic and mechanistic thinking has become too oppressive. Imagine if people started resisting technology. Imagine a revolt against STEM’s dominance. Imagine people deciding that the good life starts with NOT learning how to code.”

These considerations led me, about nine months ago, to conduct a deep dive into the history of the Romanticist movement. I wanted to see what the historical evidence told me.

I’ve devoted hours every day to this — reading stacks of books, both primary and secondary sources, on the subject. I’ve supplemented it with a music listening program and a study of visual art from the era.

What’s my goal? I’m still not entirely sure.

November 15, 2023

“If you cannot make your own pig iron, you are just LARP’n as a real power”

Filed under: Britain, History, Technology — Tags: , , , , — Nicholas @ 04:00

CDR Salamander talks about the importance of an old industry to a modern industrial economy:

We probably need to start this out by explaining exactly what a blast furnace is and why it is important if you want to be a sovereign nation.

First of all, what it does;

    The purpose of blast furnace is to chemically reduce and physically convert iron oxide into liquid iron called “hot metal” The blast furnace is a huge, steel stack lined with refractory brick where iron ore, coke and limestone are charged into the top and preheated air is blown into the bottom. The raw materials require 6 to 8 hours to descend to the bottom of the furnace where they become the final product of liquid slag and liquid iron. These liquid products are drained from the furnace at regular intervals. The hot air that was blown into the bottom of the surface ascends to the top in 6 to 8 seconds after going through numerous chemical reactions. Once the blast furnace is started it continuously runs for four to ten years with only short stops to perform planned maintenance.

Why are blast furnaces so important? Remember the middle part of Billy Joel’s “Iron, coke, chromium steel?”

“Coke” is in essence purified coal, almost pure carbon. It is about the only thing that can at scale make “new” or raw iron, aka “pig iron”. Only coke in a blast furnace can make enough heat to turn iron ore in to iron. You can’t get that heat with an electric furnace.

Pig iron is the foundation of everything that follows that makes an industrial power. If you cannot make your own pig iron, you are just LARP’n as a real power.

It takes a semester at least to understand this, but here is all you really need to know;

    Primary differences

    While the end product from each of these is comparable, there are clearly differences between their capabilities and process. Comparing each type of furnace, the major distinctions are:

    Material source – blast furnaces can melt raw iron ore as well as recycled metal, while electric arc furnaces only melt recycled or scrap metal.

    Power supply – blast furnaces primarily use coke to supply the energy needed to heat up the metal, while EAFs use electricity to accomplish this.

    Environmental impact – because of the fuels used for each, EAFs can produce up to 85% less carbon dioxide than blast furnaces.

    Cost – EAFs cost less than blast furnaces and take up less space in a factory.

    Efficiency – EAFs also reach higher temperatures much faster and can melt and produce products more quickly, as well as having more precise control over the temperature compared to blast furnaces.

We’ll get to that environmental impact later, but the “Material source” section is your money quote.

Without a blast furnace, all you can do is recycle scrap iron.

You cannot fight wars at scale if all you have is scrap iron. You cannot be an industrial hub off of just scrap iron. If you are a nation of any size, you then become economically and security vulnerable at an existential level. I don’t care how much science fiction you get nakid and roll in; wars are won by steel, ungodly amounts of steel.

Where do you get the steel to build your warships? Your tanks? Your factories? Your buildings? Your factories?

If you can only use scrap, then you are simply a scavenger living off the hard work of previous generations. Eventually you run out. You will wind up like the cypress mills of old Florida where, once they ran out of cypress trees, they simply sold off the cypress lumber their mills were constructed of … and then went bankrupt.

July 29, 2023

The brief – but vastly profitable – heyday of Parys Mountain

Filed under: Britain, Business, History — Tags: , , , — Nicholas @ 04:00

In the latest Age of Invention newsletter, Anton Howes discusses the engine behind the meteoric rise of Britain’s “Copper King”, Thomas Williams:

Parys Mine Shaft. View down a shaft at Parys Mine.
Photo by Stephen Elwyn Roddick – CC BY-SA 2.0

At the time More visited, Thomas Williams had only just begun his rapid rise to power. He was already a major industrialist and grown stupendously wealthy. When More asked about his stables, Williams apparently could not even estimate how many he possessed to the nearest ten. But Williams not yet even master of the mountain.

Nonetheless, the mining was well underway. The closest port, Amlwch, was already connected to the mountain by a new road that had been built for the Parys Mine Company’s sole use. Having not long ago been a village of just six houses, Amlwch had turned into a bustling port.

The mine itself was a source of fascination. “This differs from any mine I had ever seen or perhaps is anywhere else to be found, for the ore here instead of being met with in veins is collected into one great mass, so that it is dug in quarries and brought out in carts without any shafts being sunk”. Instead, the miners hollowed out the mountain itself, forming vast caverns that they supported by simply leaving vast columns of the ore untouched. He noted at least four or five of these caverns with ceilings forty feet high, with columns of yellow ore: “the whole seemed like the ruins of some magnificent building whose pillars had been of massy brass.”

It’s a fascinating insight into what Parys would have very briefly looked like, because today there is so little of the mountain left. Indeed, some of the caverns More got to see were already collapsing, with the rubble then needing to be sorted. He describes how one such piece of rubble — a two-ton chunk of ore — had to be bored, the cavity rammed with gunpowder and sealed with stones, and then exploded. “They are continually blowing up parts of the mine”, he noted, and was informed that the part of the mine he was visiting alone got through 10-12 tons of gunpowder per year. The mountain was disintegrating, punctuated by the occasional boom.

And as though that were not dramatic enough, the whole place smelled like hell. When More visited there were some seventy vast kilns upon the mountain for calcining the ore, burning off its sulphur. Each kiln held some 2,000 tons of ore, and when ignited with a little dried vegetation or coal it was so sulphurous that it took four months of furious burning for the ore to be sufficiently calcined. He noted that one had to keep to the windward side of the kilns, as “the fumes arising from them are very disagreeable and destroy all vegetables for a considerable distance around them.”

May 27, 2023

The true purpose of the Great Exhibition of 1851

Filed under: Britain, History, Technology — Tags: , , , , — Nicholas @ 05:00

In the latest Age of Invention newsletter, Anton Howes considers the “why” of the 1851 Great Exhibition:

The Crystal Palace from the northeast during the Great Exhibition of 1851, image from the 1852 book Dickinsons’ comprehensive pictures of the Great Exhibition of 1851
Wikimedia Commons.

Ever since researching my book on the history of the Royal Society of Arts, I’ve been fascinated by the Great Exhibition of 1851, which they initiated. Like most people, I had once assumed that the exhibition was just a big celebration of Victorian technological superiority — a brash excuse to rub the British Industrial Revolution in the rest of the world’s faces. But my research into the origins of the event revealed that it was almost the opposite. Far from being a jingoistic expression of superiority, it was actually motivated by a worry that Britain was rapidly losing its place. It was an attempt to prevent decline by learning from other countries. It was largely about not falling behind.

Industrial exhibitions already had a long history in 1851, as a crucial weapon in other countries’ innovation policy arsenals. They were used by countries like France in particular — which held an exhibition every few years from 1798 — as a means of catching up with Britain’s technology. This sounds strange nowadays, when the closest apparent parallels are vanity projects like the Millennium Experience, the recent controversial “Festival of Brexit” that ended up just being a bunch of temporary visitor attractions all over the country, and glitzy mega-events like the World’s Fairs. But the World’s Fairs, albeit notional successors to the Great Exhibition, have strayed very far from the original vision and purpose. They’re now more about celebration, infotainment and national branding, whereas the original industrial exhibitions had concrete economic aims.

Industrial exhibitions were originally much more akin to specialist industry fairs, with producers showing off their latest products, sort of combined with academic conferences, with scientists demonstrating their latest advances. Unlike modern industry fairs and conferences, however, which tend to be highly specialised, appealing to just a few people with niche interests, industrial exhibitions showed everything, altogether, all at once. They achieved a more widespread appeal to the public by being a gigantic event that was so much more than the sum of its parts — often helped along by the impressive edifices that housed them. The closest parallel is perhaps the Consumer Electronics Show, held since 1967 in the United States. But even this only focuses on particular categories of industry, and is largely catered towards attendees already interested in “tech”. Industrial exhibitions were like the CES, but for everything.

The point of all this, rather than just being an event for its own sake, was to actually improve the things on display. This happened in a number of ways, each of them complementing the other.

Concentration generated serendipity. By having such a vast variety of industries and discoveries presented at the same event, exhibitions greatly raised the chances of serendipitous discovery. A manufacturer exhibiting textiles might come across a new material from an unfamiliar region, prompting them to import it for the first time. An inventor working on a niche problem might see the scientific demonstration of a concept that had not occurred to them, providing a solution.

Comparison bred emulation. Producers, by seeing their competitors’ products physically alongside their own, would see how things could be done better. They could learn from their competitors, with the laggards being embarrassed into improving their products for next time. And this could take place at a much broader, country-wide level, revealing the places that were outperforming others and giving would-be reformers the evidence they needed to discover and adopt policies from elsewhere.

Exposure shattered complacency. The visiting public, as users and buyers of the things on display, would be exposed to superior products. This was especially effective for international exhibitions of industry, of which the Great Exhibition was the first, and simulated an effect that had only ever really been achieved through expensive foreign travel — by being exposed to things they hadn’t realised could already be so much better than what they were accustomed to, consumers raised their standards. They forced the usual suppliers of their products to either raise their game or lose out to foreign ones.

May 20, 2023

QotD: Alienation

One of Marx’s most famous concepts, “alienation” initially meant “the systemic separation of a worker from the product of his labor”. The result of a craftsman’s labor is directly visible beneath his hands, growing by the day; when he’s done, the shirt (or whatever) sits there before him, fully finished. The factory worker, by contrast, is little more than a machine-tender; he pulls the lever, and the finished article is squirted out somewhere far down the line, automatically, by machine. His “labor” consists of lever-pulling and jam-clearing.

It was a real enough insight into the psychology of factory work, and Marx deserves all the credit he got for it, but “alienation” was even more useful in a broad social context — the separation of man from the cultural products of his society. After all, if capitalism is the mode of production around which society organizes itself, and the products of capitalism are by definition alienated from their producers, then by extension capitalist society must be alienated from itself. Indeed, what could “society” even mean, in a world of lever-pullers and bearing-lubers and jam-clearers?

Again, a profound and important insight into the social conditions of the Industrial Age. Ours is a mechanical, transactional world, one not well-suited to the kind of organism we are. That’s why Marxism and its spacey little brother Nazism are both what Jeffrey Herf calls “reactionary modernism.” The Communists thought they were the endpoint of the Enlightenment; the Nazis rejected it entirely; but both of them were curdled Romantics, in love with Enlightenment science while terrified of that science’s society. Lenin said that Communism was “Soviet power plus electrification”. Goebbels wasn’t that pithy, but “the feudal system plus autobahns” is pretty much what he meant by Nazism, and both boil down to “medieval peasant villages with air conditioning”.

That the one excludes the other — necessarily, comrade, necessarily, in the full Hegelian sense of the word — never occurred to either of them shouldn’t really be held against them, since both of them were determined to freeze the world exactly as it was. Both were so terrified of individuality that they were determined to stamp it out, not realizing that individuality was the only thing that made their fantasy worlds possible. Medieval peasants who were happy being medieval peasants never would’ve invented air conditioning in the first place, nicht wahr?

Severian, “Alienation”, Rotten Chestnuts, 2020-10-29.

April 4, 2023

When the steam engine itself was an “intangible”

Filed under: Britain, History, Technology — Tags: , , , , — Nicholas @ 05:00

In the latest Age of Invention newsletter, Anton Howes explains why the steam engine patent of James Watt didn’t immediately lead to Watt and his partner Matthew Boulton building a factory to create physical engines:

Diagram of a Watt steam engine from Practical physics for secondary schools (1913).
Wikimedia Commons.

… one of the most famous business partnerships of the British Industrial Revolution — that between Matthew Boulton and James Watt from 1775 — was originally almost entirely based on intangibles.

That probably sounds surprising. James Watt — a Scottish scientific instrument-maker, chemist and civil engineer — became most famous for his improvements to the steam engine, an almost archetypal example of physical capital. In the late 1760s he radically improved the fuel efficiency of the older Newcomen engine, and then developed ways to regulate the motions of its piston — traditionally applied only to pumping water — so that it could be suitable for directly driving machinery (I’ll write more on the invention itself soon). His partnership with Matthew Boulton, a Birmingham manufacturer of buttons, candlesticks, metal buckles and the like — then called “toys” — was also based from a large, physical site full of specialised machinery: the Soho Manufactory. On the face of it, these machines and factories all sound very traditionally tangible.

But the Soho Manufactory was largely devoted to Boulton’s other, older, and ongoing businesses, and it was only much later — over twenty years after Boulton and Watt formally became partners — that they established the Soho Foundry to manufacture the improved engines themselves. The establishment of the Soho Foundry heralded James Watt’s effective retirement, with the management of this more tangible concern largely passing to his and Boulton’s sons. And when Watt retired formally, in 1800, this coincided with the full depreciation of the intangible asset upon which he and Boulton had built their business: his patent.

Watt had first patented his improvements to the steam engine in 1769, giving him a 14-year window in which to exploit them without any legal competition. But his financial backer, John Roebuck, who had a two-thirds share in the patent, was bankrupted by his other business interests and struggled to support the engine’s development. Watt thus spent the first few years of his patent monopoly as a consultant on various civil engineering projects — canals, docks, harbours, and town water supplies — in order to make ends meet. The situation gave him little time, capital, or opportunity to exploit his steam engine patent until Roebuck was eventually persuaded to sell his two-thirds share to Matthew Boulton. With just eight years left on the patent, and having already wasted six, Boulton and Watt lobbied Parliament to grant them an extension that would allow them to bring their improvements into full use. In 1775 Watt’s patent was extended by Parliament for a further twenty-five years, to last until 1800. It was upon this unusually extended patent that they then built their unusually and explicitly intangible business.

How was it intangible? As Boulton and Watt put it themselves, “we only sell the licence for erecting our engines, and the purchaser of such licence erects his engine at his own expence”. This was their standard response to potential customers asking how much they would charge for an engine with a piston cylinder of particular dimensions. The answer was, essentially, that they didn’t actually sell physical steam engines at all, so there was no way of estimating a comparable figure. Instead, they sold licences to the improvements on a case-by-case basis — “we make an agreement for each engine distinctly” — by first working out how much fuel a standard, old-style Newcomen engine would require when put to use in that place and context, and then charging only a third of the saving in fuel that Watt’s improvements would provide. “The sum therefore to be paid during the working of any engine is not to be determined by the diameter of the cylinder, but by the quantity of coals saved and by the price of coals at the place where the engine is erected.” They fitted the licensed engines with meters to see how many times they had been used, sending agents to read the meters and collect their royalties every month or year, depending on the location.

This method of charging worked well for refitting existing Newcomen engines with Watt’s improvements — in those cases the savings would be obvious. It also meant that Boulton and Watt incentivised themselves to expand the total market for steam engines. The older Newcomen engines were mainly used for pumping water out of coal mines, where the coal to run them was at its cheapest. It was one of the few places where Newcomen engines were cost-effective. But for Watt and Boulton it was at the places where coals were most expensive, and where their improvements could thus make the largest fuel savings, that they could charge the highest royalties. As Boulton wrote to Watt in 1776, the licensing of an engine for the coal mine of one Sir Archibald Hope “will not be worth your attention as his coals are so very cheap”. It was instead at the copper and tin mines of Cornwall, where coal was often expensive, having to be transported from Wales, that the royalties would be the most profitable. As Watt put it to an old mentor of his in 1778, “our affairs in other parts of England go on very well but no part can or will pay us so well as Cornwall”.

June 15, 2022

QotD: The gobsmacking magnitude of “The Great Enrichment”

Filed under: Economics, History, Quotations — Tags: , , — Nicholas @ 01:00

Serious growth happened only after 1800, at first in northwestern Europe, 2% per capita in PPP [purchasing power parity] conventionally adjusted for inflation, as in the USA 1800–present, and now the world. Its magnitude is enormous, the Great Enrichment. It was a rise from $2 or $3 a day to over $100, a factor of 30. (I recently had to explain to a justly famous anthropologist that [(30–1) / 1] x 100 is 2,900%, or about 3,000%. He said that he could believe a factor of 30 … but not 3,000%.)

The exactitude, of course, is inessential. In Japan and Finland it was roughly the factor of 30. But it could be the worldwide factor since 1800 of 10 only, about $2 or $3 to $30 a day (to $10,000 a year, the level of Brazil now, to fix ideas), and still be utterly novel. As a Brit might say, the Great Enrichment was gobsmacking.

The enrichment was actually much greater than the factor of 30, because price indices, especially recently, do not adequately reflect improvements in quality, as was determined in the early 1990s by the Boskin Commission … Consider your cell phone, your auto tires, your medical treatment — all greatly better, recently. Even economic facts and analyses are better. (Well, sometimes.) The downward bias from inadequately deflating money prices for improved quality is not far from 2% per year, which would double recent growth rates in the rich countries.

Its magnitude, novelty, recency, and location are all crucial to explaining the Great Enrichment, because together they strongly suggest that there was something deeply peculiar about Britain in the 18th century, and that afterwards the peculiarity spread to the rest of the world. Such facts make “run-up” theories such as in Stephen Broadberry et alii look implausible, because they depend on a metaphor of an airplane taking off, with little else by way of explanation for why the Industrial Revolution (a factor of 2) happened or, especially, its follow-on the Great Enrichment (a factor of 20 or 30). Likewise, it is dubious to attach the Great Enrichment to remote causes within Europe, such as the Black Death — which originated in China, with similar terrors, and yet yielded no Great Enrichment there. Also dubious is the Eurocentric belief, prominent in conservative circles, of some ancient superiority of melanin-challenged Volk back in the Black Forest. (Did you know, for example, that all European countries had common law in the Middle Ages, that is, judge-found-and-made, not legislated or codified?)

The Great Enrichment is the second most important secular event in human history, second only to the domestication of plants and animals making for cities and literacy.

Dierdre McCloskey, “How Growth Happens: Liberalism, Innovism, and the Great Enrichment (Preliminary version)” [PDF], 2018-11-29.

May 1, 2022

The Victorian-era “guarantee fund” model for risky enterprises

In the latest Age of Inventions newsletter, Anton Howes wonders why we don’t have a modern equivalent to the funding mechanism that helped create the Great Exhibition of 1851 and other events that provided benefits to the public without government backing:

The Crystal Palace from the northeast during the Great Exhibition of 1851.
Image from the 1852 book Dickinsons’ comprehensive pictures of the Great Exhibition of 1851 via Wikimedia Commons.

As I’ve mentioned before, exhibitions of industry were not just celebrations of technological progress, but could become engines for progress as well. For the inventors, artists, and engineers who exhibited, the events were a direct inducement to improvement. And for the public who visited, the events exposed them to what was possible, encouraging them to raise their demands as both consumers and citizens, ideally inspiring them to become future innovators too.

But how was it all paid for? Unlike its national-level precursors in France, the Great Exhibition was not a state-run event. Even more remarkably, its organisers also failed to raise anywhere near enough private subscriptions to cover their costs. Instead, they used something that called a “guarantee fund”.

Instead of asking for donations from supporters up-front, the organisers asked them to commit to covering the exhibitions potential losses up to certain amounts — to be paid only if the money was required. Based on the security provided by this crowdsourced guarantee fund, the organisers then raised an ordinary bank loan in order to get the cash they needed to actually hold the event. Crucially, the guarantors didn’t actually have to spend anything unless the event made a loss, and if the event broke even or even made a surplus thanks to ticket fees, then they would never spend a penny at all. (Luckily for them, that’s exactly what happened in 1851, and for many later exhibitions too.)

What’s interesting to me about the guarantee fund is that I can’t quite think of anything quite like it today. There are perhaps more individualised versions of it, like when a neighbour or friend acts as a guarantor for a mortgage. And governments sometimes provide guarantees for certain sectors or industries too. There have also been a few profit-making versions of it in certain industries, where the guarantors potentially get some share of the upside too (“Names” at the Lloyds of London insurance and reinsurance market sounds similar, though even these are disappearing). But I’ve not seen anything like what the Victorians did, essentially using a guarantee fund to leverage philanthropy.

This is surprising to me. It seems like it has a lot of major advantages, especially for those who might want to replicate the exhibitions of industry today, or indeed for any kind of capital-intensive philanthropic endeavour that could eventually be expected in some measure to pay for itself. (I can’t help but think it would be useful in efforts to speed up the de-carbonisation of the economy, for example — a potential application that I’ve been exploring in my conversations with the people at Carbon Upcycling.)

Consider that with a guarantee fund anyone able to afford the risk could considerably increase the philanthropic value of their assets. Say that you could afford to donate £100 right away, but could donate three times that amount at a pinch (e.g. by having to liquidate some funds in shares). You could thus guarantee £100 each to three different causes, potentially without ever actually having to donate it, and knowing that in the worst case scenario you would never have to spend more than the £300 you can afford.

After all, those signing up to the guarantee fund essentially chose what their maximum liability would be if the event were to make a loss. If they were confident in the event’s success, then they probably believed that they would not have to pay anything at all. And if not, they had at least named the maximum donation they might eventually be asked to give.

Older Posts »

Powered by WordPress