Writing about those rioters who in the summer of 2011 smashed, burned and looted shops across Britain, [Russell] Brand writes that their actions were no worse than the consumerism which he describes as having been “imposed” upon them. And this, I cannot help thinking, is an especially revealing phrase — entirely at one with a popular world view. That view sees “us” as poor victims of forces and temptations which are not only pushed upon us, but to which, when they are pushed upon us long enough, we will inevitably and necessarily succumb. If you are in a “consumerist” society long enough how could you be expected to just not buy crap you can’t afford when you don’t need it? No — the answer must be that of course you will succumb. And from there any bad behaviour — even looting and burning — will be excused because it will be someone else’s fault.
This is the world view of an addict. And the answer to all our society’s problems of the addict Brand is one answer which some addicts seek for their addiction — which is that everyone is to be blamed for their failings except themselves. Grand conspiracy theories and establishment plots offer great promise and comfort to such people. They suggest that when we fail or when we fall we do so never because of any conceivable failing or inability of our own, but because some bastard — any bastard — made us do it, has been planning to do it and perhaps always intended to do so. Of course the one thing missing in all this — the one thing that doesn’t appear in either of these books or in any of their conspiratorial and confused demagogic world view — is the only thing which has saved anyone in the past and the only thing which will save anybody in the future: not perfect societies, perfectly engineered economies and perfectly equal, flattened-out collective-based societies, but human agency alone.
Douglas Murray, “Don’t Listen to Britain’s Designer Demagogues”, Standpoint, 2015-01.
February 6, 2016
February 2, 2016
Published on 1 Feb 2016
The execution of British nurse Edith Cavell by German soldiers in 1915 was instrumental to British propaganda at that time and the story became legend. But who was Edith Cavell really? Find out more about the humble nurse in Brussels and if she was really a spy after all.
February 1, 2016
“I’ll find friends to wear my bleeding roses,” cries Edmund Beaufort, Duke of Somerset, in Harey the vjth. Standing in a rose garden, he has plucked a red flower from a great bush that stands between him and his nemesis, Richard Plantagenet, Duke of York. York has selected a white rose – “with this maiden blossom in my hand/I scorn thee,” he spits – and the noblemen standing by have followed suit, choosing the colour of their rose to advertise their allegiance.
In 1592, this image made perfect sense. This was how the Wars of the Roses were generally understood. Against the backdrop of weak kingship and disastrous military defeat in France, two rival branches of the Plantagenet dynasty – Lancaster and York – had gone to war for the throne, using red and white roses as emblems of their causes. The war had shattered the country, causing tens of thousands of deaths and incalculable misery.
Only after decades of chaos had the family rift been healed by the victory of a Lancastrian, Henry Tudor, over a Yorkist, Richard III, at Bosworth in 1485. Henry’s victory, and his subsequent marriage to Elizabeth of York, reconciled the warring factions. Thus had been created the red-and-white ‘Tudor rose’ that seemed to be painted everywhere, reminding the populace that the Tudors stood for unity, reconciliation, peace and the incontestable right to rule.
It was a powerful and easily grasped story that, by Shakespeare’s day, had already been in circulation for 100 years. And, in part thanks to the success of Shakespeare’s brilliant cycle of history plays, this vision of the Wars of the Roses remains in circulation – on television, in film and in popular historical fiction. Lancaster versus York, red versus white: it is a story as easy to grasp as a football match at the end of which everyone swaps shirts. Yet it is misleading, distorted, oversimplified and – in parts – deliberately false.
January 30, 2016
Published on 17 Apr 2014
Tanks were invented by the British during the First World War. Historian Dan Snow traces their development, from prototype to battlefield fixture.
January 24, 2016
Last month, Save the Royal Navy looked at the aircraft that will fly off the decks of HMS Queen Elizabeth and HMS Prince of Wales:
The SDSR stated that 42 F35-B Lightning aircraft will be delivered by 2023. These 42 aircraft form the carrier’s main armament. A foolish political fudge has given the RAF control of the Lightnings, to be jointly manned and operated with the RN. For Government, this conveniently boosts the RAF’s ORBAT while allowing the same aircraft to be counted again as part of the carrier’s equipment. Although the RAF may not like it, the needs of the carriers will have to dictate their operation. There is simply no place for the “part time carrier aviator” The aircrew need as much time at sea as possible to develop their own skills, the skills of aircraft handlers, the ship’s company and the fleet as a whole. Like all RN vessels the carriers will operate at a demanding operational tempo and need aircraft embarked for much of the time. Any RAF inclination to use the aircraft in the land-based deep strike role will have to be second priority.
The initial 42 Lightnings will be split between 2 frontline squadrons. 809 Naval Air Squadron and RAF 617 Squadron with around 15-20 aircraft each, building up to the full strength of 24 per squadron. There will also be a requirement for at least 5 aircraft to form an OCU (Operational Conversion Unit for training). An OEU (Operational Evaluation Unit for testing and trials) will also require a few aircraft. Allowing for a sustainment fleet of aircraft in deep maintenance etc, then it is clear that many more than 42 aircraft are needed to form just 2 full-strength squadrons. Between 2010 and 2014 the received wisdom was that the UK would only ever purchase a maximum of 48 F35-B but the SDSR announced a planned eventual purchase of as many as 138. This is good news which should give some strength-in-depth, potentially providing 2 more squadrons. Both the RN and the RAF should be able to fulfil their ambitions for the Lighting. Whether the RAF will push for a purchase of the conventional F35-A which would not be compatible with the carrier, but has slightly better range and performance than the VSTOL variant is a discussion for the future.
Of course the caveat to all this good news is the actual performance of the F35. There are armies of armchair F-35 critics and many of their concerns are valid. Although it may prove to be a poor “within visual range” fighter, its networking, sensors, stealth and strike capabilities will be a giant advance over any previous UK military aircraft. Furthermore the RN has a fine track record of taking equipment with many apparent deficiencies and turning them into a great success. (Fairy Swordfish anyone?)
On the other hand, Ben Ho Wan Beng argues that the carriers will not actually be able to project much power:
A tactical combat aircraft complement of 12, or even 15-20, is rather small for traditional carrier operations, especially force-projection ones that are likely to predominate considering the SDSR’s expeditionary-warfare slant. Indeed, it is worth considering the fact that the two British small-deck carriers involved in the Falklands War carried 20-odd Harrier jump jets each, and they were about three times smaller than the Queen Elizabeth-class ships.
In fact, each new carrier might even be operating with a much fighter complement fewer than 15-20 in the years leading up to 2023, giving lie to the phrase “in force” used by George Osborne when he spoke of equipping the carriers with significant airpower.
In any case, the small fighter constituent means that if the Queen Elizabeth carrier were to get involved in a conflict with an adversary with credible anti-access/area-denial (A2/AD) capabilities, the vessel would be hard-pressed to protect itself, let alone project power. With a displacement of over 70,000 tons and costing over three billion pounds each, the new British carriers will be the crown jewels of the Royal Navy; indeed, HMS Queen Elizabeth is slated to be the RN’s flagship when she comes into service. The protection of the ship would hence be of paramount importance in an era that has witnessed the proliferation of A2/AD capabilities even to developing nations. Hence for a Queen Elizabeth carrying 20 or less Lightnings in such circumstances, it remains to be seen just how many of the aircraft will be earmarked for different duties.
Should a F-35B air group of that size put to sea, at least half of them will be assigned to the Combat Air Patrol (CAP), leaving barely 10 for offensive duties. It is worth noting that of the 42 Harrier VSTOL jets deployed on HMS Hermes and HMS Invincible during the Falklands War, 28 of them – a substantial two-thirds – had CAP as their primary duty. It is also telling that of the 1,300-odd sorties flown in all by the Harriers, about 83 per cent of them were for CAP.
Faced with modern A2/AD systems such as stand-off anti-ship missiles, how likely then would the carrier task force commander devote more resources to offense and risk having a vessel named after British royalty attacked and hit? Having said that, having too many planes for defense strengthens the argument made by various carrier critics that the ship is a “self-licking ice cream cone,” in other words, an entity that exists solely to sustain itself.
The task force commander would thus be caught between a rock and a hard place. Allocate more F-35Bs to strike missions and the susceptibility of the task force to aerial threats increase. Conversely, set aside more aircraft for the CAP and its mother ship’s ability to project power decreases. All in all, with a significantly understrength F-35B air wing, the Queen Elizabeth flat-top would be operating under severe constraints, making it incapable of the traditional carrier operations it could have carried out with a larger tactical aircraft complement. Indeed, one naval commentator is right on the mark when he argues that two squadrons with a total of 24 aircraft should be a “sensible minimum standard” for each carrier.
January 22, 2016
Published on 21 Jan 2016
The Russians try to take Czernowitz, the Capital of Austrian Bukovina but thousands upon thousands of Russians were killed in action. While in Montenegro, Austro-Hungarian troops under commander in chief Franz Conrad von Hotzendorf take control of the Balkan state of Montenegro. A relief force led by Lieutenant-General Fenton Aylmer had to return to base after a big loss against the Turks, while in South Cameroon, so the Germans retire into Spanish territory.
January 19, 2016
Matt Ridley on the legacy of Capability Brown:
Next year marks the 300th birthday of Lancelot Brown at Kirkharle, in Northumberland, the man who saw “capability” in every landscape and indefatigably transformed England. In his 280 commissions, Capability Brown stamped his mark on some 120,000 acres, tearing out walls, canals, avenues, topiary and terraces to bring open parkland, with grassy tree-topped hills and glimpses of sinuous, serpentine lakes, right up to the ha-has of country houses.
Brown was not the first to design informal and semi-naturalistic landscapes: he followed Charles Bridgeman and William Kent. But he was by far the most prolific and influential. His is a type of landscape that is now imitated in parks all round the world, from Dubai to Sydney to Europe: it’s known as “jardin anglais” and was admired by Catherine the Great and Thomas Jefferson.
Frederick Law Olmsted laid out Central Park in New York in conscious emulation of Brown — as John Nash did with St James’s Park (Hyde Park is by Bridgeman). Golf courses nearly always pay unconscious homage to Brown. There is something deeply pleasing about a view of rolling grassland punctuated with clumps of low-branching trees and glimpses of distant water.
Mountains may have more majesty, forests more fear, deserts more danger, townscapes more detail, fields more fruitfulness, formal gardens more symmetry — but it is the informal English parkland of Capability Brown that you would choose for a picnic, or for a visit with a potential lover. It feels natural.
And yet of course it is wholly contrived. One of Tom Stoppard’s characters explains to another in his play Arcadia, as they contemplate the view of a park from a country house:
BERNARD: Lovely. The real England.
HANNAH: You can stop being silly now, Bernard. English landscape was invented by gardeners imitating foreign painters who were evoking classical authors. The whole thing was brought home in the luggage from the grand tour. Here, look — Capability Brown doing Claude, who was doing Virgil. Arcadia!
Hannah’s right. Claude Lorrain’s paintings of scenes from Virgil were all the rage in the 1730s. By the 1740s, when Brown started work at Stowe under William Kent, prints of 44 of Claude’s landscapes were on sale in London. The landscape at Stourhead (not by Brown), with its Grecian temples seen across lakes, is little more than a copy of Claude’s Aeneas at Delos. Kent’s genius, inspired by Lord Burlington and Alexander Pope, was to supply this craving for classical rural Arcadia.
The satirist Richard Cambridge joked that he wanted to die before Capability Brown so that he could see heaven before it was “improved”. In 2016 — the date of Brown’s birth is unknown; we have only the date of his baptism, August 30 — I shall raise a glass to a humbly born county boy, who mixed Northumberland with the Serengeti to produce Arcadia and gave us the archetypical English landscape.
January 17, 2016
During the Second World War, the British government constructed offshore anti-aircraft installations to extend protection beyond the range of land-based guns and radar. One of these forts is reportedly going to be converted into a luxury resort:
The Maunsell Forts built to defend the British Isles from Nazi invasion are in line for an upmarket make-over.
Billed as the next top-notch luxury resort, the forts will come complete with executive apartments, a helipad and even a spa at sea.
The Second World War era forts off the coast of Kent were constructed in 1943, and operated by both Army and Navy personnel.
January 16, 2016
Brendan O’Neill on the attempt to portray David Bowie’s career as something other than music, showbiz, and a set of unevenly brilliant self-marketing abilities:
Poor David Bowie. Barely 72 hours dead and he’s already being misremembered. Turn on the TV and you’ll see cultural talking heads telling the world he was the granddaddy of transgenderism. Open a newspaper and you’ll come across 800-word PhD theses masquerading as op-eds, informing us Bowie paved the way for the “gender fluidity” of the 21st century, the fashion for declaring oneself neither male nor female, but rather non-binary, or genderqueer, and whatever the other post-gender labels are. (It’s easy to lose track. Last year Facebook increased its gender options from 50 to 71, overnight. Presumably some professor suddenly discovered 21 hitherto unknown genders.)
It is a blot on Bowie’s good name to link him with the politics of transgenderism. Just because in the early Seventies he rocked the cultural world by coating himself in makeup and donning dayglo jumpsuits with vertigo-inducing platform shoes, that doesn’t mean he was transgender, far less that he facilitated modern transgenderism. On the contrary, there’s a stark difference between Bowie’s cross-dressing antics and today’s seemingly catching gender dysphoria: Where Bowie and other queens and freaks in the Sixties and Seventies were flipping a beautifully manicured finger at authority, modern transgenderism seeks to become its own form of authority, chastising and censoring those who dare dissent from its theology. The glam crowd broke boundaries; the trans elite enforces new ones.
Bowie’s death had barely been tweeted before people were hailing him the trans messiah. A British newspaper said that 40 years ago Bowie had flown “the flag for the non-binary movement.” Which is patent nonsense, since nobody — certainly not this contrarian lad from Brixton in South London — was using the turgid phrase “non-binary” in the early 1970s.
January 15, 2016
Matt Ridley on how horrible implementations of the ideas of Thomas Malthus have made the world an even more cruel place:
For more than 200 years, a disturbingly vicious thread has run through Western history, based on biology and justifying cruelty on an almost unimaginable scale. It centres on the question of how to control human population growth and it answers that question by saying we must be cruel to be kind, that ends justify means. It is still around today; and it could not be more wrong. It is the continuing misuse of Malthus.
According to his epitaph in Bath Abbey, the Rev Thomas Robert Malthus, author of An Essay on the Principle of Population (1798), was noted for “his sweetness of temper, urbanity of manners and tenderness of heart, his benevolence and his piety”. Yet his ideas have justified some of the greatest crimes in history. By saying that, if people could not be persuaded to delay marriage, we would have to encourage famine and “reprobate specific remedies for ravaging diseases”, he inadvertently gave birth to a series of heartless policies — the poor laws, the British government’s approach to famine in Ireland and India, social Darwinism, eugenics, the Holocaust, India’s forced sterilisations and China’s one-child policy. All derived their logic more or less directly from a partial reading of Malthus.
To this day if you write or speak about falling child mortality in Africa, you can be sure of getting the following Malthusian response: but surely it’s a bad thing if you stop poor people’s babies dying? Better to be cruel to be kind. Yet actually we now know, this argument is wrong. The way to get population growth to slow, it turns out, is to keep babies alive so people plan smaller families: to bring health, prosperity and education to all.
Britain’s Poor Law of 1834, which attempted to ensure that the very poor were not helped except in workhouses, and that conditions in workhouses were not better than the worst in the outside world, was based explicitly on Malthusian ideas — that too much charity only encouraged breeding, especially illegitimacy, or “bastardy”. The Irish potato famine of the 1840s was made infinitely worse by Malthusian prejudice shared by the British politicians in positions of power. The Prime Minister, Lord John Russell, was motivated by “a Malthusian fear about the long-term effect of relief”, according to a biographer. The Assistant Secretary to the Treasury, Charles Trevelyan, had been a pupil of Malthus at the East India Company College: famine, he thought, was an “effective mechanism for reducing surplus population” and a “direct stroke of an all-wise and all-merciful Providence” sent to teach the “selfish, perverse and turbulent” Irish a lesson. Trevelyan added: “Supreme Wisdom has educed permanent good out of transient evil.”
In India in 1877, a famine killed ten million people. The viceroy, Lord Lytton, quoted almost directly from Malthus in explaining why he had halted several private attempts to bring relief to the starving: “The Indian population has a tendency to increase more rapidly than the food it raises from the soil.” His policy was to herd the hungry into camps where they were fed on — literally — starvation rations. Lytton thought he was being cruel to be kind.
January 14, 2016
Published on 17 Apr 2014
An army is as good as the kit its soldiers use. In 1914, which army was the best equipped? Historian Dan Snow finds out.
They were not, as The Times correspondent claims, there to protect the wearer from rifle or machine-gun bullets. Indeed, as I understand it, even modern helmets are not always proof against high-velocity rounds. What they were there to do was to protect soldiers from shrapnel. Shrapnel, in case you didn’t already know, is the collective noun for steel balls being expelled from an air-bursting (or Shrapnel) shell. It was a huge killer in the First World War and the steel helmet did a great deal to save lives.
One of the good things about the Brodie helmet – as it sometimes known – is that it had an internal harness. This meant that if the helmet was dented the dent was not necessarily reproduced in the wearer’s skull.
On the shape, however, with a wide brim and no neck protection, I have always been in two minds. On the one hand, if the threat is from above you would have thought the shape was a good thing as it covers a large part of the wearer’s body. It is also easy to make. On the other hand, British helmets over the last 100 years have progressively given more neck protection which sounds like the British Army’s way of saying they got it wrong.
By the way, in my limited experience both steel and more modern Kevlar helmets are a pain in the arse to wear. You either can’t see anything from a prone position or you can’t see anything from a prone position and get a headache.
Patrick Crozier, “The British army gets steel helmets”, Samizdata, 2015-12-02.
January 13, 2016
ESR has a theory on the rapid decline of the duelling culture that had lasted hundreds of years until the mid-19th century:
I’ve read all the scholarship on the history of dueling I can find in English. There isn’t much, and what there is mostly doesn’t seem to me to be very good. I’ve also read primary sources like dueling codes, and paid a historian’s attention to period literature.
I’m bringing this up now because I want to put a stake in the ground. I have a personal theory about why Europo-American dueling largely (though not entirely) died out between 1850 and 1900 that I think is at least as well justified as the conventional account, and I want to put it on record.
First, the undisputed facts: dueling began a steep decline in the early 1840s and was effectively extinct in English-speaking countries by 1870, with a partial exception for American frontier regions where it lasted two decades longer. Elsewhere in Europe the code duello retained some social force until World War I.
This was actually a rather swift end for a body of custom that had emerged in its modern form around 1500 but had roots in the judicial duels of the Dark Ages a thousand years before. The conventional accounts attribute it to a mix of two causes: (a) a broad change in moral sentiments about violence and civilized behavior, and (b) increasing assertion of a state monopoly on legal violence.
I don’t think these factors were entirely negligible, but I think there was something else going on that was at least as important, if not more so, and has been entirely missed by (other) historians. I first got to it when I noticed that the date of the early-Victorian law forbidding dueling by British military officers – 1844 – almost coincided with (following by perhaps a year or two) the general availability of percussion-cap pistols.
The dominant weapons of the “modern” duel of honor, as it emerged in the Renaissance from judicial and chivalric dueling, had always been swords and pistols. To get why percussion-cap pistols were a big deal, you have to understand that loose-powder pistols were terribly unreliable in damp weather and had a serious charge-containment problem that limited the amount of oomph they could put behind the ball.
This is why early-modern swashbucklers carried both swords and pistols; your danged pistol might very well simply not fire after exposure to damp northern European weather. It’s also why percussion-cap pistols, which seal the powder charge inside a brass casing, were first developed for naval use, the prototype being Sea Service pistols of the Napoleonic era. But there was a serious cost issue with those: each cap had to be made by hand at eye-watering expense.
Then, in the early 1840s, enterprising gunsmiths figured out how to mass-produce percussion caps with machines. And this, I believe, is what actually killed the duel. Here’s how it happened…
January 12, 2016
Colby Cosh on the career(s) of David Bowie:
When we look at the enduring core of what we still awkwardly refer to as “rock music,” what we find is bizarre: a group of people born between about 1940 and 1948, mostly on one island. This cluster of neighbours took black American folk music and electric instruments and used them to hammer out a musical language whose vocabulary and power eventually rivalled that of the old Western orchestral tradition.
Somehow the seeds of war fell on England and sowed giants. It can’t just have been the bombs, even if Bowie did use the V-2 as a metaphor on the Heroes LP. (Has any single person, in retrospect, done so much to make Germany cool? They need to give Bowie a monument the size of the Hermannsdenkmal.)
The awful news of Bowie’s demise on Sunday reveals that he was not quite the same kind of star as notional equals like Ray Davies, Pete Townshend or even Mick Jagger. He was a songwriter of the first rank, an omnivore and a multi-instrumentalist whose taste in collaborators is a legend unto itself — yet his involvement in music seems almost circumstantial. It was the thing one happened to do, born where and when he was.
He gave so many excellent cinema and television performances that one suspects pop stardom’s gain might have been acting’s loss. What might Bowie have achieved if circumstances had steered him into the Royal Shakespeare Company instead of blues clubs? Is there an alternate reality in which people are reminiscing about Sir David Jones’s Richard III and regretting that he never got around to Lear?
One is tempted to add that Bowie could have been a great fashion designer or conceptual artist — but, then, he was both those things, when he wasn’t, ho-hum, dashing off “Life on Mars” or “Sound and Vision.” He was not of the fashion world as such, but slip out on any night, in any city of size from Tokyo to Toronto, and you’ll see homages to Bowie. Any ambitious, expensive pop concert still follows the Bowie idiom. His imprint on world culture is rivalled by few other pop stars, and perhaps none has its breadth. One is tempted to invoke Elvis or Dylan.