The most important weapons of al-Qaeda and the rest of the Islamist terror network are the suicide bomber and the suicide thinker. The suicide bomber is typically a Muslim fanatic whose mission it is to spread terror; the suicide thinker is typically a Western academic or journalist or politician whose mission it is to destroy the West’s will to resist not just terrorism but any ideological challenge at all.
But al-Qaeda didn’t create the ugly streak of nihilism and self-loathing that afflicts too many Western intellectuals. Nor, I believe, is it a natural development. It was brought to us by Department V of the KGB, which was charged during the Cold War with conducting memetic warfare that would destroy the will of the West’s intelligentsia to resist a Communist takeover. This they did with such magnificent effect that the infection outlasted the Soviet Union itself and remains a pervasive disease of contemporary Western intellectual life.
Consider the following propositions:
- There is no truth, only competing agendas.
- All Western (and especially American) claims to moral superiority over Communism/Fascism/Islam are vitiated by the West’s history of racism and colonialism.
- There are no objective standards by which we may judge one culture to be better than another. Anyone who claims that there are such standards is an evil oppressor.
- The prosperity of the West is built on ruthless exploitation of the Third World; therefore Westerners actually deserve to be impoverished and miserable.
- Crime is the fault of society, not the individual criminal. Poor criminals are entitled to what they take. Submitting to criminal predation is more virtuous than resisting it.
- The poor are victims. Criminals are victims. And only victims are virtuous. Therefore only the poor and criminals are virtuous. (Rich people can borrow some virtue by identifying with poor people and criminals.)
- For a virtuous person, violence and war are never justified. It is always better to be a victim than to fight, or even to defend oneself. But “oppressed” people are allowed to use violence anyway; they are merely reflecting the evil of their oppressors.
- When confronted with terror, the only moral course for a Westerner is to apologize for past sins, understand the terrorist’s point of view, and make concessions.
These ideas travel under many labels: postmodernism, nihilism, multiculturalism, Third-World-ism, pacifism, “political correctness” to name just a few. It is time to recognize them for what they are, and call them by their right name: suicidalism.
Trace any of these back far enough (e.g. to the period between 1930 and 1950 when Department V was at its most effective) and you’ll find a Stalinist at the bottom. Among the more notorious examples are: Paul de Man — racist and Nazi propagandist turned Stalinist, and founder of postmodernism; Jean-Paul Sarte, who described the effects of Stalinism as “humane terror” and helped invent existentialism; and Paul Baran, who developed the thesis that capitalism depended on the immiseration of the Third World after Marx’s immiseration of the proletariat failed to materialize.
Al-Qaeda didn’t launch any of these memes into the noosphere, but it relies on them for political cover. They have another effect as well: when Islamists characterize the West as “decadent”, and aver that it is waiting to collapse in on itself at the touch of jihad, they are describing quite correctly and accurately the effects of Western suicidalism.
Stalinist agitprop created Western suicidalism by successfully building on the Christian idea that self-sacrifice (and even self-loathing) are the primary indicators of virtue. In this way of thinking, when we surrender our well-being to others we store up grace in Heaven that is far more important than the momentary discomfort of submitting to criminals, predatory governments, and terrorists.
Eric S. Raymond, “Suicidalism”, Armed and Dangerous, 2005-09-13.
May 22, 2016
May 8, 2016
Lenin and Stalin wanted classical-liberal individualism replaced with something less able to resist totalitarianism, not more. Volk-Marxist fantasy and postmodern nihilism served their purposes; the emergence of an adhesive counter-ideology would not have. Thus, the Chomskys and Moores and Fisks are running a program carefully designed to dead-end at nothing.
Religions are good at filling that kind of nothing. Accordingly, if transnational progressivism actually succeeds in smothering liberal individualism, its reward will be to be put to the sword by some flavor of jihadi. Whether the eventual winners are Muslims or Mormons, the future is not going to look like the fuzzy multicultural ecotopia of modern left fantasy. The death of that dream is being written in European banlieus by angry Muslim youths under the light of burning cars.
In the banlieus and elsewhere, Islamist pressure makes it certain that sooner or later the West is going to vomit Stalin’s memes out of its body politic. The worst way would be through a reflex development of Western absolutism — Christian chauvinism, nativism and militarism melding into something like Francoite fascism. The self-panicking leftists who think they see that in today’s Republicans are comically wrong (as witnessed by the fact that they aren’t being systematically jailed and executed), but it is quite a plausible future for the demographically-collapsing nations of Europe.
The U.S., fortunately, is still on a demographic expansion wave and will be till at least 2050. But if the Islamists achieve their dream of nuking “crusader” cities, they’ll make crusaders out of the U.S., too. And this time, a West with a chauvinized America at its head would smite the Saracen with weapons that would destroy entire populations and fuse Mecca into glass. The horror of our victory would echo for a thousand years.
I remain more optimistic than this. I think there is still an excellent chance that the West can recover from suicidalism without going through a fevered fascist episode and waging a genocidal war. But to do so, we have to do more than recognize Stalin’s memes; we have to reject them. We have to eject postmodern leftism from our universities, transnational progressivism from our politics, and volk-Marxism from our media.
The process won’t be pretty. But I fear that if the rest of us don’t hound the po-mo Left and its useful idiots out of public life with attack and ridicule and shunning, the hard Right will sooner or later get the power to do it by means that include a lot of killing. I don’t want to live in that future, and I don’t think any of my readers do, either. If we want to save a liberal, tolerant civilization for our children, we’d better get to work.
Eric S. Raymond, “Gramscian damage”, Armed and Dangerous, 2006-02-11.
April 22, 2016
Americans have never really understood ideological warfare. Our gut-level assumption is that everybody in the world really wants the same comfortable material success we have. We use “extremist” as a negative epithet. Even the few fanatics and revolutionary idealists we have, whatever their political flavor, expect everybody else to behave like a bourgeois.
We don’t expect ideas to matter — or, when they do, we expect them to matter only because people have been flipped into a vulnerable mode by repression or poverty. Thus all our divagation about the “root causes” of Islamic terrorism, as if the terrorists’ very clear and very ideological account of their own theory and motivations is somehow not to be believed.
By contrast, ideological and memetic warfare has been a favored tactic for all of America’s three great adversaries of the last hundred years — Nazis, Communists, and Islamists. All three put substantial effort into cultivating American proxies to influence U.S. domestic policy and foreign policy in favorable directions. Yes, the Nazis did this, through organizations like the “German-American Bund” that was outlawed when World War II went hot. Today, the Islamists are having some success at manipulating our politics through fairly transparent front organizations like the Council on American-Islamic Relations.
But it was the Soviet Union, in its day, that was the master of this game. They made dezinformatsiya (disinformation) a central weapon of their war against “the main adversary”, the U.S. They conducted memetic subversion against the U.S. on many levels at a scale that is only now becoming clear as historians burrow through their archives and ex-KGB officers sell their memoirs.
The Soviets had an entire “active measures” department devoted to churning out anti-American dezinformatsiya. A classic example is the rumor that AIDS was the result of research aimed at building a ‘race bomb’ that would selectively kill black people.
On a different level, in the 1930s members of CPUSA (the Communist Party of the USA) got instructions from Moscow to promote non-representational art so that the US’s public spaces would become arid and ugly.
Americans hearing that last one tend to laugh. But the Soviets, following the lead of Marxist theoreticians like Antonio Gramsci, took very seriously the idea that by blighting the U.S.’s intellectual and esthetic life, they could sap Americans’ will to resist Communist ideology and an eventual Communist takeover. The explicit goal was to erode the confidence of America’s ruling class and create an ideological vacuum to be filled by Marxism-Leninism.
Accordingly, the Soviet espionage apparat actually ran two different kinds of network: one of spies, and one of agents of influence. The agents of influence had the minor function of recruiting spies (as, for example, when Kim Philby was brought in by one of his tutors at Cambridge), but their major function was to spread dezinformatsiya, to launch memetic weapons that would damage and weaken the West.
Eric S. Raymond, “Gramscian damage”, Armed and Dangerous, 2006-02-11.
March 22, 2016
Published on 21 Mar 2016
Propaganda was nothing new at the beginning of World War 1. But the rapid development in mass media and the total war effort by the nations led the way to our modern understanding of mass propaganda, especially in Germany and Britain. Iconic images like that of Uncle Sam or Lord Kitchener are still known today and are part of the collective memory.
February 2, 2016
Published on 1 Feb 2016
The execution of British nurse Edith Cavell by German soldiers in 1915 was instrumental to British propaganda at that time and the story became legend. But who was Edith Cavell really? Find out more about the humble nurse in Brussels and if she was really a spy after all.
February 1, 2016
“I’ll find friends to wear my bleeding roses,” cries Edmund Beaufort, Duke of Somerset, in Harey the vjth. Standing in a rose garden, he has plucked a red flower from a great bush that stands between him and his nemesis, Richard Plantagenet, Duke of York. York has selected a white rose – “with this maiden blossom in my hand/I scorn thee,” he spits – and the noblemen standing by have followed suit, choosing the colour of their rose to advertise their allegiance.
In 1592, this image made perfect sense. This was how the Wars of the Roses were generally understood. Against the backdrop of weak kingship and disastrous military defeat in France, two rival branches of the Plantagenet dynasty – Lancaster and York – had gone to war for the throne, using red and white roses as emblems of their causes. The war had shattered the country, causing tens of thousands of deaths and incalculable misery.
Only after decades of chaos had the family rift been healed by the victory of a Lancastrian, Henry Tudor, over a Yorkist, Richard III, at Bosworth in 1485. Henry’s victory, and his subsequent marriage to Elizabeth of York, reconciled the warring factions. Thus had been created the red-and-white ‘Tudor rose’ that seemed to be painted everywhere, reminding the populace that the Tudors stood for unity, reconciliation, peace and the incontestable right to rule.
It was a powerful and easily grasped story that, by Shakespeare’s day, had already been in circulation for 100 years. And, in part thanks to the success of Shakespeare’s brilliant cycle of history plays, this vision of the Wars of the Roses remains in circulation – on television, in film and in popular historical fiction. Lancaster versus York, red versus white: it is a story as easy to grasp as a football match at the end of which everyone swaps shirts. Yet it is misleading, distorted, oversimplified and – in parts – deliberately false.
December 27, 2015
Published on 26 Dec 2015
It’s time for the Chair of Wisdom again. This time Indy explains why he deems Franz Ferdinand a horrible person, why the soldier did not mutiny all the time and what the Philippines did in World War 1.
December 8, 2015
Patrick Crozier says we shouldn’t automatically believe the “common wisdom” about the career of Senator Joe McCarthy:
The vast majority of books and articles written on the subject claim that [Senator McCarthy] made it all up. M. Stanton Evans begs to differ. In Blacklisted by History: the Untold Story of Senator Joseph McCarthy and his Fight Against America’s Enemies he argues that in the vast majority of cases those accused by McCarthy of being communists were exactly that. Some were out and out spies. Some were agents of influence. Some were happy to help in the running of communist front groups. But the argument still stands: they were aiding a power that was hostile to the United States.
Evans comes to this judgement mainly by leafing through the files that have become available. These include the FBI files and what have become known as the Venona transcripts: Soviet messages de-crypted by the US military in the 1940s.
It is important to realise that these weren’t just spy games. Communist activity had a real impact. In the early 1940s, for instance, John Stewart Service, the State Department’s man in China produced a string of reports. In them he praised Mao’s Communists to the hilt claiming that they were democrats and successfully fighting the Japanese while condemning Chiang Kai Shek’s Kuomintang (KMT) for being incompetent, corrupt and uninterested in prosecuting the war. This was a travesty of the truth. Reports like this led to the KMT being starved of money and weapons which may well have tipped the balance in the Civil War leading, in turn, to the misery that was subsequently inflicted on the people of mainland China.
So, if he was right why has he been condemned and why does he continue to be condemned by history? Some of it appears to have been McCarthy’s own fault. He puffed up his war record. He over-stated his case. He bullied witnesses. He made the odd mistake. He criticised revered war heroes. Some if it was snobbery. McCarthy was from the wrong side of the tracks. There was no Ivy League education for him. He left school early but through hard work still managed to become a lawyer. He was also a Catholic. But most of it was because he was up against the combined forces of the communists and the establishment.
The Tydings Committee – a special sub-committee of the Senate Foreign Relations Committee – was established to get to the bottom of his initial 1950 claim that there were 57 communist agents working in the State Department. It did no such thing. In fact it didn’t even try.
According to Evans it was a cover up from start to finish. There was almost no attempt to get at the facts. Often a denial from the accused was sufficient. At one point they even asked the leader of the US Communist Party if certain people were members. He had to be prompted to say “no”. Most of the hostile questioning was not aimed at the accused – who were often evasive – but McCarthy himself. An inordinate amount of time was given over to attempting to prove that McCarthy had initially claimed a figure of 205 rather than 57 – as if it mattered. There was a definite suggestion that State Department personnel files had been tampered with. It was no great surprise when the official report concluded that McCarthy had made it all up.
November 13, 2015
J.M. Berger discusses the challenges of having to overcome an extremist narrative in the struggle with ISIS:
“The United States is engaged in a war of ideas — and it’s losing.”
This refrain feels modern, but it has echoed through most of American history. The argument that the U.S. is losing a war of ideas or narratives to ISIS is only the latest iteration. As Scott Atran recently wrote at The Daily Beast, the various military campaigns against the Islamic State obscure “a central and potentially determining fact about the fight” — namely that it “is, fundamentally, a war of ideas that the West has virtually no idea how to wage, and that is a major reason anti-ISIS policies have been such abysmal failures.”
The myth that America’s narrative is losing to ISIS’s persists despite the fact that millions of people are fleeing ISIS territories, while mere thousands have traveled to join the group. It persists despite the fact that the Islamic State’s ideological sympathizers make up less than 1 percent of the world’s population, even using the most hysterically alarmist estimates, and the fact that active, voluntary participants in its caliphate project certainly make up less than a tenth of a percent.
In the United States, the notion of a “war of ideas” dates almost as far back as the Revolutionary War, according to Google Ngrams, which searches the text of English-language books that have been digitized. The phrase appeared during the Civil War, in the context of slavery, and returned during World War I. References soared as the United States entered World War II, and became a fixture of American political discourse during the Cold War. The Korean War was a war of ideas; so was Vietnam.
And in every era, the same alarm bell has sounded.
September 21, 2015
I have never forgotten these visitors, or ceased to marvel at them, at how they have gone on from strength to strength, continuing to lighten our darkness, and to guide, counsel and instruct us. They are unquestionably one of the wonders of the age, and I shall treasure till I die as a blessed memory the spectacle of them travelling with radiant optimism through a famished countryside, wandering in happy bands about squalid, over-crowded towns, listening with unshakeable faith to the fatuous patter of carefully trained and indoctrinated guides, repeating like schoolchildren a multiplication table, the bogus statistics and mindless slogans endlessly intoned on them. There, I would think, an earnest office-holder in some local branch of the League of Nations Union, there a godly Quaker who had once had tea with Gandhi, there an inveigher against the Means Test and the Blasphemy Laws, there a staunch upholder of free speech and human rights, there an indomitable preventer of cruelty to animals, there scarred and worthy veterans of a hundred battles for truth, freedom, and justice – all, all chanting the praises of Stalin and his Dictatorship of the Proletariat. It was as though a vegetarian society had come outwith a passionate plea for cannibalism, or Hitler had been nominated posthumously for the Nobel Peace Prize.
Malcolm Muggeridge, Chronicles of Wasted Time, 2006.
August 17, 2015
If twentieth-century history teaches us anything, it’s that political religions spell trouble. Soviet Communism, Italian Fascism, and Nazism aren’t just called “political religions” by scholars today. In all three cases, observers at the time recognized and worried about the movements’ religious natures. Those natures were no accident; Mussolini, for instance, called his ideology “not only a faith, but a religion that is conquering the laboring masses of the Italian people.”
One reason that observers saw the great totalitarianisms as religious was that each had its idol: Mussolini in Italy, Hitler in Germany, and Lenin in Russia, followed by Stalin. Take Grigory Zinoviev’s description of Lenin: “He is really the chosen one of millions. He is the leader by the Grace of God. He is the authentic figure of a leader such as is born once in 500 years.” Stalin’s cult of personality was far more developed and sometimes explicitly idolatrous, as in the poem that addressed the despot as “O Thou mighty one, chief of the peoples, Who callest man to life, Who awakest the earth to fruitfulness.” And in Italy, writes the historian Michael Burleigh, “intellectual sycophants and propagandists characterised [Mussolini] as a prodigy of genius in terms that would not have embarrassed Stalin: messiah, saviour, man of destiny, latterday Caesar, Napoleon, and so forth.”
To point out these words’ uncomfortable similarity to the journalists’ praises of Obama is not to equate the throngs who bowed down to totalitarian dictators with even the most worshipful Obamaphiles. But the manner of worship is related, as perhaps it must be in any human society that chooses to adore a human being. The widespread renaming of villages, schools, and factories after Stalin, for example, finds its modern-day democratic parallel in a rash of schools that have already rechristened themselves after Obama, to say nothing of the hundreds of young sentimentalists who informally adopted the candidate’s middle name during the presidential race. Even the Obama campaign’s ubiquitous logo — the letter O framing a rising sun — would not have surprised the scholar Eric Voegelin. In The Political Religions (1938), Voegelin traced rulers who employed the image of the sun — a symbol of “the radiation of power along a hierarchy of rulers and offices that ranges from God at the top down to the subject at the bottom” — from the pharaoh Akhenaton to Louis XIV and eventually to Hitler.
Benjamin A. Plotinsky, “The Varieties of Liberal Enthusiasm: The Left’s political zealotry increasingly resembles religious experience”, City Journal, 2010-02-20.
July 24, 2015
Matt Ridley on the danger to all scientific fields when one field is willing to subordinate fact to political expediency:
For much of my life I have been a science writer. That means I eavesdrop on what’s going on in laboratories so I can tell interesting stories. It’s analogous to the way art critics write about art, but with a difference: we “science critics” rarely criticise. If we think a scientific paper is dumb, we just ignore it. There’s too much good stuff coming out of science to waste time knocking the bad stuff.
Sure, we occasionally take a swipe at pseudoscience — homeopathy, astrology, claims that genetically modified food causes cancer, and so on. But the great thing about science is that it’s self-correcting. The good drives out the bad, because experiments get replicated and hypotheses put to the test. So a really bad idea cannot survive long in science.
Or so I used to think. Now, thanks largely to climate science, I have changed my mind. It turns out bad ideas can persist in science for decades, and surrounded by myrmidons of furious defenders they can turn into intolerant dogmas.
This should have been obvious to me. Lysenkoism, a pseudo-biological theory that plants (and people) could be trained to change their heritable natures, helped starve millions and yet persisted for decades in the Soviet Union, reaching its zenith under Nikita Khrushchev. The theory that dietary fat causes obesity and heart disease, based on a couple of terrible studies in the 1950s, became unchallenged orthodoxy and is only now fading slowly.
What these two ideas have in common is that they had political support, which enabled them to monopolise debate. Scientists are just as prone as anybody else to “confirmation bias”, the tendency we all have to seek evidence that supports our favoured hypothesis and dismiss evidence that contradicts it—as if we were counsel for the defence. It’s tosh that scientists always try to disprove their own theories, as they sometimes claim, and nor should they. But they do try to disprove each other’s. Science has always been decentralised, so Professor Smith challenges Professor Jones’s claims, and that’s what keeps science honest.
What went wrong with Lysenko and dietary fat was that in each case a monopoly was established. Lysenko’s opponents were imprisoned or killed. Nina Teicholz’s book The Big Fat Surprise shows in devastating detail how opponents of Ancel Keys’s dietary fat hypothesis were starved of grants and frozen out of the debate by an intolerant consensus backed by vested interests, echoed and amplified by a docile press.
July 19, 2015
Theodore Dalrymple discusses the changing opinions about Germany within the European Union, but especially in France:
There seems to be growing anti-German feeling in France, at least if what I read is anything to go by (which it might not be, of course). For example, a book with the title Bismarck Herring (The German Poison) is on sale everywhere. It is not by an unknown person, but rather by a very well-known left-wing French politician, Jean-Luc Mélenchon.
You don’t have to go far in it to discover a tone of sheer hatred. The Germans, according to him, have returned to their old arrogant ways (which, of course, they never really lost); the price of their industrial and financial success is a land of oppressed, impoverished, and fat workers who don’t want any children; their industry spreads pollution all over Europe; and, unlike the French, who purified themselves of collaborationist industrialists after the war, the Germans just went on as if nothing had happened. At the end of the book, Mélenchon says that France (and presumably only France) has the wherewithal to liberate Europe from German imperialism. In a chapter headed “Spitting Out the Poison,” he mentions that, unlike Germany, France still has considerable military capacity. The obvious implication, I am afraid, is that France could, and perhaps should, use it to occupy the Ruhr again if Germany does not change its wicked ways.
Is it not strange that such thoughts should occur to a deputy of the European Parliament? After all, the most commonly used justification for the existence of the European Union is that it ensures the peace of the continent — by which, of course, is meant the pacification of France and Germany, since Belgium was never very likely to send its troops to occupy, say, Portugal. But from the first, the EU has taken Yugoslavia as its model, and Mélenchon’s rant at least has the merit of drawing our attention to a similar possible denouement.
July 15, 2015
Michelle Orange on the ways that photography can mislead and even change reality:
It may be that some of the great philosophical work of our time is taking place, hidden and unheralded, in the field of image forensics. Where but under the scrutiny of digital experts who draw a line separating false representations of the world from truthful ones are contemporary questions of perception and reality brought so keenly to bear? Who but these detectives of the real pursue as explicitly — as intricately — our crime wave of the fake, the contrived, the uncanny, the exponential image? With exquisite, singular focus, photo forensics engages the conundrum that photographic technology has tilted toward, steadily but ever more frankly, since its inception over 150 years ago: Does reality have a tipping point?
Dangling from the cliff edge of that question is the World Press Photo competition. In recent years the annual competition, which recognizes images submitted by photojournalists working across the globe, has dissolved into chaos, recrimination and a round of post-mortem soul-searching. Earlier this year, the WPP was forced to disqualify 22 percent of the competition’s finalists after forensics experts determined that certain images had been altered or manipulated beyond the currently accepted industry standard. This almost triples the number of disqualifications from a year earlier, suggesting a certain forward momentum, a trend larger and more fearsome than any set of standards.
Swedish photographer Paul Hansen won the 2013 World Press Photo competition with an image of a Gaza City funeral procession, led through an alley by men bearing the shrouded bodies of two children killed in an Israeli airstrike. Separate from the horror it depicts, with its fish-eye depth of field, stark figuration and stony matte light, the photo meets the eye as unreal. Complaints in this vein led to an investigation of the image, specifically its manipulation of tone — a quality central to photography’s evolving grammar of realism. Somehow both a beautifying tool and, in the right hands, possessed of the very texture of reality (as every Instagram filter maven knows), tone is transformative. For that reason, “excessive toning” is against WPP rules; Hansen said he adjusted tone only to balance uneven light, “in effect to recreate what the eye sees.” Ultimately, Hansen retained his prize: the judges stood behind what they saw, though it would appear their eyes prefer altered images a good portion of the time.
June 4, 2015
In The Lancet, Richard Horton discusses the problems of scientific journalism:
The case against science is straightforward: much of the scientific literature, perhaps half, may simply be untrue.
Afflicted by studies with small sample sizes, tiny effects, invalid exploratory analyses, and flagrant conflicts of interest, together with an obsession for pursuing fashionable trends of dubious importance, science has taken a turn towards darkness. As one participant put it, “poor methods get results”. The Academy of Medical Sciences, Medical Research Council, and Biotechnology and Biological Sciences Research Council have now put their reputational weight behind an investigation into these questionable research practices. The apparent endemicity of bad research behaviour is alarming. In their quest for telling a compelling story, scientists too often sculpt data to fit their preferred theory of the world. Or they retrofit hypotheses to fit their data. Journal editors deserve their fair share of criticism too. We aid and abet the worst behaviours. Our acquiescence to the impact factor fuels an unhealthy competition to win a place in a select few journals. Our love of “significance” pollutes the literature with many a statistical fairy-tale. We reject important confirmations. Journals are not the only miscreants. Universities are in a perpetual struggle for money and talent, endpoints that foster reductive metrics, such as high-impact publication. National assessment procedures, such as the Research Excellence Framework, incentivise bad practices. And individual scientists, including their most senior leaders, do little to alter a research culture that occasionally veers close to misconduct.