I hear quite a bit of that these days — almost like a local version of East German “ostalgie“. Old British friends say to me, well, say what you like about the 1970s — nothing worked; if you wanted to buy a new car, it was as if post-war rationing was still in effect — but all the same life in the village seemed a lot more pleasant back then. There’s something to this: the benign side of oppressive statism is often a kind of public restraint. And more than a few folks seem to feel, with the benefit of hindsight, that it’s better to have unionised thugs nutting scabs on the picket line than freelance yobs in hideous leisurewear infesting ersatz-American high streets catering to their every frightful whim from one end to the other. For the modern liberal, this is a new dilemma: an underclass that’s too rich.
Mark Steyn, “The Unfinished Revolution”, Daily Telegraph, 2004-05-04 (link goes to Steyn’s own site)
April 11, 2013
QotD: An underclass that’s too rich
April 8, 2013
The “Winter of Discontent” that brought Margaret Thatcher to power
Megan McArdle explains the temper of the late 1970s in Britain:
To understand the legacy of Margaret Thatcher, you need to understand Britain’s “Winter of Discontent,” in which striking public-sector workers nearly paralyzed the nation. Actually, you have to go back a bit further, to the inflations of the 1970s. Americans remember the “stagflation” of the 1970s as bad, but in Britain it was even worse — the inflation rate peaked in 1975 at over 25 percent.
Governments on both sides of the pond decided that the solution to inflation was to simply declare, by fiat, that prices would not rise so much. In America we got Nixon’s wage and price controls. In Britain, they got the government’s 1978 vow to hold public-sector wage increases to 5 percent — at a time when inflation was running to double digits.
The public-sector workers, as you might imagine, did not like that. And in Britain, the public-sector workers had immense power. Trash piled up in the streets. The truck drivers who ferried goods all over Britain went on strike — and the ones who didn’t, like oil tanker drivers, began feeding their destinations to “flying pickets” — mobile groups of strikers who would go from location to location, blockading them so that workers couldn’t get in and goods couldn’t get out. The BBC called them the “shock troops of industrial action” and that’s an accurate picture; effectively mobilized, flying pickets can grind the wheels of industry to a halt. Which is what they did in the winter of 1978-79.
In Liverpool, the gravediggers went out, leaving bodies unburied for weeks. By the end of January, half the hospitals in Britain were taking only emergency cases. Full of righteous fury, the unions flexed every muscle, demonstrating all the tremendous power that they had amassed by law and custom in the years since the Second World War. Unfortunately, they were pummeling the Labour Party, which had given them most of those powers. And the public, which was also suffering through high inflation and anemic GDP growth, had had enough. They elected Margaret Thatcher, a Conservative grocer’s daughter without roots in the working-class power structure of the labor movement, or the elite power structure of Britain’s famously rigid class system. She systematically went about dismantling the two main sources that gave labor the power to essentially shut down the United Kingdom: lenient strike laws and state ownership of key industrial sectors.
[. . .]
Her detractors should remember that as terrible as it was for the miners when the pits were closed, these mining operations were not sustainable — nor was it even desireable that they be sustained so that further generations could invest their lives in failing coal seams. The work was dreadful. The coal was too dirty for the environment, or the delicate pink tissue of the miners’ lungs. And even if Britain had wanted to keep mining the filthy stuff, it was getting too expensive to dig it out. The mines were playing out, not because Margaret Thatcher was mean, but because the cradle of the Industrial Revolution had burned through much of her coal.
In short, Margaret Thatcher destroyed an industrial system which had yes, provided workers with a secure livelihood, but yes, also done so at an unnacceptable cost. These two things are the same legacy. They cannot be parted.
Her achievement was not inevitable. But looking back at the Winter of Discontent, I’d argue that it was necessary. The alternate future for a United Kingdom where the labor unions hung on was another decade or two of failing state firms and economic decline. By the early 1980s, the UK’s per-capita GDP was lower than that of Italy. You can maybe argue that there was some alternative Social Democratic future, Sweden-style, or perhaps the discovery of an alternative path to capitalism. But it’s hard to look at the convulsions of 1970s Britain and argue that this was a happier past that the nation should pine after. And I find it hard to argue that Britain’s economy could have been modernized without taking on the unions; their veto power made even such obvious steps as shutting down failing mines effectively impossible.
As I wrote a few years back:
My family left Britain in 1967, which was a good time to go: the economy was still in post-war recovery, but opportunities abroad were still open to British workers. My first visit back was in [mid-winter] 1979, which was a terrible shock to my system. I’d left, as a child, before the strikes-every-day era began, and my memories of the place were still golden-hued and happy. Going back to grey, dismal, cold, smelly, strike-bound Britain left me with a case of depression that lasted a long time. It didn’t help that the occasion of the visit was to attend my grandfather’s funeral: it was rather like the land itself had died and the only remaining activity was a form of national decomposition.
March 28, 2013
“Gaming in the 1970 and 80s felt a little like being into punk rock”
Explanation of the headline: gaming in the 70’s was like being into punk because it was very much an outsider interest, you had to go well out of your way to find it, and it was cool (at least to you, not so much to your family and non-gaming friends). Peter Bebergal finds online caches of some of the classic gaming magazines of the day:
The Internet Archive is one of the great treasures of the internet, housing content in every media; texts, video, audio. It’s also the home of the Wayback Machine, an archive of the Internet from 1996. I thought I had explored the site pretty thoroughly — at least according to my own interests — but recently came across runs of some of the great gaming magazines of the 1970s and 80s; The Space Gamer, Ares, Polyhedron, The General, and — temporarily — Dragon Magazine. These magazines represent not only the golden age of gaming, but expose the thrill and excitement of gaming when it was still new, still on the margins. It was a time when gaming still felt a little, dare I say, punk.
Today, finding members of your particular community of interest is a Google search away, but in the 1970s the only way to be in contact with others who shared interests was through magazines. For many gamers, even finding the games could be difficult. Discovering the gaming magazines revealed an active gaming industry that still maintained a sense of being on the vanguard.
The earliest issues show off their newsletter origins. The Space Gamer and The General started off on plain paper in black and white. Even the first issues of Dragon look like a teenager’s fanzine, but the enthusiasm and energy are infectious. Who couldn’t love the introduction of new monsters for your campaign such as the Gem Var, a creature composed entirely of gemstone and that cannot take damage from bladed weapons. The artists, editors and letter writers were the best friends you had never met. Gaming in the 1970 and 80s felt a little like being into punk rock. You knew it was offbeat, knew that outsiders didn’t get it, but you also knew that this was cool. Even the advertisements and listings of conventions expanded the universe of gaming a thousandfold. Not unlike ordering 45s of unknown bands from punk zines, was sending away for microgames, miniatures and supplements from tiny game publishers.
While I wasn’t as much into the early roleplaying games, I was very much into wargaming and that was in the “respectable” part of the gaming ghetto until the boom in RPGs pretty much took all the oxygen out of the room. Of course, even in the “respectable” area, there were the Napoleonic grognards and the frisson-of-insanity East Front fanatics…
March 12, 2013
If consumers were 10% better off … why did they call it a “disease”?
In Maclean’s, Stephen Gordon illustrates the classic case of burying the lede for popular economics:
So Dutch consumers are roughly 10% better off than they would have been, but companies have been able to compete only by paring their profit margins.
“The Dutch disease,” The Economist, November 26, 1977
Talk about burying the lede. That sentence appears at the end of the 10th paragraph of the much-referred-to but rarely read article in The Economist that coined the phrase “Dutch Disease.” In the normal course of things, a 10 per cent increase in consumers’ purchasing power would be the stuff of banner headlines, but, for some reason, The Economist chose to hide that point deep into the story and qualify it with a caveat about how hard it had become for companies to compete. (The answer to that, by the way, is: “So what if producers are struggling?” What really matters is consumer welfare.)
My take on the Dutch Disease debate can be summed up as follows: Why are we calling it a disease?
February 25, 2013
What Argo doesn’t show about “The Canadian Caper” of 1979
In Maclean’s, one of the American diplomats who took part in the actual hostage drama in Tehran provides a bit of supplementary material to the film Argo:
Ben Affleck’s Argo has stormed box offices, collected awards [. . .] yet Canadians of a certain age may find themselves thinking: This is not quite how I remember those days. I was there when Iranians took over the American Embassy in Tehran, and it is not quite how I remember them either. Argo is terrific entertainment, but it tells only a part of our story, and says nothing at all about many of the real heroes — most Canadian — who helped rescue us. Before Argo came along, our rescue was routinely called the “Canadian Caper.” It still should be. The operation consisted of four distinct phases. Three were almost entirely Canadian, and only one involved significant U.S. assistance.
For those not of a certain age, a brief summary is a good starting point. Nov. 4, 1979 brought cold rain and hinted of trouble of a different sort. Two weeks earlier, then-president Jimmy Carter decided to admit the former shah of Iran to the U.S. for cancer treatment. Iranians were outraged; many suspected it was a plot by the U.S. Central Intelligence Agency to remove Iran’s new ruler, Ayatollah Ruhollah Khomeini, and put the shah back in charge. Protests outside Tehran’s U.S. Embassy had become daily occurrences. That November morning, demonstrators climbed the gate and soon controlled the compound.
[. . .]
Phase four always receives the least attention. The U.S. government was desperate to keep the CIA’s role secret, rightly fearing its disclosure might endanger the hostages (who weren’t freed until 1981). This concern was sufficiently real that we were asked to live under false names in Florida until the hostages were set free. I was looking forward to seeing how many speeding tickets my alter ego could accumulate, but La Presse decided to publish Jean Pelletier’s story once the Canadian Embassy in Tehran had closed. We came home to a rousing reception and the Canadians were asked to claim complete credit for our escape. That job understandably fell to ambassador Taylor, who spent the better part of a year on the rubber chicken circuit at receptions to honour the Canadian government and people for helping us. Some have said he did the job too well, or failed to share the credit with other embassy staff. My own experience contradicts this. I heard Taylor speak several times. He always mentioned his staff. I also tried, during press interviews I gave, to mention others, particularly the Sheardowns. My comments were edited out. It seemed the press could handle only one hero at a time. Unfortunately, this meant John Sheardown, who was indispensable in phase one, became invisible in phase four. I truly believe John did not care. He did his duty as he saw it. For those who loved and respected him, it was painful.
[. . .]
As I wrote at the beginning, Argo is a wonderful film. Not because it is historically accurate, but because, aside from its technical brilliance, it reminds us of a time when ordinary people performed great deeds, and two neighbours that feud over many small and not so small things came together and did something magnificent. Maybe it didn’t change history, but for we six house guests it was truly life changing. And it was, and should always remain, the Canadian Caper.
February 20, 2013
Rare praise for obscure movie director of the 1970’s
He apparently goes by the same name as one of the most reviled movie figures of the last 20 years:
It’s hard to imagine now, but the original Star Wars movie was more than just a star-spanning, kid-pleasing action flick. It was also a rule-breaking, expectation-thwarting one-film rebel alliance.
For instance, remember how the movie starts with a blare of trumpets and the title, followed the text crawl, followed by the actual movie? Notice how there aren’t three minutes of “Doopdy Doo Pictures and Skippity-Skip Entertainment Present … A Furfty Fur/Yonker Boo Production … A Glarpton Spitcake Film … Elwee Groodicle … Robbles Pancake … Spankster Carmont … and Bliss Underham … Casting by Arhop Maser, C.S.A … Music by Hambone Jury … Cheese Table Relocation by Hollywood Dairy Movement L.L.C.” and so forth? Lucas was fined $250,000 for that. Specifically, he was fined by the Director’s Guild for not having an opening director credit. That’s right, he was fined for not giving himself credit before the film even starts.
Or take the fact that there are two main characters who not only don’t speak English, but whose growlings and bleepings aren’t even translated into subtitles.
Oh, and one more thing. It’s science fiction. These days you can’t swing a large popcorn without hitting a science-fiction blockbuster right in the hyperdrive, but at the time there hadn’t been a really successful science fiction movie in nearly a decade. Just by setting his film in a galaxy far, far away — not to mention long, long ago — Lucas was defying the conventional movie-making wisdom of the time.
The point is that while Star Wars is the spaceship that launched a thousand clichés, it achieved its success by being something profoundly original. So here’s my unsolicited advice to Abrams, and moreover to the hundreds of entertainment bureaucrats who are going to want to have their meddling incorporated into the upcoming Star Wars VII: Action of the Noun: Don’t give into the Dark Side. Don’t incorporate the following clichés that have increasingly infested sequels for the past 35 years.
February 13, 2013
Debunking the “1970s had a higher standard of living than today” meme
Don Boudreaux produces an anecdotal list of things that refute the inane notion that America’s standard of living peaked in the 1970s:
What follows here is drawn from memory. Perhaps my memory is grossly distorted, but my report of it here is an undistorted reflection of that memory. Here’s some of what I recall, of relevance to this discussion, from middle-class America of the 1970s; I offer the 25 items on this list in no particular order, except as they come to me.
(1) Automobiles broke down much more frequently than they break down today, hence, leaving motorists stranded, sometimes for hours, more often than is the case today.
(2) Automobiles rusted faster and more thoroughly than they do today.
(3) Someone in his or her early 70s was widely regarded as being quite old.
(4) “Old” people back then were much more likely to wear dentures than are “old” people today.
(5) Frozen foods in supermarkets were gawdawful by the standards of today – in terms both of quality and of selection.
[. . .]
(21) Coffee sucked. (It was almost all made from robusta beans.) And the selection of teas was pretty much limited to whatever Lipton sold.
(22) A diagnosis of cancer was far more frightening than it is today. Any person so diagnosed was regarded as being as good as dead.
(23) Going to college was much more unusual than it is today.
(24) Contact lenses were much more expensive than they are today. I purchased insurance (!) on my first pair of soft contact lenses (which I bought in 1980) in order to protect myself against the financial consequences of losing or damaging the one pair that I bought. (Such lenses were bought one pair at a time.)
(25) The idea of widespread use of personal computers seemed like science fiction. I very clearly recall overhearing, in the Spring of 1980, one of my economics professors, Wayne Shell (who also taught computer science), telling someone that he believed that, within a few years, many American households will have a computer. I thought at the time that Dr. Shell’s prediction was fancifully far-fetched.
I could go on, listing at least another 50 such recollections. But instead I’ll end this post here.
January 5, 2013
BBC forgets about original (BBC) series, asks for pilot of new Yes, Prime Minister
As a result, the remake will not be shown on the BBC:
The new series of Yes, Prime Minister was made for a rival channel because the BBC asked its creators to make a pilot episode, it has emerged.
Co-writer Jonathan Lynn said the BBC had been given first refusal on the revival out of “courtesy”, because it aired the award-winning original.
But he called the request for a test episode “extraordinary”, as “there were 38 pilots available on DVD”.
The first new episodes for 25 years will be aired on digital channel Gold.
Lynn told comedy website Chortle that the BBC “said it was policy” to order a pilot episode before commissioning a full series.
“So we said our policy was to not write a pilot.”
The original Yes, Minister and Yes, Prime Minister tell you more about the actual workings of parliamentary democracy than a full semester undergraduate course. I hope the new series can recapture the magic (if you can call showing the awful workings of government bureaucrats and politicians “magic”).
The new series was filmed last summer and is based on a recent stage production, which launched in 2010.
Digital network Gold said the Rt Hon Jim Hacker would return as the leader of a coalition government, with plots focussing on the economic crisis, a leadership crisis with his coalition partners and a Scottish independence referendum.
David Haig will take the lead role, with Henry Goodman as Sir Humphrey. Both have appeared in the stage version of the show.
They will be joined by Dame Maggie Smith’s son, Chris Larkin, as Bernard Woolley, and Robbie Coltrane as a guest star.
December 8, 2012
The predator who hid in full view of the cameras
Mark Steyn on the Jimmy Savile investigations:
It’s tempting at this point to offer some musings on the price of fame, the burdens of celebrity. But Savile was cheerfully unburdened. Rather than a celebrity who happens to be a pedophile, he seems to have been a pedophile who became a celebrity in order to facilitate being a pedophile. Robbers rob banks because that’s where the money is. In the Sixties, Savile became a star disc jockey in Britain’s nascent pop biz because that’s where the 14-year-old nymphettes are. In the Seventies, he became a kiddie-TV host because that’s where the nine-year-old moppets are. He became a celebrity volunteer with his own living quarters at children’s hospitals and homes because that’s where the nine-year-olds too infirm to wiggle free or too mentally ill to protest are. He persuaded various institutions to give him keys to the mortuary because that’s where the nine-year-olds unable even to cry out are. (Stoke Mandeville Hospital is now investigating whether he “interacted inappropriately” with corpses.)
His persona was tailored to his appetites: The child-man shtick meant no one would ever ask him to host grown-up telly shows or move to the easy-listening channel. He motored around the country in a famous silver Rolls with a caravan on hand should he espy a comely schoolgirl at the edge of the road. When opportunity for a quickie struck ten minutes before a recording of Savile’s Travels, it was easier to drop the gold lamé sweatpants than unbuckle a belt and unzip a pair of trousers. And he more or less hid in plain sight. When Fleet Street reporters seeking a quote on something or other called him up and said “Is that Jimmy Savile?” he’d shoot back: “I never touched her!” On the one occasion we met, I remember being struck by the physical strength he projected, even at his then-advanced age. A few years ago, an interviewer asked, “You used to be a wrestler, didn’t you?”
“I still am.”
“Are you?”
“I’m feared in every girls’ school in the country.”
November 24, 2012
Israel: where the 1970s never ended
Kathy Shaidle reports on her recent trip to Israel:
Folks who say visiting Israel is like traveling back in time don’t know the half of it.
Say: Do you find yourself missing the 1970s — even though, like me, you vowed you never would?
That is: Do you miss litter, graffitti, off-leash dogs, free range cats, smoking on the beach, 13 TV channels, no wheelchair ramps — plus polyester everything?
Because if so, Israel is the 70s with cellphones! You’ll love it! Heck, the same war’s still going on!
Seriously: This shiksa just got back from her second trip to Israel — not a moment too soon, from the looks of things — and I’m here with the first of a series of articles that will go from macro to micro.
PJMedia’s own Barry Rubin literally wrote the book on Israel. I read it before I left and recommend it highly. But he’s a Jew who has lived there for years. I’m writing as a gentile two-time visitor.
To that end, I’ll start off with an overviews of major cities and regions in Israel, then drill down in the coming weeks, to cover specific attractions; define words that don’t mean what you (or more accurately, your dorky grad student nephew) think they mean (i.e., “check point,” “settlement,” “refugee camp”); then offer tips on food, language, manners and more.
Japan’s demographic time-bomb has detonated
Remember the Japan of the 1970s and 1980s? The world-spanning colossus of economic might? The nation that had Wall Street wetting its collective pants with every bold move?
That was then. This is now:
Less than a quarter-century ago, Japan was the economic envy of the world. In 1989, Tokyo-listed shares represented nearly half the planet’s equity value, while the land beneath the city’s royal palace was worth more than all of California. American nightly news anchors practically misted up when they had to report that Rockefeller Center was turning Japanese.
Two lost decades and massive property- and stock-bubble explosions later, Japan is a one-word cautionary tale. Caught in economic and demographic atrophy — and stewarded by countless false-start prime ministers — the country has become a hub for zombie banks, a generation of disenchanted youth, and fading brands such as Sony, Sharp, and Panasonic.
Last year, for the first time, sales of adult diapers in Japan exceeded those for babies.
Regulating food container size as a form of soft protectionism
Terence Corcoran talks about the 1970s-era food packaging regulations that have suddenly become topical:
What started out looking like a regulatory non-event, the Harper government’s plan to repeal scores of petty federal rules governing the size of containers for packaged food in supermarkets, has suddenly become a great national food fight.
It’s industry against industry, food processors versus supply management, Heinz battling Campbell’s, baby-food makers against corn canners — all part of a war over jobs and trade and consumer dollars. Nominally over antiquated federal regulations, it’s also a war that highlights another reason why Canadian consumers pay more for products at the retail level.
[. . .]
Never mind peanut butter. Ottawa has detailed container specs for what looks like every food product on store shelves: canned vegetables, fruit juices, vacuum-packed corn, tomato juice, maple syrup, frozen spinach, pork and beans, bagged potatoes, soups, desserts, pies, sauerkraut, horseradish sauce, wine — and many more.
It is unclear why these detailed container-size regulations exist, but one explanation is that they are a result of Ottawa’s mass conversion to metric measure in the 1970s under then prime minister Pierre Trudeau. Under the metrication rules, the law mandated metric for all prepackaged food products.
Whatever the intent of the detailed regulations, the effect has been to erect trade barriers that have created protected industries that are now opposing the proposed changes. The Food Processors of Canada set up a web page, KeepFoodJobsInCanada, promoting an email campaign to force Agriculture Minister Gerry Ritz to block the plan to repeal the container-size regulations. It seems to have worked, so far.
November 4, 2012
Even “Biblical views” change over time
An older post, but still rather informative:
The ‘biblical view’ that’s younger than the Happy Meal
In 1979, McDonald’s introduced the Happy Meal.
Sometime after that, it was decided that the Bible teaches that human life begins at conception.
Ask any American evangelical, today, what the Bible says about abortion and they will insist that this is what it says. (Many don’t actually believe this, but they know it is the only answer that won’t get them in trouble.) They’ll be a little fuzzy on where, exactly, the Bible says this, but they’ll insist that it does.
That’s new. If you had asked American evangelicals that same question the year I was born you would not have gotten the same answer.
That year, Christianity Today — edited by Harold Lindsell, champion of “inerrancy” and author of The Battle for the Bible — published a special issue devoted to the topics of contraception and abortion. That issue included many articles that today would get their authors, editors — probably even their readers — fired from almost any evangelical institution. For example, one article by a professor from Dallas Theological Seminary criticized the Roman Catholic position on abortion as unbiblical. Jonathan Dudley quotes from the article in his book Broken Words: The Abuse of Science and Faith in American Politics. Keep in mind that this is from a conservative evangelical seminary professor, writing in Billy Graham’s magazine for editor Harold Lindsell:
God does not regard the fetus as a soul, no matter how far gestation has progressed. The Law plainly exacts: “If a man kills any human life he will be put to death” (Lev. 24:17). But according to Exodus 21:22-24, the destruction of the fetus is not a capital offense. … Clearly, then, in contrast to the mother, the fetus is not reckoned as a soul.
Christianity Today would not publish that article in 2012. They might not even let you write that in comments on their website. If you applied for a job in 2012 with Christianity Today or Dallas Theological Seminary and they found out that you had written something like that, ever, you would not be hired.
At some point between 1968 and 2012, the Bible began to say something different. That’s interesting.
Even more interesting is how thoroughly the record has been rewritten. We have always been at war with Eastasia.
October 21, 2012
Nick Gillespie: A libertarian appreciation for the late George McGovern
George McGovern will, unfortunately, be best known to most people as the poor beggar who lost the 1972 election to Richard Nixon in a blowout. Nick Gillespie says there was much more to McGovern than just being on the wrong side of an electoral landslide:
McGovern’s early criticism of the Vietnam War (he first spoke against it as a newly elected Democratic senator from South Dakota in 1963) was out of step with a bipartisan Cold War consensus that smothered serious debate for too long.
Yet when you take a longer view of his career — especially after he got bounced from the Senate in 1980 during the Republican landslide he helped create — what emerges is a rare public figure whose policy positions shifted to an increasingly libertarian stance in response to a world that’s far more complicated than most politicians can ever allow.
Born in 1922 and raised during the Depression, McGovern eventually earned a doctorate in American history before becoming a politician. But it was as a private citizen he became an expert in the law of unintended consequences, which elected officials ignore routinely. He came to recognize that attempts to control the economic and lifestyle choices of Americans aren’t only destructive to cherished national ideals, but ineffective as well. That legacy is more relevant now than ever.
[. . .]
In a 1997 New York Times op-ed article, he emphasized that simply because some people abuse freedom of choice is no reason to reduce it. “Despite the death of my daughter,” he argued, “I still appreciate the differences between use and abuse.” He rightly worried that lifestyle freedom, like economic freedom, was everywhere under attack: “New attempts to regulate behavior are coming from both the right and the left, depending only on the cause. But there are those of us who don’t want the tyranny of the majority (or the outspoken minority) to stop us from leading our lives in ways that have little impact on others.”
McGovern believed that attempts to impose single-value standards were profoundly un-American and “that we cannot allow the micromanaging of each other’s lives.” But as governments at various levels expand their control of everything from health-care to mortgages to the consumption of soda pop and so much more, that’s exactly what’s happening.
October 18, 2012
Domestic terrorism less common in the US now than in the past
At the Cato@Liberty blog, Benjamin Friedman looks at the history and compares it with today’s constant worry about US domestic terror operations:
Homegrown terrorism is not becoming more common and dangerous in the United States, contrary to warnings issued regularly from Washington. American jihadists attempting local attacks are predictably incompetent, making them even less dangerous than their rarity suggests.
Janet Napolitano, Secretary of Homeland Security, and Robert Mueller, Director of the Federal Bureau of Investigation, are among legions of experts and officials who have recently warned of a rise in homegrown terrorism, meaning terrorist acts or plots carried out by American citizens or long-term residents, often without guidance from foreign organisations.
But homegrown American terrorism is not new.
Leon Czolgosz, the anarchist who assassinated President McKinley in 1901, was a native-born American who got no foreign help. The same goes for John Wilkes Booth, Lee Harvey Oswald and James Earl Ray. The deadliest act of domestic terrorism in U.S. history, the 1995 Oklahoma City Bombing, was largely the work of New York-born Gulf War vet, Timothy McVeigh.
As Brian Michael Jenkins of RAND notes, there is far less homegrown terrorism today than in the 1970s, when the Weather Underground, the Jewish Defense League, anti-Castro Cuban exile groups, and the Puerto Rican Nationalists of the FALN were setting off bombs on U.S. soil.
[. . .]
After the September 11, the FBI received a massive boost in counterterrorism funding and shifted a small army of agents from crime-fighting to counterterrorism. Many joined new Joint Terrorism Task Forces. Ambitious prosecutors increasingly looked for terrorists to indict. Most states stood up intelligence fusion centers, which the Department of Homeland Security (DHS) soon fed with threat intelligence.
The intensification of the search was bound to produce more arrests, even without more terrorism, just as the Inquisition was sure to find more witches. Of course, unlike the witches, only a minority of those found by this search are innocent. But many seem like suggestible idiots unlikely to have produced workable plots without the help of FBI informants or undercover agents taught to induce criminal conduct without engaging in entrapment.



