Everyone thinks America Alone is about Islam and demography, but in fact it has a whole section in it on cheese, called “The Pasteurization is Prologue”. Page 181:
I’ve never subscribed to that whole “cheese-eating surrender-monkey” sneer promoted by my National Review colleague Jonah Goldberg. As a neocon warmonger, I yield to no one in my contempt for the French, but, that said, cheese-wise I feel they have the edge.
When I’m at the lunch counter in America and I order a cheeseburger and the waitress says, “American, Swiss or Cheddar?” I can’t tell the difference. They all taste of nothing. The only difference is that the slice of alleged Swiss is full of holes, so you’re getting less nothing for your buck. Then again, the holes also taste of nothing, and they’re less fattening. But, either way, cheese is not the battleground on which to demonstrate the superiority of the American way.
Most of the American cheeses bearing European names are bland rubbery eunuch versions of the real thing. I wouldn’t mind if this were merely the market at work, but it’s not. It’s the result of Big Government, of the Brieatollahs at the United States Department of Agriculture:
In America, unpasteurized un-aged raw cheese that would be standard in any Continental fromagerie is banned. Americans, so zealous in defense of their liberties when it comes to guns, are happy to roll over for the nanny state when it comes to the cheese board… The French may be surrender monkeys on the battlefield, but they don’t throw their hands up and flee in terror just because the Brie’s a bit ripe. It’s the Americans who are the cheese-surrendering eating-monkeys — who insist, oh, no, the only way to deal with this sliver of Roquefort is to set up a rigorous ongoing Hans Blix-type inspections regime.
I’m not exaggerating about that. Nothing gets past their eyes, and everything gets pasteurized. That’s why American “cheesemakers” have to keep putting stuff into the “cheddar” — sun-dried tomatoes, red peppers, chocolate chips — to give it some taste, because the cheese itself has none. And, if you try to bring in anything that does taste of something, the US Government’s Brie Team Six seizes it:
The US fate of the bright-orange, mild-tasting French cheese has been in jeopardy for months and the Food and Drug Administration has blocked all further imports.
Why? Because US regulators determined the cantaloupe-like rind of the cheese was covered with too many cheese mites, even though the tiny bugs give mimolette its unique flavor.
No formal ban has been put in place, but 1.5 tonnes (3,300 pounds) of cheese were blocked from being imported, and nothing is going through US customs.
“No formal ban has been put in place” — because that would involve legislators passing laws in a legislature and whatnot. So they just banned it anyway.
Mark Steyn, “Live Brie or Die!” SteynOnline.com, 2014-03-13
March 16, 2014
December 4, 2013
Shikha Dalmia argues that the fight over forcing companies to cover contraceptive prescriptions is based on a mistaken view of women’s rights:
The administration argues that acquiescing to such arguments would mean allowing bosses or corporate CEOs to restrict women’s choices to promote their own religious beliefs. “Our policy is designed to ensure that health care decisions are made between a woman and her doctor,” noted White House spokesman Jay Carney. But it’s not bosses who pose the bigger barrier to birth control but doctors themselves.
The only reason American women need insurance coverage for contraception is because they can’t buy birth control pills without a prescription — which doctors won’t hand them without an annual exam. Few dispute anymore — not even the American College of Obstetricians and Gynecologists — that the pills are perfectly safe requiring neither a medical diagnosis nor supervision. They have side effects like every other medicine but none so serious that can’t be effectively communicated through the usual warning labels. Requiring a medical exam assumes that women can’t be completely trusted with their own health. But such paternalism is counterproductive: Most women who stop taking pills don’t do so because they can’t afford them without insurance. (A one-month generic supply from Costco costs $25.) They do so because they can’t always make the time for a doctor’s visit when their prescription runs out. This problem is especially acute for working women — professional or others.
The birth control issue shouldn’t be cast in terms women’s rights versus religious rights. That’ll turn it into a lose-lose proposition. Medical paternalism is a far bigger threat to women’s reproductive choices than religious zealotry. Focusing on the first will do more to give women control over their bodies — including the female employees of Hobby Lobby — than a pitched battle against the second.
October 28, 2013
In sp!ked, Christopher Snowdon starts off by listing a few “public health” proposals that have been suggested recently:
An abridged list of policies that have been proposed in the name of ‘public health’ in recent months includes: minimum pricing for alcohol, plain packaging for tobacco, a 20 per cent tax on fizzy drinks, a fat tax, a sugar tax, a fine for not being a member of a gym, graphic warnings on bottles of alcohol, a tax on some foods, subsidies on other foods, a ban on the sale of hot food to children before 5pm, a ban on anyone born after the year 2000 ever buying tobacco, a ban on multi-bag packs of crisps, a ban on packed lunches, a complete ban on alcohol advertising, a ban on electronic cigarettes, a ban on menthol cigarettes, a ban on large servings of fizzy drinks, a ban on parents taking their kids to school by car, and a ban on advertising any product whatsoever to children.
Doubtless many of the proponents of these policies identify themselves as ‘liberals’. We must hope they never lurch towards authoritarianism. [...]
As the definition of ‘health’ has been changed, so too has the meaning of ‘public health’. It once meant vaccinations, sanitation and education. It was ‘public’ only in the sense that it protected people from contagious diseases carried by others. Today, it means protecting people from themselves. The word ‘epidemic’ has also been divorced from its meaning — an outbreak of infectious disease — and is instead used to describe endemic behaviour such as drinking, or non-contagious diseases such as cancer, or physical conditions such as obesity which are neither diseases nor activities. This switch from literal meanings to poetic metaphors helps to maintain the conceit that governments have the same rights and responsibility to police the habits of its citizens as they do to ensure that drinking water is uncontaminated. It masks the hard reality that ‘public health’ is increasingly concerned with regulating private behaviour on private property.
The anti-smoking campaign is where the severe new public-health crusade began, but it is not where it ends. Libertarians warned that the campaign against tobacco would morph into an anti-booze and anti-fat campaign of similar intensity. They were derided; ridiculed for making fallacious ‘slippery slope’ arguments. In retrospect, their greatest failing was not that they were too hysterical in their warnings but that they lacked the imagination to foresee policies as absurd as plain packaging or bans on large servings of lemonade, even as satire.
September 24, 2013
Steven M. Teles on the defining characteristic of modern American government:
The complexity and incoherence of our government often make it difficult for us to understand just what that government is doing, and among the practices it most frequently hides from view is the growing tendency of public policy to redistribute resources upward to the wealthy and the organized at the expense of the poorer and less organized. As we increasingly notice the consequences of that regressive redistribution, we will inevitably also come to pay greater attention to the daunting and self-defeating complexity of public policy across multiple, seemingly unrelated areas of American life, and so will need to start thinking differently about government.
Understanding, describing, and addressing this problem of complexity and incoherence is the next great American political challenge. But you cannot come to terms with such a problem until you can properly name it. While we can name the major questions that divide our politics — liberalism or conservatism, big government or small — we have no name for the dispute between complexity and simplicity in government, which cuts across those more familiar ideological divisions. For lack of a better alternative, the problem of complexity might best be termed the challenge of “kludgeocracy.”
A “kludge” is defined by the Oxford English Dictionary as “an ill-assorted collection of parts assembled to fulfill a particular purpose…a clumsy but temporarily effective solution to a particular fault or problem.” The term comes out of the world of computer programming, where a kludge is an inelegant patch put in place to solve an unexpected problem and designed to be backward-compatible with the rest of an existing system. When you add up enough kludges, you get a very complicated program that has no clear organizing principle, is exceedingly difficult to understand, and is subject to crashes. Any user of Microsoft Windows will immediately grasp the concept.
“Clumsy but temporarily effective” also describes much of American public policy today. To see policy kludges in action, one need look no further than the mind-numbing complexity of the health-care system (which even Obamacare’s champions must admit has only grown more complicated under the new law, even if in their view the system is now also more just), or our byzantine system of funding higher education, or our bewildering federal-state system of governing everything from welfare to education to environmental regulation. America has chosen to govern itself through more indirect and incoherent policy mechanisms than can be found in any comparable country.
June 24, 2013
Richard Smith talks about the British Medical Association’s “official” stance on heterosexual and homosexual “indulgences” from the 1950s on, and also explains why British use of deodorant always lagged the rest of the western world:
I was once responsible for Family Doctor Publications, which were a series of booklets owned by the BMA, had titles like You and Your Bowels, and sold in huge numbers in the 1950s because they were almost the only information on health available to the public. I was much amused that in the 50s the BMA agreed that the booklets could include advertising for cigarettes and alcohol, but under no circumstances could they advertise contraceptives. And at about the same time thousands of copies of one booklet had to be pulped because it seemed to accept the possibility of sex before marriage. Now I’ve learnt more about the prudishness and “severe, restrictive morality” of the BMA.
[. . .]
The BMA was also happy to ignore science and evidence when it launched into explanations of what at the time was perceived as “an epidemic of homosexuality.” “Many men see in homosexual practices as a way of satisfying their sexual desires without running the risks of sequelae of heterosexual intercourse. They believe, for example, that there is no danger of contracting venereal disease in homosexual activity. Other men adopt homosexual practices as a substitute for extramarital heterosexual intercourse because there is no fear of causing pregnancy or emotional complications as in the life of a woman.” The idea that “women” equals “emotional complications” was a very 50s idea.
It was unsurprising, thought the BMA, that the public would be hostile to homosexuals because of the propensity of its practitioners in “positions of authority to give preferential treatment to homosexuals or to require homosexual subjection as an expedient for promotion. The existence of practising homosexuals in the Church, Parliament, Civil Service, Armed Forces, Press, radio, stage and other institutions constitutes a special problem.” Medicine is conspicuously absent from that list. God (heterosexual, of course, even though capable of insemination without intercourse) forbid that the BMA would have homosexuals in its membership.
The BMA found sexual acts between men “repulsive” and that “homosexuals congregating blatantly in public houses, streets, and restaurants are an outrage to public decency. Effeminate men wearing make-up and using scent are objectionable to everybody.” Born in 1952 I was infused with this kind of thinking and didn’t use a deodorant until I was 45 for fear of what people might think. My father, born in 1922, didn’t like me to buy half a pint rather than a pint of beer in case I be thought homosexual.
Having made its position clear, the BMA concluded that “if degenerate sodomists” persist then “it would be in the public interest to deal with them in the same way as mentally deranged offenders.” In other words, commit them to state lunatic asylums.
June 23, 2013
In the Independent, James Young reports from Rio de Janeiro:
The most recent wave of protests began at the beginning of the month in Sao Paulo over what may seem an insignificant 20 centavo (7p) bus-fare hike. But the level of the increase was less important than what it represented. Once again, Brazilians felt they were being asked to pay an onerous price for a shoddy service. Buses in big cities are overcrowded, infrequent and journeys can take hours.
Now the leaderless, non-politically affiliated protest movement has a variety of goals. Better public healthcare is one. “I recently spent eight hours in a hospital waiting room with dengue,” says Lee, a bank worker protesting on Friday. “People were sleeping on the floor.” Another is an improved education system. “We work hard and pay high taxes. And we get nothing in return,” he continues.
Frustration over the country’s institutionalised corruption has attracted many to the protests. Influence-peddling scandals such as 2005′s Mensalao (“big monthly allowance”) affair and, more recently, the saga of Carlinhos Cachoeira, accused of running a political bribery network, have left many desperate for change.
Some protesters have focused on the £8bn spent on stadium and infrastructure work for next year’s World Cup, seen as indefensible in a country with so many more pressing needs. The brutal tactics employed by the police have added to the indignation. Rubber bullets and tear gas have been used, often indiscriminately and at close range.
May 12, 2013
Matt Ridley explains why the breathless claims that this or that flu outbreak could rival the Spanish Flu pandemic of 1918-19 should not be taken too seriously:
Here we go again. A new bird-flu virus in China, the H7N9 strain, is spreading alarm. It has infected about 130 people and killed more than 30. Every time this happens, some journalists compete to foment fear, ably assisted by cautious but worried scientists, and then tell the world to keep calm. We need a new way to talk about the risk of a flu pandemic, because the overwhelming probability is that this virus will kill people, yes, but not in vast numbers.
In recent years flu has always proved vastly less perilous than feared. In 1976 more people may have died from bad reactions to swine-flu vaccine than from swine flu. Since 2005, H5N1 bird flu has killed 374 people, not the two million to 7.4 million deemed possible by the World Health Organization. In 2009, H1N1 Mexican swine flu proved to be a normal flu episode despite apocalyptic forecasts.
No doubt some readers will remind me that, in the story of the boy who cried “Wolf!”, there eventually was a wolf. And that in 1918 maybe 50 million people died of influenza world-wide. So we should always worry a bit. But perhaps it’s not just luck that has made every flu pandemic since then mild; it may be evolutionary logic.
Of course, just about every story about influenza you’ll encounter goes the Chicken Little route:
There’s no mystery as to why we talk up the risk every time: All the incentives point that way. Who among the headline-seeking journalists, reader-seeking editors, fund-seeking scientists, contract-seeking vaccine makers or rear-end-covering politicians has even a modest incentive to say: “It may not be as bad as all that”?
March 26, 2013
Yes, yes, I know salt is one of the most dangerous substances known to man. Well, this week, anyway. Next week they may decide to recommend doubling your daily intake instead of reducing it. It’s an example of the nanny state’s long history of providing inconsistent — and sometimes even dangerous — dietary advice:
The government told people to switch from saturated animal fats to unsaturated vegetable fats. But that advice may have killed a lot of people. As David Oliver notes, a recent study “in the British Medical Journal” shows that ”those who heeded the advice” from public-health officials “to switch from saturated fats to polyunsaturated vegetable oils dramatically reduced their odds of living to see 2013,” incurring up to a ”60% increase in risk of death by switching from animal fats to vegetable oils.” This possibly deadly medical advice has a long history:
Fifty years ago the medical community did an about-face … and instead went all in on polyunsaturated fats. It reasoned that since (a) cholesterol is associated with cardiovascular disease and (b) polyunsaturated fats reduce serum cholesterol levels, it inescapably followed that (c) changing people’s diet from saturated fats to polyunsaturated fats would save a lot of lives. In 1984 Uncle Sam got involved – Time magazine reported on it in “Hold the Eggs and Butter” – and he made a big push for citizens to swap out animal fat in their diet for the vegetable variety and a great experiment on the American people was begun.
As Oliver, an expert on mass torts, points out, it is hard to ”think of any mass tort, or combination of mass torts, that has produced as much harm as the advice to change to a plant oil-based diet” may have done.
Some federal food-safety regulations have also harmed public health, such as the “poke and sniff” inspection method “that likely resulted in USDA inspectors transmitting filth from diseased meat to fresh meat on a daily basis.” The Obama administration has foolishly discouraged potato consumption, even though potatoes are highly nutritious, even as it has subsidized certain sugary and fatty foods, and promoted bad advice about salt.
March 14, 2013
In sp!ked, Robin Walsh debunks some of the scare factor from recent reports about antibiotic resistant diseases and the looming pandemic:
The UK’s chief medical officer (CMO), Professor Dame Sally Davies, made a splash in the media this week with her warning that antibiotic resistance is the new climate change. There is a ‘catastrophic threat’ of ‘untreatable’ diseases, she said, which promise to return us to a ‘nineteenth century’ state of affairs. The CMO has form: she warned the House of Commons health select committee about the same problem in similarly stringent terms back in January — a case not so much of apocalypse now, as apocalypse again.
As with all such stories, reading the actual CMO’s report leavens some of the hysterical excesses of the press, which were stoked up by the CMO’s excitable media appearances. Setting out the epidemiology of infectious diseases in the UK, the report highlights that while some drug-resistant infections, such as the well-known Clostridium difficile (C diff) and MRSA, are becoming less widespread, there is an increasing occurence of harder to treat multi-drug resistant bacterial infections, which, although still only in the hundreds of cases per year, are on the rise. The report states that only five antibiotics to fight such infections are currently in phase II or III trials, so the cupboard seems worryingly bare of new, necessary drugs.
So if we’re running short on drugs, how can we make more? A sensible article in the British Medical Journal from 2010 clearly set out the challenges facing the development of new antibiotics. Firstly, there are many regulatory hurdles that make running clinical trials in this area difficult. More importantly, there is a major financial disincentive for drug companies to develop antibiotics. Currently, drugs which are profitable are those for chronic conditions that are prescribed lifelong: painkillers for arthritis, diabetes drugs, and the like. A drug that you take once to cure you is unprofitable; doubly so if it is likely to be husbanded to prevent resistance developing until the patent runs out. A change in government payments to incentivise new antibiotics, like that which already applies to so-called ‘orphan’ drugs for rare diseases, would be an easy and rational step towards producing more drugs that meet our needs.
February 26, 2013
A mind-numbing case of bureaucratic error, death, and ass-covering in Haiti:
International affairs can be complicated, but sometimes a case comes along that’s so simple it’s almost absurd. In 2010, the United Nations made a horrendous mistake that, so far, has claimed more than 8,000 lives. Its officials tried to cover it up. When the evidence came out anyway, lawyers for victims’ families petitioned the U.N. to end the crisis, pay damages, and apologize. For a year and a half, the world’s leading humanitarian organization said nothing. Then, last week, it threw out the case, saying, “The claims are not receivable.”
The background should be well-known by now. But despite the fact that American taxpayers have footed the lion’s share of the bill for the U.N. peacekeepers responsible for this disaster — to the tune of roughly $1.5 billion since 2004 — the story remains largely unknown in the United States.
The place was Haiti. The mistake: a killer combination of cholera and gross negligence. The peacekeeping mission, known by its French initials, MINUSTAH, had been in country since 2004, when it was authorized to protect an interim government installed after a coup. Six years later — thanks to a healthy dose of mission creep — the peacekeepers were still there. While rotating troops into what was now post-quake Haiti, the U.N. neglected to adequately screen a contingent of soldiers coming from an active cholera outbreak in Nepal. Upon arrival, the soldiers were sent to a rural U.N. base, outside the quake zone and long known for leaking sewage into a major river system that millions of Haitians used to drink, bathe, wash, and farm. Within days of their arrival, people downstream began to die. The epidemic then exploded, sickening more than 647,000 people, and killing in its first year more than twice the number of people who died on 9/11.
February 18, 2013
Tim Worstall on some of the issues with demands that all British beef for human consumption be tested for horsemeat:
Now let’s turn to that meat problem. We’re going to test something to make sure that it is indeed what it says. Most of the time, usually, we’d go looking for beef DNA and on finding it say, yup, that’s beef.
But now we’re talking about trace amounts of other species. Some of this horse contamination is someone deliberately substituting, yes. But a lot of it, those trace amounts, is someone not cleaning the pipes between species being processed. Or the knives even. Which leads us to something of a problem.
How many species do we test for? Some minced beef… or pink slime perhaps. Do we test for beef and horse? For beef, horse, mutton, pork, chicken, duck, goose? What about rat and mouse? For I’ll guarantee you that however much people try there will often be the odd molecule of either one of those in there. Sparrow? That’s more of a problem with grain processing but still.
For example, one lovely story about vegetarianism. Those (umm, OK, some) who have moved from the sub-continent to the UK. They carry on eating the (possibly Hindu caste based) vegetarian diet they are used to. And they start falling prey to all sorts of dietary deficiencies. Anaemia, there have even been reports of kwashikor (a protein deficiency). The grains and the pulses of the sub-continent have rather more insect and other residue in them than our more modern processing and storage systems provide.
People don’t test for hedgehog DNA in meat supplies, no. But how many species should they test for?
February 17, 2013
February 15, 2013
It’s reasonable to be concerned that your hamburger may once have raced in the Grand National, but worries about chemical contamination from the horse meat are almost certainly overblown. In fact, your health might be more at risk from the burger itself:
There is reasonable public outrage at possible criminal conspiracies to adulterate meat products with horsemeat, and additional concerns raised about the presence of the anti-inflammatory known as bute.
While not in any way questioning this concern about adulteration with a chemical compound, it is helpful to get a sense of magnitude. When bute was given as a human medicine, it was reported to be associated with a serious adverse reaction in 1 in 30,000 (over a whole course of treatment), but at a dose giving concentrations at least 4,000 times that arising from eating a diet of horse meat – see the excellent information from the Science Media Centre
So making all sorts of heroic assumptions about there being a linear-no-threshold response, we might very roughly assign a pro-rata risk of a serious event as 1 in 100,000,000 per burger.
February 9, 2013
It is not an original thought to say that public health crusaders often resemble religious zealots, but seldom is the comparison more literal than in the case of Mike Rayner, director of the British Heart Foundation Health Promotion Research Group.
[. . .]
So far, so mundane. Another illiberal battler against the free market with a heightened sense of his own importance and his nose in the trough. The only point of interest is that Mr Raynor is a Church of England priest who is guided by voices.
In all of this I see a sacred dimension. You may not believe that I have heard God aright but I think God is calling me to work towards the introduction of soft-drink taxes in this country and I am looking forward to the day when General Synod debates the ethical issues surrounding this type of tax rather than some of the other issues that august body seems obsessed by.
Golly. Where to begin? On a theological note, I do wonder whether Jesus would really be in favour of a deeply regressive stealth tax that would take from the poor to give to the rich. Perhaps the reason the General Synod does not debate tax policy is because they recall the old “render under to Caesar…” message and realise that it’s none of their business.
If we weren’t already sceptical about the documents coming from Mr Rayner’s team of would-be policy-makers, the fact that its director believes that God has told him to bring about a fat tax in this land should be enough to make us suspect that a tiny bit of research bias might have crept into his work. Considering that the Almighty has approved of the policy, what are the chances of his loyal servant producing evidence that would question its efficacy?
Christopher Snowdon, “Fat tax campaigner: ‘God told me to do it’”, Velvet Glove, Iron Fist, 2012-05-21
February 4, 2013
In the Wall Street Journal, Jonathan Last looks at the demographic changes on tap for the United States as the fertility rate continues to drop below replacement:
The fertility rate is the number of children an average woman bears over the course of her life. The replacement rate is 2.1. If the average woman has more children than that, population grows. Fewer, and it contracts. Today, America’s total fertility rate is 1.93, according to the latest figures from the Centers for Disease Control and Prevention; it hasn’t been above the replacement rate in a sustained way since the early 1970s.
The nation’s falling fertility rate underlies many of our most difficult problems. Once a country’s fertility rate falls consistently below replacement, its age profile begins to shift. You get more old people than young people. And eventually, as the bloated cohort of old people dies off, population begins to contract. This dual problem — a population that is disproportionately old and shrinking overall — has enormous economic, political and cultural consequences.
For two generations we’ve been lectured about the dangers of overpopulation. But the conventional wisdom on this issue is wrong, twice. First, global population growth is slowing to a halt and will begin to shrink within 60 years. Second, as the work of economists Esther Boserups and Julian Simon demonstrated, growing populations lead to increased innovation and conservation. Think about it: Since 1970, commodity prices have continued to fall and America’s environment has become much cleaner and more sustainable — even though our population has increased by more than 50%. Human ingenuity, it turns out, is the most precious resource.
Low-fertility societies don’t innovate because their incentives for consumption tilt overwhelmingly toward health care. They don’t invest aggressively because, with the average age skewing higher, capital shifts to preserving and extending life and then begins drawing down. They cannot sustain social-security programs because they don’t have enough workers to pay for the retirees. They cannot project power because they lack the money to pay for defense and the military-age manpower to serve in their armed forces.
Update: Kelly McParland on the plight of some older workers: “If they’d never worked at all, and gotten by on social assistance, they might still have a financial lifeline.”
It would be cruel (and maybe unfair) to say they made their own beds, but it remains the fact that a great deal of the trouble they face results from the refusal to brook a more prudent approach to public finances for so many years. Programs that were unaffordable were pushed through time and again, paid for by more and more borrowing. When crises developed, the borrowing increased while spending was only rarely curtailed. The curse of deficit financing is its snowball effect: annual shortfalls pile up, pushing up the carrying costs, creating a self-perpetuating ever-expanding spending crisis. When a recession inevitably arrives, there are no reserves to deal with it, and even more borrowing ensues.
After so many decades of pretending it could go on forever, without there being a reckoning, the generation that created it is discovering how wrong they were. Not only is it destroying the retirement dreams of so many near-seniors, it’s preparing a poisoned legacy to hand to the next generation, and perhaps the one after that, unless they recognize the need for greater discipline and finally accept the pain that will necessary to put the process back on a sustainable track.
Canada is fortunate that it faced up to its debt crisis 15 years ago and is still benefiting from that fact, but the public memory is short and there will always be pressure to turn a blind eye to debt, and legislate for today. No wonder people get more conservative as they get older. They understand the price that has to be paid for putting costs off to tomorrow.