[W]hat exactly is the HDI? The one-line explanation is that it gives “equal weights” to GDP per capita, life expectancy, and education. But it’s more complicated than that, because scores on each of the three measures are bounded between 0 and 1. This effectively means that a country of immortals with infinite per-capita GDP would get a score of .666 (lower than South Africa and Tajikistan) if its population were illiterate and never went to school.
So what are the main problems with the HDI?
1. I can see giving equal weights to GDP per capita and life expectancy. But education? As a professor and a snob, I understand the appeal (though a measure of opera consumption would be even better). But in terms of the actual if not professed values of normal human beings, televisions and cars are a lot more important than books.
2. When you take a closer look at the HDI’s education measure, it’s especially bogus. 2/3rds of the weight comes from the literacy rate. At least that’s not ridiculous. But the other 1/3 comes from the Gross Enrollment Index — the fraction of the population enrolled in primary, secondary, or tertiary education. OK, I feel a reductio ad absurdum coming on. To max out your education score, you have to turn 100% of your population into students!
3. The HDI purportedly gives equal weights to three different outcomes, but bounding the results between 0 and 1 builds in a massive bias against GDP. GDP per capita has grown fantastically during the last two centuries, and will continue to do so. In reality, there’s plenty of room left for further improvement even in rich countries. But the HDI doesn’t allow this. Since rich countries are already close to the upper bound, the HDI effectively defines their future progress on this dimension out of existence.
To a lesser extent, the same goes for life expectancy: While it’s roughly doubled over the last two centuries, dying at 85 is not, contrary to the HDI, approximately equal in value to immortality.
The clear winners from this weighting scheme, of course, are the literacy and enrollment measures, both of which have upper bounds that are imposed by logic rather than fiat.
4. The ultimate problem with the HDI, though, is lack of ambition. It effectively proclaims an “end of history” where Scandinavia is the pinnacle of human achievement. […] Scandinavia comes out on top according to the HDI because the HDI is basically a measure of how Scandinavian your country is.
Bryan Caplan, “Against the Human Development Index”, Econlog, 2009-05-22.
January 26, 2015
January 10, 2015
Megan McArdle explains why healthcare costs more than you think it should:
Milton Friedman famously divided spending into four kinds, which P.J. O’Rourke once summarized as follows:
- You spend your money on yourself. You’re motivated to get the thing you want most at the best price. This is the way middle-aged men haggle with Porsche dealers.
- You spend your money on other people. You still want a bargain, but you’re less interested in pleasing the recipient of your largesse. This is why children get underwear at Christmas.
- You spend other people’s money on yourself. You get what you want but price no longer matters. The second wives who ride around with the middle-aged men in the Porsches do this kind of spending at Neiman Marcus.
- You spend other people’s money on other people. And in this case, who gives a [damn]?
Most health-care spending in the U.S. falls into category three. In theory, the people who are funding our expenses — the proverbial middle-aged men in Porsches, except that they’re actually insurance executives and government bureaucrats — have every incentive to step in, cut up the charge cards, and substitute a gift-wrapped box of Hanes briefs with the comfort-soft waistband. In practice, legislators frequently intervene to stop them from exercising much cost-control. The managed care revolution of the 1990s died when patients complained to their representatives, and the representatives ran down to their offices to pass laws making it very hard to deny coverage for anything anyone wanted. Medicare cost-controls, such as the famed Sustainable Growth Rate, fell prey to similar maneuvers. The only system that exhibits sustained cost control is Medicaid, because poor people don’t vote, or exit the system for better insurance.
The result is a system where everyone complains that we spend much too much on health care — and the very same people get indignant if anyone suggests that they, personally, should maybe spend a little bit less. Everyone wants to go to heaven — but nobody wants to die.
Unfortunately, this is what cost-control actually looks like, which is to say, like people not being able to spend as much on health care. Oh, to be sure, we could achieve this end differently — instead of asking patients to pay a modest share of their own costs (the article suggests that this amount is less than 10 percent, in the case of Harvard professors) — we could simply set a schedule of covered treatment, and deny patients access to off-schedule treatments, or even better, not even tell them that those treatments exist. But people don’t like that solution either, which is why medical dramas are filled with rants about insurers who won’t cover procedures, and the law books are filled with regulations that sharply curtail the ability of insurers to ration care. And the third option, refusing to pay top-dollar for care, would be a bit tricky for Harvard to implement, given that they run exactly the sort of high-cost research facilities that help drive health-care costs skyward. Nor do I really think that the angry professors would be mollified by being given a cheap insurance package that wouldn’t let them go see the top-flight specialists their elite status now entitles them to access.
Instead, they persist in our mass delusion: that there is some magic pot of money in the health-care system, which can be painlessly tapped to provide universal coverage without dislocating any of the generous arrangements that insured people currently enjoy. Just as there are no leprechauns, there is no free money at the end of the rainbow; there are patients demanding services, and health-care workers making comfortable livings, who have built their financial lives around the expectation that those incomes will continue. Until we shed this delusion, you can expect a lot of ranting and raving about the hard truths of the real world.
December 22, 2014
• The psychosocial causes and consequences of online video game play were evaluated.
• Over a 1- and 2-year period, evidence for social compensation processes were found.
• Among young adults, online games appear to be socially compensating spaces.
• No significant displacement or compensation patterns were found for adolescents.
• No significant displacement or compensation patterns were found for older adults.
Due to its worldwide popularity, researchers have grown concerned as to whether or not engagement within online video gaming environments poses a threat to public health. Previous research has uncovered inverse relationships between frequency of play and a range of psychosocial outcomes, however, a reliance on cross-sectional research designs and opportunity sampling of only the most involved players has limited the broader understanding of these relationships. Enlisting a large representative sample and a longitudinal design, the current study examined these relationships and the mechanisms that underlie them to determine if poorer psychosocial outcomes are a cause (i.e., pre-existing psychosocial difficulties motivate play) or a consequence (i.e., poorer outcomes are driven by use) of online video game engagement. The results dispute previous claims that online game play has negative effects on the psychosocial well-being of its users and instead indicate that individuals play online games to compensate for pre-existing social difficulties.
November 22, 2014
In The Diplomat, James R. Holmes says that we can learn a lot about fighting infectious diseases like ebola by reading what Thucidides wrote about the plague that struck Athens during the opening stages of the Peloponnesian War:
Two panelists from our new partner institution, a pair of Africa hands, offered some striking reflections on the fight against Ebola.
Their presentations put in me in the mind of … classical Greece. Why? Mainly because of Thucydides. Thucydides’ history of the Peloponnesian War isn’t just a (partly) eyewitness account of a bloodletting from antiquity; it’s the Good Book of politics and strategy. Undergraduates at Georgia used to look skeptical when I told them they could learn ninety percent of what they needed to know about bareknuckles competition from Thucydides. The remainder? Technology, tactics, and other ephemera. Thucydides remains a go-to source on the human factor in diplomacy and warfare.
But I digress. Ancient Greece suffered its own Ebola outbreak, a mysterious plague that struck Athens oversea during the early stages of the conflict. And the malady struck, perchance, at precisely the worst moment for Athens, after “first citizen” Pericles had arranged for the entire populace of Attica, the Athenian hinterland, to withdraw within the city walls. The idea was to hold the fearsome Spartan infantry at bay with fixed fortifications while the Athenian navy raided around the perimeter of the Spartan alliance.
That’s where the parallel between then and now becomes poignant. Thucydides notes, for example, that doctors died “most thickly” from the plague. The Brown presenters noted that, likewise, public-health workers in Africa — doctors, nurses, stretcher-bearers — are among the few to deliberately make close contact with the stricken. Relief teams, consequently, take extravagant precautions to quarantine the disease within makeshift facilities while shielding themselves from contagion. Sometimes these measures fail.
Now as in ancient Greece, furthermore, the prospect of disease and death deters some would-be healers altogether from succoring the afflicted. Selflessness has limits. Some understandably remain aloof — today as in Athens of yesteryear.
Teams assigned to bury the slain also find themselves in dire peril. Perversely, the dead from Ebola are more contagious than living hosts. That makes disposing of bodies in sanitary fashion a top priority. As the plague ravaged Athens, similarly, corpses piled up in the streets. No one would perform funeral rites — even in this deeply religious society. Classicist Victor Davis Hanson ascribes some of Athens’ barbarous practices late in the war — such as cutting off the hands of captured enemy seamen to keep them from returning to war — in part to the plague’s debasing impact on morals, ethics, and religion.
October 14, 2014
Published on 14 Oct 2014
President Obama is sending thousands of U.S. troops to West Africa to fight the deadly Ebola virus. Their mission will be to construct treatment centers and provide medical training to health-care workers in the local communities.
But is it really a good idea to send soldiers to provide this sort of aid?
Here are 3 reasons why militarizing humanitarian aid is a very bad idea
October 13, 2014
David Axe on what he describes as the weirdest ship in the Royal Navy:
The British Royal Navy is deploying the auxiliary ship RFA Argus to Sierra Leone in West Africa in order to help health officials contain the deadly Ebola virus.
If you’ve never heard of Argus, you’re not alone. She’s an odd, obscure vessel — an ungainly combination of helicopter carrier, hospital ship and training platform.
But you’ve probably seen Argus, even if you didn’t realize it. The 33-year-old vessel played a major role in the 2013 zombie movie World War Z, as the floating headquarters of the U.N.
The 575-foot-long Argus launched in 1981 as a civilian container ship. In 1982, the Royal Navy chartered the vessel to support the Falklands War … and subsequently bought her to function as an aviation training ship, launching and landing helicopters.
Argus’ long flight deck features an odd, interrupted layout, with a structure — including the exhaust stack — rising out of the deck near the stern.
Weirdly, the deck’s imperfect arrangement is actually an asset in the training role. Student aviators on Argus must get comfortable landing in close proximity to obstacles, which helps prepare them for flying from the comparatively tiny decks of frigates and other smaller ships.
October 8, 2014
In Forbes, Trevor Butterworth looks at an odd data analysis piece where the “fix” for a discrepancy in reported drinks per capita is to just assume everyone under-reported and to double that number:
“Think you drink a lot? This chart will tell you.”
The chart, reproduced below breaks down the distribution of drinkers into deciles, and ends with the startling conclusion that 24 million American adults — 10 percent of the adult population over 18 — consume a staggering 74 drinks a week.
The source for this figure is “Paying the Tab,” by Phillip J. Cook, which was published in 2007. If we look at the section where he arrives at this calculation, and go to the footnote, we find that he used data from 2001-2002 from NESARC, the National Institute on Alcohol Abuse and Alcoholism, which had a representative sample of 43,093 adults over the age of 18. But following this footnote, we find that Cook corrected these data for under-reporting by multiplying the number of drinks each respondent claimed they had drunk by 1.97 in order to comport with the previous year’s sales data for alcohol in the US. Why? It turns out that alcohol sales in the US in 2000 were double what NESARC’s respondents — a nationally representative sample, remember — claimed to have drunk.
While the mills of US dietary research rely on the great National Health and Nutrition Examination Survey to digest our diets and come up with numbers, we know, thanks to the recent work of Edward Archer, that recall-based survey data are highly unreliable: we misremember what we ate, we misjudge by how much; we lie. Were we to live on what we tell academics we eat, life for almost two thirds of Americans would be biologically implausible.
But Cook, who is trying to show that distribution is uneven, ends up trying to solve an apparent recall problem by creating an aggregate multiplier to plug the sales data gap. And the problem is that this requires us to believe that every drinker misremembered by a factor of almost two. This might not much of a stretch for moderate drinkers; but did everyone who drank, say, four or eight drinks per week systematically forget that they actually had eight or sixteen? That seems like a stretch.
We are also required to believe that just as those who drank consumed significantly more than they were willing to admit, those who claimed to be consistently teetotal never touched a drop. And, we must also forget that those who aren’t supposed to be drinking at all are also younger than 18, and their absence from Cook’s data may well constitute a greater error.
The number of reported cases of polio is now the highest it has been for more than a decade, and at least some of the blame has to go to the CIA for using health workers as a cover for some of their covert operations.
As world health officials struggle to respond to the Ebola epidemic, Pakistan has passed a grim milestone in its efforts to combat another major global health crisis: the fight against polio.
Over the weekend, Pakistan logged its 200th new polio case of 2014, the nation’s highest transmission rate in more than a dozen years. The spread has alarmed Pakistani and international health experts and is prompting fresh doubt about the country’s ability to combat this or future disease outbreaks.
By Tuesday, the number of new polio cases in Pakistan stood at 202, and officials are bracing for potentially dozens of other cases by year’s end. Pakistan now accounts for 80 percent of global cases and is one of only three countries at risk of exporting the disease outside its borders, according to the World Health Organization.
In far-flung areas of the country, some parents and religious leaders are skeptical of the vaccine, requiring considerable face-to-face outreach by vaccination teams.
But the Pakistani Taliban and other Islamist militants have waged a brutal campaign against those teams, killing more than 50 health workers and security officials since 2012. The attacks began after it was discovered that the CIA had used a vaccination campaign to gain information about Osama bin Laden’s whereabouts.
September 16, 2014
Rob Lyons charts the way our governments and healthcare experts got onboard the anti-fat dietary express, to our long-lasting dietary harm:
… in recent years, the advice to eat a low-fat diet has increasingly been called into question. Despite cutting down on fatty foods, the populations of many Western countries have become fatter. If heart-disease mortality has maintained a steady decline, cases of type-2 diabetes have shot up in recent years. Maybe these changes were in spite of the advice to avoid fat. Maybe they were caused by that advice.
The most notable figure in providing the intellectual ammunition to challenge existing health advice has been the US science writer, Gary Taubes. His 2007 book, Good Calories, Bad Calories, became a bestseller, despite containing long discussions on some fairly complex issues to do with biochemistry, nutrition and medicine. The book’s success triggered a heated debate about what really makes us fat and causes chronic disease.
The move to first discussing and then actively encouraging a low-fat diet was largely due to the work of Dr. Ancel Keys, who is to the low-fat diet movement what Karl Marx is to Communism. His energy, drive, and political savvy helped get the US government and the majority of health experts onboard and pushing his advice. A significant problem with this is that Keys’ advocacy was not statistically backed by even his own data. He drew strong conclusions from tiny, unrepresentative samples, yet managed to persuade most doubters that he was right. A more statistically rigorous analysis might well show that the obesity crisis has actually been driven by the crusading health advisors who have been pushing the low-fat diet all this time … or, as I termed it, “our Woody Allen moment“.
Rob Lyons discussed this with Nina Teicholz, author of the book The Big Fat Surprise:
Once the politically astute Keys had packed the nutrition committee of the AHA and got its backing for the advice to avoid saturated fat, the war on meat and dairy could begin. But a major turning point came in 1977 when the Senate Select Committee on Nutrition, led by Democratic senator George McGovern, held hearings on the issue. The result was a set of guidelines, Dietary Goals for the United States [PDF], which promoted the consumption of ‘complex’ carbohydrates, and reductions in the consumption of fat in general and saturated fat in particular.
By 1980, this report had been worked up into government-backed guidelines — around the same time that obesity appears to have taken off in the US. The McGovern Report inspired all the familiar diet advice around the world that we’ve had ever since, and led to major changes in what food manufacturers offered. Out went fat, though unsaturated fat and hydrogenated oils were deemed less bad than saturated fat, so vegetable oils and margarines became more popular. In came more carbohydrate and more sugar, to give those cardboard-like low-fat ‘treats’ some modicum of flavour.
Yet two recent reviews of the evidence around saturated fat — one led by Ronald Krauss, the other by Rajiv Chowdhury — suggest that saturated fat is not the villain it has been painted as. (The latter paper, in particular, sparked outrage.) As for fat in general, Teicholz tells me: ‘There was no effort until very late in the game to provide evidence for the low-fat diet. It was just assumed that that was reasonable because of the caloric benefit you would see from restricting fat.’
Teicholz also debunks the wonderful reputation of the Mediterranean Diet (“a rose-tinted version of reality tailored to the anti-meat prejudices of American researchers”), points out the role of the olive oil industry in pushing the diet (“Swooning researchers were literally wined and dined into going along with promoting the benefits of olive oil”), and points out that we can’t even blame most of the obesity problem on “Big Food”:
Which leads us to an important third point made by Teicholz: that the blame for our current dietary problems cannot solely, or even mainly, be placed at the door of big food corporations. Teicholz writes about how she discovered that ‘the mistakes of nutrition science could not be primarily pinned on the nefarious interests of Big Food. The source of our misguided dietary advice was in some ways more disturbing, since it seems to have been driven by experts at some of our most trusted institutions working towards what they believed to be the public good.’ Once public-health bureaucracies enshrined the dogma that fat is bad for us, ‘the normally self-correcting mechanism of science, which involves constantly challenging one’s own beliefs, was disabled’.
The war on dietary fat is a terrifying example of what happens when politics and bureaucracy mixes with science: provisional conclusions become laws of nature; resources are piled into the official position, creating material as well as intellectual reasons to continue to support it; and any criticism is suppressed or dismissed. As the war on sugar gets into full swing, a reading of The Big Fat Surprise might provide some much-needed humility.
September 2, 2014
After all the salt uproar over the last year or so, perhaps it was inevitable that other public health consensus items would also come under scrutiny. Here’s Ace having a bit of fun with the latest New York Times report on fat and carbohydrates in the modern diet:
One day there will be a book written about this all — how a “Consensus of Experts” decided, against all previous wisdom and with virtually no evidence whatsoever, that Fat Makes You Fat and you can Eat All the Carbohydrates You Like Because Carbohydrates Are Healthy.
This never made a lick of sense to me, even before I heard of the Atkins diet.
Sugar is a carbohydrate. Indeed, it’s the carbohydrate, the one that makes up the others (such as starches, which are just long lines of sugar molecules arranged into sheets and folded over each other).
How the hell could it possibly be that Fat was Forbidden but SUGAR was Sacred?
It made no sense. A long time ago I tried to get a nutritionist to explain this to me. “Eat more fruit,” the nutritionist said.
“Fruit,” I answered, “is sugar in a ball.”
But the nutritionist had an answer. “That is fruit sugar,” the she told me.
“Fruit sugar,” I responded, “is yet sugar.”
“But it’s not cane sugar.”
“I don’t think the body really cares much about which particular plant the sugar comes from.”
“Sugar from a fruit,” the nutritionist now gambited, “is more natural than processed sugar.”
“They’re both natural, you know. We don’t synthesize sucrose in a lab. There are no beakers involved.”
“Well, you burn fruit sugar up quicker, so it actually gives you energy, instead of turning into fat!”
“Both sugars are converted into glycogen in the body. There can be no difference in how they produce ‘energy’ in the body because both wind up as glycogen. I have no idea where you’re getting any of this. It sounds like you’re making it all up as you go.”
“This is Science,” the nutritionist closed the argument.
Eh. It’s all nonsense. Even cane sugar contains, yes, fructose, or fruit sugar, and fruits contain sucrose, or cane sugar.
August 13, 2014
Matt Ridley is somewhat uncharacteristically concerned about the major Ebola outbreak in west Africa:
As you may know by now, I am a serial debunker of alarm and it usually serves me in good stead. On the threat posed by diseases, I’ve been resolutely sceptical of exaggerated scares about bird flu and I once won a bet that mad cow disease would never claim more than 100 human lives a year when some “experts” were forecasting tens of thousands (it peaked at 28 in 2000). I’ve drawn attention to the steadily falling mortality from malaria and Aids.
Well, this time, about ebola, I am worried. Not for Britain, Europe or America or any other developed country and not for the human race as a whole. This is not about us in rich countries, and there remains little doubt that this country can achieve the necessary isolation and hygiene to control any cases that get here by air before they infect more than a handful of other people — at the very worst. No, it is the situation in Liberia, Sierra Leone and Guinea that is scary. There it could get much worse before it gets better.
This is the first time ebola has got going in cities. It is the first time it is happening in areas with “fluid population movements over porous borders” in the words of Margaret Chan, the World Health Organisation’s director-general, speaking last Friday. It is the first time it has spread by air travel. It is the first time it has reached the sort of critical mass that makes tracing its victims’ contacts difficult.
One of ebola’s most dangerous features is that kills so many health workers. Because it requires direct contact with the bodily fluids of patients, and because patients are violently ill, nurses and doctors are especially at risk. The current epidemic has already claimed the lives of 60 healthcare workers, including those of two prominent doctors, Samuel Brisbane in Liberia and Sheik Umar Khan in Sierra Leone. The courage of medics in these circumstances, working in stifling protective gear, is humbling.
July 23, 2014
In the ongoing investigation into why Westerners — especially North Americans — became obese, some of the early studies are being reconsidered. For example, I’ve mentioned the name of Dr. Ancel Keys a couple of times recently: he was the champion of the low-fat diet and his work was highly influential in persuading government health authorities to demonize fat in pursuit of better health outcomes. He was so successful as an advocate for this idea that his study became one of the most frequently cited in medical science. A brilliant success … that unfortunately flew far ahead of its statistical evidence:
So Keys had food records, although that coding and summarizing part sounds a little fishy. Then he followed the health of 13,000 men so he could find associations between diet and heart disease. So we can assume he had dietary records for all 13,000 of them, right?
Uh … no. That wouldn’t be the case.
The poster-boys for his hypothesis about dietary fat and heart disease were the men from the Greek island of Crete. They supposedly ate the diet Keys recommended: low-fat, olive oil instead of saturated animal fats and all that, you see. Keys tracked more than 300 middle-aged men from Crete as part of his study population, and lo and behold, few of them suffered heart attacks. Hypothesis supported, case closed.
So guess how many of those 300-plus men were actually surveyed about their eating habits? Go on, guess. I’ll wait …
And the answer is: 31.
Yup, 31. And that’s about the size of the dataset from each of the seven countries: somewhere between 25 and 50 men. It’s right there in the paper’s data tables. That’s a ridiculously small number of men to survey if the goal is to accurately compare diets and heart disease in seven countries.
Getting the picture? Keys followed the health of more than 300 men from Crete. But he only surveyed 31 of them, with one of those surveys taken during the meat-abstinence month of Lent. Oh, and the original seven-day food-recall records weren’t available later, so he swapped in data from an earlier paper. Then to determine fruit and vegetable intake, he used data sheets about food availability in Greece during a four-year period.
And from this mess, he concluded that high-fat diets cause heart attacks and low-fat diets prevent them.
Keep in mind, this is one of the most-cited studies in all of medical science. It’s one of the pillars of the Diet-Heart hypothesis. It helped to convince the USDA, the AHA, doctors, nutritionists, media health writers, your parents, etc., that saturated fat clogs our arteries and kills us, so we all need to be on low-fat diets – even kids.
Yup, Ancel Keys had a tiny one … but he sure managed to screw a lot of people with it.
H/T to Amy Alkon for the link.
June 12, 2014
France, for all its faults, has genuinely federalized food: a distinctive cheese every 20 miles down the road. In America, meanwhile, the food nannies are lobbying to pass something called the National Uniformity for Food Act. There’s way too much of that already.
The federalization of food may seem peripheral to national security issues, and the taste of American milk — compared with its French or English or even Québécois equivalents — may seem a small loss. But take almost any area of American life: what’s the more common approach nowadays? The excessive government regulation exemplified by American cheese or the spirit of self-reliance embodied in the Second Amendment? On a whole raft of issues from health care to education the United States is trending in an alarmingly fromage-like direction.
Mark Steyn, “Live Brie or Die!” SteynOnline.com, 2014-03-13
June 1, 2014
And in The Economist this week:
Ms Teicholz describes the early academics who demonised fat and those who have kept up the crusade. Top among them was Ancel Keys, a professor at the University of Minnesota, whose work landed him on the cover of Time magazine in 1961. He provided an answer to why middle-aged men were dropping dead from heart attacks, as well as a solution: eat less fat. Work by Keys and others propelled the American government’s first set of dietary guidelines, in 1980. Cut back on red meat, whole milk and other sources of saturated fat. The few sceptics of this theory were, for decades, marginalised.
But the vilification of fat, argues Ms Teicholz, does not stand up to closer examination. She pokes holes in famous pieces of research — the Framingham heart study, the Seven Countries study, the Los Angeles Veterans Trial, to name a few — describing methodological problems or overlooked results, until the foundations of this nutritional advice look increasingly shaky.
The opinions of academics and governments, as presented, led to real change. Food companies were happy to replace animal fats with less expensive vegetable oils. They have now begun abolishing trans fats from their food products and replacing them with polyunsaturated vegetable oils that, when heated, may be as harmful. Advice for keeping to a low-fat diet also played directly into food companies’ sweet spot of biscuits, cereals and confectionery; when people eat less fat, they are hungry for something else. Indeed, as recently as 1995 the AHA itself recommended snacks of “low-fat cookies, low-fat crackers…hard candy, gum drops, sugar, syrup, honey” and other carbohydrate-laden foods. Americans consumed nearly 25% more carbohydrates in 2000 than they had in 1971.
It would be ironic indeed if the modern obesity crisis was actually caused by government dietary recommendations intended to improve public health (and fatten the bottom lines of big agribusiness campaign donors).
May 5, 2014
James Conca on a recent UN report that isn’t getting attention:
It’s always amazing when a United Nations report that has global ramifications comes out with little fanfare. The latest one states that no one will get cancer or die from radiation released from Fukushima, but the fear and overreaction is harming people (UNIS; UNSCEAR Fukushima; UNSCEAR A-68-46 [PDF]). This is what we’ve been saying for almost three years but it’s nice to see it officially acknowledged.
According to the report, drafted last year but only recently finalized by the U.N., “The doses to the general public, both those incurred during the first year and estimated for their lifetimes, are generally low or very low. No discernible increased incidence of radiation-related health effects are expected among exposed members of the public or their descendants. The most important health effect is on mental and social well-being, related to the enormous impact of the earthquake, tsunami and nuclear accident, and the fear and stigma related to the perceived risk of exposure to ionizing radiation. Effects such as depression and post-traumatic stress symptoms have already been reported.”
In addition, the report states, “Increased rates of detection of [thyroid] nodules, cysts and cancers have been observed during the first round of screening; however, these are to be expected in view of the high detection efficiency [using modern high-efficiency ultrasonography]. Data from similar screening protocols in areas not affected by the accident imply that the apparent increased rates of detection among children in Fukushima Prefecture are unrelated to radiation exposure.”
So the Japanese people can start eating their own food again, and moving back into areas contaminated with radiation levels similar to many areas of the world like Colorado and Brazil, which includes most of the exclusion zone. Only a few places shouldn’t be repopulated.
But if you want to continue feeling afraid, and want to make sure others keep being afraid, by all means ignore this report on Fukushima. But then you really can’t keep quoting previous UNSCEAR policy and application of LNT (the Linear No-Threshold dose hypothesis) to support more fear.
Note – LNT is a leftover Cold War ideology that states all radiation is bad, even the background radiation we are bathed in every day, even the 3,200 pCi of radiation in a bag of potato chips (yes, potato chips have the most radioactivity of any food, but they taste sooo good!).
Of course, if you’ve been actually following the events from three years back, this report will contain few surprises.