The Sierra Club campaign against coal is motivated by a desire to reduce CO2 emissions to prevent global warming. But since global warming skepticism and global warming fatigue are widespread, the club has opted for a junk science approach to reach its goals. The club tells people that their babies will die, or at least get asthma, if coal plants continue to operate. Although the cause of asthma is not known, it is suspected that it is related to the high levels of cleanliness in advanced countries that denies children and their immune systems exposure to the dirt and filth found in primitive places. This is known as the hygiene hypothesis. The incidence of asthma is about 50 times higher in developed countries compared to rural Africa. For all the Sierra Club knows, coal plants may prevent asthma. Given the hygiene hypothesis, that seems plausible.
With junk science, it is easy to scare people. There are many things that are bad for us that are present at low levels in the environment — for example, mercury, lead, radiation, or tobacco smoke. The junk science approach to trace toxins is to claim that if a high level of the bad thing would cause X people to get sick, then a level 10,000 times smaller must cause 1/10,000 as many people to get sick. Given 300 million people in the country, this math can give you thousands of people getting sick from low levels of mercury, lead, radiation, or secondhand tobacco smoke. This approach is known as the linear no threshold hypothesis.
The Sierra Club and its ally, the Environmental Protection Agency, lean on the small emissions of mercury from burning coal to work up a calculation of deaths from coal. They minimize the fact that much of the mercury falling on the U.S. comes from China, volcanoes, or even from burning dead bodies with mercury-based fillings in their teeth. Mercury pollution becomes an excuse to get rid of coal. Arguing the science behind such claims often degenerates into a paper chase about statistics and what studies are good or bad. From the bureaucratic point of view, the linear no threshold hypothesis is wonderful because it means that problems are never solved and there is always a need for more bureaucratic activity.
Norman Rogers, “Sierra Club at the Metropolitan Club”, American Thinker, 2011-11-11
November 12, 2011
QotD: The uses of junk science
November 7, 2011
“It is a sad state of affairs when an amazing feat of engineering is only seen in bleak environmental and misanthropic terms”
Nick Thorne wonders why the soon-to-be-completed road link between Peru and Brazil, the Interoceánica, has gotten such poor press in the west:
In July this year, an amazing feat of engineering and testament to the progress of Latin America went almost completely unreported in the Western media. With the opening of the Puente Billingshurst, a half-mile suspension bridge across the Madre Dios river, the interoceanic highway or Interoceánica neared completion. Soon, a long-held dream will finally come true: for the first time, a road will stretch all the way from the Pacific to the Atlantic, crossing the whole of South America.
The road starts in the Peruvian capital, Lima. It crosses the Andes, reaching a highest point of 4,850 metres (higher than Mont Blanc), then plunges into the rainforest, crosses several tributaries of the mighty Amazon, and after a total of 3,400 miles it reaches the Atlantic coast of Brazil. Much of the route has been in place for decades, but a 460-mile middle section was still missing. With the opening of the bridge, the road’s completion is in sight.
Comparisons have rightly been made between the Interoceánica and the first North American transcontinental railroad completed in 1869. In Brazil, the highway has already been dubbed ‘the road to China’. In 2009, China overtook the US as Brazil’s largest trading partner. Chinese trade will be able to use the Peruvian ports on the Pacific coast, cutting out a long detour via Cape Horn or the Panama Canal. While Brazil will be the main beneficiary, Peru will benefit as a middle-man. The think tank Bank Information Centre estimates that the highway will lead to a 1.5 per cent annual increase in GDP in Peru. The highway will facilitate greater regional integration and is a real symbol of Latin America’s economic awakening. A triumphant banner along the highway reads ‘once a promise, now a reality’. In 2006, a mere 3,500 people crossed the border from Peru to Brazil. By 2009, with the partial completion of the highway, this had already increased ten-fold to 35,000.
November 6, 2011
The “shale gale” blows away Canada’s illusions of being an “energy superpower”
Terence Corcoran pours cold water on the notion that this is the moment for Canada to become a major player in the world energy markets:
In recent weeks, Canada — a self-proclaimed global energy superpower — has been trying to throw its weight around over the Keystone XL pipeline, TransCanada Corp.’s $7-billion project to ship oil sands production from Alberta to Texas. In Houston on Tuesday, Natural Resources Minister Joe Oliver let the Americans know that Canada had other options. “What will happen if there wasn’t approval [of Keystone] — and we think there will be — is that we’ll simply have to intensify our efforts to sell the oil elsewhere.”
Canadian oil executives, who have a lot invested in the superpower notion, are also issuing aggressive-sounding statements aimed at the United States. A headline in The Globe and Mail Friday sounded like a threat: “Oil patch to U.S.: OK pipe or lose our oil.” The story didn’t quite back up the headline, but the sense was that Canada was developing alternatives and that China is the big alternative.
[. . .]
While Canadian government and industry officials have a lot invested in the idea of energy superpowerdom, few outside observers share the vision. Canada barely rates a mention in The Quest: Energy, Security and the Remaking of the Modern World, Daniel Yergin’s new book on the world energy market. A few pages are devoted to the oil sands, mostly to review the high costs and technical difficulties. “As the industry grows in scale, it will require wider collaboration on the R&D challenges, not only among oil companies and the province of Alberta, but also with Canada’s federal government.”
Far more impressive for the world’s energy future will be the impact of shale gas and shale oil. The “shale gale,” as Mr. Yergin calls it, has already transformed the U.S. gas market and shale oil could be next. Since Mr. Yergin’s book was written, the shale revolution has swept Europe and is about to transform China’s energy market.
November 3, 2011
October 31, 2011
QotD: Economics is not a “hard science”
The problem at base is that economics is not a branch of mathematics or statistics, no matter how much economists wish it was. Never forget that the economics equations you see, the pretty graphs and charts, are just educated guesses that are wrong more often than not — economists love the gloss of the hard sciences, but the truth is that the field is firmly placed among the philosophical and sociological disciplines. Economics is a study of human behavior more than anything else, with all the uncertainties and confusion that entails.
“Monty”, “DOOM: I like that Doom Doom Pow”, Ace of Spades H.Q., 2011-10-31
China’s increased output of scientific papers masks deeper problems
Colby Cosh linked to an interesting press release from the Chinese Academy of Sciences, which shows a surge in published papers from China, but a significant drop in the rate at which those papers are cited:
Chinese researchers published more than 1.2 million papers from 2006 to 2010 — second only to the United States but well ahead of Britain, Germany and Japan, according to data recently published by Elsevier, a leading international scientific publisher and data provider. This figure represents a 14 percent increase over the period from 2005 to 2009.
The number of published academic papers in science and technology is often seen as a gauge of national scientific prowess.
But these impressive numbers mask an uncomfortable fact: most of these papers are of low quality or have little impact. Citation per article (CPA) measures the quality and impact of papers. China’s CPA is 1.47, the lowest figure among the top 20 publishing countries, according to Elsevier’s Scopus citation database.
China’s CPA dropped from 1.72 for the period from 2005 to 2009, and is now below emerging countries such as India and Brazil. Among papers lead-authored by Chinese researchers, most citations were by domestic peers and, in many cases, were self-citations.
Being published is very important for sharing discoveries and advancing the careers of the scientists, but it’s more important that those publications be read and referenced by other scientists. Self-citations are akin to self-published works: it doesn’t guarantee that the work is useless, but it increases the chances that it is.
Perhaps worse than merely useless publication is the culture of corruption that has grown up around the scientific community:
In China, the avid pursuit of publishing sometimes gives rise to scientific fraud. In the most high-profile case in recent years, two lecturers from central China’s Jinggangshan University were sacked in 2010 after a journal that published their work admitted 70 papers they wrote over two years had been falsified.
[. . .]
A study done by researchers at Wuhan University in 2010 says more than 100 million U.S. dollars changes hands in China every year for ghost-written academic papers. The market in buying and selling scientific papers has grown five-fold in the past three years.
The study says Chinese academics and students often buy and sell scientific papers to swell publication lists and many of the purported authors never write the papers they sign. Some master’s or doctoral students are making a living by churning out papers for others. Others mass-produce scientific papers in order to get monetary rewards from their institutions.
Shipwrecks: salvage or preserve?
An article at the BBC website looks at some of the issues involving shipwrecks in international waters:
When a ship sinks and lives are lost, it is a tragedy for the families involved.
For the relatives of the dead, the ship becomes an underwater grave but as the years pass the wreck can become a site of archaeological interest.
In recent years technological innovations have allowed commercial archaeologists, decried by some as “treasure hunters”, to reach wrecks far below the surface.
[. . .]
In November 2001, the Unesco Convention on the Protection of Underwater Cultural Heritage was finally adopted.
But 10 years on, it still has not been ratified by the UK, France, Russia, China or the US, and commercial archaeologists continue to locate wrecks, remove their cargoes and sell them off.
“The convention has not been ratified yet because of the issues it throws up about the cost of implementing and policing it,” a spokesman for the UK Department for Culture, Media and Sport, says. “Discussions continue within government, but ratification is not currently seen as a priority.”
It’s telling that the convention has not been ratified by five of the nations most likely to have both the technology and the interest to take on major underwater archaeological or salvage projects.
Robert Yorke, chairman of the Joint Nautical Archaeology Policy Committee, argues the real reason the government, and the Ministry of Defence in particular, are not ratifying the convention was becayse of a misplaced fear about the implications for British warships around the world.
The internationally recognised concept of “sovereign immunity” means nations should not interfere with foreign warships.
Under the Military Remains Act 1986, a number of British warships around the world are protected, including several ships sunk during the Falklands conflict. Also covered are several German U-boats in UK waters.
October 29, 2011
Canadian Air and Space Museum to be evicted in favour of ice rinks
A sad tale at the CBC website about the impending eviction of the museum and other tenants of the historic (but not historically designated) DeHavilland plant in Downsview:
A building that played a major role in the production of aircraft for the Allies in their fight against Hitler during the Second World War is facing the wrecking ball.
It’s located in Toronto’s Downsview Park and is described in federal heritage documents simply as “CFB Plant .1, Building .1.”
Just one month after the federal government celebrated Canada’s aviation history by reintroducing the name, “Royal Canadian Air Force,” it was sending an eviction notice to a building where RCAF planes were assembled.
Built in 1929, the plant housed the operations of the de Havilland Aircraft company which provided 17 per cent of Canada’s planes during the war years.
[. . .]
David Soknacki, the chairman of Parc Downsview Park, says the building at 65 Carl Hall Road is not currently classified as a heritage building.
Up until Oct. 26., the Canada’s Historic Places website listed the facility as “a recognized Federal Heritage Building because of its historical associations and its architectural and environmental value.”
Then the listing disappeared.
H/T to Michael O’Connor Clarke for the link.
October 26, 2011
Mis-perception of relative risks
Gregg Easterbrook provides a good example of how difficult people often find to discern the relative weight of risks:
The first consideration is that both absolute numbers of football deaths and rates of death compared to participants are in long-term decline — mirroring the decline in many forms of risk in society. Age-adjusted rates of all deaths in the United States have declined for 10 consecutive years. Auto fatalities have been declining for more than a generation. Winning the War on War, an important new book by Joshua Goldstein [. . .] shows that despite the impression created by cable news, exposure to violence is in decline both in the United States and worldwide.
[. . .]
Data from the National Center for Catastrophic Sport Injury Research reflects a steady decline in deaths caused by football. Table 1 of the center’s most recent report shows that in the past decade, 34 high school, three pro and two college football players have died as the direct result of games or practices, with the primary cause of deaths being heat stroke. That is entirely awful — but much lower than the rate of a generation ago. In 1968 alone, 26 high school players died as a direct result of football; last year, the number was two. Table 3 of the report shows the direct fatality rate from high school football peaked at 2.6 deaths per 100,000 players in 1969 and declined steadily to 0.13 deaths per 100,000 in 2010. That means a 1968 high school football player was 20 times more likely to die than a 2010 player. (The main reason for declining deaths was that football helmets were improved to eliminate skull fractures.)
[. . .]
How to compare the slight risk of a terrible football outcome to other common risks experienced by the young? Consider the risk of being in a car. About 3,000 teens die each year in car crashes. There are about 21.3 million Americans between 15 and 19 years of age. Teens average about 146 miles driven per week, roughly 150 hours per year of driving. These figures yield a roughly one in 1 million chance that a teen will die in an hour of driving. The National Federation of State High School Associations reports that 1.1 million boys (and a few girls) played high school football last academic year. A typical high school football season would include, in games and practice, perhaps 75 hours of exposure to contact. That’s about 80 million total hours of exposure to contact on the part of high school football players. The National Center for Catastrophic Sport Injury Research reports a recent average of three deaths per year directly caused by high school football. That’s a roughly one in 27 million chance of a high school player dying from an hour of football contact.
These are all rough estimates. Taking them together, a teenager has a one in 1 million chance of dying in an hour behind the wheel, compared to a one in 27 million chance of dying in an hour of football contact. Being in pads on a football field is less deadly than driving to high school for class. Many contemporary parents, especially moms, might say, “I don’t want you playing football because it’s so dangerous, but it’s fine for you to drive to the mall.” As regards mortality, this misperceives the risks.
October 24, 2011
October 23, 2011
DARPA’s new project: space vampires
Yep. DARPA is hoping to release “swarming robot space vampires”* in geosynchronous orbit:
More than $300 billion worth of satellites are estimated to be in the geosynchronous orbit (GEO—22,000 miles above the earth). Many of these satellites have been retired due to normal end of useful life, obsolescence or failure; yet many still have valuable components, such as antennas, that could last much longer than the life of the satellite. When satellites in GEO “retire,” they are put into a GEO disposal or “graveyard” orbit. That graveyard potentially holds tens to more than a hundred retired satellites that have components that could be repurposed — with the willing knowledge and sanction of the satellite’s owner. Today, DoD deploys new, replacement satellites at high cost — one of the primary drivers of the high cost is the launch costs, which is dependent on the weight and volume of antennas. The repurposing of existing, retired antennas from the graveyard represents a potential for significant cost savings.
DARPA’s Phoenix program seeks to develop technologies to cooperatively harvest and re-use valuable components from retired, nonworking satellites in GEO and demonstrate the ability to create new space systems at greatly reduced cost. “If this program is successful, space debris becomes space resource,” said DARPA Director, Regina E. Dugan.
[. . .]
“Satellites in GEO are not designed to be disassembled or repaired, so it’s not a matter of simply removing some nuts and bolts,” said David Barnhart, DARPA program manager. “This requires new remote imaging and robotics technology and special tools to grip, cut, and modify complex systems, since existing joints are usually molded or welded. Another challenge is developing new remote operating procedures to hold two parts together so a third robotic ‘hand’ can join them with a third part, such as a fastener, all in zero gravity. For a person operating such robotics, the complexity is similar to trying to assemble via remote control multiple Legos at the same time while looking through a telescope.”
* “Swarming robot space vampires”, courtesy of jwz.org.
October 22, 2011
IPCC authors: “They are people who are at the top of their profession”
Whether you’re a global warming/climate change skeptic or not, Donna Laframboise has a book that might be of interest to you:
The people who write the IPCC’s report — which is informally known as the Climate Bible — are supposedly the crème de la crème of world science. Rajendra Pachauri, the person who has been the IPCC’s chairman since 2002, tells us this repeatedly. In 2007 he explained to a newspaper how his organization selects individuals to help write the Climate Bible: “These are people who have been chosen on the basis of their track record, on their record of publications, on the research that they have done,” he said. “They are people who are at the top of their profession.”
Two years later, when testifying before a committee of the U.S. Senate, Pachauri argued that “all rational persons” should be persuaded by the IPCC’s conclusions since his organization mobilizes “the best talent available across the world.”
[. . .]
A close look at the IPCC’s roster of authors reveals that — on a wide range of topics including hurricanes, sea-level rise, and malaria — some of the world’s most seasoned specialists have been left out in the cold. In their stead, the IPCC has been recruiting 20-something graduate students.
For example, Laurens Bouwer is currently employed by an environmental studies institute at the VU University Amsterdam. In 1999-2000, he served as an IPCC lead author before earning his Masters degree in 2001.
How can a young man without even a master’s degree become an IPCC lead author? Bouwer’s expertise is in climate change and water resources. Yet the chapter for which he first served as a lead author was titled Insurance and Other Financial Services.
It turns out that, during part of 2000, Bouwer was a trainee at Munich Reinsurance Company. This means the IPCC chose as a lead author someone who was a trainee, who lacked a master’s degree, and was still a full decade away from receiving his 2010 PhD.
October 21, 2011
Neuroscientists and neurononsense
Stuart Derbyshire and Nina Powell review Cordelia Fine’s Delusions of Gender:
Given that objective measures show gender differences are in decline, it is surprising that there has been such an increase in books and reports describing hard-wired differences in male and female brains and that so many people are using them to explain why men and women live different lives. The most famous British example is Simon Baron-Cohen who has extrapolated from his research on autism (a predominantly male disorder) the more general conclusion that the female brain is predominantly hard-wired for empathy while the male brain is predominantly hard-wired for understanding and building systems.
Cordelia Fine’s Delusions of Gender brilliantly demolishes these overly simplistic and, essentially, wrong conclusions about male and female brains. She does this in two different ways. First, she points out that supposedly fixed differences between men and women are quite plastic. For example, merely asking men to consider the social value and benefits of empathising will lead them to be more empathic. And when men are paid to detect and correctly identify emotional states they perform as well as women. Similarly, when women are told that women perform better on a spatial rotation task their performance matches those of men. It appears that male and female differences in task performance can be fairly easily overcome by changing the motivation to do well or by changing the way the task is framed. That doesn’t sound like something hard-wired or fixed in the brain.
[. . .]
Fine also points to a problem that is, perhaps, more important. The brain is a complicated organ that we barely understand in anything but the most basic detail. Furthermore, brain imaging is a technology that is in its infancy and the data generated by imaging is also highly complicated. A typical brain imaging study will generate a matrix involving hundreds of thousands of numbers replicated across time and people. Analysis of these kinds of data sets is difficult, tedious and complicated, often requiring many years of experience and containing a surprising element of subjectivity and argument about what is the right and wrong thing to do. It is perhaps understandable that brain imaging throws up contradictory results and that brain researchers reach contradictory conclusions. Fine notes that this can lead to theories about brain function being untouched by the collection of brain activation data:
‘As the contradictory data come in, researchers can draw on both the hypothesis that men are better at mental rotation because they use just one hemisphere, as well as the completely contrary hypothesis that men are better at mental rotation because they use both hemispheres. So flexible is the theoretical arrangement that researchers can even present these opposing hypotheses, quite without embarrassment, within the very same article.’
It is a strange science where exactly opposite data support the same interpretation. Fine’s conclusion is scathing. She suggests that neuroscientists are merely projecting cultural assumptions about the sexes on to the vast unknown that is the brain. This process she dismisses as ‘neurosexism’, which is part of a larger discipline called ‘neurononsense’. It is hard to argue.
October 20, 2011
Polls indicate 50% of Americans now support legalizing marijuana
Cue all the “what are they smoking?” jokes:
Once in office, Jimmy Carter didn’t abandon his temperate approach to cannabis. He proposed that the federal government stop treating possession of small amounts as a crime, making a sensible but novel argument: “Penalties against possession of a drug should not be more damaging to an individual than the use of the drug itself.”
Nothing came of it, of course. Carter’s logic was unassailable even 35 years ago, but it has yet to be translated into federal policy. The American experience with prohibition of alcohol proved that we are capable of learning from our mistakes. The experience with prohibition of marijuana proves that we are also capable of doing just the opposite.
The stupidity and futility of the federal war on weed, however, has slowly permeated the mass consciousness. This week, the Gallup organization reported that fully 50 percent of Americans now think marijuana should be made legal. This is the first time since Gallup began asking in 1969 that more Americans support legalization than oppose it.
[. . .]
Over the past 30 years, federal spending to fight drugs has risen seven times over, after inflation. Since 1991, arrests for possession of pot have nearly tripled. But all for naught.
As a report last year by the International Centre for Science in Drug Policy noted, more high school students and young adults get high today than 20 years ago. More than 16 million Americans smoke dope at least once a month. Pot is just as available to kids as it ever was, and cheaper than before.
If we had gotten results like this after reducing enforcement, the new policy would be blamed. But politicians who support the drug war never consider that their remedies may be aggravating the disease. They follow the customary formula for government programs: If it works, spend more on it, and if it fails, spend more on it.
October 19, 2011
Regulators back off (temporarily)
Walter Olson reports on some recent regulatory retreats:
Are there enough data points yet to call it a trend? I think there are: the Environmental Protection Agency is now backing off a whole series of deeply unpopular Obama-era initiatives. This time it’s the idea of tightening the federal standard for coarse airborne particulates — better known as “dust” — from the current 150 micrograms per cubic meter to a figure somewhere between 65 and 85, depending on what assumptions are used. That change could have dealt a tough economic blow to businesses, notably farms and ranches, that kick up quantities of dirt in the ordinary course of operation. Unfortunately, the EPA — unable to resist the urge to lash out against its critics — is being less than candid about its latest turnabout.
The retreats have been coming steadily in recent months, since President Obama’s popularity ratings began to tank. In July, following protests from Sen. Olympia Snowe (R-ME) and other lawmakers, the administration dropped a proposal that would have required lead-dust lab testing as part of even relatively minor renovations to older homes. Last month it scuttled costly new smog regulations. A couple of weeks ago it relaxed its so-called Cross-State Air Pollution Rule, which was menacing the continued operation of power plants. And it remains under heavy pressure to scrap its ultra-expensive “Boiler MACT” rules, another utility nemesis.
EPA administrator Lisa Jackson has made it clear that she isn’t happy about some of these about-faces, and her staff spun the latest dust decision in remarkably graceless fashion, accusing critics of spreading “myths” and claiming the agency never had any intention of going after farm dust in the first place.



