Quotulatiousness

November 13, 2013

The environmental damage from “green” ethanol production

Filed under: Economics, Environment, Government, USA — Tags: , , , , — Nicholas @ 09:46

Ethanol was supposed to be an environmentally friendly substitute for gasoline, and it was renewable … but it’s not living up to promises:

With the Iowa political caucuses on the horizon in 2007, presidential candidate Barack Obama made homegrown corn a centerpiece of his plan to slow global warming. And when President George W. Bush signed a law that year requiring oil companies to add billions of gallons of ethanol to their gasoline each year, Bush predicted it would make the country “stronger, cleaner and more secure.”

But the ethanol era has proven far more damaging to the environment than politicians promised and much worse than the government admits today.

As farmers rushed to find new places to plant corn, they wiped out millions of acres of conservation land, destroyed habitat and polluted water supplies, an Associated Press investigation found.

Five million acres of land set aside for conservation — more than Yellowstone, Everglades and Yosemite National Parks combined — have vanished on Obama’s watch.

Landowners filled in wetlands. They plowed into pristine prairies, releasing carbon dioxide that had been locked in the soil.

Sprayers pumped out billions of pounds of fertilizer, some of which seeped into drinking water, contaminated rivers and worsened the huge dead zone in the Gulf of Mexico where marine life can’t survive.

The consequences are so severe that environmentalists and many scientists have now rejected corn-based ethanol as bad environmental policy. But the Obama administration stands by it, highlighting its benefits to the farming industry rather than any negative impact.

October 30, 2013

Human Progress

Filed under: Economics, Education, Environment, Food, Health, History — Tags: , — Nicholas @ 09:57

At Reason, Marian Tupy introduces a new website celebrating Human Progress:

In a world where we are constantly bombarded with bad news, it can sometimes be difficult to think of “progress” and “humanity” in the same sentence. Are there not wars taking place, people going hungry, children at work, women being abused, and mass poverty around the world?

In fact, for most of human history, life was very difficult for most people. People lacked basic medicines and died relatively young. They had no painkillers and people with ailments spent much of their lives in agonizing pain. Entire families lived in bug-infested dwellings that offered neither comfort nor privacy. They worked in the fields from sunrise to sunset, yet hunger and famines were commonplace. Transportation was primitive and most people never traveled beyond their native villages or nearest towns. Ignorance and illiteracy were rife. The “good old days” were, by and large, very bad for the great majority of humankind.

Average global life expectancy at birth hovered around 30 years from the Upper Paleolithic to 1900. Even in the richest countries, like those of Western Europe, life expectancy at the start of the 20th century rarely exceeded 50 years. Incomes were quite stagnant, too. At the beginning of the Christian era, annual incomes per person around the world ranged from $1,073 to $1,431. As late as 1820, average global income was only $1,274 per person. (Angus Maddison, whose income estimates I use here, gives his data in 1990 dollars. I have adjusted Maddison’s figures for inflation.)

Humanity has made enormous progress — especially over the course of the last two centuries. For example, average life expectancy in the world today is 67.9 years. In 2010, global per capita income stood at $13,037 — over 10 times what it was two centuries ago.

The new website is called Human Progress:

It is perhaps best to start by explaining what the Human Progress website is not trying to accomplish. It will not try to convince you that the world is a perfect place. As long as there are people who go hungry or die from preventable diseases, there will always be room for improvement. To that end, we all have a role to play in helping the destitute in our communities and beyond.

Our goal, then, is not to paint a rosy picture of the state of humanity, but a realistic one. A realistic account of the world should focus on long-term trends, comparing living standards between two or more generations. Crucially, it should compare the imperfect present with a much more imperfect past, rather than with an imagined utopia in the future.

As such, this website has two main aims. First is to inform you about the many ways in which the world has become a better place. Second is to allow you to search for reasons that brought that improvement about. While we think that policies and institutions compatible with freedom and openness are important factors in promoting human progress, we let the evidence speak for itself and hope the website stimulates an intelligent debate on the drivers of human progress.

October 29, 2013

Even selling the USS Forrestal for $1 was a win for the US Navy

Filed under: Economics, Environment, Military, USA — Tags: , , , , — Nicholas @ 07:21

Several people have commented about the headlines proclaiming that the very first supercarrier had been sold for a princely sum of $1, but Strategy Page explains why even that token dollar was better than all the other options:

The U.S. Navy recently sold a decommissioned (in 1993) aircraft carrier (USS Forrestal) for scrap. The ship yard that will take the Forrestal apart (All Star Metals of Texas) paid the navy one cent ($.01) for the ship. That’s because this was the best deal the navy could get. That’s because it will cost many millions to take the ship apart in a legal fashion (being careful to avoid releasing any real or imagined harmful substances into the environment). The other alternative was to sink the Forrestal at sea. But this requires partial disassembly (to remove anything that could or might pollute the ocean), that would be even more expensive.

[…]

Since the 1990s, sending warships to the scrap yard has not been considered a viable alternative. It’s all about pollution, bad press, and cost. That was because of the experience with the largest warship to be scrapped to date, the 45,000 ton carrier USS Coral Sea. This ship took until 2000 (seven years) to be broken up. Thus, the new ecologically correct process was not only expensive but it took a long time. Then the navy discovered that the cost of scrapping a nuclear powered carrier like the USS Enterprise would be close to a billion dollars. This was largely the result of a lot more environmental and safety regulations. With so many navy ships (especially nuclear subs) being broken up in the 1990s, and all these new regulations arriving, the cost of disposing of these ships skyrocketed. This was especially true with carriers.

So for over a decade the navy just tied up retired ships and waited for some better solution to appear. That never happened. In fact, the situation has gotten worse. The navy only has one ship scrapping facility (Brownsville, Texas), so only one carrier at a time can be dismantled. Using official estimates of the time required to dismantle each of the biggest ships, it’ll take seven decades to get rid of the surviving conventionally powered carriers. Note also that the conventional carrier in the absolute worst shape, the USS John F Kennedy, is the one being officially retained in category B reserve (but only until Congress forgets all about her, of course). Name recognition really does count.

It gets worse. With the really vast number of single hull tankers being scrapped and large numbers of old, smaller-capacity container ships laid up and likely to be offered for scrap fairly soon, the market for difficult-to-scrap naval ships is going to shrivel and the price for scrap steel will drop. Efforts to get the navy to include the costs of disposal in the budget for lifetime costs has never caught on and now it’s obvious why not. The real nightmare begins with the first nuclear powered carrier (the 93,000 ton USS Enterprise), which began the decommissioning process in late 2012 (with the lengthy removal of all classified or reusable equipment). The cost of dismantling this ship (and disposing of radioactive components) may be close to $2 billion.

October 5, 2013

Climate models, trust, and spin

Filed under: Environment, Media — Tags: , , — Nicholas @ 08:58

In Reason, Ronald Bailey asks whether we can trust the IPCC’s climate models:

On Monday, the U.N.’s Intergovernmental Panel on Climate Change (IPCC) released the final draft of Climate Change 2013: The Physical Sciences Basis. The report’s Summary for Policymakers flatly states: “Warming of the climate system is unequivocal, and since the 1950s, many of the observed changes are unprecedented over decades to millennia. The atmosphere and ocean have warmed, the amounts of snow and ice have diminished, sea level has risen, and the concentrations of greenhouse gases have increased.” Pretty much everyone concerned with this issue agrees that those are the facts. But what is causing the planet to warm up? Here is where it gets interesting.

[…]

The IPCC report acknowledges that almost all of the “historical simulations do not reproduce the observed recent warming hiatus.” Not to worry, it assures us; 15-year pauses just happen, and you can’t really expect the models to simulate these kind of random natural fluctuations in the climate. Once this little slow-down passes, “It is more likely than not that internal climate variability in the near-term will enhance and not counteract the surface warming expected to arise from the increasing anthropogenic forcing.” In other words, when the warm-up resumes it will soar.

John Christy, a climatologist at the University of Alabama in Huntsville, has come to a different conclusion. Christy compared the outputs of 73 climate models for the tropical troposphere used by the IPCC in its latest report with satellite and weather balloon temperature trends since 1979 until 2030. “The tropics is so important because that is where models show the clearest and most distinct signal of greenhouse warming — so that is where the comparison should be made (rather than say for temperatures in North Dakota),” Christy explains in an email. “Plus, the key cloud and water vapor feedback processes occur in the tropics.” When it comes to simulating the atmospheric temperature trends of the past 35 years, Christy found, all of the IPCC models are running hotter than the actual climate.

[…]

Average of model results compared with temperature trends

Average of model results compared with temperature trends

To defend himself against any accusations of cherry-picking his data, Christy notes that his “comparisons start in 1979, so these are 35-year time series comparisons” — rather longer than the 15-year periods whose importance the IPCC disputes.

Why the discrepancy between the IPCC and Christy results? As Georgia Tech climatologist Judith Curry notes, data don’t speak for themselves; researchers have to put them into a context. And your choice of context — say, the year you choose to begin with — can influence your conclusions considerably. While there may be nothing technically wrong with the way the IPCC chose to display the comparison between model data and observation data, “Curry observes, it will mislead the public to infer that climate models are better than we thought.” She adds, “What is wrong is the failure of the IPCC to note the failure of nearly all climate model simulations to reproduce a pause of 15+ years.”

October 1, 2013

No mistakes were made, no problems uncovered, but 19 firefighters died

Filed under: Bureaucracy, Environment, USA — Tags: , , , — Nicholas @ 09:57

The official report on the Yarnell Hill fire which claimed the lives of 19 firefighters has managed to find no issues whatsoever with the incident. Apparently no mistakes were made by any of the firefighters or their leadership, and there are no lessons to be learned from this tragedy.

Nothing went wrong in the Yarnell Hill Fire, which killed 19 wildland firefighters in June.

This according to the “Serious Accident Investigation Report” into the fire, released this weekend by federal, state, and local firefighting officials in Prescott.

“The Team found no indication of negligence, reckless actions, or violations of policy or protocol,” the report states.

It certainly seems that something must have gone wrong when 19 men, most of them young men, are dead.

In fact, certain fire officials who now say everything went according to protocol had been among those assessing blame and pointing out mistakes leading up to the deaths of the Granite Mountain Hotshots.

Arizona Deputy State Forester Jerry Payne previously said it looked like Eric Marsh, superintendent of the hotshot crew, had violated basic wildfire-safety rules, although Payne added that many decisions made by those leading wildfire-fighting crews are calculated risks, rather than strictly rule-book decisions.

Prescott Wildland Division Chief Darrell Willis suggested in an interview with ABC News that the crew “could have made it” had the U.S. Forest Service delivered all the air-tankers that were requested for the Yarnell Hill Fire.

Neither of these findings was included in the report, despite Payne and Willis’ prescence among fire officials presenting investigators’ conclusions at Prescott High School on Saturday.

Not everyone is convinced, however:

Here is my analysis of what is going on with this report: Substantial mistakes were made by both the fire team and by their leaders. Their leaders wrote the report, and certainly were not going to incriminate themselves, particularly given that they likely face years of litigation. They could have perhaps outlined the mistakes the team made, but the families and supporters of the dead men would have raised a howl if the dead firefighters were blamed for mistakes while the leadership let themselves off the hook, and surely would have pushed back on the culpability of the firefighting effort’s management.

So this report represents an implicit deal being offered to the families — we will let your dead rest in peace by not highlighting the mistakes they made if you will lay off of us and the mistakes we made. We will just blame it on God (I kid you not, see Prescott chief’s statements here). Most Arizonans I know seem willing to have these folks die as heroes who succumbed to the inherent risks of the profession, rather than stupid errors, so we may never have an honest assessment of what happened. And yet again the opportunity to do a major housecleaning of wildland firefighting is missed.

September 27, 2013

Harper and climate change … spending

David Akin points out that all the major federal parties believe the same thing about climate change, except that the Tories are the ones who’ve been chucking around the money on climate change programs:

The simple fact of Canadian politics here is that, if you do not believe in climate change, there is no federal political party that shares your view. There almost was one in Alberta in its last provincial election but, boy, did that idea get shouted down.

But back to what [former environment minister Peter] Kent said to me in that interview:

“There is no question that since the Industrial Revolution there have been anthropogenic, man-made effects on our global climate. The argument continues in the scientific community how much is evolution and how much is man-made but there is certainly something we can do.”

So what is the something that the Harper government has been doing? Well, truth be told, the Harper Conservatives, like the Martin and Chretien Liberals before them, have not been doing very much. None of them, in fact, got the job done. Which might, come to think of it, be a good reason — if climate change is the only thing you’re voting on — to consider choosing the NDP or the Greens next time around. Not to say they’d actually get it done but it’s pretty clear the other two parties, while they talk a good game, just don’t have the political stomach for the job. Those New Democrats brought us universal health care. Maybe they can fix the environment, too.

Still, that doesn’t mean Conservatives aren’t prepared to spend hundreds of millions of dollars — billions even — on a problem they are accused of not admitting even exists. Take biofuels, for example. Early on, the Harper government got the idea that if corn- or plant-based ethanol displaced enough fossil fuels, we’d easily roll back greenhouse gas emissions. Apparently no one bothered to point out that there is serious doubt that corn-based ethanol is actually a lower-emission alternative to fossil fuels but why complicate things? Ethanol is a good, solid, job-creating green story!

In the long run, the subsidies and outright gifts of government money to green-ish sounding companies will likely be the only reminders of the great global warming panic of the last decade. Certainly little or no actual environmental improvements will be traced to the billions of dollars doled out to cronies under this government.

September 22, 2013

Statistical fail for political axe-grinding

Filed under: Environment, Media, Politics, USA — Tags: , , — Nicholas @ 11:29

Coyote Blog views with alarm a recent article in Rolling Stone which abuses statistics to make a point that apparently isn’t true:

What I want to delve into is the claim by the author that wildfires are increasing due to global warming, and only evil Republicans (who suck) could possibly deny this obvious trend […]

These are the 8 statements I can find to support an upward trend in fires. And you will note, I hope, that none of them include the most obvious data — what has the actual trend been in number of US wildfires and acres burned. Each of these is either a statement of opinion or a data point related to fire severity in a particular year, but none actually address the point at hand: are we getting more and larger fires?

Maybe the data does not exist. But in fact it does, and I will say there is absolutely no way, no way, the author has not seen the data. The reason it is not in this article is because it does not fit the “reporters” point of view so it is left out. Here is where the US government tracks fires by year, at the National Interagency Fire Center. To save you clicking through, here is the data as of this moment:

Wildfire averages 2004-2013

Well what do you know? The number of fires and the acres burned in 2013 are not some sort of record high — in fact they actually are the, respectively, lowest and second lowest numbers of the last 10 years. In fact, both the number of fires and the total acres burned are running a third below average.

The one thing this does not address is the size of fires. The author implies that there are more fires burning more acres, which we see is clearly wrong, but perhaps the fires are getting larger? Well, 2012 was indeed an outlier year in that fires were larger than average, but 2013 has returned to the trend which has actually been flat to down, again exactly opposite of the author’s contention (data below is just math from chart above)

Wildfires average acres per fire 2004-2013

In the rest of the post, I will briefly walk through his 8 statements highlighted above and show why they exhibit many of the classic fallacies in trying to assert a trend where none exists. In the postscript, I will address one other inconsistency from the article as to the cause of these fires which is a pretty hilarious of how to turn any data to supporting your hypothesis, even if it is unrelated.

September 21, 2013

Why wind and solar power can’t meet our needs

Filed under: Economics, Environment, Technology — Tags: , , , — Nicholas @ 10:32

Robert Bryce explains why — no matter how much we might want it to be so — alternate forms of energy like wind and solar power cannot cover our demands:

That 32 percent increase in global carbon dioxide emissions reflects the central tension in any discussion about cutting the use of coal, oil and natural gas: Developing countries — in particular, fast-growing economies such as Vietnam, China and India — simply cannot continue to grow if they limit the use of hydrocarbons. Those countries’ refusal to enact carbon taxes or other restrictions illustrates what Roger Pielke Jr., a professor of environmental studies at the University of Colorado, calls the “iron law of climate policy”: Whenever policies “focused on economic growth confront policies focused on emissions reduction, it is economic growth that will win out every time.”

Over the past 10 years, despite great public concern, carbon dioxide emissions have soared because some 2.6 billion people still live in dire energy poverty. More than 1.3 billion have no access to electricity at all.

Now to the second number: 1. That’s the power density of wind in watts per square meter. Power density is a measure of the energy flow that can be harnessed from a given area, volume or mass. Six different analyses of wind (one of them is my own) have all arrived at that same measurement.

Wind energy’s paltry power density means that enormous tracts of land must be set aside to make it viable. And that has spawned a backlash from rural and suburban landowners who don’t want 500-foot wind turbines near their homes. To cite just one recent example, in late July, some 2,000 protesters marched against the installation of more than 1,000 wind turbines in Ireland’s Midlands Region.

Consider how much land it would take for wind energy to replace the power the U.S. now gets from coal. In 2011, the U.S. had more than 300 billion watts of coal-fired capacity. Replacing that with wind would require placing turbines over about 116,000 square miles, an area about the size of Italy. And because of the noise wind turbines make — a problem that has been experienced from Australia to Ontario — no one could live there.

[…]

In 2012, the contribution from all of those sources amounted to about 4.8 million barrels of oil equivalent per day, or roughly one-half of a Saudi Arabia. Put another way, we get about 50 times as much energy from all other sources — coal, oil, natural gas, nuclear and hydropower — as we do from wind, solar, geothermal and biomass.

September 20, 2013

The IPCC’s new, more cautious tone

Filed under: Environment, Media, Science — Tags: , , , , — Nicholas @ 07:41

In The Spectator, a muted tone of “we told you so” about the upcoming IPCC report:

Next week, those who made dire predictions of ruinous climate change face their own inconvenient truth. The summary of the fifth assessment report by the Intergovernmental Panel on Climate Change (IPCC) will be published, showing that global temperatures are refusing to follow the path which was predicted for them by almost all climatic models. Since its first report in 1990, the IPCC has been predicting that global temperatures would be rising at an average of 0.2° Celsius per decade. Now, the IPCC acknowledges that there has been no statistically significant rise at all over the past 16 years.

It is difficult to over-emphasise the significance of this report. The IPCC is not simply a research body making reports and declarations which are merely absorbed into political debate. Its word has been taken as gospel, and its research has been used to justify all manner of schemes to make carbon-based energy more expensive while subsidising renewable energy.

The failure of its predictions undermines the certainties which have been placed upon the science of climate change. Previous IPCC reports — and much of the debate over how to react to them — have appeared to treat the Earth’s climate as if it were a domestic central heating system, with carbon emissions analogous to the dial on the thermostat: a small tweak here will result in a temperature rise of precisely 0.2°C and so on. What is clear from the new IPCC report is that the science is not nearly advanced enough to make useful predictions on the future rise of global temperatures. Perhaps it never will be.

Some climate scientists themselves, to give them credit, have admitted as much. Their papers now incorporate a degree of caution, as you would expect from genuine scientists. The problems arise when the non-scientists leap upon the climate change bandwagon and assume that anything marked ‘science’ must be the final word. As the chemist and novelist C.P. Snow once warned in his lecture about the ‘two cultures’, you end up in a situation where non-scientists use half-understood reports to silence debate — not realising that proper science welcomes refutation and is wary of the notion of absolute truths.

August 28, 2013

Hurricane and cyclone paths since 1842

Filed under: Americas, Environment, Pacific, Science — Tags: , , — Nicholas @ 08:59

Wired‘s MapLab has a lovely visualization of both hurricane and cyclone tracks starting with the earliest records in 1842:

Historical Hurricane and Cyclone paths since 1842

This map shows the paths of every hurricane and cyclone detected since 1842. Nearly 12,000 tropical cyclones have been tracked and recorded, and the National Oceanic and Atmospheric Administration keeps them all in a single database. Long-term datasets can be really interesting and scientifically valuable, and this one is undoubtedly both.

In the image above, you can clearly see that more storm tracks have overlapped in the western Pacific ocean and northern Indian ocean. This is largely because of the length of the typhoon season, which basically never stops in the warmer waters there.

The tracks of the earliest storms are based on mariner’s logs and storm records, collected from various countries, agencies and other sources. Reconciling data from these different entities was tough. Most international agencies had their own set of codes for cyclone intensity, and only recorded this information once per day. India was even using different wind thresholds to designate cyclone stages.

August 17, 2013

Fracking and the environment

Filed under: Environment, Media, Technology — Tags: , , , , — Nicholas @ 09:03

Matt Ridley debunks five common myths about environmental issues with fracking:

The movie Gasland showed a case of entirely natural gas contamination of water and the director knew it, but he still pretended it might have been caused by fracking. Ernest Moniz, the US Energy Secretary, said earlier this month: “I still have not seen any evidence of fracking per se contaminating groundwater.” Tens of thousands of wells drilled, two million fracking operations completed and not a single proven case of groundwater contamination. Not one. It may happen one day, of course, but there’s few industries that can claim a pollution record that good.

Next comes the claim that shale gas production results in more methane release to the atmosphere and hence could be as bad for climate change as coal. (Methane is a more powerful greenhouse gas than carbon dioxide, but stays in the atmosphere for a shorter time and its concentration is not currently rising fast.) This claim originated with a Cornell biology professor with an axe to grind. Study after study has refuted it. As a team from Massachusetts Institute of Technology put it: “It is incorrect to suggest that shale gas-related hydraulic fracturing has substantially altered the overall [greenhouse gas] intensity of natural gas production.”

Third comes the claim that fracking uses too much water. The Guardian carried a report this week implying that a town in Texas is running dry because of the water used for fracking. Yet in Texas 1% of water use is for fracking, in the United States as a whole 0.3% — less than is used by golf courses. If parts of Texas run out of water, blame farming, by far the biggest user.

Fourth, the ever-so-neutral BBC in a background briefing this week described fracking as releasing “hundreds of chemicals” into the rock. Out by an order of magnitude, Auntie. Fracking fluid is 99.51% water and sand. In the remaining 0.49% there are just 13 chemicals, all of which can be found in your kitchen, garage or bathroom: citric acid (lemon juice), hydrochloric acid (swimming pools), glutaraldehyde (disinfectant), guar (ice cream), dimethylformamide (plastics), isopropanol (deodorant), borate (hand soap); ammonium persulphate (hair dye); potassium chloride (intravenous drips), sodium carbonate (detergent), ethylene glycol (de-icer), ammonium bisulphite (cosmetics), petroleum distillate (cosmetics).

As for earthquakes, Durham University’s definitive survey of all induced earthquakes over many decades concluded that “almost all of the resultant seismic activity [from fracking] was on such a small scale that only geoscientists would be able to detect it” and that mining, geothermal activity or reservoir water storage causes more and bigger tremors.

(more…)

July 31, 2013

“What LEED designers deliver is what most LEED building owners want – namely, green publicity, not energy savings”

Filed under: Business, Environment, Media, USA — Tags: , , , — Nicholas @ 10:24

A bit of LEED debunking at The New Republic:

When the Bank of America Tower opened in 2010, the press praised it as one of the world’s “most environmentally responsible high-rise office building[s].” It wasn’t just the waterless urinals, daylight dimming controls, and rainwater harvesting. And it wasn’t only the Leadership in Energy and Environmental Design (LEED) Platinum certification — the first ever for a skyscraper — and the $947,583 in incentives from the New York State Energy Research and Development Authority. It also had as a tenant the environmental movement’s biggest celebrity. The Bank of America Tower had Al Gore.

The former vice president wanted an office for his company, Generation Investment Management, that “represents the kind of innovation the firm is trying to advance,” his real-estate agent said at the time. The Bank of America Tower, a billion-dollar, 55-story crystal skyscraper on the northwest corner of Manhattan’s Bryant Park, seemed to fit the bill. It would be “the most sustainable in the country,” according to its developer Douglas Durst. At the Tower’s ribbon-cutting ceremony, Gore powwowed with Mayor Michael Bloomberg and praised the building as a model for fighting climate change. “I applaud the leadership of the mayor and all of those who helped make this possible,” he said.

Gore’s applause, however, was premature. According to data released by New York City last fall, the Bank of America Tower produces more greenhouse gases and uses more energy per square foot than any comparably sized office building in Manhattan. It uses more than twice as much energy per square foot as the 80-year-old Empire State Building. It also performs worse than the Goldman Sachs headquarters, maybe the most similar building in New York — and one with a lower LEED rating. It’s not just an embarrassment; it symbolizes a flaw at the heart of the effort to combat climate change.

[…]

“What LEED designers deliver is what most LEED building owners want — namely, green publicity, not energy savings,” John Scofield, a professor of physics at Oberlin, testified before the House last year.

Governments, nevertheless, have been happy to rely on LEED rather than design better metrics. Which is why New York’s release of energy data last fall was significant. It provided more public-energy data for a U.S. city than has ever existed. It found the worst-performing buildings use three to five times more energy per square foot than the best ones. It also found that, if the most energy-intensive large buildings were brought up to the current seventy-fifth percentile, the city’s total greenhouse gases could be reduced by 9 percent.

July 20, 2013

Investigators still don’t know what caused the explosion in Lac-Mégantic derailment

Filed under: Cancon, Environment, Railways — Tags: , — Nicholas @ 00:02

In the Globe and Mail, Jacquie McNish and Grant Robertson report on the ongoing investigations into the causes of the fatal explosion:

Federal officials probing the Lac-Mégantic disaster are testing the chemical composition of crude oil carried by the runaway train as they seek to answer the crucial question of what triggered the unusual and devastating explosion after the derailment.

[…]

Edward Burkhardt, chairman of Montreal, Maine & Atlantic Railway Inc., which operated the derailed train, said Canadian authorities have impounded the rail cars to take “a huge number of samples of oil.” He said the investigators and officials in the rail and oil industries “are asking how come there were explosions here. Crude does not blow up.”

People familiar with the investigation said the TSB is examining the composition of the oil that fuelled the explosion.

Industry sources said there are several possibilities. One is whether the crude, which came from the Bakken oil region of North Dakota, contained volatile chemicals. A possible scenario is that additives were intentionally combined with the crude oil to speed up the transfer of the syrupy oil, common for pipelines but rare in the rail industry. Another possibility is that the tanker cars had chemical contaminants from a previous shipment. Another question is whether the oil contained high levels of flammable hydrogen sulphide gas, which is sometimes present in Bakken oil.

[…]

Regulators in the United States say rail carriers are responsible for knowing what they are carrying, and that the shipper and the railway company are required to work out such details when the train is being loaded.

“The carriers have to know exactly what it is that they’re hauling at all times,” said Warren Flateau, a spokesman for the Federal Railway Association in Washington.

Mr. Burkhardt said MM&A received a detailed bill of lading from the U.S. oil services company, which he declined to identify, and no chemicals were identified as being present in the crude. The intermediary oil services company leased the rail cars, loaded them with oil and then contracted three separate railway companies to transport them.

The first carrier was Canadian Pacific Railway, which handed over the train to MM&A in Montreal. From there, MM&A was to deliver the oil cars to a small rail company in New Brunswick owned by the Irving family.

July 17, 2013

Keep calm, and don’t panic about bee-pocalypse now

Filed under: Environment, Food, Media, Science — Tags: , , , , — Nicholas @ 08:17

You’ve heard about the mysterious colony collapse disorder (CCD) that has been devastating bee colonies across the world, right? This is serious, as bees are a very important part of the pollenization of many crops. As you’ll know from many media reports, this is a food disaster unfolding before us and we’re all going to starve! Or, looking at the facts, perhaps not:

In a rush to identify the culprit of the disorder, many journalists have made exaggerated claims about the impacts of CCD. Most have uncritically accepted that continued bee losses would be a disaster for America’s food supply. Others speculate about the coming of a second “silent spring.” Worse yet, many depict beekeepers as passive, unimaginative onlookers that stand idly by as their colonies vanish.

This sensational reporting has confused rather than informed discussions over CCD. Yes, honey bees are dying in above average numbers, and it is important to uncover what’s causing the losses, but it hardly spells disaster for bees or America’s food supply.

Consider the following facts about honey bees and CCD.

For starters, US honey bee colony numbers are stable, and they have been since before CCD hit the scene in 2006. In fact, colony numbers were higher in 2010 than any year since 1999. How can this be? Commercial beekeepers, far from being passive victims, have actively rebuilt their colonies in response to increased mortality from CCD. Although average winter mortality rates have increased from around 15% before 2006 to more than 30%, beekeepers have been able to adapt to these changes and maintain colony numbers.

[…]

“The state of the honey bee population—numbers, vitality, and economic output — are the products of not just the impact of disease but also the economic decisions made by beekeepers and farmers,” economists Randal Rucker and Walter Thurman write in a summary of their working paper on the impacts of CCD. Searching through a number of economic measures, the researchers came to a surprising conclusion: CCD has had almost no discernible economic impact.

But you don’t need to rely on their study to see that CCD has had little economic effect. Data on colonies and honey production are publicly available from the USDA. Like honey bee numbers, US honey production has shown no pattern of decline since CCD was first detected. In 2010, honey production was 14% greater than it was in 2006. (To be clear, US honey production and colony numbers are lower today than they were 30 years ago, but as Rucker and Thurman explain, this gradual decline happened prior to 2006 and cannot be attributed to CCD).

H/T to Tyler Cowen for the link.

July 10, 2013

Next up on our agenda of things to panic about is “peak water”

Filed under: Economics, Environment, Technology — Tags: , , , , — Nicholas @ 00:02

sp!ked editor Rob Lyons explains that “peak water” just isn’t something to worry too much about:

Disappointed by the failure of the peak-oil disaster to come to fruition, our doom-mongering, Malthusian friends have alighted on other scary narratives to confirm their suspicions of humanity as a rapacious blight on the planet. Their latest is ‘peak water’.

On the face of it, peak water is a boneheaded concept on a planet where two thirds of the surface is covered in, er, water. According to the US Geological Survey, there are 332 million cubic miles of water on Earth. What we tend to need, however, is not sea water but fresh water, of which there is much less: nearer 2.5 million cubic miles. And much of that is too deep underground to be accessed. Surface water in rivers and lakes is a small fraction of overall fresh water: 22,339 cubic miles. Handily, though, natural processes cause sea water to evaporate and form clouds, which then dump their contents on to land — so in most populated parts of the world there is currently sufficient water to supply our needs in an endlessly renewable way. As for the future, it is clear there is no shortage of H2O on the planet. What we really have is a shortage of cheap energy and the necessary technology to take advantage of the salinated stuff.

The ‘peak water’ theorists focus on groundwater supplies that are either being used faster than they are replenished, or supplies that are not replenished at all: so-called ‘fossil water’. According to leading environmentalist Lester Brown, writing in the Guardian last weekend, the rapid exhaustion of these supplies in some parts of the world is leading to the decline of food production. And at a time of fast-growing populations, this apparently promises disaster for these countries.

But often, the problem is a political rather than a practical one. [. . .]

In reality, all of the fixes that apply to peak oil also apply to peak water. New technology may make water desalination far cheaper than it is now, a claim being made for new water filtration methods based on nanotechnology. Better use of water in irrigation, through careful management of when and how water is applied to crops, could cut usage dramatically — something that is already happening in dry countries such as Israel and Australia and in parts of the US. Current uses of water, like flush toilets, may be superseded in places where water is in high demand. Through civil engineering projects, water can be shifted from places where it is plentiful to places where it is needed most, something societies have been doing for thousands of years.

« Newer PostsOlder Posts »

Powered by WordPress