Quotulatiousness

December 2, 2013

Sea level changes during recorded history

Filed under: Environment, Europe, History — Tags: , , , , , — Nicholas @ 10:50

Some interesting points in this guest post by Robert W. Endlich:

Sea level changes over relatively recent geologic and human history demonstrate that alarmist claims do not withstand scrutiny. Sea levels rose significantly after the last ice age, fell during the Little Ice Age, and have been rising again since the LIA ended around 1850. In fact, Roman Empire and Medieval port cities are now miles from the Mediterranean, because sea levels actually fell during the Little Ice Age.

[…]

Those rising oceans created new ports for Greek and Roman naval and trade vessels. But today many of those structures and ruins are inland, out in the open, making them popular tourist destinations. How did that happen? The Little Ice Age once again turned substantial ocean water into ice, lowering sea levels, and leaving former ports stranded. Not enough ice has melted since 1850 to make them harbors again.

The ancient city of Ephesus was an important port city and commercial hub from the Bronze Age to the Minoan Warm period, and continuing through the Roman Empire. An historic map shows its location right on the sea. But today, in modern-day Turkey, Ephesus is 5 km from the Mediterranean. Some historians erroneously claim “river silting” caused the change, but the real “culprit” was sea level change.

Ruins of the old Roman port Ostia Antica, are extremely well preserved – with intact frescoes, maps and plans. Maps from the time show the port located at the mouth of the Tiber River, where it emptied into the Tyrrhenian Sea. The Battle of Ostia in 849, depicted in a painting attributed to Raphael, shows sea level high enough for warships to assemble at the mouth of the Tiber. However, today this modern-day tourist destination is two miles up-river from the mouth of the Tiber. Sea level was significantly higher in the Roman Warm Period than today.

An important turning point in British history occurred in 1066, when William the Conqueror defeated King Harold II at the Battle of Hastings. Less well-known is that, when William landed, he occupied an old Roman fort now known as Pevensey Castle, which at the time was located on a small island in a harbor on England’s south coast. A draw bridge connected it to the mainland. Pevensey is infamous because unfortunate prisoners were thrown into this “Sea Gate,” so that their bodies would be washed away by the tide. Pevensey Castle is now a mile from the coast – further proof of a much higher sea level fewer than 1000 years ago.

November 27, 2013

OMG! There are scary-sounding chemicals in your Thanksgiving Dinner!

Filed under: Environment, Food, Health, Media — Tags: , , — Nicholas @ 09:23

Our American friends are about to celebrate their (weirdly late) Thanksgiving this week, so junk science food scares are also making another annual appearance. Angela Logomasini explains why you can safely ignore most of the advice you may receive about food safety this Thanksgiving:

Toxic chemicals lurk in the “typical” Thanksgiving meal, warns a green activist website. Eat organic, avoid canned food, and you might be okay, according to their advice. Fortunately, there’s no need to buy this line. In fact, the trace levels of man-made chemicals found in these foods warrant no concern and are no different from trace chemicals that appear in food naturally.

The American Council on Science and Health (ACSH) illustrates this reality best with their Holiday Dinner Menu, which outlines all the “toxic” chemicals found naturally in food. The point is, at such low levels, both the man-made and naturally occurring chemicals pose little risk. This year the ACSH puts the issue in perspective explaining:

    Toxicologists have confirmed that food naturally contains a myriad of chemicals traditionally thought of as “poisons.” Potatoes contain solanine, arsenic, and chaconine. Lima beans contain hydrogen cyanide, a classic suicide substance. Carrots contain carototoxin, a nerve poison. And nutmeg, black pepper, and carrots all contain the hallucinogenic compound myristicin. Moreover, all chemicals, whether natural or synthetic, are potential toxicants at high doses but are perfectly safe when consumed in low doses.”

Typically, these kinds of food safety scares depend on using unfamiliar scientific names of various chemicals, knowing that most peoples’ memories of high school science have long since faded away. Anything “safe” has an ordinary name, while anything “toxic” goes by a tongue-twisting science-y name that conceals far more than it reveals to non-scientists. Remember how many times the dangers of dihydrogen monoxide (DHMO) have been used to whip up support for petitions to ban the stuff (see the Material Safety Data Sheet (pdf) for it). Dihydrogen monoxide is a science-y way of describing a molecule with two hydrogen atoms and one oxygen atom … it’s another name for water, but it sounds so much more ominous that way, doesn’t it?

November 14, 2013

Scientific facts and theories

Filed under: Environment, Science, Space — Tags: , , , , — Nicholas @ 10:32

Christopher Taylor wants to help you avoid mis-using the word “fact” when you’re talking about “theories”:

These days, criticizing or questioning statements on science can get you called an idiot or even a heretic; science has become a matter of religious faith for some. If a scientist said it, they believe it, and that’s that. Yet the very nature about science is not to be an authoritative voice, but a method of inquiry; science is about asking questions and wondering if something is valid and factual, not a system of producing absolute statements of unquestioned truth.

It is true that people need that source of truth and it is true that we’re all inescapably religious creatures, so that will find an outlet somewhere. Science just isn’t the proper outlet for it.

[…]

The problem is that there’s no way to test or confirm this theory [plate tectonics]. You can make a model and see it work, you can check out types of rock and examine fault lines, and you can make measurements, but that’s only going to tell you small portions of information in very limited time frames. Because the earth is so huge, and because there are so very many different pressures and influences on everything on a planet, you can’t be sure without observation over time.

And since the theory posits that it would take millions of years to really demonstrate this to be true, humanity cannot test it enough to be certain. So all we’re left with is a scientific theory: a functional method of interpreting data. In other words, it cannot be properly or accurately describe as fact.

This is true about other areas. The word “fact” is thrown around so casually with science and is defended angrily by people who really ought to know better. Cosmology does this a lot. Its a fact that the universe is expanding from an unknown central explosive point (although there is a fair amount of data that’s throwing this into question). We can’t know because we can’t have enough data and there hasn’t been long enough to really test this.

Michael Crichton’s criticism of global warming was along these lines. He didn’t deny anything, he just said its too big and complex a system that we understand far too little about to even attempt to make any absolute or authoritative statements about it. Science has gotten us far beyond our ability to properly measure or interpret the data at hand, but some still keep trying to make absolute statements anyway.

[…]

And that’s the heart of a scientific theory. It isn’t like a geometric theorem (a statement or formula that can be deduced from the axioms of a formal system by means of its rules of inference), or a theory that Sherlock Holmes might develop (a proposed explanation whose status is still conjectural). A scientific theory is a system of interpreting data (a coherent group of general propositions used as principles of explanation for a class of phenomena). It’s a step beyond a hypothesis, which is simply speculation or a guess, but is not proven fact.

Confusing theory with fact is really not excusable for an educated person, but some theories are so wedded to worldviews and hopes that they become a matter of argument and even rage. Questioning that theory means you’re an idiot, uneducated, worthless. If you doubt this theory, you’re clearly someone who is wrong about everything and should be totally ignored in life, even showered with contempt.

For all its rich vocabulary, English fails to correctly differentiate among the various uses of the word “theory”, which allows propagandists and outright frauds to confuse the issues and obscure the difference between what science can say about an issue and what believers desperately want to be true.

November 13, 2013

The environmental damage from “green” ethanol production

Filed under: Economics, Environment, Government, USA — Tags: , , , , — Nicholas @ 09:46

Ethanol was supposed to be an environmentally friendly substitute for gasoline, and it was renewable … but it’s not living up to promises:

With the Iowa political caucuses on the horizon in 2007, presidential candidate Barack Obama made homegrown corn a centerpiece of his plan to slow global warming. And when President George W. Bush signed a law that year requiring oil companies to add billions of gallons of ethanol to their gasoline each year, Bush predicted it would make the country “stronger, cleaner and more secure.”

But the ethanol era has proven far more damaging to the environment than politicians promised and much worse than the government admits today.

As farmers rushed to find new places to plant corn, they wiped out millions of acres of conservation land, destroyed habitat and polluted water supplies, an Associated Press investigation found.

Five million acres of land set aside for conservation — more than Yellowstone, Everglades and Yosemite National Parks combined — have vanished on Obama’s watch.

Landowners filled in wetlands. They plowed into pristine prairies, releasing carbon dioxide that had been locked in the soil.

Sprayers pumped out billions of pounds of fertilizer, some of which seeped into drinking water, contaminated rivers and worsened the huge dead zone in the Gulf of Mexico where marine life can’t survive.

The consequences are so severe that environmentalists and many scientists have now rejected corn-based ethanol as bad environmental policy. But the Obama administration stands by it, highlighting its benefits to the farming industry rather than any negative impact.

October 30, 2013

Human Progress

Filed under: Economics, Education, Environment, Food, Health, History — Tags: , — Nicholas @ 09:57

At Reason, Marian Tupy introduces a new website celebrating Human Progress:

In a world where we are constantly bombarded with bad news, it can sometimes be difficult to think of “progress” and “humanity” in the same sentence. Are there not wars taking place, people going hungry, children at work, women being abused, and mass poverty around the world?

In fact, for most of human history, life was very difficult for most people. People lacked basic medicines and died relatively young. They had no painkillers and people with ailments spent much of their lives in agonizing pain. Entire families lived in bug-infested dwellings that offered neither comfort nor privacy. They worked in the fields from sunrise to sunset, yet hunger and famines were commonplace. Transportation was primitive and most people never traveled beyond their native villages or nearest towns. Ignorance and illiteracy were rife. The “good old days” were, by and large, very bad for the great majority of humankind.

Average global life expectancy at birth hovered around 30 years from the Upper Paleolithic to 1900. Even in the richest countries, like those of Western Europe, life expectancy at the start of the 20th century rarely exceeded 50 years. Incomes were quite stagnant, too. At the beginning of the Christian era, annual incomes per person around the world ranged from $1,073 to $1,431. As late as 1820, average global income was only $1,274 per person. (Angus Maddison, whose income estimates I use here, gives his data in 1990 dollars. I have adjusted Maddison’s figures for inflation.)

Humanity has made enormous progress — especially over the course of the last two centuries. For example, average life expectancy in the world today is 67.9 years. In 2010, global per capita income stood at $13,037 — over 10 times what it was two centuries ago.

The new website is called Human Progress:

It is perhaps best to start by explaining what the Human Progress website is not trying to accomplish. It will not try to convince you that the world is a perfect place. As long as there are people who go hungry or die from preventable diseases, there will always be room for improvement. To that end, we all have a role to play in helping the destitute in our communities and beyond.

Our goal, then, is not to paint a rosy picture of the state of humanity, but a realistic one. A realistic account of the world should focus on long-term trends, comparing living standards between two or more generations. Crucially, it should compare the imperfect present with a much more imperfect past, rather than with an imagined utopia in the future.

As such, this website has two main aims. First is to inform you about the many ways in which the world has become a better place. Second is to allow you to search for reasons that brought that improvement about. While we think that policies and institutions compatible with freedom and openness are important factors in promoting human progress, we let the evidence speak for itself and hope the website stimulates an intelligent debate on the drivers of human progress.

October 29, 2013

Even selling the USS Forrestal for $1 was a win for the US Navy

Filed under: Economics, Environment, Military, USA — Tags: , , , , — Nicholas @ 07:21

Several people have commented about the headlines proclaiming that the very first supercarrier had been sold for a princely sum of $1, but Strategy Page explains why even that token dollar was better than all the other options:

The U.S. Navy recently sold a decommissioned (in 1993) aircraft carrier (USS Forrestal) for scrap. The ship yard that will take the Forrestal apart (All Star Metals of Texas) paid the navy one cent ($.01) for the ship. That’s because this was the best deal the navy could get. That’s because it will cost many millions to take the ship apart in a legal fashion (being careful to avoid releasing any real or imagined harmful substances into the environment). The other alternative was to sink the Forrestal at sea. But this requires partial disassembly (to remove anything that could or might pollute the ocean), that would be even more expensive.

[…]

Since the 1990s, sending warships to the scrap yard has not been considered a viable alternative. It’s all about pollution, bad press, and cost. That was because of the experience with the largest warship to be scrapped to date, the 45,000 ton carrier USS Coral Sea. This ship took until 2000 (seven years) to be broken up. Thus, the new ecologically correct process was not only expensive but it took a long time. Then the navy discovered that the cost of scrapping a nuclear powered carrier like the USS Enterprise would be close to a billion dollars. This was largely the result of a lot more environmental and safety regulations. With so many navy ships (especially nuclear subs) being broken up in the 1990s, and all these new regulations arriving, the cost of disposing of these ships skyrocketed. This was especially true with carriers.

So for over a decade the navy just tied up retired ships and waited for some better solution to appear. That never happened. In fact, the situation has gotten worse. The navy only has one ship scrapping facility (Brownsville, Texas), so only one carrier at a time can be dismantled. Using official estimates of the time required to dismantle each of the biggest ships, it’ll take seven decades to get rid of the surviving conventionally powered carriers. Note also that the conventional carrier in the absolute worst shape, the USS John F Kennedy, is the one being officially retained in category B reserve (but only until Congress forgets all about her, of course). Name recognition really does count.

It gets worse. With the really vast number of single hull tankers being scrapped and large numbers of old, smaller-capacity container ships laid up and likely to be offered for scrap fairly soon, the market for difficult-to-scrap naval ships is going to shrivel and the price for scrap steel will drop. Efforts to get the navy to include the costs of disposal in the budget for lifetime costs has never caught on and now it’s obvious why not. The real nightmare begins with the first nuclear powered carrier (the 93,000 ton USS Enterprise), which began the decommissioning process in late 2012 (with the lengthy removal of all classified or reusable equipment). The cost of dismantling this ship (and disposing of radioactive components) may be close to $2 billion.

October 5, 2013

Climate models, trust, and spin

Filed under: Environment, Media — Tags: , , — Nicholas @ 08:58

In Reason, Ronald Bailey asks whether we can trust the IPCC’s climate models:

On Monday, the U.N.’s Intergovernmental Panel on Climate Change (IPCC) released the final draft of Climate Change 2013: The Physical Sciences Basis. The report’s Summary for Policymakers flatly states: “Warming of the climate system is unequivocal, and since the 1950s, many of the observed changes are unprecedented over decades to millennia. The atmosphere and ocean have warmed, the amounts of snow and ice have diminished, sea level has risen, and the concentrations of greenhouse gases have increased.” Pretty much everyone concerned with this issue agrees that those are the facts. But what is causing the planet to warm up? Here is where it gets interesting.

[…]

The IPCC report acknowledges that almost all of the “historical simulations do not reproduce the observed recent warming hiatus.” Not to worry, it assures us; 15-year pauses just happen, and you can’t really expect the models to simulate these kind of random natural fluctuations in the climate. Once this little slow-down passes, “It is more likely than not that internal climate variability in the near-term will enhance and not counteract the surface warming expected to arise from the increasing anthropogenic forcing.” In other words, when the warm-up resumes it will soar.

John Christy, a climatologist at the University of Alabama in Huntsville, has come to a different conclusion. Christy compared the outputs of 73 climate models for the tropical troposphere used by the IPCC in its latest report with satellite and weather balloon temperature trends since 1979 until 2030. “The tropics is so important because that is where models show the clearest and most distinct signal of greenhouse warming — so that is where the comparison should be made (rather than say for temperatures in North Dakota),” Christy explains in an email. “Plus, the key cloud and water vapor feedback processes occur in the tropics.” When it comes to simulating the atmospheric temperature trends of the past 35 years, Christy found, all of the IPCC models are running hotter than the actual climate.

[…]

Average of model results compared with temperature trends

Average of model results compared with temperature trends

To defend himself against any accusations of cherry-picking his data, Christy notes that his “comparisons start in 1979, so these are 35-year time series comparisons” — rather longer than the 15-year periods whose importance the IPCC disputes.

Why the discrepancy between the IPCC and Christy results? As Georgia Tech climatologist Judith Curry notes, data don’t speak for themselves; researchers have to put them into a context. And your choice of context — say, the year you choose to begin with — can influence your conclusions considerably. While there may be nothing technically wrong with the way the IPCC chose to display the comparison between model data and observation data, “Curry observes, it will mislead the public to infer that climate models are better than we thought.” She adds, “What is wrong is the failure of the IPCC to note the failure of nearly all climate model simulations to reproduce a pause of 15+ years.”

October 1, 2013

No mistakes were made, no problems uncovered, but 19 firefighters died

Filed under: Bureaucracy, Environment, USA — Tags: , , , — Nicholas @ 09:57

The official report on the Yarnell Hill fire which claimed the lives of 19 firefighters has managed to find no issues whatsoever with the incident. Apparently no mistakes were made by any of the firefighters or their leadership, and there are no lessons to be learned from this tragedy.

Nothing went wrong in the Yarnell Hill Fire, which killed 19 wildland firefighters in June.

This according to the “Serious Accident Investigation Report” into the fire, released this weekend by federal, state, and local firefighting officials in Prescott.

“The Team found no indication of negligence, reckless actions, or violations of policy or protocol,” the report states.

It certainly seems that something must have gone wrong when 19 men, most of them young men, are dead.

In fact, certain fire officials who now say everything went according to protocol had been among those assessing blame and pointing out mistakes leading up to the deaths of the Granite Mountain Hotshots.

Arizona Deputy State Forester Jerry Payne previously said it looked like Eric Marsh, superintendent of the hotshot crew, had violated basic wildfire-safety rules, although Payne added that many decisions made by those leading wildfire-fighting crews are calculated risks, rather than strictly rule-book decisions.

Prescott Wildland Division Chief Darrell Willis suggested in an interview with ABC News that the crew “could have made it” had the U.S. Forest Service delivered all the air-tankers that were requested for the Yarnell Hill Fire.

Neither of these findings was included in the report, despite Payne and Willis’ prescence among fire officials presenting investigators’ conclusions at Prescott High School on Saturday.

Not everyone is convinced, however:

Here is my analysis of what is going on with this report: Substantial mistakes were made by both the fire team and by their leaders. Their leaders wrote the report, and certainly were not going to incriminate themselves, particularly given that they likely face years of litigation. They could have perhaps outlined the mistakes the team made, but the families and supporters of the dead men would have raised a howl if the dead firefighters were blamed for mistakes while the leadership let themselves off the hook, and surely would have pushed back on the culpability of the firefighting effort’s management.

So this report represents an implicit deal being offered to the families — we will let your dead rest in peace by not highlighting the mistakes they made if you will lay off of us and the mistakes we made. We will just blame it on God (I kid you not, see Prescott chief’s statements here). Most Arizonans I know seem willing to have these folks die as heroes who succumbed to the inherent risks of the profession, rather than stupid errors, so we may never have an honest assessment of what happened. And yet again the opportunity to do a major housecleaning of wildland firefighting is missed.

September 27, 2013

Harper and climate change … spending

David Akin points out that all the major federal parties believe the same thing about climate change, except that the Tories are the ones who’ve been chucking around the money on climate change programs:

The simple fact of Canadian politics here is that, if you do not believe in climate change, there is no federal political party that shares your view. There almost was one in Alberta in its last provincial election but, boy, did that idea get shouted down.

But back to what [former environment minister Peter] Kent said to me in that interview:

“There is no question that since the Industrial Revolution there have been anthropogenic, man-made effects on our global climate. The argument continues in the scientific community how much is evolution and how much is man-made but there is certainly something we can do.”

So what is the something that the Harper government has been doing? Well, truth be told, the Harper Conservatives, like the Martin and Chretien Liberals before them, have not been doing very much. None of them, in fact, got the job done. Which might, come to think of it, be a good reason — if climate change is the only thing you’re voting on — to consider choosing the NDP or the Greens next time around. Not to say they’d actually get it done but it’s pretty clear the other two parties, while they talk a good game, just don’t have the political stomach for the job. Those New Democrats brought us universal health care. Maybe they can fix the environment, too.

Still, that doesn’t mean Conservatives aren’t prepared to spend hundreds of millions of dollars — billions even — on a problem they are accused of not admitting even exists. Take biofuels, for example. Early on, the Harper government got the idea that if corn- or plant-based ethanol displaced enough fossil fuels, we’d easily roll back greenhouse gas emissions. Apparently no one bothered to point out that there is serious doubt that corn-based ethanol is actually a lower-emission alternative to fossil fuels but why complicate things? Ethanol is a good, solid, job-creating green story!

In the long run, the subsidies and outright gifts of government money to green-ish sounding companies will likely be the only reminders of the great global warming panic of the last decade. Certainly little or no actual environmental improvements will be traced to the billions of dollars doled out to cronies under this government.

September 22, 2013

Statistical fail for political axe-grinding

Filed under: Environment, Media, Politics, USA — Tags: , , — Nicholas @ 11:29

Coyote Blog views with alarm a recent article in Rolling Stone which abuses statistics to make a point that apparently isn’t true:

What I want to delve into is the claim by the author that wildfires are increasing due to global warming, and only evil Republicans (who suck) could possibly deny this obvious trend […]

These are the 8 statements I can find to support an upward trend in fires. And you will note, I hope, that none of them include the most obvious data — what has the actual trend been in number of US wildfires and acres burned. Each of these is either a statement of opinion or a data point related to fire severity in a particular year, but none actually address the point at hand: are we getting more and larger fires?

Maybe the data does not exist. But in fact it does, and I will say there is absolutely no way, no way, the author has not seen the data. The reason it is not in this article is because it does not fit the “reporters” point of view so it is left out. Here is where the US government tracks fires by year, at the National Interagency Fire Center. To save you clicking through, here is the data as of this moment:

Wildfire averages 2004-2013

Well what do you know? The number of fires and the acres burned in 2013 are not some sort of record high — in fact they actually are the, respectively, lowest and second lowest numbers of the last 10 years. In fact, both the number of fires and the total acres burned are running a third below average.

The one thing this does not address is the size of fires. The author implies that there are more fires burning more acres, which we see is clearly wrong, but perhaps the fires are getting larger? Well, 2012 was indeed an outlier year in that fires were larger than average, but 2013 has returned to the trend which has actually been flat to down, again exactly opposite of the author’s contention (data below is just math from chart above)

Wildfires average acres per fire 2004-2013

In the rest of the post, I will briefly walk through his 8 statements highlighted above and show why they exhibit many of the classic fallacies in trying to assert a trend where none exists. In the postscript, I will address one other inconsistency from the article as to the cause of these fires which is a pretty hilarious of how to turn any data to supporting your hypothesis, even if it is unrelated.

September 21, 2013

Why wind and solar power can’t meet our needs

Filed under: Economics, Environment, Technology — Tags: , , , — Nicholas @ 10:32

Robert Bryce explains why — no matter how much we might want it to be so — alternate forms of energy like wind and solar power cannot cover our demands:

That 32 percent increase in global carbon dioxide emissions reflects the central tension in any discussion about cutting the use of coal, oil and natural gas: Developing countries — in particular, fast-growing economies such as Vietnam, China and India — simply cannot continue to grow if they limit the use of hydrocarbons. Those countries’ refusal to enact carbon taxes or other restrictions illustrates what Roger Pielke Jr., a professor of environmental studies at the University of Colorado, calls the “iron law of climate policy”: Whenever policies “focused on economic growth confront policies focused on emissions reduction, it is economic growth that will win out every time.”

Over the past 10 years, despite great public concern, carbon dioxide emissions have soared because some 2.6 billion people still live in dire energy poverty. More than 1.3 billion have no access to electricity at all.

Now to the second number: 1. That’s the power density of wind in watts per square meter. Power density is a measure of the energy flow that can be harnessed from a given area, volume or mass. Six different analyses of wind (one of them is my own) have all arrived at that same measurement.

Wind energy’s paltry power density means that enormous tracts of land must be set aside to make it viable. And that has spawned a backlash from rural and suburban landowners who don’t want 500-foot wind turbines near their homes. To cite just one recent example, in late July, some 2,000 protesters marched against the installation of more than 1,000 wind turbines in Ireland’s Midlands Region.

Consider how much land it would take for wind energy to replace the power the U.S. now gets from coal. In 2011, the U.S. had more than 300 billion watts of coal-fired capacity. Replacing that with wind would require placing turbines over about 116,000 square miles, an area about the size of Italy. And because of the noise wind turbines make — a problem that has been experienced from Australia to Ontario — no one could live there.

[…]

In 2012, the contribution from all of those sources amounted to about 4.8 million barrels of oil equivalent per day, or roughly one-half of a Saudi Arabia. Put another way, we get about 50 times as much energy from all other sources — coal, oil, natural gas, nuclear and hydropower — as we do from wind, solar, geothermal and biomass.

September 20, 2013

The IPCC’s new, more cautious tone

Filed under: Environment, Media, Science — Tags: , , , , — Nicholas @ 07:41

In The Spectator, a muted tone of “we told you so” about the upcoming IPCC report:

Next week, those who made dire predictions of ruinous climate change face their own inconvenient truth. The summary of the fifth assessment report by the Intergovernmental Panel on Climate Change (IPCC) will be published, showing that global temperatures are refusing to follow the path which was predicted for them by almost all climatic models. Since its first report in 1990, the IPCC has been predicting that global temperatures would be rising at an average of 0.2° Celsius per decade. Now, the IPCC acknowledges that there has been no statistically significant rise at all over the past 16 years.

It is difficult to over-emphasise the significance of this report. The IPCC is not simply a research body making reports and declarations which are merely absorbed into political debate. Its word has been taken as gospel, and its research has been used to justify all manner of schemes to make carbon-based energy more expensive while subsidising renewable energy.

The failure of its predictions undermines the certainties which have been placed upon the science of climate change. Previous IPCC reports — and much of the debate over how to react to them — have appeared to treat the Earth’s climate as if it were a domestic central heating system, with carbon emissions analogous to the dial on the thermostat: a small tweak here will result in a temperature rise of precisely 0.2°C and so on. What is clear from the new IPCC report is that the science is not nearly advanced enough to make useful predictions on the future rise of global temperatures. Perhaps it never will be.

Some climate scientists themselves, to give them credit, have admitted as much. Their papers now incorporate a degree of caution, as you would expect from genuine scientists. The problems arise when the non-scientists leap upon the climate change bandwagon and assume that anything marked ‘science’ must be the final word. As the chemist and novelist C.P. Snow once warned in his lecture about the ‘two cultures’, you end up in a situation where non-scientists use half-understood reports to silence debate — not realising that proper science welcomes refutation and is wary of the notion of absolute truths.

August 28, 2013

Hurricane and cyclone paths since 1842

Filed under: Americas, Environment, Pacific, Science — Tags: , , — Nicholas @ 08:59

Wired‘s MapLab has a lovely visualization of both hurricane and cyclone tracks starting with the earliest records in 1842:

Historical Hurricane and Cyclone paths since 1842

This map shows the paths of every hurricane and cyclone detected since 1842. Nearly 12,000 tropical cyclones have been tracked and recorded, and the National Oceanic and Atmospheric Administration keeps them all in a single database. Long-term datasets can be really interesting and scientifically valuable, and this one is undoubtedly both.

In the image above, you can clearly see that more storm tracks have overlapped in the western Pacific ocean and northern Indian ocean. This is largely because of the length of the typhoon season, which basically never stops in the warmer waters there.

The tracks of the earliest storms are based on mariner’s logs and storm records, collected from various countries, agencies and other sources. Reconciling data from these different entities was tough. Most international agencies had their own set of codes for cyclone intensity, and only recorded this information once per day. India was even using different wind thresholds to designate cyclone stages.

August 17, 2013

Fracking and the environment

Filed under: Environment, Media, Technology — Tags: , , , , — Nicholas @ 09:03

Matt Ridley debunks five common myths about environmental issues with fracking:

The movie Gasland showed a case of entirely natural gas contamination of water and the director knew it, but he still pretended it might have been caused by fracking. Ernest Moniz, the US Energy Secretary, said earlier this month: “I still have not seen any evidence of fracking per se contaminating groundwater.” Tens of thousands of wells drilled, two million fracking operations completed and not a single proven case of groundwater contamination. Not one. It may happen one day, of course, but there’s few industries that can claim a pollution record that good.

Next comes the claim that shale gas production results in more methane release to the atmosphere and hence could be as bad for climate change as coal. (Methane is a more powerful greenhouse gas than carbon dioxide, but stays in the atmosphere for a shorter time and its concentration is not currently rising fast.) This claim originated with a Cornell biology professor with an axe to grind. Study after study has refuted it. As a team from Massachusetts Institute of Technology put it: “It is incorrect to suggest that shale gas-related hydraulic fracturing has substantially altered the overall [greenhouse gas] intensity of natural gas production.”

Third comes the claim that fracking uses too much water. The Guardian carried a report this week implying that a town in Texas is running dry because of the water used for fracking. Yet in Texas 1% of water use is for fracking, in the United States as a whole 0.3% — less than is used by golf courses. If parts of Texas run out of water, blame farming, by far the biggest user.

Fourth, the ever-so-neutral BBC in a background briefing this week described fracking as releasing “hundreds of chemicals” into the rock. Out by an order of magnitude, Auntie. Fracking fluid is 99.51% water and sand. In the remaining 0.49% there are just 13 chemicals, all of which can be found in your kitchen, garage or bathroom: citric acid (lemon juice), hydrochloric acid (swimming pools), glutaraldehyde (disinfectant), guar (ice cream), dimethylformamide (plastics), isopropanol (deodorant), borate (hand soap); ammonium persulphate (hair dye); potassium chloride (intravenous drips), sodium carbonate (detergent), ethylene glycol (de-icer), ammonium bisulphite (cosmetics), petroleum distillate (cosmetics).

As for earthquakes, Durham University’s definitive survey of all induced earthquakes over many decades concluded that “almost all of the resultant seismic activity [from fracking] was on such a small scale that only geoscientists would be able to detect it” and that mining, geothermal activity or reservoir water storage causes more and bigger tremors.

(more…)

July 31, 2013

“What LEED designers deliver is what most LEED building owners want – namely, green publicity, not energy savings”

Filed under: Business, Environment, Media, USA — Tags: , , , — Nicholas @ 10:24

A bit of LEED debunking at The New Republic:

When the Bank of America Tower opened in 2010, the press praised it as one of the world’s “most environmentally responsible high-rise office building[s].” It wasn’t just the waterless urinals, daylight dimming controls, and rainwater harvesting. And it wasn’t only the Leadership in Energy and Environmental Design (LEED) Platinum certification — the first ever for a skyscraper — and the $947,583 in incentives from the New York State Energy Research and Development Authority. It also had as a tenant the environmental movement’s biggest celebrity. The Bank of America Tower had Al Gore.

The former vice president wanted an office for his company, Generation Investment Management, that “represents the kind of innovation the firm is trying to advance,” his real-estate agent said at the time. The Bank of America Tower, a billion-dollar, 55-story crystal skyscraper on the northwest corner of Manhattan’s Bryant Park, seemed to fit the bill. It would be “the most sustainable in the country,” according to its developer Douglas Durst. At the Tower’s ribbon-cutting ceremony, Gore powwowed with Mayor Michael Bloomberg and praised the building as a model for fighting climate change. “I applaud the leadership of the mayor and all of those who helped make this possible,” he said.

Gore’s applause, however, was premature. According to data released by New York City last fall, the Bank of America Tower produces more greenhouse gases and uses more energy per square foot than any comparably sized office building in Manhattan. It uses more than twice as much energy per square foot as the 80-year-old Empire State Building. It also performs worse than the Goldman Sachs headquarters, maybe the most similar building in New York — and one with a lower LEED rating. It’s not just an embarrassment; it symbolizes a flaw at the heart of the effort to combat climate change.

[…]

“What LEED designers deliver is what most LEED building owners want — namely, green publicity, not energy savings,” John Scofield, a professor of physics at Oberlin, testified before the House last year.

Governments, nevertheless, have been happy to rely on LEED rather than design better metrics. Which is why New York’s release of energy data last fall was significant. It provided more public-energy data for a U.S. city than has ever existed. It found the worst-performing buildings use three to five times more energy per square foot than the best ones. It also found that, if the most energy-intensive large buildings were brought up to the current seventy-fifth percentile, the city’s total greenhouse gases could be reduced by 9 percent.

« Newer PostsOlder Posts »

Powered by WordPress