Published on 23 Dec 2013
Archive film, moving a new power station transformer by rail to Blaenau Ffestiniog Wales.
H/T to Roger Henry for the link.
Published on 23 Dec 2013
Archive film, moving a new power station transformer by rail to Blaenau Ffestiniog Wales.
H/T to Roger Henry for the link.
Matt Ridley on the piously hoped-for breakthroughs in renewable energy sources … that still seem as distant as they did decades ago:
The environmental movement has advanced three arguments in recent years for giving up fossil fuels: (1) that we will soon run out of them anyway; (2) that alternative sources of energy will price them out of the marketplace; and (3) that we cannot afford the climate consequences of burning them.
These days, not one of the three arguments is looking very healthy. In fact, a more realistic assessment of our energy and environmental situation suggests that, for decades to come, we will continue to rely overwhelmingly on the fossil fuels that have contributed so dramatically to the world’s prosperity and progress.
In 2013, about 87% of the energy that the world consumed came from fossil fuels, a figure that — remarkably — was unchanged from 10 years before. This roughly divides into three categories of fuel and three categories of use: oil used mainly for transport, gas used mainly for heating, and coal used mainly for electricity.
So those who predict the imminent exhaustion of fossil fuels are merely repeating the mistakes of the U.S. presidential commission that opined in 1922 that “already the output of gas has begun to wane. Production of oil cannot long maintain its present rate.” Or President Jimmy Carter when he announced on television in 1977 that “we could use up all the proven reserves of oil in the entire world by the end of the next decade.”
That fossil fuels are finite is a red herring. The Atlantic Ocean is finite, but that does not mean that you risk bumping into France if you row out of a harbor in Maine. The buffalo of the American West were infinite, in the sense that they could breed, yet they came close to extinction. It is an ironic truth that no nonrenewable resource has ever run dry, while renewable resources — whales, cod, forests, passenger pigeons — have frequently done so.
As for renewable energy, hydroelectric is the biggest and cheapest supplier, but it has the least capacity for expansion. Technologies that tap the energy of waves and tides remain unaffordable and impractical, and most experts think that this won’t change in a hurry. Geothermal is a minor player for now. And bioenergy — that is, wood, ethanol made from corn or sugar cane, or diesel made from palm oil — is proving an ecological disaster: It encourages deforestation and food-price hikes that cause devastation among the world’s poor, and per unit of energy produced, it creates even more carbon dioxide than coal.
Wind power, for all the public money spent on its expansion, has inched up to — wait for it — 1% of world energy consumption in 2013. Solar, for all the hype, has not even managed that: If we round to the nearest whole number, it accounts for 0% of world energy consumption.
Jared Newman on Tesla’s plans to move into a new marketplace:
Tesla’s crusade against fossil fuels could soon hit home with a battery-powered energy pack.
The company plans to start producing a home battery within six months, Bloomberg reports, and will reveal more details in the next month or two.
One obvious application would be a source for backup power, replacing conventional fuel-powered generators. The upcoming Toyota Mirai hydrogen-powered car can also function as a backup power source for a house. But the Tesla pack could also help shift energy usage to reduce peak demand on the electric grid, thereby cutting down on energy bills. In an earnings call, Tesla Chief Technology Officer JB Straubel hinted at strong interest from utility companies for that very reason.
Potentially, homes with renewable energy sources such as solar panels could also power the pack, allowing users to wean or remove themselves from the electric grid.
If the price point is low enough, this could be a big boost to fans of locally produced electricity from solar or wind power … being able to store the energy you generate so you can use it when you need it makes it much more attractive to invest in those technologies.
Robyn Arianrhod on the sesquicentential one of the most important discoveries that helped create the world we live in today:
It’s hard to imagine life without mobile phones, radio and television. Yet the discovery of the electromagnetic waves that underpin such technologies grew out of an abstract theory that’s 150 years old.
Our knowledge of the existence of such waves is a direct result of James Clerk Maxwell’s theory of electromagnetism which was first published in January 1865.
Electromagnetism itself was discovered physically rather than theoretically. Some time around 1820, the Danish physicist Hans Oersted noticed that when you switch on an electric current, a nearby magnet – such as the needle of a compass – actually jumps, as if the changing electric current was itself a magnet.
Then, in 1831 (the year Maxwell was born in Edinburgh) the English physicist and chemist Michael Faraday discovered that if you move a magnet through a coil of wire, you create an electric current in the wire without the aid of batteries or other electricity supply.
Faraday was so intrigued by the surprising ability of moving magnets to create electricity that he created a tiny prototype of the electric generator. He also created a prototype of the electric motor, but it would take decades before engineers were able to develop working motors and generators.
Nevertheless, basic technologies had begun to flow almost immediately after the phenomenon of electromagnetism was discovered: in particular, the telegraph – the first high-speed global telecommunications system.
In the Wall Street Journal, Jo Craven McGinty examines the pro and con equation for Daylight Saving Time. The US government, of course, says it saves electricity by their measurement:
The historic reason for observing daylight-saving time — which ends at 2 a.m. on Sunday when clocks revert to standard time — is to conserve energy, by pushing sunlight forward into the evening, reducing the need for electric lights.
The U.S. government has found the strategy works. But two academic studies published in peer-reviewed journals rebut the idea, and one even concludes the policy increases demand for electricity.
The most recent government study, by the Department of Energy, tested whether expanding daylight-saving time by four weeks in 2007 reduced the use of electricity, as intended.
The study examined the additional weeks of daylight-saving time using data provided by 67 utilities accounting for two-thirds of U.S. electricity consumption. It compared average daily use in 2006, when there was no daylight saving, with the same period in 2007 when the extension took effect and found a reduction in electricity use of 0.5% in the spring and 0.38% in the fall.
However, non-government studies don’t agree:
The study, which was published in the Review of Economics and Statistics, examined residential data only, but the researchers didn’t believe commercial use would alter their findings.
“Big-box stores don’t turn on or off lights based on whether it’s light outside or dark,” Mr. Kotchen said. “In a commercial building, the lights are on when people are working no matter what.”
Rather than conserving electricity, the study found that daylight-saving time increased demand for electricity. Conditions may vary in other parts of the country, but the study concluded that Indiana is representative of much of the country.
That doesn’t mean daylight-saving time has never worked since its introduction during World War I. But, said Mr. Kotchen, “the world has changed. Lighting is a small amount of energy and electricity use in households. The big things are heating and cooling, particularly as air conditioning has become more prevalent. We’re fooling ourselves to continue calling it an energy policy given the studies that show it doesn’t save energy.”
H/T to Terence Corcoran for the link.
Markus Krajewski writes about the formation of a multinational industrial cartel shortly after the First World War that helped create the very concept of “planned obsolescence” for (no) fun and (their) profit:
On 23 December 1924, a group of leading international businessmen gathered in Geneva for a meeting that would alter the world for decades to come. Present were top representatives from all the major lightbulb manufacturers, including Germany’s Osram, the Netherlands’ Philips, France’s Compagnie des Lampes, and the United States’ General Electric. As revelers hung Christmas lights elsewhere in the city, the group founded the Phoebus cartel, a supervisory body that would carve up the worldwide incandescent lightbulb market, with each national and regional zone assigned its own manufacturers and production quotas. It was the first cartel in history to enjoy a truly global reach.
The cartel’s grip on the lightbulb market lasted only into the 1930s. Its far more enduring legacy was to engineer a shorter life span for the incandescent lightbulb. By early 1925, this became codified at 1,000 hours for a pear-shaped household bulb, a marked reduction from the 1,500 to 2,000 hours that had previously been common. Cartel members rationalized this approach as a trade-off: Their lightbulbs were of a higher quality, more efficient, and brighter burning than other bulbs. They also cost a lot more. Indeed, all evidence points to the cartel’s being motivated by profits and increased sales, not by what was best for the consumer. In carefully crafting a lightbulb with a relatively short life span, the cartel thus hatched the industrial strategy now known as planned obsolescence.
How exactly did the cartel pull off this engineering feat? It wasn’t just a matter of making an inferior or sloppy product; anybody could have done that. But to create one that reliably failed after an agreed-upon 1,000 hours took some doing over a number of years. The household lightbulb in 1924 was already technologically sophisticated: The light yield was considerable; the burning time was easily 2,500 hours or more. By striving for something less, the cartel would systematically reverse decades of progress.
The details of this effort have been very slow to emerge. Some facts came to light in the 1940s, when the U.S. government investigated GE and a number of its business partners for anticompetitive practices. Others were uncovered more recently, when I and the German journalist Helmut Höge delved into the corporate archives of Osram in Berlin. Jointly founded in 1920 by three German companies, Osram remains one of the world’s leading makers of all kinds of lighting, including state-of-the-art LEDs. In the archives, we found meticulous correspondence between the cartel’s factories and laboratories, which were researching how to modify the filament and other measures to shorten the life span of their bulbs.
The cartel took its business of shortening the lifetime of bulbs every bit as seriously as earlier researchers had approached their job of lengthening it. Each factory bound by the cartel agreement—and there were hundreds, including GE’s numerous licensees throughout the world—had to regularly send samples of its bulbs to a central testing laboratory in Switzerland. There, the bulbs were thoroughly vetted against cartel standards. If any factory submitted bulbs lasting longer or shorter than the regulated life span for its type, the factory was obliged to pay a fine.
Companies were also fined for exceeding their sales quotas, which were constantly being adjusted. In 1927, for example, Tokyo Electric noted in a memo to the cartel that after shortening the lives of its vacuum and gas-filled lightbulbs, sales had jumped fivefold. “But if the increase in our business resulting from such endeavors directly mean[s] a heavy penalty, it must be a thing out of reason and shall quite discourage us,” the memo stated.
The great Adam Smith, of course, saw this coming in 1776: “People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.” Some things never change.
At The Diplomat, Mohamed Zeeshan talks about India’s self-imposed disadvantages in manufacturing both for domestic and export consumption:
Indian Prime Minister Narendra Modi’s maiden Independence Day speech was laced with inspiring rhetoric. But of the many things he said, the one slogan that inevitably caught public attention was this: “Come, make in India!” With those words, Modi was trying to make the case for turning India into the world’s next great manufacturing hub. Understandably, the Indian populace was thrilled.
India is one of the world’s ten largest economies (and is third largest on a purchasing power parity basis), with a total annual output of nearly $2 trillion. As much as 57 percent of this output is produced by a service sector that employs just 28 percent of the population, largely concentrated in urban parts of the country. That is no surprise, because most Indians lack the skills and education to join the more knowledge-intensive service sector. What they need is what successful developing nations all over the world have had ever since the Industrial Revolution: a robust and productive manufacturing sector.
Yet India’s manufacturing sector contributes just 16 percent to the total GDP pie (China’s, by contrast, accounts for almost half of its total economic output). Victor Mallet, writing in the McKinsey book Reimagining India, recently offered an anecdote that was illuminating. “One of India’s largest carmakers recently boasted that it was selling more vehicles than ever and that it was hiring an extra eight hundred workers for its factory,” he wrote, “But the plant employing those workers belongs to the Jaguar Land Rover subsidiary of Tata Motors and is in the English Midlands, not in job-hungry India.”
Mallet goes on to make a point that has been made frequently by Indian economists: The world doesn’t want to “make in India,” because it is simply too painful. There’s bureaucratic red tape, a difficult land acquisition act, troublesome environmental legislation, a shortage of electricity, and a lack of water resources. The only thing India doesn’t seem to lack is labor, but that merely adds to the problem. As Mallet points out in the same essay, aptly titled “Demographic dividend – or disaster?”, “India’s population grew by 181 million in the decade to 2011 – and (despite falling fertility rates) a rise of nearly 50% in the total number of inhabitants is unavoidable.” But the number of jobs being added to feed that population is inadequate.
However, the labor dividend is still important. India doesn’t need to reduce the number of hands on deck. It needs to weed out the challenges that stop them from being productive.
In Maclean’s, Kate Lunau recounts the history of the venerable incandescent light bulb as new regulations kick in today to phase them out of use in Canada:
The incandescent light bulb was born on Jan. 27, 1880, when U.S. inventor Thomas Edison famously patented his “electric lamp.” Others had paved the way, including Canadians Henry Woodward and Mathew Evans, whose 1874 light bulb patent was bought by Edison. But it was the latter who perfected and would commercialize the technology.
The light bulb — in which an electric current passes through a filament that heats up and glows inside a glass bulb — yanked North America into the electric age. Before then, “all street lamps were gas,” says Anna Adamek, who curates the energy collection at the Canada Science and Technology Museum, which includes about 2,000 light bulbs. “Wealthy people could afford gas lamps for interior lighting, but most would use kerosene, oil, or candles.” In 1882, the Canada Cotton Co., in Cornwall, Ont., became the first Canadian company to install electric lights. “Edison personally supervised the installation,” she says. In 1884, the lights went on in the Parliament buildings and, by 1905, the lighting of Canadian cities was well under way. Electric light changed the way people spent their evenings, and the way businesses operated — allowing people to work around the clock. Once electric wiring was installed, manufacturers were spurred to make all sorts of new gadgets and appliances for the home, from electric irons to refrigerators.
As the ban approached, many fretted over the cost of replacing their household lights with CFLs and LEDs, as well as the small amount of mercury inside fluorescents — not to mention the loss of pleasant-coloured lighting at home. Traditionalists have responded by stockpiling their beloved bulbs. In the U.K., the Daily Mail carried a story of a 62-year-old pensioner, who hoarded enough to see her “into the grave.” Riffing on the old joke, Freedom Light Bulb, a U.S. blog, asked: “How many politicians or bureaucrats should it take to change a light bulb?” The answer: “None.” On Jan. 1, 2014, Canada’s new regulations will be phased in. Stores will sell through existing inventory; not long after, that warm familiar glow will be gone for good.
Robert Bryce explains why — no matter how much we might want it to be so — alternate forms of energy like wind and solar power cannot cover our demands:
That 32 percent increase in global carbon dioxide emissions reflects the central tension in any discussion about cutting the use of coal, oil and natural gas: Developing countries — in particular, fast-growing economies such as Vietnam, China and India — simply cannot continue to grow if they limit the use of hydrocarbons. Those countries’ refusal to enact carbon taxes or other restrictions illustrates what Roger Pielke Jr., a professor of environmental studies at the University of Colorado, calls the “iron law of climate policy”: Whenever policies “focused on economic growth confront policies focused on emissions reduction, it is economic growth that will win out every time.”
Over the past 10 years, despite great public concern, carbon dioxide emissions have soared because some 2.6 billion people still live in dire energy poverty. More than 1.3 billion have no access to electricity at all.
Now to the second number: 1. That’s the power density of wind in watts per square meter. Power density is a measure of the energy flow that can be harnessed from a given area, volume or mass. Six different analyses of wind (one of them is my own) have all arrived at that same measurement.
Wind energy’s paltry power density means that enormous tracts of land must be set aside to make it viable. And that has spawned a backlash from rural and suburban landowners who don’t want 500-foot wind turbines near their homes. To cite just one recent example, in late July, some 2,000 protesters marched against the installation of more than 1,000 wind turbines in Ireland’s Midlands Region.
Consider how much land it would take for wind energy to replace the power the U.S. now gets from coal. In 2011, the U.S. had more than 300 billion watts of coal-fired capacity. Replacing that with wind would require placing turbines over about 116,000 square miles, an area about the size of Italy. And because of the noise wind turbines make — a problem that has been experienced from Australia to Ontario — no one could live there.
In 2012, the contribution from all of those sources amounted to about 4.8 million barrels of oil equivalent per day, or roughly one-half of a Saudi Arabia. Put another way, we get about 50 times as much energy from all other sources — coal, oil, natural gas, nuclear and hydropower — as we do from wind, solar, geothermal and biomass.
The official US inflation rate is around 1% annually. That doesn’t seem quite right to a lot of people who seem to be spending more money for the same goods:
… what Bernanke will never admit is that the official inflation rate is a total sham. The way that inflation is calculated has changed more than 20 times since 1978, and each time it has been changed the goal has been to make it appear to be lower than it actually is.
If the rate of inflation was still calculated the way that it was back in 1980, it would be about 8 percent right now and everyone would be screaming about the fact that inflation is way too high.
But instead, Bernanke can get away with claiming that inflation is “too low” because the official government numbers back him up.
Of course many of us already know that inflation is out of control without even looking at any numbers. We are spending a lot more on the things that we buy on a regular basis than we used to.
For example, when Barack Obama first entered the White House, the average price of a gallon of gasoline was $1.84. Today, the average price of a gallon of gasoline has nearly doubled. It is currently sitting at $3.49, but when I filled up my vehicle yesterday I paid nearly $4.00 a gallon.
And of course the price of gasoline influences the price of almost every product in the entire country, since almost everything that we buy has to be transported in some manner.
But that is just one example.
Our monthly bills also seem to keep growing at a very brisk pace.
Electricity bills in the United States have risen faster than the overall rate of inflation for five years in a row, and according to USA Today water bills have actually tripled over the past 12 years in some areas of the country.
No inflation there, eh?
A short video of Kirk Sorensen taking us through the benefits of Liquid Fluoride Thorium Reactors, a revolutionary liquid reactor that runs not on uranium, but thorium. These work and have been built before. Search for either LFTRs or Molten Salt Reactors (MSR).
The main downsides/negatives to this technology, politics, corrosion and being scared of nuclear radiation. Liquid Fluoride Thorium Reactors were created 50 years ago by an American chap named Alvin Weinberg, but the American Government realised you can’t weaponise the by-products and so they weren’t interested.
Another point, yes it WAS corrosive, but these tests of this reactor were 50 years ago, our technology has definitely improved since then so a leap to create this reactor shouldn’t be too hard.
And nuclear fear is extremely common in the average person, rather irrational though it may be. More people have died from fossil fuels and even hydroelectric power than nuclear power. I added this video for a project regarding Liquid Fluoride Thorium Reactors, watch and enjoy.
No, it would not collapse the economy… just like the use of uranium reactors didn’t… neither did coal… This is because you wouldn’t have an instant transition from coal… oil… everything else to thorium. We could not do that. Simply due to the engineering. Give it 50 years we might be using thorium instead of coal/oil (too late in terms of global warming, but that’s another debate completely), but we certainly won’t destroy the earths economy. Duh.
And yes he said we’d never run out. Not strictly true… bloody skeptics … LFTRs can harness 3.5 million Kwh per Kg of thorium! 70 times greater than uranium, 10,000 greater than oil… and there is over 2.6 million tonnes of it on earth… Anyone with a calculator, or a brain, will understand that is a lot of energy!!
H/T to Rob Fisher for the link.
The Economist on a promising new development in battery technology:
LITHIUM-ION batteries are hot stuff. Affordable, relatively lightweight and packing a lot of energy, they are the power source of choice for everything from mobile phones to electric cars. Unfortunately, the heat can be more than figurative. Occasionally, such batteries suffer malfunctions that lead to smoke, flames and even explosions. In gadgets, such meltdowns can be distressing and dangerous. In aircraft, they can be fatal. Earlier this year airlines grounded their entire fleet of Boeing’s next-generation 787 passenger jet after the lithium-ion batteries installed in two planes caught fire. Last month they have been permitted back in the air after being retrofitted with a protection system in the form of a tough steel box that vents directly outside in the event of a fire.
A more comforting solution, of course, would be to build a lithium-ion battery that could not burst into flames in the first place. Katie Zhong at Washington State University might have just such a device. For the last few years, she has been working on battery technology for flexible and bendable electronic gadgets. By blending a polymer called polyethylene oxide (PEO) with natural soy protein, she had made a solid electrolyte for lithium ion batteries that could be bent or stretched to twice its normal size without affecting its performance.
Like all batteries, lithium-ion rechargeables consist of two electrodes separated by an electrolyte. In a typical lithium-ion cell, the electrolyte is a solution of lithium salts and organic solvents. Charging drives lithium ions from the electrolyte into a graphite anode. On discharge, the reverse happens, with a balancing flow of electrons through the device being powered.
Matt Ridley explains why replacing natural gas (or even coal) electrical generation with biomass is an absurd “solution”:
Under the Government’s plan, biomass power stations will soon be burning much more wood than the country can possibly produce. There is a comforting myth out there that biomass imports are mainly waste that would otherwise decompose: peanut husks, olive pips, bark trimmings and the like. Actually, the bulk of the imports are already and will continue to be of wood pellets.
It is instructive to trace these back to their origin. Reporters for The Wall Street Journal recently found that the two pelleting plants established in the southern US specifically to supply Drax are not just taking waste or logs from thinned forest, but also taking logs from cleared forest, including swamp woodlands in North Carolina cleared by “shovel-logging” with giant bulldozers (running on diesel). Local environmentalists are up in arms.
The logs are taken to the pelleting plants where they are dried, chopped and pelleted, in an industrial process that emits lots of carbon dioxide and pollutants. They are then trucked (more diesel) to ports, loaded on ships (diesel again), offloaded at the Humber on to (yet more diesel) trains, 40 of which arrive at Drax each day.
[. . .]
Over 20 or 40 years, study after study shows that wood burning is far worse than gas, and worse even than coal, in terms of its greenhouse gas emissions. The effect on forest soil, especially if it is peaty, only exacerbates the disparity. The peat dries out and oxidises.
Yet the Government persists in regarding biomass burning as zero-carbon and therefore deserving of subsidy. It does so by the Orwellian feat of defining sustainability as a 60 per cent reduction in emissions from fossil fuels. As Calor Gas puts it: “This is a logical somersault too far, conveniently — for the sake of cherry-picking the technology — equating 40 per cent to 0 per cent.” (Calor Gas supplies rural gas and is understandably miffed at being punitively treated while a higher- carbon rival industry is subsidised. […]) Moreover, unlike gas or coal, you are pinching nature’s lunch when you cut down trees. Unfelled, the trees would feed beetles, woodpeckers, fungi and all sorts of other wildlife when they died, let alone when they lived. Nothing eats coal.
So, compared with gas, the biomass dash is bad for the climate, bad for energy security and dependence on imports, bad for human health, bad for wildlife and very bad for the economy. Apart from that, what’s not to like?
It’s mighty handy to have thoughtfully passed a law against deleting official records — that includes no penalties whatsoever — just before you start breaking that law with abandon:
Top Liberal staffers — even in former premier Dalton McGuinty’s office — illegally deleted emails tied to the $585-million gas plant scandal, a parliamentary watchdog has found.
“It’s clear they didn’t want anything left behind in terms of a record on these issues,” Information and Privacy Commissioner Ann Cavoukian said Wednesday.
Her findings came in a scathing 35-page report prompted by NDP complaints that key Liberal political staff have no records on the controversial closures of plants in Mississauga and Oakville before the 2011 election.
However, despite breaking the Archives and Recordkeeping Act and “undermining” freedom-of-information legislation, the scofflaws will not face penalties because there are none, said Cavoukian.
“That’s the problem,” she said, noting the inadequate legislation was passed by the McGuinty Liberals. “It’s untenable. It has to have teeth so people just don’t engage in indiscriminate practices.”
Attorney General John Gerretsen said the government would consider changes.
“Any law, in order to be effective, there have to be some sort of penalty provisions,” he said. “We’ll take a look.”
If I were a betting man, I’d say that the chances of this “look” producing anything useful would be less than 1 in 10. If this were a private firm or an individual accused of deleting records that the government had an interest in seeing, I rather suspect they’d creatively find something in the existing body of law to use as a bludgeon. It’s charming that they didn’t think to include any penalties if the culprit was a government employee.
In The Register, Tony Smith discusses a new prototype battery that might be coming to your electronic devices … eventually:
Electronics continue to shrink to ever smaller sizes, but researchers are having a tough time miniaturising the batteries powering today’s mobile gadgets. Step forward, bicontinuous nanoporous electrodes.
Smartphones use smaller power packs than they did five years ago, it’s true, but that’s because their chips and radios are more power efficient, not because of any major new battery technology.
Now boffins from the University of Illinois at Urbana-Champaign reckon they have come up with a new pocket-friendly electricity supply.
Enter the “microbattery”, a compact power cell constructed from many three-dimensional nanoporous electrodes capable, its developers claim, of delivering both high power and a large energy capacity.
The negative cathode was devised by another team at the university, but graduate student James Pikul, working under Bliss Professor of mechanical science and engineering William King, figured out how to create a compatible anode and put the two into a battery.
[. . .]
The cathode design, devised by a team led by the University’s Professor Paul Braun, is fast charging. Pikul reckons building a battery out of it yields a rechargeable that can be filled up in a thousandth of the time it takes to charge a comparably sized regular rechargeable cell.
Building a battery in a lab is one thing. Working out how to manufacture it commercially at a price that makes it a realistic power source for future devices is another thing altogether. Pikul and King will be working on that next.
Powered by WordPress