Published on 8 Feb 2016
Are electric cars greener than conventional gasoline cars? If so, how much greener? What about the CO2 emissions produced during electric cars’ production? And where does the electricity that powers electric cars come from? Environmental economist Bjorn Lomborg, director of the Copenhagen Consensus Center, examines how environmentally friendly electric cars really are.
March 13, 2016
October 17, 2015
Tesla does over-the-air updates for their electric cars (which is kinda neat). The latest update includes an almost-but-not-quite self-driving feature:
Tonight, Tesla makes its cars autonomous. Well, semi-autonomous. And it did it with an over-the-air update, effectively making tens of thousands of cars already sold to customers way better.
There are two things to talk about here. There’s the small story about the features and what the upgrade actually looks like and how it works. That’s a good place to start: This is the biggest change to the visual display of the Model S and X ever. There are new instrument panels, app windows are larger and take up more of the 17-inch touchscreen. Drivers will now get more information about what their cars are doing when in Autopilot, they can lock and unlock their car from the status bar. There’s a new clock!
These are simple cosmetic changes. The Big Story is that all of this—and really, who cares about anything beyond autopilot mode?—is being pushed through to customers’ Teslas overnight. The update will begin being pushed out tonight, and will hit every Tesla made and sold in the US in the past year over the course of this week.
Before you get too excited about an autonomous, hands-free present, you need to know that you can’t nap in the back, chauffeured around in beautiful, electric silence.
Even in Autopilot, you keep your hands on the steering wheel. Well… you don’t have to keep your hands on the steering wheel. You can rest them on your knees (resting on knees, palms up, fingertips touching the wheel is advised), or keep one pinky on the wheel. And okay, you can take your hands off altogether for a moment. But after a few seconds, your car will give you a little message, asking you to touch the wheel in some capacity.
September 21, 2015
Published on 18 Mar 2015
What happened to the cleanliness of your clothes after the U.S. Department of Energy issued new washing machine requirements? The requirements — which require washers to use 21% less energy — mean that washers actually clean clothes less than they used to. Is “command and control” an efficient way to achieve the desired outcome (which is less pollution)? Rather than a standard requirement, such as the Department of Energy issued, a tax on electricity would provide users with greater flexibility in their washing—and would prompt people to purchase machines that use energy more efficiently and keep their clothes clean.
Are there times when a command and control solution to a problem makes the most sense? We look at the eradication of smallpox as one example.
August 5, 2015
It may make politicians and activists feel empowered and righteous, but it has negative aspects that don’t seem to get the same level of attention as the “feel good” rhetoric does:
Nuclear power faces an uncertain future in Sweden. Major political parties, including the Green party of the coalition-government have recently strongly advocated for a policy to decommission the Swedish nuclear fleet prematurely. Here we examine the environmental, health and (to a lesser extent) economic impacts of implementing such a plan. The process has already been started through the early shutdown of the Barsebäck plant. We estimate that the political decision to shut down Barsebäck has resulted in ~2400 avoidable energy-production-related deaths and an increase in global CO2 emissions of 95 million tonnes to date (October 2014). The Swedish reactor fleet as a whole has reached just past its halfway point of production, and has a remaining potential production of up to 2100 TWh. The reactors have the potential of preventing 1.9–2.1 gigatonnes of future CO2-emissions if allowed to operate their full lifespans. The potential for future prevention of energy-related-deaths is 50,000–60,000. We estimate an 800 billion SEK (120 billion USD) lower-bound estimate for the lost tax revenue from an early phase-out policy. In sum, the evidence shows that implementing a ‘nuclear-free’ policy for Sweden (or countries in a similar situation) would constitute a highly retrograde step for climate, health and economic protection.
April 4, 2015
Published on 23 Dec 2013
Archive film, moving a new power station transformer by rail to Blaenau Ffestiniog Wales.
H/T to Roger Henry for the link.
April 1, 2015
Matt Ridley on the piously hoped-for breakthroughs in renewable energy sources … that still seem as distant as they did decades ago:
The environmental movement has advanced three arguments in recent years for giving up fossil fuels: (1) that we will soon run out of them anyway; (2) that alternative sources of energy will price them out of the marketplace; and (3) that we cannot afford the climate consequences of burning them.
These days, not one of the three arguments is looking very healthy. In fact, a more realistic assessment of our energy and environmental situation suggests that, for decades to come, we will continue to rely overwhelmingly on the fossil fuels that have contributed so dramatically to the world’s prosperity and progress.
In 2013, about 87% of the energy that the world consumed came from fossil fuels, a figure that — remarkably — was unchanged from 10 years before. This roughly divides into three categories of fuel and three categories of use: oil used mainly for transport, gas used mainly for heating, and coal used mainly for electricity.
So those who predict the imminent exhaustion of fossil fuels are merely repeating the mistakes of the U.S. presidential commission that opined in 1922 that “already the output of gas has begun to wane. Production of oil cannot long maintain its present rate.” Or President Jimmy Carter when he announced on television in 1977 that “we could use up all the proven reserves of oil in the entire world by the end of the next decade.”
That fossil fuels are finite is a red herring. The Atlantic Ocean is finite, but that does not mean that you risk bumping into France if you row out of a harbor in Maine. The buffalo of the American West were infinite, in the sense that they could breed, yet they came close to extinction. It is an ironic truth that no nonrenewable resource has ever run dry, while renewable resources — whales, cod, forests, passenger pigeons — have frequently done so.
As for renewable energy, hydroelectric is the biggest and cheapest supplier, but it has the least capacity for expansion. Technologies that tap the energy of waves and tides remain unaffordable and impractical, and most experts think that this won’t change in a hurry. Geothermal is a minor player for now. And bioenergy — that is, wood, ethanol made from corn or sugar cane, or diesel made from palm oil — is proving an ecological disaster: It encourages deforestation and food-price hikes that cause devastation among the world’s poor, and per unit of energy produced, it creates even more carbon dioxide than coal.
Wind power, for all the public money spent on its expansion, has inched up to — wait for it — 1% of world energy consumption in 2013. Solar, for all the hype, has not even managed that: If we round to the nearest whole number, it accounts for 0% of world energy consumption.
February 22, 2015
Jared Newman on Tesla’s plans to move into a new marketplace:
Tesla’s crusade against fossil fuels could soon hit home with a battery-powered energy pack.
The company plans to start producing a home battery within six months, Bloomberg reports, and will reveal more details in the next month or two.
One obvious application would be a source for backup power, replacing conventional fuel-powered generators. The upcoming Toyota Mirai hydrogen-powered car can also function as a backup power source for a house. But the Tesla pack could also help shift energy usage to reduce peak demand on the electric grid, thereby cutting down on energy bills. In an earnings call, Tesla Chief Technology Officer JB Straubel hinted at strong interest from utility companies for that very reason.
Potentially, homes with renewable energy sources such as solar panels could also power the pack, allowing users to wean or remove themselves from the electric grid.
If the price point is low enough, this could be a big boost to fans of locally produced electricity from solar or wind power … being able to store the energy you generate so you can use it when you need it makes it much more attractive to invest in those technologies.
February 4, 2015
Robyn Arianrhod on the sesquicentential one of the most important discoveries that helped create the world we live in today:
It’s hard to imagine life without mobile phones, radio and television. Yet the discovery of the electromagnetic waves that underpin such technologies grew out of an abstract theory that’s 150 years old.
Our knowledge of the existence of such waves is a direct result of James Clerk Maxwell’s theory of electromagnetism which was first published in January 1865.
Electromagnetism itself was discovered physically rather than theoretically. Some time around 1820, the Danish physicist Hans Oersted noticed that when you switch on an electric current, a nearby magnet – such as the needle of a compass – actually jumps, as if the changing electric current was itself a magnet.
Then, in 1831 (the year Maxwell was born in Edinburgh) the English physicist and chemist Michael Faraday discovered that if you move a magnet through a coil of wire, you create an electric current in the wire without the aid of batteries or other electricity supply.
Faraday was so intrigued by the surprising ability of moving magnets to create electricity that he created a tiny prototype of the electric generator. He also created a prototype of the electric motor, but it would take decades before engineers were able to develop working motors and generators.
Nevertheless, basic technologies had begun to flow almost immediately after the phenomenon of electromagnetism was discovered: in particular, the telegraph – the first high-speed global telecommunications system.
November 1, 2014
In the Wall Street Journal, Jo Craven McGinty examines the pro and con equation for Daylight Saving Time. The US government, of course, says it saves electricity by their measurement:
The historic reason for observing daylight-saving time — which ends at 2 a.m. on Sunday when clocks revert to standard time — is to conserve energy, by pushing sunlight forward into the evening, reducing the need for electric lights.
The U.S. government has found the strategy works. But two academic studies published in peer-reviewed journals rebut the idea, and one even concludes the policy increases demand for electricity.
The most recent government study, by the Department of Energy, tested whether expanding daylight-saving time by four weeks in 2007 reduced the use of electricity, as intended.
The study examined the additional weeks of daylight-saving time using data provided by 67 utilities accounting for two-thirds of U.S. electricity consumption. It compared average daily use in 2006, when there was no daylight saving, with the same period in 2007 when the extension took effect and found a reduction in electricity use of 0.5% in the spring and 0.38% in the fall.
However, non-government studies don’t agree:
The study, which was published in the Review of Economics and Statistics, examined residential data only, but the researchers didn’t believe commercial use would alter their findings.
“Big-box stores don’t turn on or off lights based on whether it’s light outside or dark,” Mr. Kotchen said. “In a commercial building, the lights are on when people are working no matter what.”
Rather than conserving electricity, the study found that daylight-saving time increased demand for electricity. Conditions may vary in other parts of the country, but the study concluded that Indiana is representative of much of the country.
That doesn’t mean daylight-saving time has never worked since its introduction during World War I. But, said Mr. Kotchen, “the world has changed. Lighting is a small amount of energy and electricity use in households. The big things are heating and cooling, particularly as air conditioning has become more prevalent. We’re fooling ourselves to continue calling it an energy policy given the studies that show it doesn’t save energy.”
H/T to Terence Corcoran for the link.
October 9, 2014
Markus Krajewski writes about the formation of a multinational industrial cartel shortly after the First World War that helped create the very concept of “planned obsolescence” for (no) fun and (their) profit:
On 23 December 1924, a group of leading international businessmen gathered in Geneva for a meeting that would alter the world for decades to come. Present were top representatives from all the major lightbulb manufacturers, including Germany’s Osram, the Netherlands’ Philips, France’s Compagnie des Lampes, and the United States’ General Electric. As revelers hung Christmas lights elsewhere in the city, the group founded the Phoebus cartel, a supervisory body that would carve up the worldwide incandescent lightbulb market, with each national and regional zone assigned its own manufacturers and production quotas. It was the first cartel in history to enjoy a truly global reach.
The cartel’s grip on the lightbulb market lasted only into the 1930s. Its far more enduring legacy was to engineer a shorter life span for the incandescent lightbulb. By early 1925, this became codified at 1,000 hours for a pear-shaped household bulb, a marked reduction from the 1,500 to 2,000 hours that had previously been common. Cartel members rationalized this approach as a trade-off: Their lightbulbs were of a higher quality, more efficient, and brighter burning than other bulbs. They also cost a lot more. Indeed, all evidence points to the cartel’s being motivated by profits and increased sales, not by what was best for the consumer. In carefully crafting a lightbulb with a relatively short life span, the cartel thus hatched the industrial strategy now known as planned obsolescence.
How exactly did the cartel pull off this engineering feat? It wasn’t just a matter of making an inferior or sloppy product; anybody could have done that. But to create one that reliably failed after an agreed-upon 1,000 hours took some doing over a number of years. The household lightbulb in 1924 was already technologically sophisticated: The light yield was considerable; the burning time was easily 2,500 hours or more. By striving for something less, the cartel would systematically reverse decades of progress.
The details of this effort have been very slow to emerge. Some facts came to light in the 1940s, when the U.S. government investigated GE and a number of its business partners for anticompetitive practices. Others were uncovered more recently, when I and the German journalist Helmut Höge delved into the corporate archives of Osram in Berlin. Jointly founded in 1920 by three German companies, Osram remains one of the world’s leading makers of all kinds of lighting, including state-of-the-art LEDs. In the archives, we found meticulous correspondence between the cartel’s factories and laboratories, which were researching how to modify the filament and other measures to shorten the life span of their bulbs.
The cartel took its business of shortening the lifetime of bulbs every bit as seriously as earlier researchers had approached their job of lengthening it. Each factory bound by the cartel agreement—and there were hundreds, including GE’s numerous licensees throughout the world—had to regularly send samples of its bulbs to a central testing laboratory in Switzerland. There, the bulbs were thoroughly vetted against cartel standards. If any factory submitted bulbs lasting longer or shorter than the regulated life span for its type, the factory was obliged to pay a fine.
Companies were also fined for exceeding their sales quotas, which were constantly being adjusted. In 1927, for example, Tokyo Electric noted in a memo to the cartel that after shortening the lives of its vacuum and gas-filled lightbulbs, sales had jumped fivefold. “But if the increase in our business resulting from such endeavors directly mean[s] a heavy penalty, it must be a thing out of reason and shall quite discourage us,” the memo stated.
The great Adam Smith, of course, saw this coming in 1776: “People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.” Some things never change.
September 28, 2014
At The Diplomat, Mohamed Zeeshan talks about India’s self-imposed disadvantages in manufacturing both for domestic and export consumption:
Indian Prime Minister Narendra Modi’s maiden Independence Day speech was laced with inspiring rhetoric. But of the many things he said, the one slogan that inevitably caught public attention was this: “Come, make in India!” With those words, Modi was trying to make the case for turning India into the world’s next great manufacturing hub. Understandably, the Indian populace was thrilled.
India is one of the world’s ten largest economies (and is third largest on a purchasing power parity basis), with a total annual output of nearly $2 trillion. As much as 57 percent of this output is produced by a service sector that employs just 28 percent of the population, largely concentrated in urban parts of the country. That is no surprise, because most Indians lack the skills and education to join the more knowledge-intensive service sector. What they need is what successful developing nations all over the world have had ever since the Industrial Revolution: a robust and productive manufacturing sector.
Yet India’s manufacturing sector contributes just 16 percent to the total GDP pie (China’s, by contrast, accounts for almost half of its total economic output). Victor Mallet, writing in the McKinsey book Reimagining India, recently offered an anecdote that was illuminating. “One of India’s largest carmakers recently boasted that it was selling more vehicles than ever and that it was hiring an extra eight hundred workers for its factory,” he wrote, “But the plant employing those workers belongs to the Jaguar Land Rover subsidiary of Tata Motors and is in the English Midlands, not in job-hungry India.”
Mallet goes on to make a point that has been made frequently by Indian economists: The world doesn’t want to “make in India,” because it is simply too painful. There’s bureaucratic red tape, a difficult land acquisition act, troublesome environmental legislation, a shortage of electricity, and a lack of water resources. The only thing India doesn’t seem to lack is labor, but that merely adds to the problem. As Mallet points out in the same essay, aptly titled “Demographic dividend – or disaster?”, “India’s population grew by 181 million in the decade to 2011 – and (despite falling fertility rates) a rise of nearly 50% in the total number of inhabitants is unavoidable.” But the number of jobs being added to feed that population is inadequate.
However, the labor dividend is still important. India doesn’t need to reduce the number of hands on deck. It needs to weed out the challenges that stop them from being productive.
January 1, 2014
In Maclean’s, Kate Lunau recounts the history of the venerable incandescent light bulb as new regulations kick in today to phase them out of use in Canada:
The incandescent light bulb was born on Jan. 27, 1880, when U.S. inventor Thomas Edison famously patented his “electric lamp.” Others had paved the way, including Canadians Henry Woodward and Mathew Evans, whose 1874 light bulb patent was bought by Edison. But it was the latter who perfected and would commercialize the technology.
The light bulb — in which an electric current passes through a filament that heats up and glows inside a glass bulb — yanked North America into the electric age. Before then, “all street lamps were gas,” says Anna Adamek, who curates the energy collection at the Canada Science and Technology Museum, which includes about 2,000 light bulbs. “Wealthy people could afford gas lamps for interior lighting, but most would use kerosene, oil, or candles.” In 1882, the Canada Cotton Co., in Cornwall, Ont., became the first Canadian company to install electric lights. “Edison personally supervised the installation,” she says. In 1884, the lights went on in the Parliament buildings and, by 1905, the lighting of Canadian cities was well under way. Electric light changed the way people spent their evenings, and the way businesses operated — allowing people to work around the clock. Once electric wiring was installed, manufacturers were spurred to make all sorts of new gadgets and appliances for the home, from electric irons to refrigerators.
As the ban approached, many fretted over the cost of replacing their household lights with CFLs and LEDs, as well as the small amount of mercury inside fluorescents — not to mention the loss of pleasant-coloured lighting at home. Traditionalists have responded by stockpiling their beloved bulbs. In the U.K., the Daily Mail carried a story of a 62-year-old pensioner, who hoarded enough to see her “into the grave.” Riffing on the old joke, Freedom Light Bulb, a U.S. blog, asked: “How many politicians or bureaucrats should it take to change a light bulb?” The answer: “None.” On Jan. 1, 2014, Canada’s new regulations will be phased in. Stores will sell through existing inventory; not long after, that warm familiar glow will be gone for good.
September 21, 2013
Robert Bryce explains why — no matter how much we might want it to be so — alternate forms of energy like wind and solar power cannot cover our demands:
That 32 percent increase in global carbon dioxide emissions reflects the central tension in any discussion about cutting the use of coal, oil and natural gas: Developing countries — in particular, fast-growing economies such as Vietnam, China and India — simply cannot continue to grow if they limit the use of hydrocarbons. Those countries’ refusal to enact carbon taxes or other restrictions illustrates what Roger Pielke Jr., a professor of environmental studies at the University of Colorado, calls the “iron law of climate policy”: Whenever policies “focused on economic growth confront policies focused on emissions reduction, it is economic growth that will win out every time.”
Over the past 10 years, despite great public concern, carbon dioxide emissions have soared because some 2.6 billion people still live in dire energy poverty. More than 1.3 billion have no access to electricity at all.
Now to the second number: 1. That’s the power density of wind in watts per square meter. Power density is a measure of the energy flow that can be harnessed from a given area, volume or mass. Six different analyses of wind (one of them is my own) have all arrived at that same measurement.
Wind energy’s paltry power density means that enormous tracts of land must be set aside to make it viable. And that has spawned a backlash from rural and suburban landowners who don’t want 500-foot wind turbines near their homes. To cite just one recent example, in late July, some 2,000 protesters marched against the installation of more than 1,000 wind turbines in Ireland’s Midlands Region.
Consider how much land it would take for wind energy to replace the power the U.S. now gets from coal. In 2011, the U.S. had more than 300 billion watts of coal-fired capacity. Replacing that with wind would require placing turbines over about 116,000 square miles, an area about the size of Italy. And because of the noise wind turbines make — a problem that has been experienced from Australia to Ontario — no one could live there.
In 2012, the contribution from all of those sources amounted to about 4.8 million barrels of oil equivalent per day, or roughly one-half of a Saudi Arabia. Put another way, we get about 50 times as much energy from all other sources — coal, oil, natural gas, nuclear and hydropower — as we do from wind, solar, geothermal and biomass.
July 13, 2013
The official US inflation rate is around 1% annually. That doesn’t seem quite right to a lot of people who seem to be spending more money for the same goods:
… what Bernanke will never admit is that the official inflation rate is a total sham. The way that inflation is calculated has changed more than 20 times since 1978, and each time it has been changed the goal has been to make it appear to be lower than it actually is.
If the rate of inflation was still calculated the way that it was back in 1980, it would be about 8 percent right now and everyone would be screaming about the fact that inflation is way too high.
But instead, Bernanke can get away with claiming that inflation is “too low” because the official government numbers back him up.
Of course many of us already know that inflation is out of control without even looking at any numbers. We are spending a lot more on the things that we buy on a regular basis than we used to.
For example, when Barack Obama first entered the White House, the average price of a gallon of gasoline was $1.84. Today, the average price of a gallon of gasoline has nearly doubled. It is currently sitting at $3.49, but when I filled up my vehicle yesterday I paid nearly $4.00 a gallon.
And of course the price of gasoline influences the price of almost every product in the entire country, since almost everything that we buy has to be transported in some manner.
But that is just one example.
Our monthly bills also seem to keep growing at a very brisk pace.
Electricity bills in the United States have risen faster than the overall rate of inflation for five years in a row, and according to USA Today water bills have actually tripled over the past 12 years in some areas of the country.
No inflation there, eh?
July 5, 2013
A short video of Kirk Sorensen taking us through the benefits of Liquid Fluoride Thorium Reactors, a revolutionary liquid reactor that runs not on uranium, but thorium. These work and have been built before. Search for either LFTRs or Molten Salt Reactors (MSR).
The main downsides/negatives to this technology, politics, corrosion and being scared of nuclear radiation. Liquid Fluoride Thorium Reactors were created 50 years ago by an American chap named Alvin Weinberg, but the American Government realised you can’t weaponise the by-products and so they weren’t interested.
Another point, yes it WAS corrosive, but these tests of this reactor were 50 years ago, our technology has definitely improved since then so a leap to create this reactor shouldn’t be too hard.
And nuclear fear is extremely common in the average person, rather irrational though it may be. More people have died from fossil fuels and even hydroelectric power than nuclear power. I added this video for a project regarding Liquid Fluoride Thorium Reactors, watch and enjoy.
No, it would not collapse the economy… just like the use of uranium reactors didn’t… neither did coal… This is because you wouldn’t have an instant transition from coal… oil… everything else to thorium. We could not do that. Simply due to the engineering. Give it 50 years we might be using thorium instead of coal/oil (too late in terms of global warming, but that’s another debate completely), but we certainly won’t destroy the earths economy. Duh.
And yes he said we’d never run out. Not strictly true… bloody skeptics … LFTRs can harness 3.5 million Kwh per Kg of thorium! 70 times greater than uranium, 10,000 greater than oil… and there is over 2.6 million tonnes of it on earth… Anyone with a calculator, or a brain, will understand that is a lot of energy!!
H/T to Rob Fisher for the link.