Published on 25 Feb 2015
If price controls have negative consequences, why do governments enact them? Let’s revisit our example of President Nixon’s wage and price controls in the 1970s. These price controls were popular, as is demonstrated by Nixon being re-elected after they went into effect. The public didn’t think that the price controls were to blame for things such as long lines at the fuel pump. Without knowledge of the economics behind price controls, the public blamed foreign oil cartels and oil companies for the shortages.
In this video we’ll also address questions such as: do price controls — like rent controlled apartments and the minimum wage — help the poor? Are there better ways to help the poor? If so, what are they? Let’s find out.
August 3, 2015
Why Do Governments Enact Price Controls?
July 22, 2015
Price Floors: Airline Fares
Published on 25 Feb 2015
In this video, we cover how price floors lead to wasteful increases in quality and a misallocation of resources. Using the real-world example of airline regulations from 1938-1978, we show how price floors can be used to restrict entry and reduce competition within an industry. When the Civil Aeronautics Board regulated airline fares, airlines couldn’t compete on price so they instead had to compete by increasing quality. This may sound like a good thing, but we’ll show how this actually created quality waste since the cost of that quality was higher than the value to the customers. Price floors also lead to the misallocation of resources by preventing competition and responsiveness to consumer demand. In this video, we’ll show you how consumers are negatively affected by price floors.
July 10, 2015
A new and exciting (if you’re a lawyer) aspect of photography
As a casual photographer, I think very little about taking a photo of a building or landscape visible from the sidewalk or other public place. This casual attitude may become a relic of the past if EU regulators have their way, as Brian Micklethwait explains:
Basically, some EU-ers are talking about making it illegal to profit without permission by taking a photo, in public, of a publicly visible building or work of art, and then posting it on any “profitable” blog or website. The nasty small print being to the effect that the definition of “profitable” is very inclusive. For the time being, it would exclude my personal blog, because my blog has no income of any kind. But does Samizdata get any cash, however dribblesome, from any adverts, “sponsorships”, and so forth? If so, then me placing the above photo of the Shard at Samizdata might, any year now, become illegal, unless Samizdata has filled in a thousand forms begging the owners of the Shard, and for that matter of all the buildings that surround it, to allow this otherwise terrible violation of their property rights, or something.
“Might” because you never really know with the EU. At present this restriction applies in parts of the EU. It seems that a rather careless MEP tried to harmonise things by making the whole of the EU as relaxed about this sort of things as parts of it are now, parts that now include the UK. But, the EU being the EU, other EU-ers immediately responded by saying, no, the way to harmonise things is to make the entire EU more restrictive. Now the MEP who kicked all this off is fighting a defensive battle against the very restriction she provoked. Or, she is grandstanding about nothing, which is very possible.
Being pessimistic about all this, what if the restriction does spread? And how long, then, before the definition of “for profit” is expanded to include everything you do, because if it wasn’t profitable for you, why would you do it? At that point, even my little hobby blog would be in the cross hairs, if I ever dared to take and post further pictures of London’s big buildings.
Some better news for me is that if this scheme proceeds as far as it eventually might, my enormous archive of photographs of people taking photographs will maybe acquire a particular poignancy. It will become a record of a moment in social history, which arrived rather suddenly, and then vanished. Like smoking in public.
July 1, 2015
QotD: The CRTC, Canada’s most fascistic government body
The CRTC is an even more odious organization. Back in 1920s both the Canadian and American governments declared the broadcast spectrum to be public property. So a technology pioneered and commercialized by the private sector, in both countries, was essentially nationalized by the state. Since it was a new industry it lacked the ability to effectively lobby Washington and Ottawa. The result has been that a large and important sector of our modern economy now lives and dies at the whim of an unelected government agency: The CRTC.
Of all the organs of Canadian government the CRTC has always struck me as the most fascistic. You could rationalize socialize health care, public education and government financed infrastructure as doing useful things in a terribly statist way. The CRTC is at an exercise in make work at best. At worse it’s an attempt to impose indirect censorship on the Canadian people. Beneath the reams of government drafted euphemisms the blunt truth behind the CRTC is that we mere Canadians are not clever enough, not patriotic enough or sufficiently sensible to watch and listen to the right things in the right way.
The existence of the CRTC explains much of the timorousness of Canadian broadcasting. The Americans did away with the Fairness Doctrine in 1987, thereby triggering the explosion in talk radio in the early 1990s. While Canada never had an exact equivalent, the regulations surrounding who could and could not receive or retain a license were sufficiently vague to make such a rule unnecessary. A nod and a wink from the right people at the right time was enough to indicate what type of broadcasting would or would not be acceptable.
The result was an insufferable group think that could no more be defined than challenged. There were unwritten rules of etiquette that forbade serious discussion from talking place on a whole host of issues: Abortion, capital punishment, race relations, linguistic issues and any frank discussions of our socialized health care system. It wasn’t that these discussions didn’t take place in a public forum, the newspapers and magazines were largely unregulated, but broadcasting was the late twentieth century’s pre-eminent mass media. It’s where ordinary people got their news and opinions.
Richard Anderson, “And All Must Have Prizes”, The Gods of the Copybook Headings, 2014-09-24.
June 30, 2015
Extending the ADA to the web
Amy Alkon discusses why the notion of expanding the Americans with Disabilities Act to cover the internet would be a terrible idea:
So few people understand how laws passed can be used — and easily misused. Stretched into something they were never supposed to be (or not what they were said to be about, anyway).
For example, Title IX was supposed to be about allowing girls equal participation in school sports. The Obama admin has turned it into a system of campus kangaroos courts removing due process from men accused of sexual assault.
Next in line for strrretching is the Americans with Disabilities Act.
[…]
Bader gives some examples from Walter Olson, from his testimony to Congress, of awful changes that would ensue, like that amateur publishing would become “more of a legal hazard.” They’d go after websites like mine, that make a few shekels from Amazon links and a few more from Google ads. I need this money to supplement the money that’s fallen out of newspaper writing; also, I love the people who comment here and the discussion that goes on. It’s what keeps my eyes pried open at 11 p.m. when I need to post a blog item half an hour after I should have gone to bed for my 5 a.m. book- and column-writing wakeup time.
Also, added in the morning, after waking up worrying about this all night — making something “accessible” for a tiny minority could ruin it for everyone.
And what sort of understanding do we really owe people? I don’t do well with complex physics and I have limited attention for things I don’t understand that don’t grab my interest enough to figure them out. Should physics websites dumb themselves down for Amy Alkon’s brain? How many scientific websites will be brought down by disabled people going around to them like the quadriplegic lawyer in the wheelchair filing profit-making suits and closing classic hamburger stands and other businesses in California over ADA claims?
June 29, 2015
Price Ceilings: Shortages and Quality Reduction
Published on 25 Feb 2015
Price ceilings result in five major unintended consequences, and in this video we cover two of them. Using the supply and demand curve, we show how price ceilings lead to a shortage of goods and to low quality goods. Prices are signals that indicate to suppliers how much is being demanded, but when prices are kept artificially low with price ceilings, suppliers have no way of knowing how many goods they should produce and sell, leading to a shortage of goods. Quality also decreases under price controls. Do you ever wonder why the quality of customer service at Starbucks is generally better than at the DMV? The answer lies in incentives and price ceilings. We’ll discuss further in this video.
More on the “self-driving truck” issue
In the comments to this post, Tom Kelley provided a worthwhile digression on the topic that I felt deserved a wider audience, so with his permission, here’s Tom’s response:
Given that the trucking industry has been my sandbox for quite some time, I can safely extend Megan’s prognosis to also include the low long-term risk of job losses due to self-driving vehicles.
Frankly, I have to be wary of any “expert” who can’t even get the name of his source (the American Trucking Associations — yes, plural — not the American Trucker Association) transcribed correctly.
Apart from the myriad technical issues standing in the way of driverless trucks, the insurmountable barrier is anti-competitive trucking regulations passed on behalf of the government’s favorite white elephant, the rail industry. Invariably, these regulations are tarted up under some guise of safety (Let’s see, was it a truck or a train that blew the town of Lac-Mégantic off the map??? Hmm).
The bottom line is that any change that would have the slightest possibility of making trucking more productive is quickly met with massive dis-information campaigns, and even more massive lobbying from the rail industry. Even the most minor dimensional changes designed to reflect the current realities of truck freight transportation stand little if any chance of making it past regulators with a permanent disdain for free enterprise.
We can’t have electronically actuated brakes on trucks because the regulators have no grasp of brakes or electronics, and somebody wants to replace the driver with electronics? Seriously? Of course these same folks seen to have no problem flying cross-country at 500 MPH in a commercial jetliner that is literally flown by wire.
And even if the government types were perfect actors in this little tale, then you have the American tort law system, run/regulated by, for, and about the trial lawyers. Even with professional truck drivers who can deftly avoid putting incompetent car drivers on their way to a Darwin award, hundreds of four-wheeler drivers still manage to commit suicide-by-truck every year, followed quickly by their otherwise destitute estates suing innocent trucking companies for millions.
Can’t you just hear the jury summation now: “The eeevvilll trucking company wanted to save a few pennies by outsourcing the driver’s job to a microchip! The must be punished! My client, a fourth cousin of the homeless man who jumped off a bridge in front of a truck MUST be awarded $10 million for the pain and suffering from losing a relative he never met. No justice, no peace!”
No insurance company in their right mind would insure a driverless truck for real-world operation.
There’s no question that the technology is available to make the concept work, I was on-board numerous autonomous vehicles of all sizes back in 1997.
It will take several major societal shifts before any serious degree of autonomy makes it into real world trucking operations.
June 27, 2015
QotD: The corporate tax game
You can think of corporate taxation as a sort of long chess match: The government makes a move. Corporations move in response — sometimes literally, to another country where the tax burden is less onerous. This upsets the government greatly, and the Barack Obama administration in particular. Treasury Secretary Jack Lew has written a letter to Congress, urging it to make it stop by passing rules that make it harder to execute these “inversions.”
I’ve got a better idea: What if we made our tax system so attractive to corporations that they would have no interest in moving themselves abroad?
The problem with this extended chess game is that every move is very costly. First, it adds to the complexity of the tax code. With every new rule — no matter how earnestly said rule attempts to close a “loophole” — it becomes harder to know whether you are in compliance with the law. This is true on both sides; corporate tax law has now passed well beyond the point where it is possible for a single expert to be familiar with its ins and outs. This makes it harder to plan business expansions, harder to forecast government revenue, and it requires both sides to hire more experts in order to determine whether corporations are compliant. It also means more lawsuits, and longer ones, as both sides wrangle over how this morass of laws should be applied to real-world situations.
You can think of it this way: Every new law has possible intersections with every other tax law in existence. As the number of laws grows, the number of possible intersections grows even faster. And each of those intersections represents both a possible way to avoid taxes and a potential for unintended consequences that inadvertently outlaw something Congress never intended to touch. This growing complexity makes it more and more difficult for either companies or lawmakers to forecast the ultimate effects of new tax laws.
Megan McArdle, “We Don’t Need a Corporate Income Tax”, Bloomberg View, 2014-07-16.
June 24, 2015
QotD: Surge pricing
New York just killed every economist’s favorite thing about Uber: surge pricing. Sure, many economists also love convenient car service at the touch of a button. But black-car services have been around for a long time. Explicit surge pricing — which both creates new supply and rations demand — has not, but it’s long been a core feature of Uber Technologies Inc.’s business model. While it can be annoying at times (during a recent rainstorm, I noticed a sudden epidemic of drivers canceling rides, which I suspect was due to the rapidly rising surge price), it also allows you to be sure that you will be able to get a taxi on New Year’s Eve or during a rainstorm as long as you’re willing to pay extra.
Sadly, no one else loves surge pricing as much as economists do. Instead of getting all excited about the subtle, elegant machinery of price discovery, people get all outraged about “price gouging.” No matter how earnestly economists and their fellow travelers explain that this is irrational madness — that price gouging actually makes everyone better off by ensuring greater supply and allocating the supply to (approximately) those with the greatest demand — the rest of the country continues to view marking up generators after a hurricane, or similar maneuvers, as a pretty serious moral crime.
Megan McArdle, “Uber Makes Economists Sad”, Bloomberg View, 2014-07-09.
June 18, 2015
Nutrition … what we thought we knew is wrong, again
At Real Clear Science, Ross Pomeroy explains how historical “expert knowledge” and government cheerleading pointed in exactly the opposite direction of today’s experts and government regulators:
For decades, the federal government has been advising Americans on what to eat. Those recommendations have been subject to the shifting sands of dietary science. And have those sands ever been shifting. At first, fat and cholesterol were vilified, while sugar was mostly let off the hook. Now, fat is fine (saturated fat is still evil, though), cholesterol is back, and sugar is the new bogeyman.
Why the sizable shift? The answer may be “bad science.”
Every five years, the Dietary Guidelines Advisory Committee, composed of nutrition and health experts from around the country, convenes to review the latest scientific and medical literature. From their learned dissection, they form the dietary guidelines.
But according to a new editorial published in Mayo Clinic Proceedings, much of the science they review is fundamentally flawed. Unlike experiments in the hard sciences of chemistry, physics, and biology, which rely on direct observational evidence, most diet studies are based on self-reported data. Study subjects are examined for height, weight, and health, then are questioned about what they eat. Their dietary choices are subsequently linked to health outcomes — cancer, mortality, heart disease, etc.
That’s a poor way of doing science, says Edward Archer, a research fellow with the Nutrition Obesity Research Center at the University of Alabama, and lead author of the report.
“The assumption that human memory can provide accurate or precise reproductions of past ingestive behavior is indisputably false,” he and his co-authors write.
Two of the largest studies on nutritional intake in the United States, the CDC’s NHANES and “What We Eat,” are based on asking subjects to recall precisely what and how much they usually eat.
But despite all of the steps that NHANES examiners take to aid recall, such as limiting the recall period to the previous 24 hours and even offering subjects measuring guides to help them report accurate data, the information received is wildly inaccurate. An analysis conducted by Archer in 2013 found that most of the 60,000+ NHANES subjects report eating a lower amount of calories than they would physiologically need to survive, let alone to put on all the weight that Americans have in the past few decades.
May 16, 2015
Charles Murray and Jonah Goldberg on civil disobedience in America
Published on 11 May 2015
The American ideal of limited government on life support. Is it time for civil disobedience? Charles Murray says yes. Murray has been writing on government overreach for more than 30 years. His new book, By The People, is a blueprint for taking back American liberty. Jonah Goldberg sits down with Murray to discuss civil unrest in Baltimore, the scope of the government, and why bureaucrats should wear body cameras.
According to AEI scholar, acclaimed social scientist, and bestselling author Charles Murray, American liberty is under assault. The federal government has unilaterally decided that it can and should tell us how to live our lives. If we object, it threatens, “Fight this, and we’ll ruin you.” How can we overcome regulatory tyranny and live free once again? In his new book, By the People: Rebuilding Liberty Without Permission (Crown Forum, May 2015), Murray offers provocative solutions.
QotD: The true nature of government
The governments of these United States, from the federal to the local level, have managed to insinuate themselves between citizens and their property at every point of significance. In that, our governments are very much like most other governments, liberal and illiberal, democratic and undemocratic. We have allowed ourselves to be in effect converted from a nation of owners to a nation of renters. But while medieval serfs had only the one landlord, we have a rogue’s gallery of them: the local school board, the criminals at the IRS, the vehicle-registry office, etc. Never-ending property taxes ensure that as a matter of economic function, you never really own your house — you rent it from the government. Vehicle registration fees and, in some jurisdiction, outright taxes on automobile ownership ensure in precisely the same way that you never really own your car: You rent it from the government. Stock portfolio? Held at the sufferance of politicians. A profitable business? You’ll keep what income they decide you can keep. Your own body? Not yours — not if you use it for profitable labor.
A Who down in Whoville? You should be so lucky: Welcome to Whomville, peon.
Kevin D. Williamson, “Property and Peace”, National Review, 2014-07-20.
May 15, 2015
This is why California’s water shortage is really a lack of accurate pricing
David Henderson explains:
Of the 80 million acre feet a year of water use in California, only 2.8 million acre feet are used for toilets, showers, faucets, etc. That’s only 3.5 percent of all water used.
One crop, alfalfa, by contrast, uses 5.3 million acre feet. Assuming a linear relationship between the amount of water used to grow alfalfa and the amount of alfalfa grown, if we cut the amount of alfalfa by only 10 percent, that would free up 0.53 million acre feet of water, which means we wouldn’t need to cut our use by the approximately 20 percent that Jerry Brown wants us to.
What is the market value of the alfalfa crop? Alexander quotes a study putting it at $860 million per year. So, assuming, for simplicity, a horizontal demand curve for alfalfa, a cut of 10% would reduce alfalfa revenue by $86 million. (With a more-realistic downward-sloping demand for alfalfa, alfalfa farmers would lose less revenue but consumers would pay more.) With a California population of about 38 million, each person could pay $2.26 to alfalfa growers not to grow that 10%. Given that the alfalfa growers use other resources besides water, they would be much better off taking the payment.
May 13, 2015
Why are railroads dragging their feet over more efficient braking systems?
Fred Frailey discusses the U.S. Department of Transportation mandate that all crude oil trains longer than 69 cars must be equipped with electronic brakes by 2021 or they will restrict the speed of oil trains to 30 MPH at all times. The current standard braking system for railroads in North America is pneumatic, which have worked well for decades, but have inherent problems as modern trains have gotten longer and heavier. One of the biggest problems is that pneumatic brakes have a relatively long activation time — when the engineer operates the brake in the lead locomotive, it takes quite some time for that to propagate all the way through the train. This creates situations which can cause derailments as the lead cars begin to slow down, while the rest of the train is still travelling at full speed.
The preferred replacements are called electronically controlled pneumatic brakes (ECP), where instead of the brakes operating by pressure changes in the air line, the brakes would be controlled by a separate electronic circuit that would allow simultaneous brake application in all cars in the train.
It seems electronic braking has no friends in the railroad industry. I find this puzzling. Research I’ve read suggests there is both a safety and business case to be made. One explanation for the bum’s rush being given ECP comes from someone whose career was immersed in railroad technology: “The mechanical departments say the ECP brakes don’t save enough on wheels and brake shoes to justify implementation. The track departments say that ECP brakes don’t reduce rail wear enough to justify implementation. Transportation departments say that ECP brakes don’t save enough fuel to justify implementation. And improved train running times, improved train dynamics, and improved engineer performance are all soft-dollar savings which don’t count. No one ever bothers to sum up total benefits.” Silos, in other words.
So I’ll make the case for ECP. (By the way, the standards were developed two decades ago by the same AAR that now vigorously opposes their implementation.) A train equipped with electronic braking is hard-wired, allowing instant communication from airbrake handle in the locomotive to every brake valve on the cars. The principal advantages are that all brakes instantly apply and release at the same time, the air supply is continually charged, engineers can gradually release and reapply brakes, and undesired emergency braking (dynamiters, they’re called) virtually disappear. In-train forces, such as slack roll-in and roll-out, are greatly reduced, and that lessens the risk of derailment. Moreover, stopping distance is reduced 40 to 60 percent, permitting higher train speeds and higher speeds approaching restricting signals. Longer trains are possible. Longer trains run at higher speeds increase the capacity of the railroad network. Because air is always charging, braking power is inexhaustible; plus, a train can stop and instantly restart. Brakes, draft gear, wheels, and bearings require less maintenance. Existing federal regulations would allow train inspections every 5,000 miles instead of the present 1,500 or 1,000 miles.
Those are a lot of advantages. In a report commissioned by the Federal Railroad Administration in 2005, the consulting company Booz Allen Hamilton estimated the cost of full implementation of ECP at $6 billion and the measurable savings (not including added network capacity) at $650 million a year. Booz recommended that ECP conversion begin with coal trains loaded in Wyoming’s Powder River Basin, then to other types of unit trains (presumably including intermodal trains), and finally the rest of the car fleet — all in a 15-year time frame. “As applied to western coal service,” its report stated, “the business case is substantial,” with a recovery of all costs within three years.
[…]
Several things are going on here. Silos are one. Nobody is looking at the big picture, just his or her little piece of it. The boys in the Mechanical Silo could care less about increased network capacity. The occupants of the Finance Silo don’t want to divert cash flow away from share buybacks, their favorite toy. Most of those in the CEO Silo didn’t come up on the operating side and are probably bored by the subject. In a conservative, mature business like railroading, risk taking and even forward thinking are not rewarded. And the cost of hard-wiring the car fleet would primarily be borne by shippers, who own most of the equipment, whereas railroads would reap the benefits. How to share the benefits with car-owning shippers leads to very difficult negotiations.
QotD: Mono-culture banking
One of the factors in the financial crisis of 2007-2009 that is mentioned too infrequently is the role of banking capital sufficiency standards and exactly how they were written. Folks have said that capital requirements were somehow deregulated or reduced. But in fact the intention had been to tighten them with the Basle II standards and US equivalents. The problem was not some notional deregulation, but in exactly how the regulation was written.
In effect, capital sufficiency standards declared that mortgage-backed securities and government bonds were “risk-free” in the sense that they were counted 100% of their book value in assessing capital sufficiency. Most other sorts of financial instruments and assets had to be discounted in making these calculations. This created a land rush by banks for mortgage-backed securities, since they tended to have better returns than government bonds and still counted as 100% safe.
Without the regulation, one might imagine banks to have a risk-reward tradeoff in a portfolio of more and less risky assets. But the capital standards created a new decision rule: find the highest returning assets that could still count for 100%. They also helped create what in biology we might call a mono-culture. One might expect banks to have varied investment choices and favorites, such that a problem in one class of asset would affect some but not all banks. Regulations helped create a mono-culture where all banks had essentially the same portfolio stuffed with the same one or two types of assets. When just one class of asset sank, the whole industry went into the tank,
Well, we found out that mortgage-backed securities were not in fact risk-free, and many banks and other financial institutions found they had a huge hole blown in their capital.
Warren Meyer, “When Regulation Makes Things Worse — Banking Edition”, Coyote Blog, 2014-07-07.