Published on 25 Feb 2014
Featuring the author Megan McArdle, Columnist, Bloomberg View; with comments by Brink Lindsey, Vice President for Research, Cato Institute; moderated by Dalibor Rohac, Policy Analyst, Center for Global Liberty and Prosperity, Cato Institute.
Nobody likes to fail, yet failure is a ubiquitous element of our lives. According to Megan McArdle, failing often — and well — is an important source of learning for individuals, organizations, and governments. Although failure is critical in coping with complex environments, our cognitive biases often keep us from drawing the correct lessons and adjusting our behavior. Our psychological aversion to failure can compound its undesirable effects, McArdle argues, and transform failures into catastrophes.
Video produced by Blair Gwaltney.
February 26, 2014
February 22, 2014
In Maclean’s, Colby Cosh explains that the future of classical music may well lie in the ballpark:
The Colorado Rockies have commissioned and recorded a theme song from composer Charles Denler, creator of introductory music for Oprah and NBC’s Dateline. According to the Denver Business Journal, the new Rockies theme, “Take the Field”, will come with multiple versions for particular game situations.
Denler, who has a trunkful of TV and film soundtracks to his credit, said some 80 members of [the Colorado Symphony] recorded “a big ‘Star Wars’-y variation and a very serious, pensive, we’re-going-to-make-it-through-this variation, and the main theme, which is very upbeat and very aggressive in a good sportsman kind of way.”
It is hard to hear of this idea without reflecting on the fact that orchestral and big-band music is a killer app of Western civilization, but one whose frontline practitioners, in the form of regional orchestras, are said to be in a state of permanent crisis. Sports fans love Sam Spence’s lumbering NFL Films soundtracks and still wriggle orgiastically at the sound of “Brass Bonanza”. There would appear to be space for creative enterprise here: I wonder, for example, if Mr. Denler’s contract would allow him to sell a full-on three-movement Rockies Symphony once his main theme becomes familiar to fans. Different variations for different game situations is a good idea, but perhaps only a first step; maybe each inning should have its own theme? Individual players represented by their own Wagnerian motifs?
February 19, 2014
Even as a child, I was vaguely annoyed by the LEGO kits that allowed you to recreate something you’d seen on TV or in the movies. The greatest thing about LEGOs is that you can use them to build anything your imagination can create. Castles, cars, airplanes, you name it: If you had the blocks and a mild spark of ingenuity, you could do just about any damn thing you pleased.
But the LEGOification of every aspect of popular culture is, in many ways, the exact opposite of the triumph of imagination. This ideal asks you to take something endlessly changeable and shove it into a tiny mental space already dominated by every other facet of popular culture. It’s a perversion of the LEGO ideal, a slap in the face of everyone who grew up tinkering with their building blocks in the hope of creating something new and exciting, something just for themselves or their friends.
Also, if you could get off my lawn, that would be great.
Sonny Bunch, “Knock It Off with the LEGOs, Jerks”, Washington Free Beacon, 2014-02-19
February 17, 2014
Strategy Page discusses the problems of predicting the future … which isn’t just a task for science fiction writers:
How will warfare change in the next 30 years? Military leaders, and the people they protect, are always trying to figure this out. There’s an easy way to get some good insight on the future. Simply go back 120 years (1894) and note the state of warfare and military technology at the time, then advance, 30 years at a time, until you reach 2014. At that point, making an educated guess at what 2044 will be like will like will be, if not easy, at least a lot less daunting.
In 1894, many infantry were still using single shot black powder rifles. Change was in the air though, and the United States had just begun to adopt the newfangled smokeless powder, a few years after it became widely available. In 1894 American troops were still replacing their black power rifles with a smokeless powder model (the Krag-Jorgensen). The modern machine-gun had been invented in 1883 but armies took about two decades before they began adopting them on a large scale. Most artillery was still short ranged, not very accurate, and could only fire at targets the crew could see. Horses pulled or carried stuff and the infantry marched a lot when they were not being moved long distances by railroad or steamships. But the modern, quick-firing artillery was recently introduced and still unproven in battle. Communications still relied on the telegraph, a half century old invention that revolutionized, in only a few decades, the way commanders could talk to each other over long distances. They could now do it in minutes. This was a big change for warfare. Very big. At this time telephones were all local and not portable. Cavalry was still important for scouting, although less useful for charging infantry (a trend that began when infantry got muskets with bayonets two centuries earlier).
So what does this portend for 2044? Faster and deadlier, for sure. Information war will be more than a buzzword by then because better sensors and data processing technology will make situational awareness (knowing where you and your enemy are, knowing it first, and acting on it before the other guy does) more decisive than ever.
If the expected breakthrough in batteries (fuel cells) evolves as reliably and cheaply as expected, the 2040s infantryman will be something of a cyborg. In addition to carrying several computers and sensor systems, he might wear body armor that also provides air conditioning. Satellite communications, of course, and two way video. Exoskeletons are already in the works and may mature by then. A lot depends on breakthroughs in battery tech although engineers are also finding to do more with just a little juice. Historians tend to constantly underestimate the cleverness of engineers and inventors in general.
But the big new development will be the continued evolution of robotic weapons. The World War II acoustic torpedo (used by the Germans and the allies, from subs as well as the air) was the first truly robotic weapon. You turned it lose and it would hunt down its prey and attack. There may be a lot of public uproar over land based systems that have sensors, can use them to hunt, and have weapons that can be used without human intervention. But those systems will be easy and cheap to build by 2044, and as soon as one nation builds them others will have to follow. By 2044, machines will be fighting other machines more often than they will be looking for the stray human on the battlefield.
But there will be other developments that are more difficult to anticipate. In 1894 most of the 1924 technologies were already known in a theoretical sense. Same with the 1954 technologies in 1924 and so on. What is most difficult to predict is exactly how new tech will be employed. There will be imagination and ingenuity involved there, and that sort of thing is, by its very nature, resistant to prediction.
February 10, 2014
At Ace of Spades HQ, Monty gives an introduction to Say’s Law:
Jean-Baptiste Say, an 18th-century economist and follower of Adam Smith, recognized one of the most fundamental laws in all economics: the entirely common-sense observation that consumption requires production. This axiom is called Say’s Law of Markets.
However, this axiom is often mis-stated as “production creates its own demand”. This is incorrect — production is necessary for consumption to take place, but production anticipates demand, it does not cause it. Production is speculative in this sense. The simple act of producing some good or service does not, in and of itself, create demand for that good or service. (This is true even for basic commodities.)
What Say’s Law really says is that production is the source of wealth. Market-driven production creates value and provides choice to consumers. Inventors and innovators bring new products to market, and as consumers are exposed to these new products, demand rises with the utility or desirability of these new products. New markets are opened by innovators who are able to tap into needs and wants that consumers didn’t even know they had until a new product or service is offered.
And he explains why money is not wealth:
So what is “wealth”, really? (I could write a whole book on the difference between “wealth” and “money”, but I’ll try to boil it down.) Wealth is options. Wealth is choice. Wealth is variety. Wealth is agency — being able to do what you want to do when you want to do it. Wealth is surfeit — having more than the essentials of life. It is comfort, leisure, ease — or at least the agency and option (those words again) to avail oneself of leisure. Simply put, wealth is stored value that can be drawn down in various ways, only some of which involve the exchange of money for goods and services. And how is wealth created? Through production, because production must necessarily precede consumption.
Money correlates with wealth because money is a medium of exchange and a store of value. Rich people have a lot of money because they are wealthy, not the other way around. Wealth allows us to buy a bigger house or better car or nicer furniture. It pays for a nice dinner for two at an upscale restaurant. Note well: wealth buys these things, not money per se. Consumption is the draw-down of wealth, not the simple expenditure of money.
Money is the oil in the machine of an economy, but money is not in and of itself wealth. If I am stranded on a desert island with a thousand gold coins, I am just as poor as if I were a homeless vagrant living in an alleyway somewhere, because I cannot exchange my gold for things I want or need. It does not give me options or variety or comfort. My gold facilitates neither production nor consumption absent a market mechanism that makes use of it.
February 6, 2014
Megan McArdle discusses the past, present, and potential future for the e-cigarette industry:
In its simplest form, an e-cigarette is a cartridge filled with a nicotine solution and a battery powering a coil that heats the solution into vapor, which one sucks in and exhales like smoke. Typically, it looks like a regular cigarette, except the tip, embedded with an LED, often glows blue instead of red. The active ingredient in e-cigarettes is the same nicotine found in cigarettes and nicotine patches.
The effects of inhaling nicotine vapor are not totally understood, but there is no evidence to date that it causes cancer. Experts and logic seem to agree that it’s a lot better than setting chopped-up tobacco leaves on fire and inhaling the nicotine along with thousands of combustion byproducts, some of which are definitely carcinogenic. Because cancer is the main drawback of smoking for a lot of people, the delivery of nicotine without lighting a cigarette is very attractive. And because it produces a wispy vapor instead of acrid smoke, an e-cigarette lets you bring your smoking back indoors, where lighting up in an enclosed space is no longer socially, or legally, acceptable.
A primitive, battery-operated “smokeless non-tobacco cigarette” was patented as early as 1963 and described in Popular Mechanics in 1965. Thomas Schelling, a Nobel prize-winning economist who helped start the Institute for the Study of Smoking Behavior and Policy at Harvard University’s Kennedy School in the 1980s, recalls that people in the 1960s were talking about a charcoal-based vaporizer that would heat some sort of nicotine solution. While those early versions might have been safer than a regular cigarette, they were too expensive and cumbersome to become a substitute for a pack of Camels in a country where, as Schelling notes, “you’re never more than 5 or 10 minutes away from a smoke.”
In a way, electronic cigarettes were made possible by cell phones. The drive to make phones smaller and lengthen their battery life led to the development of batteries and equipment small enough to fit in a container the size and shape of a cigarette. There’s some dispute over who invented the modern e-cigarette, but the first commercially marketed device was created by a Chinese pharmacist, Hon Lik, and introduced to the Chinese market as a smoking cessation device in 2004.
In the same way that alcohol comes in various guises (many carefully crafted to appeal to beginners: sweet as soda pop, for example), e-cigarettes are available in many different flavours:
E-cigarette cartridges come in classic tobacco and menthol flavors — Verleur’s company even offers V2 Red, Sahara, and Congress, clearly aimed at loyal smokers of Marlboros, Camels, and Parliaments. But most companies also have less conventional flavors. Blu offers Peach Schnapps, Java Jolt, Vivid Vanilla, Cherry Crush, and Piña Colada, presumably for people who don’t just like a drink with a cigarette, but in one.
January 27, 2014
In the Boston Globe, Katherine C. Epstein makes a strong case for the origin of the military-industrial complex not being the era that President Eisenhower warned about, but actually in the run-up to the First World War:
The phrase [Eisenhower] popularized to describe the emerging system — the “military-industrial complex” — has since become a watchword, and Eisenhower’s account of its rise has struck most observers as accurate: It was a product of an immense war effort and the new attitudes spawned in the aftermath.
But what if Eisenhower — and others — had the origin story wrong? Although the military-industrial complex unquestionably became far larger and more deeply entrenched as a result of World War II and the Cold War, a closer reading of the history suggests that its essential dynamics were actually decades older. An armaments industry in close collaboration with the military — coping with global and national arms markets, sophisticated technology, intense geopolitical rivalries, and a government prone to expand its power in the name of national security — had its roots in the way geopolitics, industrialization, and globalization collided at the turn of the 20th century. And one key innovation that helped to tip the United States over into the national security regime that we recognize today was, of all things, the torpedo.
The torpedo didn’t just threaten to change naval warfare. It was a sophisticated new weapon so important to the US Navy that it forced the government to form a novel relationship with industry — and to introduce the trump card of national security as a rationale for demanding secrecy from private companies. The policy that developed along with the torpedo set the terms for the efforts to control information in the name of national security that we’re seeing now. To appreciate just how far back that policy runs — back to a time not of war, but of peace — gives us a new lens on our current struggles over the military-industrial complex, and perhaps a different reason to worry.
January 25, 2014
Jim Dunnigan wrote this short piece as a proposal for a longer work to follow-on to his book The Perfect Soldier in 2003.
There are many new trends producing the dramatic changes in warfare. Many of these changes are missed by the media and even many military analysts because so much has changed so quickly. The new technologies and trends include;
Robots. Combat robots have actually been around for over a century. Naval mines and torpedoes are robotic weapons that proved themselves in the early years of the 20th century. There were some more robotic weapons in World War II (cruise and ballistic missiles plus the first “smart shells”), but the momentum for combat robots really didn’t get going until the late 20th century, when smaller, cheaper and more reliable microprocessors and similar electronics made it possible to create inexpensive, “smart”, dependable and useful battle droids. Combat robots have sneaked into the military, without many people in, or out of, uniform paying a lot of attention. That’s still the case, especially because the media and even many senior military and political leaders don’t fully understand the technology nor how it is implemented. One example of this confusion can be seen with the constant reference to UAVs (unmanned aerial vehicles) as “drones” or “robots.” They are neither, they are simply remotely controlled aircraft, something that’s been around for over half a century. But these UAVs are being given more and more robotic (operating autonomously) capabilities. This isn’t new either, as torpedoes have had this ability for over 60 years and missiles for over 50 years.
Battlefield Internet. The Internet appeared as a mass-market product in the mid-1990s just as a generation of PC savvy officers were rising up the chain of command. These guys had encountered the first PCs as teenagers and then had access to the pre-World Wide Web Internet in college. PCs and the web were not mysteries, but tools they were familiar with. By 2001 these men and women were majors and colonels, the people the generals turn to when they want something done, or explained. When the World Wide Web showed up in the mid-1990s, generals turned to the majors and colonels for an update and were told, “no problem sir, good stuff. We can use it.” There followed a scramble to create a workable “battlefield Internet.” But there was another trend operating, the 1980s effort to implement “information technology.” But as the ideas merged with workable and affordable hardware and software, sparks began to fly. Unlike earlier ventures into new technology, this was not just a case of the troops being given new gadgets and shown how to use them. With Internet stuff, and Internet savvy troops, a lot of the new technology was being invented by the users. This has created high speed development of new technology, putting new stuff through development, testing and into use much faster than ever before.
Commandos. These specialists have always been around. Think of the “Knights of the Round Table” or any legendary super warrior. During the 20th century, methods were developed to produce commando class troops at will. This was not possible in the past. While commandos are specialist troops that are only useful in certain situations, when you can use them, they often have a devastating effect. Those nations with large commando forces (the US, Britain, Russia, etc.) have a military advantage that is often the margin of victory.
Off the Shelf Mentality. Since the 1980s, the military has increasingly looked to commercial companies for the latest combat equipment. This recognizes that military procurement has become too slow, and technological advances too rapid to get the latest gear into the hands of troops before it becomes obsolete. In most cases, civilian equipment works fine, as is, for the military. This is because over half the troops that work at jobs that never take them from shops or offices indistinguishable from the work places civilians use. But even the combat troops can find a lot of equipment that is rugged enough for the battlefield. Soldiers have long noted that civilian camping equipment is superior to most of the stuff they are issued, and many soldiers have supplemented, or replaced, issued equipment with better off-the-shelf gear. In the last decade, it’s been common for combat troops to bring civilian electronics gear with them. Everything from laser range finders to GPS units, all of which are issued, but the official stuff tends to be heavier and less capable.
January 24, 2014
Developments like this should be of great interest to the Royal Canadian Navy:
… constrained budgets in America and Europe are prompting leading nations to reconsider future needs and explore whether new ships should be tailored for what they do every day, rather than what they might have to do once over decades.
The solution: extreme flexibility at an affordable price for construction and operation.
Here the Danes have emerged as a clear leader by developing two classes of highly innovative ships designed to operate as how they will be used: carrying out coalition operations while equipped to swing from high-end to low-end missions.
The three Iver Huitfeldt frigates and two Absalon flexible support ships share a common, large, highly efficient hull to yield long-range, efficient but highly flexible ships that come equipped with considerable capabilities — from large cargo and troop volumes and ample helo decks for sea strike and anti-submarine warfare — in a package that’s cheap to buy and operate. The ships come with built-in guns, launch tubes for self-defense and strike weapons, and hull-mounted sonar gear, and they can accept mission modules in hours to expand or tailor capabilities. The three Huitfeldts cost less than $1 billion.
The ships also are coveted during coalition operations for their 9,000-mile range at 15 knots, excellent sea-keeping qualities and command-and-control gear, plus spacious accommodations for command staffs. That’s why the Esbern Snare, the second of two Absalon support ships, is commanding the international flotilla in the Eastern Mediterranean that will destroy Syria’s chemical weapons.
Wikipedia has this image of the HDMS Iver Huitfeldt:
The class is built on the experience gained from the Absalon-class support ships, and by reusing the basic hull design of the Absalon class the Royal Danish Navy have been able to construct the Iver Huitfeldt class considerably cheaper than comparable ships. The frigates are compatible with the Danish Navy’s StanFlex modular mission payload system used in the Absalons, and are designed with slots for six modules. Each of the four Stanflex positions on the missile deck is able to accommodate either the Mark 141 8-cell Harpoon launcher module, or the 12-cell Mark 56 ESSM VLS.
While the Absalon-class ships are primarily designed for command and support roles, with a large ro-ro deck, the three new Iver Huitfeldt-class frigates will be equipped for an air defence role with Standard Missiles, and the potential to use Tomahawk cruise missiles, a first for the Danish Navy.
For contrast here is the HDMS Esbern Snare, the second ship in the Absalom class:
That’s not to say that these particular ships would be a good fit for the RCN, but that the approach does seem to be viable (sharing common hull configurations and swappable mission modules). However, the efficiencies that could be achieved by following this practice would almost certainly be swamped by the political considerations to spread the money out over as many federal ridings as possible…
H/T to The Armourer for the link.
January 20, 2014
BBC News Magazine has an article by Dan Snow discussing some commonly held beliefs about the First World War:
3. Men lived in the trenches for years on end
Front-line trenches could be a terribly hostile place to live. Often wet, cold and exposed to the enemy, units would quickly lose their morale if they spent too much time in them.
As a result, the British army rotated men in and out continuously. Between battles, a unit spent perhaps 10 days a month in the trench system, and of those, rarely more than three days right up on the front line. It was not unusual to be out of the line for a month.
During moments of crisis, such as big offensives, the British could occasionally spend up to seven days on the front line but were far more often rotated out after just a day or two.
4. The upper class got off lightly
Although the great majority of casualties in WW1 were from the working class, the social and political elite was hit disproportionately hard by WW1. Their sons provided the junior officers whose job it was to lead the way over the top and expose themselves to the greatest danger as an example to their men.
Some 12% of the British army’s ordinary soldiers were killed during the war, compared with 17% of its officers. Eton alone lost more than 1,000 former pupils – 20% of those who served. UK wartime Prime Minister Herbert Asquith lost a son, while future Prime Minister Andrew Bonar Law lost two. Anthony Eden lost two brothers, another brother of his was terribly wounded and an uncle was captured.
7. Tactics on the Western Front remained unchanged despite repeated failure
Never have tactics and technology changed so radically in four years of fighting. It was a time of extraordinary innovation. In 1914 generals on horseback galloped across battlefields as men in cloth caps charged the enemy without the necessary covering fire. Both sides were overwhelmingly armed with rifles. Four years later, steel-helmeted combat teams dashed forward protected by a curtain of artillery shells.
They were now armed with flame throwers, portable machine-guns and grenades fired from rifles. Above, planes, that in 1914 would have appeared unimaginably sophisticated, duelled in the skies, some carrying experimental wireless radio sets, reporting real-time reconnaissance.
Huge artillery pieces fired with pinpoint accuracy — using only aerial photos and maths they could score a hit on the first shot. Tanks had gone from the drawing board to the battlefield in just two years, also changing war forever.
January 6, 2014
In The Register, Tim Worstall explains why the notion of patents was introduced to the law and why we need to fix it now:
Having decided that the patent problem is an attempt to solve a public goods problem, as we did in part 1, let’s have a look at the specific ways that we put our oar into those perfect and competitive free markets.
It’s worth just noting that patents and copyright are not, absolutely not, the product of some fevered free market dreams. Rather, they’re an admission that “all markets all the time” does not solve all problems. That exactly why we create the patents.
Given that people find it very difficult to make money from the production of public goods, we think that we probably get too few of them. Innovation, the invention of new things for us to enjoy, is one of those public goods. It’s a hell of a lot easier to copy something you know can already be done than it is to come up with an invention yourself. So, if new inventions can be copied easily then we think that too few people will invent new things. We’re not OK with this idea. Thus we create a property right in that new invention. The inventor can now make money out of the invention and thus we get more new things.
And if it were only that simple, then of course we’d all be for patenting everything for ever. However it isn’t that simple. For not only do we want people to invent new things, we also want people to be able to adapt, extend, play with, improve those new things. Or apply them to areas the original inventor had no thought about. In the jargon, we want not just new inventions but also derivative ones. So we want to balance the ability of inventors to protect with the ability of others to do the deriving. And that’s probably what is actually wrong with our patent system today.
Have a look at Tabarrok’s curve:
If we have no protection of originality, then we get too little innovation. But if we have too strong a protection, then we get too little of the derivative stuff. There’s a sweet spot and the argument is that we’re not at it at present and are thus missing out on some goodies as a result. Perhaps some tweaks to the system would help?
January 3, 2014
I’ve heard the term many, many times (and used it more than a few times as well), but as Virginia Postrel points out, it didn’t just happen by chance that there are “first world problems” we can mock-sympathize over:
Third world conditions are defined not merely by economic misery but by unreliable services. “At the age of fourteen I had experienced a miracle,” writes Suketu Mehta in Maximum City, his critically acclaimed 2009 book on Mumbai. “I turned on a tap, and clean water came gushing out. This was in the kitchen of my father’s studio apartment in Jackson Heights [New York]. It had never happened to me before. In Bombay, the tap, when it worked, was always the first step of a process” taking at least 24 hours to produce drinkable water. Mehta’s family lived an affluent life but with third world problems.
By contrast, in a developed country, barring a major natural disaster, you can count on uninterrupted electricity, hot and cold running water, sewage disposal, garbage pickup, heat (and in hot climates, air conditioning), telephone service, Internet access and television. The roads and bridges will be in decent repair; the elevators will work; the ATMs will have cash; and you’ll be able to find a decent public toilet when you need one.
These things aren’t necessarily free, but they’re cheap enough for pretty much everyone to enjoy them. Most significantly, they’re ubiquitous and reliable. Even when natural disasters strike, we can expect heroic efforts to get things back to normal. Under normal circumstances, we can depend on these services to be there consistently and to work as promised. We can make plans accordingly. That’s a first world privilege.
It took years of sustained efforts by online retailers and delivery services to make overnight orders realistic. It also took dissatisfaction: insanely demanding companies working to please insanely demanding customers — or, in some cases, to offer customers services they hadn’t even thought to ask for — as each improvement revealed new sources of discontent.
“Form follows failure,” is what Henry Petroski, the civil engineering professor and prolific popular writer, calls the process. Every step forward begins with a complaint about what already exists. “This principle governs all invention, innovation, and ingenuity; it is what drives all inventors, innovators, and engineers,” he writes. “And there follows a corollary: Since nothing is perfect, and, indeed, since even our ideas of perfection are not static, everything is subject to change over time.”
Rising expectations aren’t a sign of immature “entitlement.” They’re a sign of progress — and the wellspring of future advances. The same ridiculous discontent that says Starbucks ought to offer vegan pumpkin lattes created Starbucks in the first place. Two centuries of refusing to be satisfied produced the long series of innovations that turned hunger from a near-universal human condition into a “third world problem.”
November 15, 2013
One of the big problems facing everyone in the US is the cost of healthcare: it’s expensive and getting more so. Obamacare is supposed to be an attempt to lower the overall cost of healthcare, but by approaching it from the “insurance” angle, it’s likely to make the situation worse rather than better. The Anti-Gnostic reposted an extended comment from Steve Sailer’s blog explaining why misunderstanding the purpose of insurance is a big problem:
1) Most people lose money on insurance, because most of the time insurance doesn’t pay out more than it takes in.
2) Thus, a “good” policy is a catastrophic-coverage-only, high-deductible policy, where most payments are out of pocket. This is a policy that protects you against the downside risk, but where you lose a lot less on average.
3) This is because the purpose of insurance is to protect yourself from *catastrophe*, not to make routine purchases.
4) For example, if you went to Best Buy and whipped out your home insurance card to get a new flat screen TV, everyone would look at you as a crazy man. “Don’t you know that home insurance is only for fires and floods, and not for routine purchases?”
5) And so it should be with health insurance, because you’ll actually — *provably* — pay less with a high deductible plan for all but catastrophic conditions.
6) Indeed, the most innovative and technologically advanced areas of medicine are ambulatory areas in which people feel that markets are “ok”. These are paradoxically the most trivial areas: lasik, plastic surgery, dermatology, dentistry, even veterinary medicine.
7) Why are these areas so advanced? Because people pay cash money, because they choose based on quality, and because they are *able* to choose — i.e. they aren’t being wheeled up to the hospital in a gurney in a no choice scenario.
8) Moreover, with every technology ever, from cars to cell phones to air travel to computers, things that start out expensive become cheaper when enough people demand them. With medicine it seems to bite more that money means differences in care. But at the end of the day doctors, patients, nurses, drugs, ambulances…all that stuff means real resources, and a refusal to do explicit computations just results in massive waste as costs are shunted to a place where no one looks at them.
At the Independent Institute blog, John Graham points out that — in the few places that government allows free markets to operate — prices tend to drop over time even while services or features improve:
It has taken a long time, but the price of hearing aids is in the process of falling dramatically. How has this happened? Technological innovation, of course, but there is more. There’s no shortage of technological innovation in U.S. health care. However, because third-party payers, that is, health insurers and governments, determine prices, there is no mechanism for customers to signal value to providers.
This is not the case for hearing aids: Although some states have mandated insurance coverage for hearing aids, this is usually limited to disabled children. The big market for hearing aids is seniors, and Medicare does not cover hearing aids.
This is another case of a phenomenon observed elsewhere by Devon Herrick of the National Center for Policy Analysis [PDF]: Where patients pay directly for medical care, prices fall like they do in every other market.
Seniors who want highly personalized service from an audiologist in his own practice can get it, and they will pay for it. Those who want to order online can save money by doing that. Those who want to get their old hearing aids repaired can make that choice. And the most adventurous seniors, who don’t mind running an earpiece into an iPhone, can get a functional hearing aid almost for free.
We are on the verge of enjoying universal access to hearing aids — but only because the government restrained itself from interfering, and let the market operate.
October 18, 2013
September 13, 2013
The Register‘s Andrew Orlowski speculates that we’ve hit PEAK SMARTPHONE:
Apple’s keynotes seem to command more mainstream front-page press attention than ever before — but each time, there’s less and less to report. Is the modern smartphone era limping to a close?
Apple’s announcements on Tuesday about the iPhone 5S and 5C were wearily predictable. Cupertino just doesn’t seem to be where the action is any more.
It is almost as if Apple and its arch-rival Samsung have exhausted themselves by suing each other around the world — and now look like two very knackered boxers agreeing to shuffle their way through the remaining rounds to the bell, rather than risk throwing big punches.
But the warning signs are there. Samsung reportedly held “crisis talks” this after sales of the Galaxy S4 failed to meet its expectations, Apple iPhone sales have declined for the past three quarters, and, well, “Peak Apple“.
Samsung piled on gimmicky and slightly creepy features like eyeball tracking, simply because it could. Apple’s user-facing innovation (the A7 64-bit chip is the real star of the show) entails building in a fingerprint scanner — a commodity laptop part for the past 10 years. Indeed, the only “radical” moves by Apple are adding colours to a slightly cheaper (but certainly not cheap) iPhone and rejecting NFC (or “Not F*cking Connecting”, as it’s known around here), which is a technology flop. Not so radical, then.
The stark truth is that smartphones, like computers, were only ever a means to an end — and once the services and apps markets matured, the smartphone itself became less … important. It didn’t really matter what access device you were carrying. The PC reached a point where the devices became beige boxes competing on price, and the smartphone era is drawing to the point where it doesn’t really matter what black rectangle you’re carrying — provided it accesses the services and apps you want. Fetishising the access devices is as strange as thanking LG or Panasonic for creating BBC2. No wonder both Samsung and Apple are looking at new higher-margin peripherals such as watches.