Quotulatiousness

November 6, 2012

Adam Smith’s “invisible hand”

Filed under: Books, Economics, History, Liberty — Tags: , , , — Nicholas @ 00:01

From LearnLiberty.org

Why are some countries wealthy while other nations are poor? Prof. James Otteson, using the ideas of Adam Smith, explains how the division of labor is a necessary and crucial element of wealthy nations. Additionally, Otteson explains Smith’s idea of the invisible hand, which explains how human beings acting to satisfy their own self interest often unintentionally benefit others.

August 12, 2012

Wendy McElroy on the Myth of the Greater Good

Filed under: Liberty, Randomness — Tags: , , — Nicholas @ 09:12

Have you been punked by your philosophy professor?

In entry-level philosophy class, a professor will often present a scenario that seems to challenge the students’ perspective on morality.

The argument runs something as follows: “The entire nation of France will drop dead tomorrow unless you kill your neighbor who has only one day to live. What do you do?”

Or “You could eliminate cancer by pressing a button that also kills one healthy person. Do you do so?”

The purpose is to create a moral dilemma. The questions pit your moral rejection of murder against your moral guilt for not acting to save millions of lives.

In reality, the questions are a sham that cannot be honestly answered. They postulate a parallel world in which the rules of reality, like cause and effect, have been dramatically changed so that pushing a button cures cancer. The postulated world seems to operate more on magic than reality.

Because my moral code is based on the reality of the existing world, I don’t know what I would do if those rules no longer operated. I presume my morality would be different, so my actions would be as well.

As absurd as they are, these are considered to be the “tough” moral questions. In grappling with them, some students come to believe that being true to morality requires the violation of morality in a profound manner; after all, there is no greater violation than the deliberate murder of another human being.

But how can the life of one outweigh those of millions in your hands? At this point, morality becomes a numbers game, a matter of cost-benefit analysis, rather than of principle. This is not an expansion of morality, as the professor claims, but the manufacture of a conflict that destroys morality. In its place is left a moral gray zone, a vacuum into which utilitarianism rushes.

June 11, 2012

An epitaph for the original Arts and Crafts movement

Filed under: Economics, History — Tags: , , — Nicholas @ 12:42

Colby Cosh has an interesting slant on William Morris and the original Arts and Crafts movement (for the record, I’m quite a fan of a lot of A&C artifacts, if not quite so much of their philosophy):

In the 19th century, William Morris preached a social revolution in which exploitative “useless toil” would be replaced by “useful work”. He dreamt of a world that would reject shoddy mass-produced goods in favour of objects made with care and craftsmanship. Any business that sells “artisanal” goods, whether the goods be curtains or crumpets, is essentially quoting Morris and referring to his promise.

That promise, of course, failed spectacularly. It did not even survive Morris’s own time. His “libertarian socialism” of crafted objects and honest work found itself drowned out at every turn by leftist alternatives which, more sensibly, accepted the power and inevitability of mass production. 20th-century Marxism wasn’t opposed to factories; it worshipped them, fetishized them. The fatal problem with Morris’s appeal is that he was just plain wrong about mass-produced objects necessarily being unlovely junk. We have been to Ikea; we know better.

Morris felt very strongly about this, and from his own historical standpoint, he was certainly on to something. It’s impossible for us to imagine what kind of things factories suppurated into the marketplace before things like statistical control charts were invented, or before items like micrometers were themselves mass-produced to a consistent high standard. Morris lived in a world where individual masons and cabinetmakers and weavers really were losing their livelihoods to a tide of undifferentiated, undistinguished banality; his feelings of alarm now seem fussy when we read him, but that is because only the better-made Victorian objects have physically survived destruction or disposal and reached our time.

Soon enough, however, the art of industrial design would come to the rescue. If Morris could have lived long enough to see the Studebaker Commander or the IBM Selectric II or, yes, the furshlugginer iPhone, he would have packed in the Arts and Crafts talk and gone straight to work designing pickle-jar labels. (Morris was not too consistent when it came to the ultimate logical consequences of a world made by hand, anyway. The influential Kelmscott Press he founded in 1891 favoured early printing techniques and letterforms, but it was, at any rate, a press; unlike his spiritual ancestor William Blake, he didn’t set out to mimic the appearance of illuminated manuscripts by the actual method implied in the etymology of the term “manuscript”.)

While I picked this section of the article to quote, you really should read the whole thing. It’s some of the most thought-provoking writing I’ve seen in months.

March 25, 2012

Bryan Caplan: John Stuart Mill was over-rated

Filed under: Books, History, Liberty — Tags: , — Nicholas @ 09:32

Mill isn’t one of my favourite philosophers: I read On Liberty as a teenager, but most of it didn’t stick with me (probably more a reflection of my age than the work itself, I agree). Bryan Caplan makes a case for him being far more famous than he deserves:

One especially cringeworthy example: In the span of two pages in On Liberty, Mill names one “ultimate” principle and one “absolute” principle. His Ultimate Principle:

    It is proper to state that I forego any advantage which could be derived to my argument from the idea of abstract right, as a thing independent of utility. I regard utility as the ultimate appeal on all ethical questions…

His Absolute Principle:

    The object of this Essay is to assert one very simple principle, as entitled to govern absolutely the dealings of society with the individual in the way of compulsion and control, whether the means used be physical force in the form of legal penalties, or the moral coercion of public opinion.

You might think that Mill would argue that his Ultimate Principle implies his Absolute Principle — or at least that that the two principles never conflict. That would be silly and dogmatic, but consistent.

[. . .]

Unfortunately for Mill, neither his Ultimate nor Absolute Principles leaves any role for mere “capability.” You could say, “If free and equal discussion will improve a person, you should respect his liberty.” When words work, there’s no reason to resort to beatings. But after free and equal discussion fails to open the eyes of a person capable of free and equal discussion, why not try coercion? No matter what a person’s “capabilities,” Mill’s Ultimate Principle commands coercion and his Absolute Principle forbids it.

January 28, 2012

How a long-dead activist’s ideas influenced Barack Obama

Filed under: Government, Media, Politics, USA — Tags: , — Nicholas @ 11:52

In an article from 2009, Jim Geraghty traces the influence of Saul Alinsky (who died before Obama went to high school) on the President’s early days in office:

Alinsky died in 1972, when Obama was 11 years old. But three of Obama’s mentors from his Chicago days studied at a school Alinsky founded, and they taught their students the philosophy and methods of one of the first “community organizers.” Ryan Lizza wrote a 6,500-word piece on Alinsky’s influence on Obama for The New Republic, noting, “On his campaign website, one can find a photo of Obama in a classroom teaching students Alinskian methods. He stands in front of a blackboard on which he has written ‘Relationships Built on Self Interest,’ an idea illustrated by a diagram of the flow of money from corporations to the mayor.”

In a letter to the Boston Globe, Alinsky’s son wrote that “the Democratic National Convention had all the elements of the perfectly organized event, Saul Alinsky style. . . . Barack Obama’s training in Chicago by the great community organizers is showing its effectiveness. It is an amazingly powerful format, and the method of my late father always works to get the message out and get the supporters on board. When executed meticulously and thoughtfully, it is a powerful strategy for initiating change and making it really happen. Obama learned his lesson well.”

As a tool for understanding the thinking of Obama, Alinsky’s most famous book, Rules for Radicals, is simultaneously edifying and worrisome. Some passages make Machiavelli’s Prince read like a Sesame Street picture book on manners.

He also took advantage of the innumeracy of many people:

When Obama announced a paltry $100 million in budget cuts, and insisted this was part of a budget-trimming process that would add up to “real money,” he clearly understood that the public processes these numbers very differently from the way budget wonks do. Alinsky wrote: “The moment one gets into the area of $25 million and above, let alone a billion, the listener is completely out of touch, no longer really interested, because the figures have gone above his experience and almost are meaningless. Millions of Americans do not know how many million dollars make up a billion.”

That’s the same sense that Mark Steyn captured recently.

Alinsky sneered at those who would accept defeat rather than break their principles: “It’s true I might have trouble getting to sleep because it takes time to tuck those big, angelic, moral wings under the covers.” He assured his students that no one would remember their flip-flops, scoffing, “The judgment of history leans heavily on the outcome of success or failure; it spells the difference between the traitor and the patriotic hero. There can be no such thing as a successful traitor, for if one succeeds he becomes a founding father.” If you win, no one really cares how you did it.

[. . .]

Moderates thought they were electing a moderate; liberals thought they were electing a liberal. Both camps were wrong. Ideology does not have the final say in Obama’s decision-making; an Alinskyite’s core principle is to take any action that expands his power and to avoid any action that risks his power.

January 24, 2012

Robert Fulford: Nietzsche’s inescapable shadow

Filed under: History, Politics — Tags: , , , — Nicholas @ 12:07

Writing in the National Post, Robert Fulford traces all the ways we still live with a long-dead madman:

Friedrich Nietzsche is one of those philosophers you just can’t kill.

He’s been in his grave since 1900, having been silenced by insanity many years before. In 1898, The New York Times ran an article headed, “Interesting Revolutionary Theories from a Writer Now in the Madhouse.” He’s read, as he was then, only by a small minority, many of whom it would be flattering to call eccentric.

Nevertheless, he runs through our social bloodstream. Francis Fukuyama’s remark has the sound of truth: Whether we like it or not, “We continue to live within the intellectual shadow cast by Nietzsche.”

[. . .]

We don’t know it but Nietzsche scripted many of our conversations, putting words in our mouths. When we talk about culture (the culture of this, the culture of that) we echo him. Anyone who discusses “values” (instead of, say, ethics) is talking Nietzsche-talk.

People who claim to be in a state of “becoming” are Nietzscheans, knowingly or otherwise. He believed (now everyone believes) that we are all constantly reconstructing ourselves. In Nietzsche there’s no such thing as a permanently stable personality.

He was the original culture warrior. He laid the foundation for the struggle between traditionalism and modernism, an enduring battle. The more important a tradition, the more he wanted to see it challenged.

November 4, 2011

The libertarian subtext to . . . Harold and Kumar?

Filed under: Humour, Liberty, Media — Tags: , , , — Nicholas @ 11:05

David Boaz reviews the philosophical and economic underpinnings of the Harold and Kumar movies:

Escaping persecution, poverty, and hunger . . . to find ample food and unlimited choices . . . the pursuit of happiness . . . the American Dream. Yes, I think writers Jon Hurwitz and Hayden Schlossberg were on to something.

And then in the sequel, Harold & Kumar Escape from Guantanamo Bay, after another improbable road trip, the fugitive youths literally dropped in on George W. Bush’s Texas ranch. In the increasingly fantastic plot, the president invited them to join him in hiding from the scary Cheney, shared his pot with them, and then promised to clear up the unfortunate misunderstanding that landed them in Guantanamo Bay. An uninhibited but still skeptical Kumar said, “I’m not sure I trust our government any more, sir.” And President Bush delivered this ringing libertarian declaration:

    Hey, I’m in the government, and I don’t even trust it. You don’t have to trust your government to be a patriot. You just have to trust your country.

Harold & Kumar: more wisdom than a month of right-wing talk radio. Hurwitz and Schlossberg get what America is about.

Not having seen any of the movies, that certainly sounds like the kindest treatment George W. Bush has ever received from Hollywood.

October 31, 2011

QotD: Economics is not a “hard science”

Filed under: Economics, Quotations, Science — Tags: , , , — Nicholas @ 13:20

The problem at base is that economics is not a branch of mathematics or statistics, no matter how much economists wish it was. Never forget that the economics equations you see, the pretty graphs and charts, are just educated guesses that are wrong more often than not — economists love the gloss of the hard sciences, but the truth is that the field is firmly placed among the philosophical and sociological disciplines. Economics is a study of human behavior more than anything else, with all the uncertainties and confusion that entails.

“Monty”, “DOOM: I like that Doom Doom Pow”, Ace of Spades H.Q., 2011-10-31

May 3, 2011

How to referee a philosophical discussion

Filed under: Humour, Randomness — Tags: , — Nicholas @ 09:48

Brilliant and (potentially) useful post from davidad:

H/T to Alex Tabarrok for the link.

March 16, 2011

Guest post: Virginia Postrel and the “magic” iPad

Filed under: Technology — Tags: , , , , — Nicholas @ 11:16

This was written by Jon, my former virtual landlord, in an email to me earlier today. I’ve asked his permission to post it on the blog.

Did you see this Wall Street Journal post?

When Apple introduced the iPad last year, it added a new buzzword to technology marketing. The device, it declared, was not just “revolutionary,” a tech-hype cliché, but “magical.” Skeptics rolled their eyes, and one Apple fan even started an online petition against such superstitious language.

But the company stuck with the term. When Steve Jobs appeared on stage last week to unveil the iPad 2, which hit stores Friday, he said, “People laughed at us for using the word ‘magical,’ but, you know what, it’s turned out to be magical.”

I’m not sure what she’s on about when get gets to magic and dissing “makers” and hackers for their disdain of such. More on that later.

Sadly, I think love for the iPad is explained in much simpler terms: it is a shiny object, and people like shiny objects.

The thing is well proportioned (I’ve not looked at the specs, but I suspect that golden ratio proportions are present in its design), it has a polished surface, the display is bright and vivid — and people simply dig that sort of thing. I admit that I find the things attractive, but not attractive enough to overcome what are, for me, wallet-crushing limitations:

  • No ROI. I cannot be measurably productive on an iPad — I could not write or code or draw on the thing — so I’m never going to make back its cost. I’ve been able to pay for all of my computers by being productive on them, but that would not happen on the iPad. For that to happen, I would have to devote far more time than I have to, say, learning how to program for the thing — and that’s not likely to happen. Your mileage will, of course, vary on this: if you can measure and assign a dollar value to the time saved by having a portable internet access point around the office, plant, home, or on the road, then you’ll see more of a return here. At present, though, I don’t need that — at least not in a way that can be represented by income or cashflow.
  • Hyper-accelerated planned obsolescence. Apple is notorious for this — the next generation of device typically makes the earlier generation either less desirable or downright useless. My first — and only — Mac taught me this lesson, and I won’t fall for it again — at least not with an Apple product. The device becoming less desirable may not be an issue for most people, unless they are stylish hipsters who buy the device simply for its value as a fashion accessory. The reduced functionality, lack of updates, and lack of development support might be a real problem for people who bought the things for measurable productivity. So again, as ever and always, your mileage will vary.

Another thing that keeps me from buying one of these is that I can see that they are not going to age well. A portable device is going to get beat up, and the iPad will lose much of its Jobs-gizz-polished luster as the screen gets greasy and smudged, the case gets dinged and pitted, and then, finally — horror of horrors — the screen gets a deep corner-to-corner gouge after you read about the next generation device, drop the thing face down in shock, accidentally kick it into the next stall, and the hobo there picks it up and does who knows what with it before passing it back to you under the cubicle wall. Something as precious as the iPad just will not weather that sort of abuse. And even if it did, would you really want it back after that?

Postrel dibbles:

Even the “maker ethic” of do-it-yourself hobbyists depends on having the right ingredients and tools, from computers, lasers and video cameras to plywood, snaps and glue. Extraordinarily rare even among the most accomplished seamstresses, chefs and carpenters are those who spin their own fibers, thresh their own wheat or trim their own lumber — all once common skills. Rarer still is the Linux hacker who makes his own chips. Who among us can reproduce from scratch every component of a pencil or a pencil skirt? We don’t notice their magic — or the wonder of electricity or eyeglasses, anesthesia or aspirin — only because we’re used to them.

I’m not sure what to make of that. It sounds like she’s saying that hackers should revere the iPad simply because they could not make one themselves from scratch. By that logic, I should revere a shipping pallet because I could not make one from scratch — and I’m thinking beyond my lack of woodworking skills here. To Postrel, the shipping pallet should be seen as magic because I did not plant the acorn that grew into the oak that I cut down with the axe that I forged myself from ore that . . . oh, screw it, you know where I’m going with this and have better things to do with your time than to follow me there).

Postrel is missing the fact that clever people have commoditized magic: they’ve found methods to manufacture tedious or complicated things in ways that make them commonplace and disposable. It’s true that your average hacker could not build an iPad from scratch, starting from raw silicon and copper and gold and dead plankton transmogrified into petrochemicals. I mean, really, who has the time to farm plankton, wait for them to die, settle to the bottom of the ocean, be covered by sediment, be compressed through the build-up of rock strata over geological epochs — sorry, I’m doing it again. While your average hacker is not going to build an iPad from raw materials, your average hacker could probably build a world-changing application for a popular platform if that platform were open.

The article throws out the old groan about any sufficiently advanced technology being indistinguishable from magic. To those who don’t think too much about how that technology works, it certainly must seem like magic. What’s truly magical, though, is when such magic is commoditized and becomes commonplace. It goes from being a flashy-bangy trick to something that’s actually useful. Sadly, Apple is not building magic — they are building a captive audience.

Damnit. I’ve been letting this stew for a couple of days, and I can see that it’s just going to boil down to some lame bromide about how free markets and free access to products that one actually owns after paying for them are what is truly magical, but I’m just not going to go there. So I’m going to consider this done and send it off.

October 1, 2010

QotD: Principles versus positions

Filed under: Liberty, Media, Politics, Quotations — Tags: , , , — Nicholas @ 17:56

As I was explaining to an attractive young woman the other day, most of my views — my basic political commitments — have not changed in twenty years: I support freedom of expression, equality of opportunity, equal rights for women, etc. and so forth.

Twenty years ago my views were called left wing and these days my views are called fascist.

Nicholas Packwood, “True Colours”, Ghost of a Flea, 2010-10-01

September 28, 2010

Atheists and agnostics know more about religion than believers

Filed under: Religion, USA — Tags: , , — Nicholas @ 12:31

A report in the Los Angeles Times has set some tongues wagging:

Atheists, agnostics most knowledgeable about religion, survey says
Report says nonbelievers know more, on average, about religion than most faithful. Jews and Mormons also score high on the U.S. Religious Knowledge Survey.

Apparently, this is some kind of surprise. I’m not sure how, unless a lot of people really don’t know any professed atheists or agnostics.

If you want to know about God, you might want to talk to an atheist.

Heresy? Perhaps. But a survey that measured Americans’ knowledge of religion found that atheists and agnostics knew more, on average, than followers of most major faiths. In fact, the gaps in knowledge among some of the faithful may give new meaning to the term “blind faith.”

A majority of Protestants, for instance, couldn’t identify Martin Luther as the driving force behind the Protestant Reformation

The cynic in me wonders how many of them thought the question was about Martin Luther King.

Stephen Prothero, a professor of religion at Boston University and author of “Religious Literacy: What Every American Needs to Know — And Doesn’t,” served as an advisor on the survey. “I think in general the survey confirms what I argued in the book, which is that we know almost nothing about our own religions and even less about the religions of other people,” he said.

He said he found it significant that Mormons, who are not considered Christians by many fundamentalists, showed greater knowledge of the Bible than evangelical Christians.

[Going for the cheap laughs] That’s because most Mormons can read.

The Rev. Adam Hamilton, a Methodist minister from Leawood, Kan., and the author of “When Christians Get it Wrong,” said the survey’s results may reflect a reluctance by many people to dig deeply into their own beliefs and especially into those of others.

“I think that what happens for many Christians is, they accept their particular faith, they accept it to be true and they stop examining it. Consequently, because it’s already accepted to be true, they don’t examine other people’s faiths. . . . That, I think, is not healthy for a person of any faith,” he said.

I think it’s rather that people who are brought up in a faith rarely examine it at all — your parents tell you it’s true, the religious leaders tell you it’s true, and there’s rarely any advantage to be had from opposing or questioning authority early in life. By the time you’re ready to start examining things for yourself, your religious faith is “part of you”, not something external to you. It’s such a deeply rooted part of your view of the world that most people never even consider the possibility of questioning it.

For comparison purposes, the survey also asked some questions about general knowledge, which yielded the scariest finding: 4% of Americans believe that Stephen King, not Herman Melville, wrote “Moby Dick.”

I have to assume that the writer of this article hasn’t seen very many surveys of this type: in any large number of people you can usually find 5-10% who believe in far more amazing things than mis-attributed works of popular fiction.

H/T to Cory Doctorow for the link.

August 16, 2010

Cory Doctorow on the new Robert Heinlein biography

RAH by PattersonI finished reading the first volume last night, and I can’t wait for volume two. Cory Doctorow summarizes John Clute’s review with his own observations (Clute compared Heinlein’s work to Doctorow’s):

Heinlein was notoriously recalcitrant about his early life and the two wives he was married to before his epic marriage to Virginia Heinlein. He repeatedly burned correspondence and other writings that related to that period. Clute suggests that this is partly driven by Heinlein’s desire to be Robert A Heinlein, titan of the field, without having to cope with his youthful embarrassments. It’s a good bet — lots of the stuff that drives young people to write science fiction also makes them a pain in the ass to be around until they work some of the kinks out of their system (I wholeheartedly include myself in this generalization).

It’s interesting to see his own growth, from his early priggishness (he was nicknamed “the boy general” as a plebe at the Naval Academy) which undoubtedly was not helped by his health issues and tendency to stammer. He was in the shadow of his older brother Rex Ivar for most of his youth, even following him to the Academy three two years later. Rex Ivar was the favourite child in the family and Robert never seemed to be able to do as well in his parents’ eyes as the older boy.

Robert Heinlein was probably a pretty toxic individual as a teenager, based on the evidence Patterson presents — it’s pretty clear even after most of the information was sanitized by Heinlein’s third wife Virginia. Patterson never met Heinlein, and by the time he took on the biography, most of the people who knew Heinlein were fading from the scene. I think he did a very good job with the information available to him, but the biography definitely improves after the Academy years.

Patterson also puts forward a pretty comprehensive case for the idea that Heinlein’s fiction generally conveys Heinlein’s own political beliefs. This is widely acknowledged among Heinlein fans, save for a few who seem distressed by the idea that the blatant racism and sexism (especially in the earlier works) are the true beliefs of the writer at the time of writing and would prefer to believe that Heinlein didn’t write himself into his works. I got into a pretty heated debate with one such person at the Heinlein panel at the 2007 Comicon, who maintained the absurd position that Heinlein’s views could never be divined by reading his fiction — after all, his characters espouse all manner of contradictory beliefs! (To which I replied: “Yes, but the convincing arguments are always for the same set of beliefs, and the characters who challenge those beliefs are beaten in the argument.”) Not that I fault Heinlein for this — it’s an honorable tradition in SF and the mainstream of literature, and I find Heinlein’s beliefs to be nuanced and complex, anything but the reactionary caricature with which he is often dismissed.

It should be no surprise to anyone over 30 that Robert Heinlein’s political and philosophical views changed over his lifetime. This is discussed in some depth in the book, frequently from Heinlein’s own letters to friends at various points. He lost his religious views very early on (if he ever really had them, other than for conforming to familial expectations), and after leaving the Navy he was deeply involved in Upton Sinclair’s EPIC movement.

His belief in world government must have been hard to sustain, given that he had a great deal of experience of the political process, both in Kansas City during the Pendergast years, and in California with EPIC. Corruption, dirty dealing, and backroom bargaining were the way things got done, and it would be hard to believe that things would be better with a single world-wide government.

What seems to have gotten him involved in EPIC was his first-hand experience of poverty and seeing the plight of the “Okies” who’d come to California after the dust bowl wiped out so many farms in the central states. There were not enough jobs for them, even displacing the Mexican migrant labourers, and they were ineligible for state assistance until after they’d been in California for a year. Sinclair appeared to be the only politician with any plan other than oppressing the Okies enough to force them to move on.

July 23, 2010

Define, or be defined

Filed under: Economics, Liberty, Media — Tags: , , — Nicholas @ 22:28

Jesse Walker looks at efforts to take the notion of “capitalism” and wrap it up in the more user-friendly term “free enterprise”:

[T]here’s an effort afoot to rebrand “capitalism” as “free enterprise.” On the face of it, I like the idea. Capital is going to be a central part of any modern economic system, whether or not there’s a lot of government intervention. By contrast, the phrase “free enterprise” implies economic liberty.

Unfortunately, MSNBC identifies the chief force behind the idea as the U.S. Chamber of Commerce, a group whose commitment to economic liberty is so strong that it came out for TARP, the Detroit bailout, and the 2009 stimulus. If the Chamber were more honest about its outlook, it would reject “free enterprise” for a more frank label, like “corporate welfare.” But I suspect that wouldn’t be good branding.

In the same way we had to give up the historical meaning of the word “liberal” to folks who used it to imply almost the opposite, we should probably abandon the word “capitalism”. For a start, the word was popularized by that great pamphlet writer Karl Marx, and it has a pejorative connotation to most people who hear it used. “Capitalists” are folks in top hats who ride in chauffeured limousines and have no sympathy or respect for “the working man”. Try subbing in “Plutocracy” or “Rich F*cking Bastards” and you’ll get close to the popular image of the current term.

In any argument where you try using terms that have been appropriated by your opponents, you’re already ceding the high ground. “Capitalism” is a word that comes pre-loaded with all the negativity your opponents delight in — don’t play their game by their rules!

June 28, 2010

Marketing secrets revealed

Filed under: Humour, Randomness, Technology — Tags: , , , , — Nicholas @ 09:16

An absolutely brilliant post at The Secret Diary of Steve Jobs tells you all about the reality of marketing:

It’s a pretty safe assumption that if you’re reading this blog, you’ve seen “The Matrix.” And you may or may not remember the scene where a kid explains to Neo that the trick to bending a spoon with your mind is simply to remember that, “There is no spoon.”

So it is with marketing. One thing I learned very early in life, thanks to intentional overuse of psychedelic drugs, is that there is no reality. As a guy at the commune once put it: “The reality is, there is no reality.”

So some guy says his iPhone 4 is having reception issues. I say there is no reception issue. Now it’s his reality against my reality. Which one of us is living in the real reality?

There’s a two-part answer: 1, there is no real reality, and 2, it doesn’t matter.

The only thing that matters is which reality our customers will choose to adopt as their own.

[. . .]

What I realized many years ago — and honestly, it still amazes me — is that most people are so unsure of themselves that they will think whatever we tell them to think.

So we tell people that this new phone is not just an incremental upgrade, but rather is the biggest breakthrough since the original iPhone in 2007. We say it’s incredible, amazing, awesome, mind-blowing, overwhelming, magical, revolutionary. We use these words over and over.

It’s all patently ridiculous, of course. But people believe it.

H/T to Chris Anderson for the link.

« Newer Posts

Powered by WordPress