Many years ago, when I was more reckless intellectually than I am today, I proposed the application of Haeckel’s biogenetic law — to wit, that the history of the individual rehearses the history of the species to the domain of ideas. So applied, it leads to some superficially startling but probably quite sound conclusions, for example, that an adult poet is simply an individual in a state of arrested development — in brief, a sort of moron. Just as all of us, in utero, pass through a stage in which we are tadpoles, and almost indistinguishable from the tadpoles which afterward become frogs, so all of us pass through a stage, in our nonage, when we are poets. A youth of seventeen who is not a poet is simply a donkey: his development has been arrested even anterior to that of the tadpole. But a man of fifty who still writes poetry is either an unfortunate who has never developed, intellectually, beyond his teens, or a conscious buffoon who pretends to be something that he isn’t — something far younger and juicier than he actually is […] Something else, of course, may enter into it. The buffoonery may be partly conscious and deliberate, and partly Freudian. Many an aging man keeps on writing poetry simply because it gives him the illusion that he is still young. For the same reason, perhaps, he plays tennis, wears green cravats, and tries to convince himself that he is in love.
H.L. Mencken, “The Nature of Faith”, Prejudices, Fourth Series, 1924.
August 25, 2015
March 11, 2015
March 10, 2015
February 9, 2015
Last month, in his Times column, Matt Ridley explained why — until we discover a treatment for aging itself — rising cancer rates are a weird form of good news:
If we could prevent or cure all cancer, what would we die of? The new year has begun with a war of words over whether cancer is mostly bad luck, as suggested by a new study from Johns Hopkins School of Medicine, and over whether it’s a good way to die, compared with the alternatives, as suggested by Dr Richard Smith, a former editor of the BMJ.
It is certainly bad luck to be British and get cancer, relatively speaking. As The Sunday Times reported yesterday, survival rates after cancer diagnosis are lower here than in most developed and some developing countries, reflecting the National Health Service’s chronic problems with rationing treatment by delay. In Japan, survival rates for lung and liver cancer are three times higher than here.
Cancer is now the leading cause of death in Britain even though it is ever more survivable, with roughly half of people who contract it living long enough to die of something else. But what else? Often another cancer.
In the western world we’ve conquered most of the causes of premature death that used to kill our ancestors. War, smallpox, homicide, measles, scurvy, pneumonia, gangrene, tuberculosis, stroke, typhoid, heart disease and cholera are all much rarer, strike much later in life or are more survivable than they were fifty or a hundred years ago.
The mortality rate in men from coronary heart disease, for instance, has fallen by an amazing 80 per cent since 1968 — for all age groups. Mortality rates from stroke in both sexes have halved in 20 years. Cancer’s growing dominance of the mortality tables is not because it’s getting worse but because we are avoiding other causes of death and living longer.
It is worth remembering that some scientists and anti-pesticide campaigners in the 1960s were convinced that by now lifespans would be much shorter because of cancer caused by pesticides and other chemicals in the environment.
In the 1950s Wilhelm Hueper — a director of the US National Cancer Institute and mentor to Rachel Carson, the environmentalist author of Silent Spring — was so concerned that pesticides were causing cancer that he thought the theory that lung cancer was caused by smoking was a plot by the chemical industry to divert attention from its own culpability: “Cigarette smoking is not a major factor in the causation of lung cancer,” he insisted.
In fact it turns out that pollution causes very little cancer and cigarettes cause a lot. But aside from smoking, most cancers are indeed bad luck. The Johns Hopkins researchers found that tissues that replicate their stem cells most run the highest risk of cancer: basal skin cells do ten trillion cell divisions in a lifetime and have a million times more cancer risk than pelvic bone cells which do about a million cell divisions. Random DNA copying mistakes during cell division are “the major contributors to cancer overall, often more important than either hereditary or external environmental factors”, say the US researchers.
To sum it up, until or unless medical research finds a way to stop the bodily effects of aging, cancer becomes the most likely way for all of us to die. Cancer is a generic rather than a specific term — it’s what we use to describe the inevitable breakdown of the cellular division process that happens millions or even trillions of times over our lifetime. As Ridley puts it, “even if everybody lived in the healthiest possible way, we would still get a lot of cancer.” I’m not a scientist and I don’t even play one on TV, but I suspect that the solution to cancers of all kinds are to boost our immune systems to more quickly identify aberrant cells in our bodies before they start reproducing beyond the capability of the immune system to handle. The short- to medium-term solution to cancer may be to make us all a little bit cyborg…
January 25, 2015
January 3, 2015
At the wonderfully named Worthwhile Canadian Initiative blog, Frances Woolley looks at some of the ordinary human cussedness that prevents wonderfully clear and understandable economic theories from working quite as efficiently as their formulators expect:
1. Economies grow when people buy stuff.
2. Over time, people accumulate more and more stuff.
3. People can only handle so much stuff. Sock drawers get full of socks. Cupboards get full of cups. Bookshelves get full of books.
4. It’s hard to get rid of stuff. Economic models typically assume disposing of unwanted things costs nothing. But life isn’t like that. Sorting out stuff that can be tossed from stuff that is worth keeping takes time and effort.
5. People are “loss averse”. Throwing things away — clothes that don’t fit, vinyl LPs — hurts psychologically.
6. There’s no need to replace perfectly good stuff. True some stuff, like mobile phones, only lasts a year or three. But other stuff, like cast-iron frying pans, lasts for decades.
Taken together, observations 2 through 6 imply that, as people get older, they buy less and less stuff. Combined with observation 1, these observations explain why countries with aging populations experience lower rates of economic growth.
My only quibble is with the final sentence of point 3: bookshelves don’t get full … you just run out of immediate book storage options. Bookshelves are never really full, they’re just temporarily over-booked.
December 2, 2014
On his blog, Charles Stross talks about the mundane irritations and accumulated friction of a life lived past age 50 or so:
Beyond the obvious (gross physiological deterioration and pathologies of senescence), what are the psychological symptoms of ageing?
I tend to be somewhat impatient or short-tempered these days. Examples: getting worked up about people obstructing a sidewalk in front of me, or carelessly blowing smoke over their shoulder and into my face, walking while texting … you know the drill. This I put down largely to the chronic low-grade pain of the middle-aged body: joints that creak and pop, muscles that need an extra stretch, sore feet. […]
My memory, as previously noted, is a sieve. Partly I find myself living in a cluttered cognitive realm: I have so much context to apply to any new piece of incoming data. If middle-aged people seem slow at times it may not be because they’re stupid (although stupidity is a non-ageist affliction) but because they’re processing a lot more data than a young mind has on hand to digest. That shop window display? You’re not just looking at this seasons clothing fashions, but integrating changes in fashion across multiple decades and recognizing when this stuff was last new. (And if fashion is your thing, you’re trying to remember how far back in the wardrobe you hung it last time you wore it, all those years ago.) A side-effect of this: when experiencing something familiar through long repetition you forget it — you don’t remember it as a new experience but merely as an instance of a familiar one and (eventually) as nothing at all. (For those of you with a workday routine, this can cut in quite early: how well do you remember your last commute to work? If you do remember it, do you remember it only because it was exceptional—a truck nearly t-boning you, for example?)
An intersecting effect of the aches and pains and the difficulty retrieving information is that you have to focus hard on tasks — it’s hard to execute a day with six or seven distinct non-routine activities in it, because that requires planning and planning requires lots of that difficult mental integration. Planning is exhausting. Instead you focus on maintaining routines (get up, brush teeth, take meds, shave, use toilet, make coffee … check. Go to gym: check. Eat lunch: check. Work at desk: check …) and scheduling one or two exceptional tasks. Mental checklists help a lot, but you run into the sieve-shaped memory problem again: this is where digital prosthesis (or an overflowing filofax) come in handy.
Your perspective on current events changes. Take the news media. Everything new is old after a time: you see the large-scale similarities across decades even without becoming a student of history. Today’s invasion or oil crisis is just like the one before last. Our current political leadership are stuck in the same ideological monkey’s-paw trap as their predecessors the last time their party was in power. And so on. So you tend to discount current events and lose interest in the news until something new happens. (If you’re wondering why I’m obsessively interested in the Scottish independence thing this year, it’s because it’s a disruptive event: nothing like it has happened in UK politics for a very long time indeed. It’s fresh.)
September 19, 2014
I recall, in the very early days of the personal computer, articles, in magazines like Personal Computer World, which expressed downright opposition to the idea of technological progress in general, and progress in personal computers in particular. There was apparently a market for such notions, in the very magazines that you would think would be most gung-ho about new technology and new computers. Maybe the general atmosphere of gung-ho-ness created a significant enough minority of malcontents that the editors felt they needed to nod regularly towards it. I guess it does make sense that the biggest grumbles about the hectic pace of technological progress would be heard right next to the places where it is happening most visibly.
Whatever the reasons were for such articles being in computer magazines, I distinctly remember their tone. I have recently, finally, got around to reading Virginia Postrel’s The Future and Its Enemies, and she clearly identifies the syndrome. The writers of these articles were scared of the future and wanted that future prevented, perhaps by law but mostly just by a sort of universal popular rejection of it, a universal desire to stop the world and to get off it. “Do we really need” (the words “we” and “need” cropped up in these PCW pieces again and again), faster central processors, more RAM, quicker printers, snazzier and bigger and sharper and more colourful screens, greater “user friendlinesss”, …? “Do we really need” this or that new programme that had been reported in the previous month’s issue? What significant and “real” (as opposed to frivolous and game-related) problems could there possibly be that demanded such super-powerful, super-fast, super-memorising and of course, at that time, super-expensive machines for their solution? Do we “really need” personal computers to develop, in short, in the way that they have developed, since these grumpy anti-computer-progress articles first started being published in computer progress magazines?
The usual arguments in favour of fast and powerful, and now mercifully far cheaper, computers concern the immensity of the gobs of information that can now be handled, quickly and powerfully, by machines like the ones that we have now, as opposed to what could be handled by the first wave of personal computers, which could manage a small spreadsheet or a short text file or a very primitive computer game, but very little else. And of course that is true. I can now shovel vast quantities of photographs (a particular enthusiasm of mine) hither and thither, processing the ones I feel inclined to process in ways that only Hollywood studios used to be able to do. I can make and view videos (although I mostly stick to viewing). And I can access and even myself add to that mighty cornucopia that is the internet. And so on. All true. I can remember when even the most primitive of photos would only appear on my screen after several minutes of patient or not-so-patient waiting. Videos? Dream on. Now, what a world of wonders we can all inhabit. In another quarter of a century, what wonders will there then be, all magicked in a flash into our brains and onto our desks, if we still have desks. The point is, better computers don’t just mean doing the same old things a bit faster; they mean being able to do entirely new things as well, really well.
Brian Micklethwait, “Why fast and powerful computers are especially good if you are getting old”, Samizdata, 2014-09-17.
September 7, 2014
It is my conviction that no normal man ever fell in love, within the ordinary meaning of the word, after the age of thirty. He may, at forty, pursue the female of his species with great assiduity, and he may, at fifty, sixty or even seventy, “woo” and marry a more or less fair one in due form of law, but the impulse that moves him in these follies at such ages is never the complex of outlandish illusions and hallucinations that poets describe as love. This complex is quite natural to all males between adolescence and the age of, say, twenty-five, when the kidneys begin to disintegrate. For a youth to reach twenty-one without having fallen in love in an abject and preposterous manner would be for doubts to be raised as to his normalcy. But if he does it after his wisdom teeth are cut, it is no more than a sign that they have been cut in vain — that he is still in his teens, whatever his biological and legal age. Love, so-called, is based upon a view of women that is impossible to any man who has any experience of them. Such a man may, to the end of his life, enjoy the charm of their society, and even respect them and admire them, but, however much he respects and admires them, he nevertheless sees them more or less clearly, and seeing them clearly is fatal to true romance. Find a man of forty-five who heaves and moans over a woman, however amiable and lovely, in the manner of a poet and you will behold either a man who ceased to develop intellectually at twenty-four or thereabout, or a fraud who has his eye on the lands, tenements and hereditaments of the lady’s deceased first husband. Or upon her talents as nurse, or cook, amanuesis and audience. This, no doubt, is what George Bernard Shaw meant when he said that every man over forty is a scoundrel.
H.L. Mencken, “The Nature of Faith”, Prejudices, Fourth Series, 1924.
August 17, 2014
Of the many problems discussed and solved in this work, it is proper that the question of retirement should be left to the last. It has been the subject of many commissions of inquiry but the evidence heard has always been hopelessly conflicting and the final recommendations muddled, inconclusive, and vague. Ages of compulsory retirement are fixed at points varying from 55 to 75, all being equally arbitrary and unscientific. Whatever age has been decreed by accident and custom can be defended by the same argument. Where the retirement age is fixed at 65 the defenders of this system will always have found, by experience, that the mental powers and energy show signs of flagging at the age of 62. This would be a most useful conclusion to have reached had not a different phenomenon been observed in organizations where the age of retirement has been fixed at 60. There, we are told, people are found to lose their grip, in some degree, at the age of 57. As against that, men whose retiring age is 55 are known to be past their best at 52. It would seem, in short, that efficiency declines at the age of R minus 3, irrespective of the age at which R has been fixed. This is an interesting fact in itself but not directly helpful when it comes to deciding what the R age is to be.
C. Northcote Parkinson, “Pension Point, Or The Age Of Retirement”, Parkinson’s Law (and other studies in administration), 1957.
August 14, 2014
The widespread perception that almost everyone else was a moron — why, just look at the things people post and say on the Internet! – would facilitate a certain philosophy of narcissism; we would have people walking around convinced they’re much smarter, and much more sophisticated and enlightened, than everyone else.
Marinating in the perception that most people are stupid, hateful, sick, and needlessly cruel would undoubtedly alter people’s aspirations and ambitions in life. Why strive to create a new invention, miracle cure, remarkable technology, or wondrous innovation to help the masses? It would be pearls before swine, a gift to a thoroughly undeserving population that had earned its miserable circumstances. The hopeless ignorance and hateful philosophies of the great unwashed might, however, spur quiet calls for the restoration of a properly thinking aristocracy to help steer society in the correct direction.
If we wanted to build a society designed to promote depression, we would want to make children seem like a burden. Children are a smaller, slightly altered version of ourselves; Christopher Hitchens described parenthood as “realizing that your heart is running around in somebody else’s body.” To hate life, you have to hate children. If they are a form of immortality — half of our genetic code and half of our habits, good and ill, walking around a generation later — then a depressive society would condition its members to hate the possibilities of their future.
If we wanted to build a society designed to promote depression, we would want to make old age seem to be a horrible fate. (It is the only alternative to death!) Our depressive society would want to not merely celebrate youth, but we would want to constantly reinforce the sense that one is approaching mental and physical obsolescence. A celebrity who appeared much younger than her years would be celebrated and everyone would openly demand to know her secret. The unspoken expectation would be that anyone could achieve the same result if she simply tried hard enough. We would exclaim, “Man, he’s getting old!” in response to those who didn’t look the same as when we first saw them.
We would want to make sure that appearances not merely counted, but that attractiveness is preeminent. That anonymous and yet public realm of the Internet would ensure that anyone in the world could safely mock the appearance of others to a public audience and then return to picking Cheetos out of his chest hair.
Jim Geraghty, “Robin Williams and Our Strange Times: Does our society set the stage for depression?”, National Review, 2014-08-12.
May 11, 2014
It is not, naturally and generally, the happy who are most anxious either for prolongation of the present life or for a life hereafter; it is those who never have been happy. Those who have had their happiness can bear to part with existence, but it is hard to die without ever having lived.
John Stuart Mill, Three Essays on Religion, 1874
April 28, 2014
I’m interested in life extension … I have no particular hankering to die any time soon, although I admit there is some truth in the aphorism “Many wish for immortality who don’t know how to spend a rainy Sunday afternoon”. Ray Kurzweil wants immortality, and he’s doing what he can to make that happen:
Ray Kurzweil — futurist, inventor, entrepreneur, bestselling author, and now, director of engineering at Google — wants to live forever. He’s working to make it happen. Kurzweil, whose many inventions include the first optical character recognition software (which transforms the written word into data) and the first text-to-speech synthesizer, spoke to Maclean’s for our annual Rethink issue about why we’re on the brink of a technological revolution — one that will improve our health and our lives, even after the robots outsmart us for good.
Q: You say we’re in the midst of a “grand transformation” in the field of medicine. What do you see happening today?
A: Biology is a software process. Our bodies are made up of trillions of cells, each governed by this process. You and I are walking around with outdated software running in our bodies, which evolved in a very different era. We each have a fat insulin receptor gene that says, “Hold on to every calorie.” That was a very good idea 10,000 years ago, when you worked all day to get a few calories; there were no refrigerators, so you stored them in your fat cells. I would like to tell my fat insulin receptor gene, “You don’t need to do that anymore,” and indeed that was done at the Joslin Diabetes Center. They turned off this gene, and the [lab mice] ate ravenously and remained slim. They didn’t get diabetes; they didn’t get heart disease. They lived 20 per cent longer. They’re working with a drug company to bring that to market.
Life expectancy was 20 a thousand years ago; 37, 200 years ago. We’re now able to reprogram health and medicine as software, and that [pace is] going to continue to accelerate. We’re treating biology, and by extension health and medicine, as an information technology. Our intuition about how progress will unfold is linear, but information technology progresses exponentially, not linearly. My Android phone is literally several billion times more powerful, per dollar, than the computer I used when I was a student. And it’s also 100,000 times smaller. We’ll do both of those things again in 25 years. It’ll be a billion times more powerful, and will be the size of a blood cell.
April 17, 2014
As your body staggers down the winding road to death, user interfaces that require fighter pilot-grade eyesight, the dexterity of a neurosurgeon, and the mental agility of Derren Brown, are going to screw with you at some point.
Don’t kid yourself otherwise — disability, in one form or another, can strike at any moment.
Given that people are proving ever harder to kill off, you can expect to have decades of life ahead of you — during which you’ll be battling to figure out where on the touchscreen that trendy transdimensional two-pixel wide “OK” button is hiding.
Can you believe, people born today will spend their entire lives having to cope with this crap? The only way I can explain the web design of many Google products today is that some wannabe Picasso stole Larry Page’s girl when they were all 13, and is only now exacting his revenge. Nobody makes things that bad by accident, surely?
Dominic Connor, “Is tech the preserve of the young able-bodied? Let’s talk over a fine dinner and claret”, The Register, 2014-04-17
March 25, 2014
Noam Scheiber examines the fanatic devotion to youth in (some parts of) the high tech culture:
Silicon Valley has become one of the most ageist places in America. Tech luminaries who otherwise pride themselves on their dedication to meritocracy don’t think twice about deriding the not-actually-old. “Young people are just smarter,” Facebook CEO Mark Zuckerberg told an audience at Stanford back in 2007. As I write, the website of ServiceNow, a large Santa Clara–based I.T. services company, features the following advisory in large letters atop its “careers” page: “We Want People Who Have Their Best Work Ahead of Them, Not Behind Them.”
And that’s just what gets said in public. An engineer in his forties recently told me about meeting a tech CEO who was trying to acquire his company. “You must be the token graybeard,” said the CEO, who was in his late twenties or early thirties. “I looked at him and said, ‘No, I’m the token grown-up.’”
Investors have also become addicted to the youth movement:
The economics of the V.C. industry help explain why. Investing in new companies is fantastically risky, and even the best V.C.s fail a large majority of the time. That makes it essential for the returns on successes to be enormous. Whereas a 500 percent return on a $2 million investment (or “5x,” as it’s known) would be considered remarkable in any other line of work, the investments that sustain a large V.C. fund are the “unicorns” and “super-unicorns” that return 100x or 1,000x — the Googles and the Facebooks.
And this is where finance meets what might charitably be called sociology but is really just Silicon Valley mysticism. Finding themselves in the position of chasing 100x or 1,000x returns, V.C.s invariably tell themselves a story about youngsters. “One of the reasons they collectively prefer youth is because youth has the potential for the black swan,” one V.C. told me of his competitors. “It hasn’t been marked down to reality yet. If I was at Google for five years, what’s the chance I would be a black swan? A lot lower than if you never heard of me. That’s the collective mentality.”
Some of the corporate cultures sound more like playgroups than workgroups:
Whatever the case, the veneration of youth in Silicon Valley now seems way out of proportion to its usefulness. Take Dropbox, which an MIT alumnus named Drew Houston co-founded in 2007, after he got tired of losing access to his files whenever he forgot a thumb drive. Dropbox quickly caught on among users and began to vacuum up piles of venture capital. But the company has never quite outgrown its dorm-room vibe, even now that it houses hundreds of employees in an 85,000-square-foot space. Dropbox has a full-service jamming studio and observes a weekly ritual known as whiskey Fridays. Job candidates have complained about being interviewed in conference rooms with names like “The Break-up Room” and the “Bromance Chamber.” (A spokesman says the names were recently changed.)
Once a year, Houston, who still wears his chunky MIT class ring, presides over “Hack Week,” during which Dropbox headquarters turns into the world’s best-capitalized rumpus room. Employees ride around on skateboards and scooters, play with Legos at all hours, and generally tool around with whatever happens to interest them, other than work, which they are encouraged to set aside. “I’ve been up for about forty hours working on Dropbox Jeopardy,” one engineer told a documentarian who filmed a recent Hack Week. “It’s close to nearing insanity, but it feels worth it.”
It’s safe to say that the reigning sensibility at Dropbox has conquered more or less every corner of the tech world. The ping-pong playing can be ceaseless. The sexual mores are imported from college—“They’ll say something like, ‘This has been such a long day. I have to go out and meet some girls, hook up tonight,’ ” says one fortysomething consultant to several start-ups. And the vernacular is steroidally bro-ish. Another engineer in his forties who recently worked at a crowdsourcing company would steel himself anytime he reviewed a colleague’s work. “In programming, you need a throw-away variable,” the engineer explained to me. “So you come up with something quick.” With his co-workers “it would always be ‘dong’ this, ‘dick’ that, ‘balls’ this.”
There’s also the blind spot about having too many youth-focussed firms in the same market:
The most common advice V.C.s give entrepreneurs is to solve a problem they encounter in their daily lives. Unfortunately, the problems the average 22-year-old male programmer has experienced are all about being an affluent single guy in Northern California. That’s how we’ve ended up with so many games (Angry Birds, Flappy Bird, Crappy Bird) and all those apps for what one start-up founder described to me as cooler ways to hang out with friends on a Saturday night.
H/T to Kathy Shaidle for the link.