… Rolling Stone morphed into AARP Magazine so slowly, I hardly even noticed.
Ed Driscoll, “Plutocrat Millionaires Insult Military Veterans”, PJ Media, 2014-11-14.
November 20, 2015
November 19, 2015
Matt Ridley on recent developments in the search for ways to ameliorate the effects of aging:
Squeezed between falling birth rates and better healthcare, the world population is getting rapidly older. Learning how to deal with that is one of the great challenges of this century. The World Health Organisation has just produced a report on the implications of an ageing population, which — inadvertently — reveals a dismal fatalism we share about the illnesses of old age: that they will always be inevitable.
This could soon be wrong. A new book, The Telomerase Revolution, published in America this week by the doctor and medical researcher Michael Fossel, argues that we now understand enough about the fundamental cause of ageing to be confident that we will eventually be able to reverse it. This would mean curing diseases such as Alzheimer’s, heart disease and osteoporosis, rather than coping with them or treating their symptoms.
Let me show you what I mean about fatalism. The WHO report on ageing and health, for all its talk of the need for “profound changes” to health care for the elderly, actually urges us to stop trying to cure the afflictions of old age and learn to live with them: “The societal response to population ageing will require a transformation of health systems that moves away from disease-based curative models and towards the provision of older-person-centred and integrated care.”
Yet it also subscribes to the somewhat magical hope that illnesses of old age can be “prevented or delayed by engaging in healthy behaviours” and that “physical activity and good nutrition can have powerful benefits for health and wellbeing.” This is largely wishful thinking. There is no evidence that, say, Alzheimer’s can be prevented by a certain diet or activity. A lack of activity and poor nutrition can worsen health at any age, but the underlying chronic diseases of old age are caused by age itself.
When I asked Dr Fossel what he thought of the WHO report, he replied: “In 1950 we could have talked (and did) about ‘active polio’ in the sense of keeping polio victims active rather than giving up, but the very phrase itself implies that one has already given up. I would prefer that we cure the fundamental problem. Why talk about ‘active ageing’, ‘successful ageing’, and ‘healthy ageing’ when we could talk about not ageing?”
November 4, 2015
The older I get, the better I understand the saying “you can’t teach an old dog new tricks.” I used to think this referred to some weakness of the mind or obstinance, which I rejected as foolish and even cruel.
I’ve come to understand That saying differently. The older you get, the less patience, time, and energy you have with new things. You’ve seen decades of new things and are beginning to tire of their novelty. You only have so much time, and most of it is taken up with the rest of your life. And you have less energy to spend on something new.
In addition, the older you get, the more experience you have. Starting to learn a new operating system at 20 seems like just a matter of picking up some new tricks, but at 50 you realize just how long its going to take and how annoying its going to be after the previous 5 times through that process. And sometimes it feels like this old Far Side cartoon, where you’ve filled your mind up with 50+ years of stuff like old phone numbers, how to call information on a rotary phone, and the name of that character on Adam-12.
So its not so much you cannot teach an old dog new tricks. Its that the old dog has been through this once too often and has better things to do.
Christopher Taylor, “OLD DOGS”, Word Around the Net, 2014-10-20.
October 4, 2015
Believe it or not, the end of the seemingly eternal federal election is finally in sight. We’re getting to the wind-up stage of the campaign and we can now expect certain evergreen political topics to be discussed as we wearily struggle down to the wire. Colby Cosh covers one of the biggest “issues” of every federal election:
The parties are running low on ammunition in the election that never ends, and I can sense, like a tracker laying an ear to the ground, the approach of conversations about demographics and the getting-out of the vote. With this campaign sub-season — suitably located in the autumn — will come talk of “gray power”; dread of the Conservative advantage among bigoted, ornery, vote-crazy oldies; and, above all, the suffocating hatred of the young toward the liver-spotted hands that grip our levers of power and ward off change.
I rarely speak of Baby Boomers without a generous helping of contemptuous spittle. But the great equalizers, pain and death and dementia and distraction, are now starting to take them. The people I call Turnout Nerds obsess over youth voting: it seems unnatural to them, even revolting, that fewer than half of people under 35 bother to struggle to the polls, choosing to deny us their breezy new ideas and their orientation toward the future. (Not that I can see much actual evidence of either quality.)
They do not talk much about what happens to voter turnout once Canadians have passed their peak propensity to vote, which arrives, according to the official estimates for the 2011 election, at the age of 67. The graph, it turns out, looks like a skewed triangle. Voters in the age cohorts from 20-25 had less than 40 per cent turnout in 2011. There is a slow linear climb from there; turnout passes 50 per cent in the mid-30s, 60 per cent in the mid-40s, 70 per cent on the cusp of age 60. It rises to above 75 per cent at about the traditional retirement age.
But the dropoff in turnout from there is steeper than the rise — and how else could it be, given arthritis and lumbago and the other cruel facts of late life? And by age 67, according to an insurance man’s icy “life tables,” more than one per cent of the population is dying every year. If you adjust for mortality, and imagine a hypothetical pool of Canadian voters starting out at age 18, the estimated age at which the highest number of the original group will be voting isn’t 67; it’s more like a flat peak between the ages of 59 and 64. After that, coronaries start taking away more voters than enthusiasm is adding.
August 25, 2015
Many years ago, when I was more reckless intellectually than I am today, I proposed the application of Haeckel’s biogenetic law — to wit, that the history of the individual rehearses the history of the species to the domain of ideas. So applied, it leads to some superficially startling but probably quite sound conclusions, for example, that an adult poet is simply an individual in a state of arrested development — in brief, a sort of moron. Just as all of us, in utero, pass through a stage in which we are tadpoles, and almost indistinguishable from the tadpoles which afterward become frogs, so all of us pass through a stage, in our nonage, when we are poets. A youth of seventeen who is not a poet is simply a donkey: his development has been arrested even anterior to that of the tadpole. But a man of fifty who still writes poetry is either an unfortunate who has never developed, intellectually, beyond his teens, or a conscious buffoon who pretends to be something that he isn’t — something far younger and juicier than he actually is […] Something else, of course, may enter into it. The buffoonery may be partly conscious and deliberate, and partly Freudian. Many an aging man keeps on writing poetry simply because it gives him the illusion that he is still young. For the same reason, perhaps, he plays tennis, wears green cravats, and tries to convince himself that he is in love.
H.L. Mencken, “The Nature of Faith”, Prejudices, Fourth Series, 1924.
March 11, 2015
March 10, 2015
February 9, 2015
Last month, in his Times column, Matt Ridley explained why — until we discover a treatment for aging itself — rising cancer rates are a weird form of good news:
If we could prevent or cure all cancer, what would we die of? The new year has begun with a war of words over whether cancer is mostly bad luck, as suggested by a new study from Johns Hopkins School of Medicine, and over whether it’s a good way to die, compared with the alternatives, as suggested by Dr Richard Smith, a former editor of the BMJ.
It is certainly bad luck to be British and get cancer, relatively speaking. As The Sunday Times reported yesterday, survival rates after cancer diagnosis are lower here than in most developed and some developing countries, reflecting the National Health Service’s chronic problems with rationing treatment by delay. In Japan, survival rates for lung and liver cancer are three times higher than here.
Cancer is now the leading cause of death in Britain even though it is ever more survivable, with roughly half of people who contract it living long enough to die of something else. But what else? Often another cancer.
In the western world we’ve conquered most of the causes of premature death that used to kill our ancestors. War, smallpox, homicide, measles, scurvy, pneumonia, gangrene, tuberculosis, stroke, typhoid, heart disease and cholera are all much rarer, strike much later in life or are more survivable than they were fifty or a hundred years ago.
The mortality rate in men from coronary heart disease, for instance, has fallen by an amazing 80 per cent since 1968 — for all age groups. Mortality rates from stroke in both sexes have halved in 20 years. Cancer’s growing dominance of the mortality tables is not because it’s getting worse but because we are avoiding other causes of death and living longer.
It is worth remembering that some scientists and anti-pesticide campaigners in the 1960s were convinced that by now lifespans would be much shorter because of cancer caused by pesticides and other chemicals in the environment.
In the 1950s Wilhelm Hueper — a director of the US National Cancer Institute and mentor to Rachel Carson, the environmentalist author of Silent Spring — was so concerned that pesticides were causing cancer that he thought the theory that lung cancer was caused by smoking was a plot by the chemical industry to divert attention from its own culpability: “Cigarette smoking is not a major factor in the causation of lung cancer,” he insisted.
In fact it turns out that pollution causes very little cancer and cigarettes cause a lot. But aside from smoking, most cancers are indeed bad luck. The Johns Hopkins researchers found that tissues that replicate their stem cells most run the highest risk of cancer: basal skin cells do ten trillion cell divisions in a lifetime and have a million times more cancer risk than pelvic bone cells which do about a million cell divisions. Random DNA copying mistakes during cell division are “the major contributors to cancer overall, often more important than either hereditary or external environmental factors”, say the US researchers.
To sum it up, until or unless medical research finds a way to stop the bodily effects of aging, cancer becomes the most likely way for all of us to die. Cancer is a generic rather than a specific term — it’s what we use to describe the inevitable breakdown of the cellular division process that happens millions or even trillions of times over our lifetime. As Ridley puts it, “even if everybody lived in the healthiest possible way, we would still get a lot of cancer.” I’m not a scientist and I don’t even play one on TV, but I suspect that the solution to cancers of all kinds are to boost our immune systems to more quickly identify aberrant cells in our bodies before they start reproducing beyond the capability of the immune system to handle. The short- to medium-term solution to cancer may be to make us all a little bit cyborg…
January 25, 2015
January 3, 2015
At the wonderfully named Worthwhile Canadian Initiative blog, Frances Woolley looks at some of the ordinary human cussedness that prevents wonderfully clear and understandable economic theories from working quite as efficiently as their formulators expect:
1. Economies grow when people buy stuff.
2. Over time, people accumulate more and more stuff.
3. People can only handle so much stuff. Sock drawers get full of socks. Cupboards get full of cups. Bookshelves get full of books.
4. It’s hard to get rid of stuff. Economic models typically assume disposing of unwanted things costs nothing. But life isn’t like that. Sorting out stuff that can be tossed from stuff that is worth keeping takes time and effort.
5. People are “loss averse”. Throwing things away — clothes that don’t fit, vinyl LPs — hurts psychologically.
6. There’s no need to replace perfectly good stuff. True some stuff, like mobile phones, only lasts a year or three. But other stuff, like cast-iron frying pans, lasts for decades.
Taken together, observations 2 through 6 imply that, as people get older, they buy less and less stuff. Combined with observation 1, these observations explain why countries with aging populations experience lower rates of economic growth.
My only quibble is with the final sentence of point 3: bookshelves don’t get full … you just run out of immediate book storage options. Bookshelves are never really full, they’re just temporarily over-booked.
December 2, 2014
On his blog, Charles Stross talks about the mundane irritations and accumulated friction of a life lived past age 50 or so:
Beyond the obvious (gross physiological deterioration and pathologies of senescence), what are the psychological symptoms of ageing?
I tend to be somewhat impatient or short-tempered these days. Examples: getting worked up about people obstructing a sidewalk in front of me, or carelessly blowing smoke over their shoulder and into my face, walking while texting … you know the drill. This I put down largely to the chronic low-grade pain of the middle-aged body: joints that creak and pop, muscles that need an extra stretch, sore feet. […]
My memory, as previously noted, is a sieve. Partly I find myself living in a cluttered cognitive realm: I have so much context to apply to any new piece of incoming data. If middle-aged people seem slow at times it may not be because they’re stupid (although stupidity is a non-ageist affliction) but because they’re processing a lot more data than a young mind has on hand to digest. That shop window display? You’re not just looking at this seasons clothing fashions, but integrating changes in fashion across multiple decades and recognizing when this stuff was last new. (And if fashion is your thing, you’re trying to remember how far back in the wardrobe you hung it last time you wore it, all those years ago.) A side-effect of this: when experiencing something familiar through long repetition you forget it — you don’t remember it as a new experience but merely as an instance of a familiar one and (eventually) as nothing at all. (For those of you with a workday routine, this can cut in quite early: how well do you remember your last commute to work? If you do remember it, do you remember it only because it was exceptional—a truck nearly t-boning you, for example?)
An intersecting effect of the aches and pains and the difficulty retrieving information is that you have to focus hard on tasks — it’s hard to execute a day with six or seven distinct non-routine activities in it, because that requires planning and planning requires lots of that difficult mental integration. Planning is exhausting. Instead you focus on maintaining routines (get up, brush teeth, take meds, shave, use toilet, make coffee … check. Go to gym: check. Eat lunch: check. Work at desk: check …) and scheduling one or two exceptional tasks. Mental checklists help a lot, but you run into the sieve-shaped memory problem again: this is where digital prosthesis (or an overflowing filofax) come in handy.
Your perspective on current events changes. Take the news media. Everything new is old after a time: you see the large-scale similarities across decades even without becoming a student of history. Today’s invasion or oil crisis is just like the one before last. Our current political leadership are stuck in the same ideological monkey’s-paw trap as their predecessors the last time their party was in power. And so on. So you tend to discount current events and lose interest in the news until something new happens. (If you’re wondering why I’m obsessively interested in the Scottish independence thing this year, it’s because it’s a disruptive event: nothing like it has happened in UK politics for a very long time indeed. It’s fresh.)
September 19, 2014
I recall, in the very early days of the personal computer, articles, in magazines like Personal Computer World, which expressed downright opposition to the idea of technological progress in general, and progress in personal computers in particular. There was apparently a market for such notions, in the very magazines that you would think would be most gung-ho about new technology and new computers. Maybe the general atmosphere of gung-ho-ness created a significant enough minority of malcontents that the editors felt they needed to nod regularly towards it. I guess it does make sense that the biggest grumbles about the hectic pace of technological progress would be heard right next to the places where it is happening most visibly.
Whatever the reasons were for such articles being in computer magazines, I distinctly remember their tone. I have recently, finally, got around to reading Virginia Postrel’s The Future and Its Enemies, and she clearly identifies the syndrome. The writers of these articles were scared of the future and wanted that future prevented, perhaps by law but mostly just by a sort of universal popular rejection of it, a universal desire to stop the world and to get off it. “Do we really need” (the words “we” and “need” cropped up in these PCW pieces again and again), faster central processors, more RAM, quicker printers, snazzier and bigger and sharper and more colourful screens, greater “user friendlinesss”, …? “Do we really need” this or that new programme that had been reported in the previous month’s issue? What significant and “real” (as opposed to frivolous and game-related) problems could there possibly be that demanded such super-powerful, super-fast, super-memorising and of course, at that time, super-expensive machines for their solution? Do we “really need” personal computers to develop, in short, in the way that they have developed, since these grumpy anti-computer-progress articles first started being published in computer progress magazines?
The usual arguments in favour of fast and powerful, and now mercifully far cheaper, computers concern the immensity of the gobs of information that can now be handled, quickly and powerfully, by machines like the ones that we have now, as opposed to what could be handled by the first wave of personal computers, which could manage a small spreadsheet or a short text file or a very primitive computer game, but very little else. And of course that is true. I can now shovel vast quantities of photographs (a particular enthusiasm of mine) hither and thither, processing the ones I feel inclined to process in ways that only Hollywood studios used to be able to do. I can make and view videos (although I mostly stick to viewing). And I can access and even myself add to that mighty cornucopia that is the internet. And so on. All true. I can remember when even the most primitive of photos would only appear on my screen after several minutes of patient or not-so-patient waiting. Videos? Dream on. Now, what a world of wonders we can all inhabit. In another quarter of a century, what wonders will there then be, all magicked in a flash into our brains and onto our desks, if we still have desks. The point is, better computers don’t just mean doing the same old things a bit faster; they mean being able to do entirely new things as well, really well.
Brian Micklethwait, “Why fast and powerful computers are especially good if you are getting old”, Samizdata, 2014-09-17.
September 7, 2014
It is my conviction that no normal man ever fell in love, within the ordinary meaning of the word, after the age of thirty. He may, at forty, pursue the female of his species with great assiduity, and he may, at fifty, sixty or even seventy, “woo” and marry a more or less fair one in due form of law, but the impulse that moves him in these follies at such ages is never the complex of outlandish illusions and hallucinations that poets describe as love. This complex is quite natural to all males between adolescence and the age of, say, twenty-five, when the kidneys begin to disintegrate. For a youth to reach twenty-one without having fallen in love in an abject and preposterous manner would be for doubts to be raised as to his normalcy. But if he does it after his wisdom teeth are cut, it is no more than a sign that they have been cut in vain — that he is still in his teens, whatever his biological and legal age. Love, so-called, is based upon a view of women that is impossible to any man who has any experience of them. Such a man may, to the end of his life, enjoy the charm of their society, and even respect them and admire them, but, however much he respects and admires them, he nevertheless sees them more or less clearly, and seeing them clearly is fatal to true romance. Find a man of forty-five who heaves and moans over a woman, however amiable and lovely, in the manner of a poet and you will behold either a man who ceased to develop intellectually at twenty-four or thereabout, or a fraud who has his eye on the lands, tenements and hereditaments of the lady’s deceased first husband. Or upon her talents as nurse, or cook, amanuesis and audience. This, no doubt, is what George Bernard Shaw meant when he said that every man over forty is a scoundrel.
H.L. Mencken, “The Nature of Faith”, Prejudices, Fourth Series, 1924.
August 17, 2014
Of the many problems discussed and solved in this work, it is proper that the question of retirement should be left to the last. It has been the subject of many commissions of inquiry but the evidence heard has always been hopelessly conflicting and the final recommendations muddled, inconclusive, and vague. Ages of compulsory retirement are fixed at points varying from 55 to 75, all being equally arbitrary and unscientific. Whatever age has been decreed by accident and custom can be defended by the same argument. Where the retirement age is fixed at 65 the defenders of this system will always have found, by experience, that the mental powers and energy show signs of flagging at the age of 62. This would be a most useful conclusion to have reached had not a different phenomenon been observed in organizations where the age of retirement has been fixed at 60. There, we are told, people are found to lose their grip, in some degree, at the age of 57. As against that, men whose retiring age is 55 are known to be past their best at 52. It would seem, in short, that efficiency declines at the age of R minus 3, irrespective of the age at which R has been fixed. This is an interesting fact in itself but not directly helpful when it comes to deciding what the R age is to be.
C. Northcote Parkinson, “Pension Point, Or The Age Of Retirement”, Parkinson’s Law (and other studies in administration), 1957.
August 14, 2014
The widespread perception that almost everyone else was a moron — why, just look at the things people post and say on the Internet! – would facilitate a certain philosophy of narcissism; we would have people walking around convinced they’re much smarter, and much more sophisticated and enlightened, than everyone else.
Marinating in the perception that most people are stupid, hateful, sick, and needlessly cruel would undoubtedly alter people’s aspirations and ambitions in life. Why strive to create a new invention, miracle cure, remarkable technology, or wondrous innovation to help the masses? It would be pearls before swine, a gift to a thoroughly undeserving population that had earned its miserable circumstances. The hopeless ignorance and hateful philosophies of the great unwashed might, however, spur quiet calls for the restoration of a properly thinking aristocracy to help steer society in the correct direction.
If we wanted to build a society designed to promote depression, we would want to make children seem like a burden. Children are a smaller, slightly altered version of ourselves; Christopher Hitchens described parenthood as “realizing that your heart is running around in somebody else’s body.” To hate life, you have to hate children. If they are a form of immortality — half of our genetic code and half of our habits, good and ill, walking around a generation later — then a depressive society would condition its members to hate the possibilities of their future.
If we wanted to build a society designed to promote depression, we would want to make old age seem to be a horrible fate. (It is the only alternative to death!) Our depressive society would want to not merely celebrate youth, but we would want to constantly reinforce the sense that one is approaching mental and physical obsolescence. A celebrity who appeared much younger than her years would be celebrated and everyone would openly demand to know her secret. The unspoken expectation would be that anyone could achieve the same result if she simply tried hard enough. We would exclaim, “Man, he’s getting old!” in response to those who didn’t look the same as when we first saw them.
We would want to make sure that appearances not merely counted, but that attractiveness is preeminent. That anonymous and yet public realm of the Internet would ensure that anyone in the world could safely mock the appearance of others to a public audience and then return to picking Cheetos out of his chest hair.
Jim Geraghty, “Robin Williams and Our Strange Times: Does our society set the stage for depression?”, National Review, 2014-08-12.