Published on 9 Mar 2015
Dr. Barbara Vreede sheds light on the evolutionary mechanisms behind alternative medicine.
BAHFest is the Festival of Bad Ad Hoc Hypotheses, a celebration of well-researched, logically explained, and clearly wrong evolutionary theory.
March 17, 2015
March 10, 2015
March 8, 2015
Maggie McNeill can definitely confirm that your great-grandparents knew more about sex than you give them credit for:
Every generation thinks it invented sex, or at least non-vanilla sex. And I don’t just mean teenagers who are squicked out by the idea of their parents shagging, either; among vanilla folk and/or those outside the demimonde, the delusion seems to persist through life that nearly everybody who lived before a moving line (hovering like a will-o-the-wisp exactly at the year the believer reached puberty) only had missionary-position sex for the purpose of procreation. Even if the individual is familiar with the Kama Sutra, knows about classical Greek pederasty or has seen the menu of a Victorian brothel, these are likely to be dismissed as islands of kink in a vast sea of unsweetened vanilla custard stretching back into prehistory. Even doctors quoted in newspaper articles are wont to make incredibly stupid, totally wrong statements like “the concept of having oral sex is something that seems less obscure to you than it did to your parents or grandparents.” Well, my dears, I’m old enough to have given birth to many of you reading this, and I can assure you that oral sex was not remotely “obscure” to us in those long-ago and far-off days of the early ‘80s; nor was it “obscure” to any of the older men I trysted with in my late teens, many of whom are now old enough to be your grandfathers; nor was it “obscure” to my own grandparents’ generation, who came of age in the Roaring Twenties; nor to the 5.5% or more of the female population who worked as whores in every large city of the world in the 19th century, nor the 70% or more of the male population who had enjoyed their company at least once; nor to any of the long procession of harlots and clients stretching back to before busybodies invented the idea of policing other peoples’ sexuality. Know what else wasn’t “obscure” to them? Anal sex. BDSM. Role-playing. Exhibitionism & voyeurism. Homosexuality. Cuckolding. I could go on and on, but I think you get the idea. Here’s a hint: most lawmakers have always been pompous ignoramuses too obsessed with telling other people what to do to actually have normal lives, so by the time they get around to banning something it’s a pretty safe bet the majority of everybody else in that culture over the age of 16 already knows about it, and many of them are doing it.
Chief among the popular sex acts that modern mythology pretends were “obscure” is masturbation, at least for women. The common delusion is that because a culture didn’t like to talk about something, it must not have existed; accordingly, the idea has arisen that Victorian girls were somehow so carefully controlled that they never discovered that touching oneself between the legs (or riding rocking horses) feels good. And because many women have difficulty reaching orgasm without some form of masturbation, that must mean that pre-20th century women all went around in a perpetual state of sexual frustration. In the past few years, the ridiculous myth has arisen that Victorian doctors actually gave women orgasms without knowing what they were, and that the vibrator was invented to speed up what they viewed as an odious task.
March 5, 2015
At Mother Jones, Kevin Drum talks about all the things we’ve been told about healthy eating … that just ain’t so:
For several years now I’ve been following the controversy over whether the dietary guidelines that have developed over the the past 70 years might be all wrong. And I’ve become tentatively convinced that, in fact, they are wrong. For most people — not all! — salt isn’t a big killer; cholesterol isn’t harmful; and red meat and saturated fat are perfectly OK. Healthy, even. Sugar, on the other hand, really needs to be watched.
Before I go on, a great big caveat: I’m not even an educated amateur on this subject. I’ve read a fair amount about it, but I’ve never dived into it systematically. And the plain truth is that firm proof is hard to come by when it comes to diet. It’s really, really hard to conduct the kinds of experiments that would give us concrete proof that one diet is better than another, and the studies that have been done almost all have defects of some kind.
Randomized trials are the gold standard of dietary studies, but as I said above, they’re really, really hard to conduct properly. You have to find a stable population of people. You have to pick half of them randomly and get them to change their diets. You have to trust them to actually do it. You have to follow them for years, not months. Virtually no trial can ever truly meet this standard.
Nonetheless, as Carroll says, the randomized trials we do have suggest that red meat and saturated fat have little effect on cardiovascular health — and might actually have a positive effect on cancer outcomes.
At the same time, increased consumption of sugars and carbohydrates might be actively bad for us. At the very least they contribute to obesity and diabetes, and there’s some evidence that they aren’t so great for your heart either.
So where does this leave us? As Carroll says, the literature as a whole suggests that we simply don’t know. We’ve been convinced of a lot of things for a long time, and it’s turned out that a lot of what we believed was never really backed by solid evidence in the first place. So now the dietary ship is turning. Slowly, but it’s turning.
His primary take-away from all this: moderation is probably your safest bet, unless you have a condition that requires you to avoid certain foods or types of foods. Oh, and avoid over-indulging in packaged food that uses lots of preservatives. This is certainly one area where the science sure didn’t turn out to be settled, after all.
February 27, 2015
Charlotte Allen discusses how quickly the language has changed when talking about transsexuality over a very short time:
In 2012 the board of trustees of the American Psychiatric Association (APA) approved a set of proposed revisions to its Diagnostic and Statistical Manual of Mental Disorders (the new version is the DSM-5), designed to remove the stigma of mental illness from the transgender classification. Earlier versions of the DSM had defined transgenderism as “gender identity disorder,” which seemed to imply illness. The DSM-5 changed that term to “gender dysphoria.” The change paralleled the association’s removal of homosexuality as a mental disorder in 1973. It signaled that whatever problems transgenders might experience were not due to a pathological misconception that their bodies and gender identities were mismatched but to the fact that their bodies and gender identities were mismatched. Hormones, surgery, cosmetics, and different clothes might still be the “cure” (enabling transgenders to qualify for medical reimbursement for a variety of procedures), but the APA was making it clear, as far as it was concerned, that the problem was not inside the transgender’s head.
The medical evidence for a mismatch between brains and bodies is ambiguous. The two studies cited most frequently by transgender activists, published in 1995 and 2000, examined the brains of a total of seven male-to-female transgenders and found that a region of the hypothalamus, an almond-shaped area of the brain that controls the release of hormones by the pituitary gland, was female-typical in those brains. But those studies have been criticized for not controlling for the estrogen—which affects the size of the hypothalamus—that most male-to-female transgenders take daily in order to maintain their feminine appearance.
Accompanying the APA’s change of classification was a change of vocabulary. Ever since the days of Christine Jorgensen (1926-1989), the World War II serviceman whose surgery in Denmark during the early 1950s brought transgenderism under the media spotlight for the first time, the procedure was known in popular parlance as a “sex change operation.” Then in the 1990s, when the idea of one’s “gender” as something distinct from one’s biological sex began to take hold (thanks to the efforts of academic feminists and other postmodernists, who argued that gender is “socially constructed”), the preferred term became “gender reassignment surgery.” Now the preferred phrase seems to be “gender confirmation surgery.” The change in terminology renders more credible transpeople’s claims to have always belonged to the gender to which they have transitioned.
The once commonly used word “transsexual” has thus become passé — even verboten in the most sensitive circles — just during the past decade. For example, Washington Post reporter Abby Ohlheiser issued a severe scolding to news media for using the word “transsexual” in reference to a 27-year-old male-to-female victim of a grisly murder and dismemberment at the hands of her 28-year-old male lover (who subsequently committed suicide) in Brisbane, Australia, in October 2014. “Although some individuals do identify as ‘transsexual,’ the term is often viewed as old-fashioned and not an appropriate umbrella word,” Ohlheiser wrote in a column deriding the coverage of the crime as “transphobic.” Ohlheiser also objected to media describing the victim, Mayang Prasetyo, as a “prostitute” (Prasetyo had been working as an escort before her death) and reproducing photos of Prasetyo’s busty self clad in a tiny swimsuit that she had posted on the Internet. “Many of the articles covering the murder are laden with provocative photographs of the victim in a bikini, as if any story about a trans person is an excuse to view and scrutinize trans bodies,” Ohlheiser wrote.
Americans, prepare to feel angry: After years of watching our cholesterol, sacrificing shellfish and egg yolks and gloriously fatty pork and beef, and enduring day-glow yellow and too-soft tubs of butter substitute, Americans are about to be told by our government diet experts, “Oops … we had it all wrong.”
The Dietary Guidelines Advisory Committee, which is charged with reviewing the government-issued dietary guidelines every five years, is preparing to release its “new and improved” guidelines any day now, and leaks from the deliberations hint at a reversal in the committee’s decades-long guidance that Americans should eat a diet low in cholesterol.
What are Americans to think of this new guidance that says cholesterol doesn’t really matter after all, that it is no longer a “nutrient of concern,” that eating food high in cholesterol may not be connected to heart disease?
Devotees of protein-rich, low-carb diets may see this as validation and reason to celebrate. Others will no doubt feel deflated, confused, and just plain bitter that for years they’ve been fed a lie that cost them, quite literally, the joy of eating delicious food, and possibly better health. Still others will misunderstand this new guidance and think butter and other high-cholesterol foods are now in the healthy column. In reality, those foods still ought to be consumed in moderation — particularly by people with preexisting conditions such as diabetes.
Yet there’s a bigger story here. Government really ought not be in the business of providing nutrition advice in the first place. Nutrition is a personal issue, and what’s best for one person may not be best for another. Moreover, Americans have ample access to information in the private sector on health and nutrition. In other words, Uncle Sam, we don’t need you anymore.
Julie Gunlock, “Government Dieticians Tell Us, Never Mind Our Decades of Bad Advice”, National Review, 2015-02-13.
February 25, 2015
Bobby Stein linked to this column in Psychology Today from last summer, talking about how to deal with sadists, psychopaths, narcissists, and Machiavellians:
There are several personality types that are more likely to harm another than the average person would. Sadists possess an intrinsic motivation to inflict suffering on innocent others, even when this comes at a personal cost. This is because for sadistic personalities, cruelty is pleasurable, generally exciting, and can be sexually stimulating.
In a recent study, Buckels and colleagues examined examples of everyday sadism as part of what they refer to as the “Dark Tetrad,” sadism plus the original members of the “Dark Triad”—psychopathy, narcissism, and Machiavellianism. These personalities have some overlap and are characterized by callous manipulation, self-centeredness, disagreeableness, and exploitation. In their research, the team sought to determine whether everyday sadism could be captured in the laboratory, as well as whether measures of sadistic personality would predict these behaviors beyond already established measures of the Dark Triad. Among the findings were that sadistic personalities were the most likely members of the Dark Tetrad to select the task involving killing from an array of unpleasant tasks. Those sadists who killed more bugs derived greater pleasure from the act than those who killed fewer bugs.
In a second, related study, those high in sadism, psychopathy, and/or narcissism, as well as those low in empathy and perspective-taking, were willing to aggress against an innocent person when aggression was easy. Only sadists increased the intensity of their attack once they realized the person would not fight back, however. Furthermore, sadists, unlike the other “dark personalities,” were the only ones willing to expend additional time and energy (in this case, first completing a boring task) in order to have the opportunity to hurt an innocent person.
Previous research has found that although psychopaths have no qualms about hurting others, they are more likely to do so when it serves a specific purpose. Narcissists are less likely to aggress upon another unless their ego is threatened. Machiavellians will usually aggress upon others only if there are sufficient perceived benefits and the risk to themselves is acceptably low.
February 24, 2015
Sarah Hoyt recently reposted her rant (in her words) about the ongoing struggle between men and women:
I know this goes completely against everything you’ve ever heard and learned. History — and SF — is full of dreamers who are convinced that if women ruled the world it would all be beauty flowers and non aggression. (To these dreamers I say spend a week as a girl in an all-girl school. It will be a rude awakening.)
Dreamers of the Dan Brown stripe posit a peaceful female worship, with yet more beauty and flowers and non-aggression. They ignore the fact that 99% of the goddess-worshiping religions were scary. And don’t tell me that’s patriarchal slander — it’s not. The baby-killing of Astoreth worship has been documented extensively. (Of course, the Phoenicians were equal-opportunity baby killers.) The castrations of Cybele worship were also well documented. Now, I can hardly imagine a female divinity without imagining hormonal episodes requiring appeasement — but that’s because I’m a woman of a certain age, and that’s fodder for another altogether different discussion. Suffice it to say that the maiden and mother usually also had a crone persona who was … er… “not a nice person.”
Anyway — all this to say since I joined the MOB (Mothers Of Boys) the scales about such things as the inherent equality of men and women as far as their brain structure and basic behavior have fallen from my eyes. (Well, the scales that remained. My experience in school notwithstanding, I’d been TAUGHT that females were getting the short end of the stick and that’s a hard thing to overcome. Learned wisdom is so much more coherent than lived wisdom, after all.)
Again — indulge me — I’m going to make a lot of statements I can too back up, but which would take very, very, very long to document — so it will seem like I’m ranting mid air. Stay with it. If I feel up to it later, I’ll post some references.
Yes, women have been horribly oppressed throughout history including the rather disgusting Victorian period that most Americans seem to believe is how ALL of history went. I contend, though, that women were not oppressed by some international conspiracy of males — yes, I know what Women’s Studies professors say. I would however remind you we’re talking of a group of people — men — who a) have issues finding their own socks in the dresser they’ve used for ten years. b) Are so good at communicating as a group that they couldn’t coordinate their way out of a wet paper bag, or to quote my friend Kate, couldn’t organize a bonk in a brothel. (In most large organizations the “social/coordinating” function is performed by females at various levels.) c) That women being oppressed by a patriarchy so thorough it altered history and changed all records of peaceful female religion would require a conspiracy lasting thousands of years and involving almost every male on Earth. If you believe that, I have this bridge in NY that I would like to sell you. — Women were oppressed by their own bodies.
Throughout most of history women had no safe and effective means of stopping pregnancy. — please, spare me the “herbal” remedies. I grew up in a village that had little access to medicine. If there had been an effective means of preventing pregnancy we’d have known it. TRUST me. There are abortificients, but they endanger the mother as well. However, until the pill there was no safe contraceptive. The herbal contraceptive is a plot device dreamed up by fantasy writers. Also, btw, the People’s Republic of China TESTED all these methods (including swallowing live tadpoles at the full moon.) NONE of them worked. SERIOUSLY.
What this meant in practical fact is that most women were pregnant from menarche to menopause, if they were lucky to live that long. I’ve been pregnant. If you haven’t, take it from me it’s not a condition conducive to brilliant discourse or reasoned logic. On top of that, of course, women would suffer the evils of repeated child bearing with no rest. In effect this DID make women frail and not the intellectual equals of men. And it encouraged any male around to “oppress” them. I.e., when the majority of females around you need a minder, you’re going to assume ALL females need a minder. It’s human nature. Note that beyond suffrage, the greatest advance in women’s equality came from the pill. Not a coincidence, that.
However, the people who think that women were oppressed by an international historical cabal rule the establishment. Including the educational establishment. I find it hilarious that in their minds men/boys are so powerful that they must be kept back and are suspected of being criminals just because they have a penis. This is attributing to them god-like powers to rival what any Victorian housewife would believe.
Anyway — these people have decided all efforts must be made to equal male and female performance in school. Since, in practical fact, this is impossible because males and females develop at different paces and favor different areas, they’ve settled for hobbling the all-powerful males.
You see this everywhere from Saturday morning cartoons to kindergarten to all the grades beyond. In cartoons these days, the girls ALWAYS rescue the boys. (They do it while keeping impeccably groomed hair, too. Impressive, that.) And in school all the girls are assumed to be right and all the boys are assumed to be wrong.
February 22, 2015
Megan McArdle on just what externalities are and why we pay attention to them:
For those who might not know the term, “externality” is economist-speak, and it means about what it sounds like: an effect that your action has on others. An externality can be positive or negative, and obviously, we as a society would like to have as many as possible of the former and as few as possible of the latter. In other words, “Your right to swing your fist stops at the end of my nose.”
I’m a libertarian, and libertarians love talking about externalities. They give us a (relatively) clear way to define what are and are not legitimate scopes of public action. Whatever you’re doing in the privacy of your own bedroom with another consenting adult is really none of my business, even if I think you oughtn’t to be doing it. On the other hand, if you’re breeding rats and cockroaches in there, and they’re coming through the shared wall of our respective row houses, then I have the right to get the law involved.
Framing things as “externalities” is therefore a good way to get a libertarian, or someone who leans that way, on your side. And such frames have come up over and over in the debate over Obamacare, which has been variously justified by the cost to the state of emergency room care; the cost to society of free-riding young folks who don’t buy insurance until they get sick; the public health cost of people who don’t go to the doctor and get really, expensively sick; an unhealthy workforce that is less productive; and the cost to friends and relatives who have to chip in to cover uninsured medical expenses.
I didn’t find any of those arguments particularly convincing. The third can just be dispensed with on the grounds of accuracy: In general, preventive medicine does not save money. Oh, it may save money in the particular case of someone whose diabetes or cancer went long undiagnosed. The problem is, you can’t just look at the cost of sick folks who would have been a lot cheaper to treat if their conditions had been caught earlier. You also have to include the cost of all the healthy people you had to screen in order to catch that one case of disease. And with limited exceptions, the cost of screening the healthy generally outweighs the cost of treating the chronically ill. Now, you can certainly argue for preventive care on other grounds — for example, that it makes people healthier (though even then you have to add the cost of unnecessary medical procedures, such as biopsies following a false positive on a blood test, which is why we do not, say, give annual mammograms to every American woman). But it’s not generally a money saver, so this particular externality doesn’t exist.
The rest of the arguments have some weight, but in the end, I don’t think they’re weighty enough. Let me explain.
February 21, 2015
At Ace of Spades H.Q., Ace is underwhelmed by the Washington Post‘s belated acknowledgement that they aided and abetted the CDC in downplaying the seriousness of the Ebola outbreak last year:
Scientists: “There Was Almost a Rush to Assure the Public That We Knew A Lot More Than We Did” About Ebola; Experts Now Concede Ebola May Be Transmitted by Purely Airborne Route
Incidentally, the Washington Post, which is itself an Expert at Writing to whom you should bow and scrape, reported his words as “there was a rush to ensure the public,” which is not what he said, because it’s stupid. And if he did say it, you throw a “(sic)” after it to indicate the error is in the quoted material, not in your own writing.
I assume he is speaking here of a proper airborne transmission, and not the layman’s “airborne” transmission; either way, the experts who so ensuredly ensured us that there was no way to get ebola from the air were wrong.
Not just wrong. Arrogantly and loudly wrong.
See, the media is not particularly bright but they are Bossy and they like pretending they Love Science. So when they see an opportunity to Pretend to Be Scientists and Yell At Their Dumb Readers, they seize upon it, even if they don’t have any idea about what the fuck they are talking. (Note preposition smartly undangled, all expert-like.)
The media were always wrong on this, and the CDC was always deliberately deceptive. This new information about an actual airborne route of transmission is new (ish), but even before, the CDC was falsely suggesting that “no airborne transmission” meant that you could not catch ebola except by direct contact with an infected person or his fluids, like his blood and stool.
They sort of forgot that his “spit” and vapor in his breath counted as “liquids,” so you could in fact catch ebola by what the layman would call an airborne route. (Scientists do not call this path of transmission “airborne” transmission, but rather “spray” transmission or “droplet transmission.”)
The CDC deliberately lied to people, and the demented little Apple Polishers of the media rushed to scream at the rest of the class that you could not possibly get ebola by anything other than direct contact.
February 20, 2015
In Nature, Claire Ainsworth explains why it’s becoming more difficult to discuss sex as a binary:
Sex can be much more complicated than it at first seems. According to the simple scenario, the presence or absence of a Y chromosome is what counts: with it, you are male, and without it, you are female. But doctors have long known that some people straddle the boundary — their sex chromosomes say one thing, but their gonads (ovaries or testes) or sexual anatomy say another. Parents of children with these kinds of conditions — known as intersex conditions, or differences or disorders of sex development (DSDs) — often face difficult decisions about whether to bring up their child as a boy or a girl. Some researchers now say that as many as 1 person in 100 has some form of DSD.
When genetics is taken into consideration, the boundary between the sexes becomes even blurrier. Scientists have identified many of the genes involved in the main forms of DSD, and have uncovered variations in these genes that have subtle effects on a person’s anatomical or physiological sex. What’s more, new technologies in DNA sequencing and cell biology are revealing that almost everyone is, to varying degrees, a patchwork of genetically distinct cells, some with a sex that might not match that of the rest of their body. Some studies even suggest that the sex of each cell drives its behaviour, through a complicated network of molecular interactions. “I think there’s much greater diversity within male or female, and there is certainly an area of overlap where some people can’t easily define themselves within the binary structure,” says John Achermann, who studies sex development and endocrinology at University College London’s Institute of Child Health.
These discoveries do not sit well in a world in which sex is still defined in binary terms. Few legal systems allow for any ambiguity in biological sex, and a person’s legal rights and social status can be heavily influenced by whether their birth certificate says male or female.
“The main problem with a strong dichotomy is that there are intermediate cases that push the limits and ask us to figure out exactly where the dividing line is between males and females,” says Arthur Arnold at the University of California, Los Angeles, who studies biological sex differences. “And that’s often a very difficult problem, because sex can be defined a number of ways.”
February 19, 2015
Published on 18 Feb 2015
Almost every cell in your body has the same DNA sequence. So how come a heart cell is different from a brain cell? Cells use their DNA code in different ways, depending on their jobs. Just like orchestras can perform one piece of music in many different ways. A cell’s combined set of changes in gene expression is called its epigenome. This week Nature publishes a slew of new data on the epigenomic landscape in lots of different cells. Learn how epigenomics works in this video.
February 11, 2015
Well, maybe not everything, but a lot of government advice — which may well have been a major factor in the rise of obesity — was based on very little empirical evidence:
Whenever standard nutritional advice is overturned — as it has been this week by a study which effectively rubbished government guidelines limiting the intake of dietary fat — I am instantly reminded of a scene in the Woody Allen film Sleeper, first released when I was 10. I expect a lot of people my age are.
In the film Allen plays Miles, a cryogenically frozen health food store owner who is revived 200 years later. Two scientists are puzzling over his old-fashioned dietary requirements, unable to comprehend what passed for health food back in 1973. “You mean there was no deep fat?” says one. “No steak or cream pies, or hot fudge?”
“Those were thought to be unhealthy,” says the other scientist. “Precisely the opposite of what we now know to be true.”
This was meant to be a joke rather than a prediction, but it’s beginning to look as if we may not have to wait until 2173 to see it validated.
Of course the new study isn’t comprehensively refuting the association between high saturated fat intake and heart disease; it’s just pointing out that dietary guidelines first adopted in the mid-1970s were not, on reflection, based on any real evidence. In terms of what one should and shouldn’t be eating, I sometimes feel as if I’ve spent the past 30 years in a freezer.
February 9, 2015
Last month, in his Times column, Matt Ridley explained why — until we discover a treatment for aging itself — rising cancer rates are a weird form of good news:
If we could prevent or cure all cancer, what would we die of? The new year has begun with a war of words over whether cancer is mostly bad luck, as suggested by a new study from Johns Hopkins School of Medicine, and over whether it’s a good way to die, compared with the alternatives, as suggested by Dr Richard Smith, a former editor of the BMJ.
It is certainly bad luck to be British and get cancer, relatively speaking. As The Sunday Times reported yesterday, survival rates after cancer diagnosis are lower here than in most developed and some developing countries, reflecting the National Health Service’s chronic problems with rationing treatment by delay. In Japan, survival rates for lung and liver cancer are three times higher than here.
Cancer is now the leading cause of death in Britain even though it is ever more survivable, with roughly half of people who contract it living long enough to die of something else. But what else? Often another cancer.
In the western world we’ve conquered most of the causes of premature death that used to kill our ancestors. War, smallpox, homicide, measles, scurvy, pneumonia, gangrene, tuberculosis, stroke, typhoid, heart disease and cholera are all much rarer, strike much later in life or are more survivable than they were fifty or a hundred years ago.
The mortality rate in men from coronary heart disease, for instance, has fallen by an amazing 80 per cent since 1968 — for all age groups. Mortality rates from stroke in both sexes have halved in 20 years. Cancer’s growing dominance of the mortality tables is not because it’s getting worse but because we are avoiding other causes of death and living longer.
It is worth remembering that some scientists and anti-pesticide campaigners in the 1960s were convinced that by now lifespans would be much shorter because of cancer caused by pesticides and other chemicals in the environment.
In the 1950s Wilhelm Hueper — a director of the US National Cancer Institute and mentor to Rachel Carson, the environmentalist author of Silent Spring — was so concerned that pesticides were causing cancer that he thought the theory that lung cancer was caused by smoking was a plot by the chemical industry to divert attention from its own culpability: “Cigarette smoking is not a major factor in the causation of lung cancer,” he insisted.
In fact it turns out that pollution causes very little cancer and cigarettes cause a lot. But aside from smoking, most cancers are indeed bad luck. The Johns Hopkins researchers found that tissues that replicate their stem cells most run the highest risk of cancer: basal skin cells do ten trillion cell divisions in a lifetime and have a million times more cancer risk than pelvic bone cells which do about a million cell divisions. Random DNA copying mistakes during cell division are “the major contributors to cancer overall, often more important than either hereditary or external environmental factors”, say the US researchers.
To sum it up, until or unless medical research finds a way to stop the bodily effects of aging, cancer becomes the most likely way for all of us to die. Cancer is a generic rather than a specific term — it’s what we use to describe the inevitable breakdown of the cellular division process that happens millions or even trillions of times over our lifetime. As Ridley puts it, “even if everybody lived in the healthiest possible way, we would still get a lot of cancer.” I’m not a scientist and I don’t even play one on TV, but I suspect that the solution to cancers of all kinds are to boost our immune systems to more quickly identify aberrant cells in our bodies before they start reproducing beyond the capability of the immune system to handle. The short- to medium-term solution to cancer may be to make us all a little bit cyborg…
February 7, 2015
Last month, Scott Alexander tried to show the evidence, pro and con, on whether we have detected a causal relationship between physical ailments and depression:
Start with From inflammation to sickness and depression [PDF], Dantzer et al (2008), who note that being sick makes you feel lousy . Drawing upon evolutionary psychology, they theorize this is an adaptive response to make sick people stay in bed (or cave, or wherever) so the body can focus all of its energy on healing. A lot of sickness behavior – being tired, not wanting to do anything, not eating, not wanting to hang around other people – seems kind of like mini-depression.
All of this stuff is regulated by chemicals called cytokines, which are released by immune cells that have noticed an injury or infection or something. They are often compared to a body-wide “red alert” sending the message “sickness detected, everyone to battle stations”. This response is closely linked to the idea of “inflammation”, the classic example of which is the locally infected area that has turned red and puffy. Most inflammatory cytokines handle the immune response directly, but a few of them – especially interleukin-1B and tumor necrosis factor alpha – cause this depression-like sickness behavior.
Here are some other suspicious facts about depression and inflammation:
– Exercise, good diet and sleep reduce inflammation; they also help depression.
– Stress increases inflammation and is a known trigger for depression.
– Rates of depression are increasing over time, with the condition seemingly very rare in pre-modern non-Westernized societies. This is commonly attributed to the atomization and hectic pace of modern life. But levels of inflammation are also increasing over time, probably because we have a terrible diet that disrupts the gut microbiota that are supposed to be symbioting with the immune system. Could this be another one of the things we think are social that turn out to be biological?
– SSRI antidepressants, like most medications, have about five zillion effects. One of the effects is to reduce the level of inflammatory cytokines in the body. Is it possible that this is why they work, and all of this stuff about serotonin receptors in the brain is a gigantic red herring?
– It’s always been a very curious piece of trivia that treating depression comorbid with heart disease significantly decreases your chances of dying from the heart disease. People just sort of nod their heads and say “You know, mind-body connection”. But inflammation is known to be implicated in cardiovascular disease. If treating depression is a form of lowering inflammation, this would make perfect sense.
– Rates of depression are much higher in sick people. Cancer patients are especially famous for this. No one gets too surprised here, because having cancer is hella depressing. But it’s always been interesting (to me at least) that as far as we can tell, antidepressants treat cancer-induced depression just as well as any other type. Are antidepressants just that good? Or is the link between cancer being sad and cancer causing depression only part of the story, with the other part being that the body’s immune response to cancer causes inflammatory cytokine release, which antidepressants can help manage?
– Along with cancer, depression is common in many other less immediately emotion-provoking illnesses like rheumatoid arthritis and diabetes. The common thread among these illnesses is inflammation.
– Inflammation changes the activity level of the enzyme indoleamine 2,3 dioxygenase. This enzyme produces kynurenines which interact with the NMDA receptor, a neurotransmitter receptor implicated in depression and various other psychiatric diseases (in case your first question upon learning about this pathway is the same as mine: yes, kynurenines got their name because they were first found in dog urine).
– Sometimes doctors treat diseases like hepatitis by injecting artificial cytokines to make the immune system realize the threat and ramp up into action. Cytokine administration treatments very commonly cause depression as a side effect. This depression can be treated with standard antidepressants.
– Also, it turns out we can just check and people with depression have more cytokines.
There’s also some evidence against the theory. People with depression have more cytokines, but it’s one of those wishy-washy “Well, if you get a large enough sample size, you’ll see a trend” style relationships, rather than “this one weird trick lets you infallibly produce depression”.
So in conclusion, I think the inflammatory hypothesis of depression is very likely part of the picture. Whether it’s the main part of the picture or just somewhere in the background remains to be seen, but for now it looks encouraging. Anti-inflammatory drugs do seem to treat depression, which is a point in the theory’s favor, but right now the only one that has strong evidence behind it has side effects that make it undesirable for most people. There’s a lot of room to hope that in the future researchers will learn more about exactly how this cytokine thing works and be able to design antidepressant drugs that target the appropriate cytokines directly. Until then, your best bets are the anti-inflammatory mainstays: good diet, good sleep, plenty of exercise, low stress levels, and all the other things we already know work.