“Research Shows Oreos Are Just As Addictive As Drugs,” says the headline above a recent Connecticut College press release. “…in Lab Rats,” it adds, and I’ll get to that part later. But first note that the study’s findings could just as truthfully be summarized this way: “Drugs Are No More Addictive Than Oreos.” The specific drugs included in the study were cocaine and morphine, which is what heroin becomes immediately after injection. So the headline also could have been: “Research Shows That Heroin and Cocaine Are No More Addictive Than Oreos.” Putting it that way would have raised some interesting questions about the purportedly irresistible power of these drugs, which supposedly justifies using force to stop people from consuming them.
[…]
So what exactly did the rats do? They favored the side of a maze where they were given Oreos to the same extent that they favored that side of the maze when they were given an injection of cocaine or morphine there. Furthermore, when the researchers “used immunohistochemistry to measure the expression of a protein called c-Fos, a marker of neuronal activation, in the nucleus accumbens, or the brain’s ‘pleasure center,'” they found that “the Oreos activated significantly more neurons than cocaine or morphine.” Given the latter finding, perhaps we should credit Connecticut College’s publicity department with restraint for not announcing that Oreos are in fact more addictive than cocaine or heroin. Or to put it another way: Cocaine and heroin are less addictive than Oreos. Which makes you wonder why people go to prison for selling the drugs but not for selling the cookies, especially since Oreos and similar foods “may present even more of a danger.”
The idea that people can take or leave cocaine or heroin in the same way they can take or leave Oreos seems inconsistent with research that supposedly shows how powerfully reinforcing these substances are. Studies published between 1969 and 1985, for instance, found that rats and rhesus monkeys “will prefer cocaine to food” and “will self-administer cocaine until death or near-death,” as Stanton Peele and Richard DeGrandpre note in a 1998 Addiction Research article. But the animals in these studies were isolated from other animals, deprived of interesting stimuli, and prevented from engaging in normal behavior while tethered to catheters providing “an unlimited, direct flow of high concentrations of cocaine at all times at little or no cost” (in terms of effort). Research conducted in more naturalistic settings finds that monkeys and rats are much more apt to consume cocaine and morphine in moderation.
Laboratory animals’ tendency to consume drugs to excess when they are bored and lonely has pretty clear parallels in human behavior. But unlike rats and monkeys, humans are capable of reason and foresight (even if they do not always exercise those faculties) as well as emotions such as guilt and regret. They also have considerable control over their own environments. If the reinforcing power of drugs is not the only factor in addiction among rats and monkeys, it surely is not a complete explanation for why some people get hooked on these substances while most do not.
October 16, 2013
Cocaine and heroin are less addictive than Oreos
October 14, 2013
You can’t make me eat kale
At American Digest, a paean of joy at the thought of being forced to eat more kale:
Of late many self-employed food bullshit artists have concluded that we should eat more kale. Why anyone would want to eat even a little kale is beyond me. Kale, considered dispassionately, is something that you’d want to dry and stuff into a tick mattress if you were out of paint soaked rags and seaweed. Kale is not, strictly speaking, a food.
And yet, and yet, there it is. Oozing in piles of of leafy green intestine cleansing fronds in what can now only be described as the weed section of the produce aisle at your average Whole Foods.
How kale actually got into our national food chain is a mystery almost as deep as how the flavor of pumpkin (backed by “Spice!”) has been infused into foods and beverages starting October 1. Both kale and pumpkin exemplify items from the somewhat vegetable kingdom that would be better going straight from farm to compost without passing through humans.
And yet, and yet, here we are … one more mile down the road to hell courtesy of those post LorenaBobbittized vegans within whom there is not a teaspoon of testosterone in a trainload.
[…]
That’s the wave of the future and it is not an amber wave under spacious skies. Nope. It is a wave of pale and sodden progressively “good-for-you” greens slopped onto your aluminum plate in the prison chow line on Planet Vegan. You remember that putrescent puddle of gurgling spinach guts in spinach water that was once glunked on your plate in the high school cafeteria? This is the same thing only with extra thiocyanate. But hey, its KALE!, so count yourself lucky. Think of all the children of the elite and super rich that are going to bed tonight without any.
October 12, 2013
Not news: people under-report calorie intake, invalidating 40 years of federal research
Any study that depends on self-reporting, especially self-reporting on things like how much food they eat, can’t be assumed to be accurate:
Four decades of nutrition research funded by the Centers for Disease Control and Prevention (CDC) may be invalid because the method used to collect the data was seriously flawed, according to a new study by the Arnold School of Public Health at the University of South Carolina.
The study, led by Arnold School exercise scientist and epidemiologist Edward Archer, has demonstrated significant limitations in the measurement protocols used in the National Health and Nutrition Examination Survey (NHANES). The findings, published in PLOS ONE (The Public Library of Science), reveal that a majority of the nutrition data collected by the NHANES are not “physiologically credible,” Archer said.
[…]
The study examined data from 28,993 men and 34,369 women, 20 to 74 years old, from NHANES I (1971 — 1974) through NHANES (2009 — 2010), and looked at the caloric intake of the participants and their energy expenditure, predicted by height, weight, age and sex. The results show that — based on the self-reported recall of food and beverages — the vast majority of the NHANES data “are physiologically implausible, and therefore invalid,” Archer said.
In other words, the “calories in” reported by participants and the “calories out,” don’t add up and it would be impossible to survive on most of the reported energy intakes. This misreporting of energy intake varied among participants, and was greatest in obese men and women who underreported their intake by an average 25 percent and 41 percent (i.e., 716 and 856 Calories per-day respectively).
October 5, 2013
The care and feeding of introverts
Carl King has a list of myths that are somewhat widely believed about that odd class of people, the introverts:
I wrote this list in late-2008. Around that time, I was lucky enough to discover a book called, The Introvert Advantage (How To Thrive in an Extrovert World), by Marti Laney, Psy.D. It felt like someone had written an encyclopedia entry on a rare race of people to which I belong. Not only had it explained many of my eccentricities, it helped me to redefine my entire life in a new and productive context.
Sure, anyone who knows me would say, “Duh! Why did it take you so long to realize you’re an Introvert?” It’s not that simple. The problem is that labeling someone as an Introvert is a very shallow assessment, full of common misconceptions. It’s more complex than that.
A section of Laney’s book (page 71 through page 75) maps out the human brain and explains how neuro-transmitters follow different dominant paths in the nervous systems of Introverts and Extroverts. If the science behind the book is correct, it turns out that Introverts are people who are over-sensitive to Dopamine, so too much external stimulation overdoses and exhausts them. Conversely, Extroverts can’t get enough Dopamine, and they require Adrenaline for their brains to create it. Extroverts also have a shorter pathway and less blood-flow to the brain. The messages of an Extrovert’s nervous system mostly bypass the Broca’s area in the frontal lobe, which is where a large portion of contemplation takes place.
Unfortunately, according to the book, only about 25% of people are Introverts. There are even fewer that are as extreme as I am. This leads to a lot of misunderstandings, since society doesn’t have very much experience with my people. (I love being able to say that.)
H/T to Joey deVilla for the link.
September 30, 2013
September 19, 2013
QotD: Guns and mental illness
There isn’t much of a culture-war component of discussing mental illness, other than a few folks on the Right who blame the Left for deinstitutionalizing the mentally ill in the 1960s. I suspect that there is no real constituency in favor of the Second Amendment rights of the mentally ill — provided, of course, the definition of “mentally ill” is clear, explicit, and taken seriously. (If you think there’s a stigma to admitting you’re seeing a therapist, a psychologist, or getting mental health treatment now, just wait until some of your legal rights can be restricted because of it.)
Thankfully, I’ve never known anyone who has had violent episodes or threatening mental illness. My sense of reading coverage and the literature is that people rarely “snap” and become dangerous killers overnight. As you’ve probably found in your research, there are certain common threads: withdrawal from others and lack of a support network; hostile behavior and temper control, outbursts, etc. It is maddeningly infuriating to hear friends and acquaintances of past shooters describe behavior that seems, in retrospect, to be a warning sign or red flag.
After Columbine, many school administrators tried to institute a new “If you see something, say something” approach to individuals behaving in a threatening manner. Then we saw in Virginia Tech that many, many students reported the gunman for strange and threatening behavior, including stalking. School administrators ultimately couldn’t do enough to stop him — either from fear of lawsuits or from overall bureaucratic inertia.
[…]
It’s not clear how effective a program like this would be; one would hope that people would already know to report strange, troubling, or threatening behavior to authorities. In past writings, I’ve emphasized that the only authority that can put someone on the federal firearms restriction list is a judge, and so that these sorts of concerns are best sent directly to the cops, not to a school administrator or company HR department.
However, a country where more Americans are trained to spot signs of serious, untreated and potentially dangerous mental illness strikes me as a better path than yet another effort to restrict the rights of 40 million gun owners because of the actions of a handful.
Jim Geraghty, “Why Post-Shooting Gun-Control Debates Are So Insufferable”, National Review Online, 2013-09-18
September 18, 2013
Elizabeth Loftus on false memories
The more we discover about the process of memory formation and recall, the more we discover that our memories are more fallible and plastic than we believed. Elizabeth Loftus talks to Alison George about the problem of false memories:
AG: How does this happen? What exactly is going on when we retrieve a memory?
EL: When we remember something, we’re taking bits and pieces of experience — sometimes from different times and places — and bringing it all together to construct what might feel like a recollection but is actually a construction. The process of calling it into conscious awareness can change it, and now you’re storing something that’s different. We all do this, for example, by inadvertently adopting a story we’ve heard — like Romney did.AG: How did you end up studying false memories?
EL: Early in my career, I had done some very theoretical studies of memory, and after that I wanted to [do] work that had more obvious practical uses. The memory of witnesses to crimes and accidents was a natural place to go. In particular I looked at what happens when people are questioned about their experiences. I would ultimately see those questions as a means by which the memories got contaminated.AG: You’re known for debunking the idea of repressed memories. Why focus on them?
EL: In the 1990s we began to see these recovered-memory cases. In the first big one, a man called George Franklin was on trial. His daughter claimed she had witnessed her father kill her best friend when she was 8 years old — but had only remembered this 20 years later. And that she had been raped by him and repressed that memory too. Franklin was convicted of the murder, and that started this repressed-memory ball rolling through the legal system. We began to see hundreds of cases where people were accusing others based on claims of repressed memory. That’s what first got me interested.AG: How did you study the process of creating false memories?
EL: We needed a different paradigm for studying these types of recollections. I developed a method for creating “rich false memories” by using strong suggestion. The first such memory was about getting lost in a shopping mall as a child.AG: How susceptible are people to having these types of memories implanted?
EL: Depending on the study, you might get as many as 50 percent of people falling for the suggestion and developing a complete or partial false memory.
As I’ve mentioned before, the more we learn about memory, the less comfortable I am with the belief that eyewitness testimony in criminal cases is as dependable as our legal system assumes. There are definitely large numbers of people in prison based on eyewitness accounts … some of which are almost certainly false memories (but believed by the witness to be accurate).
AG: Is there any way to distinguish a false memory from a real one?
EL: Without independent corroboration, little can be done to tell a false memory from a true one.AG: Could brain imaging one day be used to do this?
EL: I collaborated on a brain imaging study in 2010, and the overwhelming conclusion we reached is that the neural patterns were very similar for true and false memories. We are a long way away from being able to look at somebody’s brain activity and reliably classify an authentic memory versus one that arose through some other process.AG: Do you think it’s important for people to realize how malleable their memory is?
EL: My work has made me tolerant of memory mistakes by family and friends. You don’t have to call them lies. I think we could be generous and say maybe this is a false memory.
September 17, 2013
The flaw in “nudging”
Coyote Blog looks at the flourishing “nudge” sector of government activity and points out one of the biggest flaws:
The theory behind the idea that government should nudge (or coerce, as the case may be) us into “better” behavior is based on the idea that many people are bad at delay discounting. In other words, we tend to apply huge discount rates to pain in the future, such that we will sometimes make decisions to avoid small costs today even if that causes us to incur huge costs in the future (e.g. we refuse to walk away from the McDonalds french fries today which may cause us to die of obesity later).
There are many problems with this theory, not the least of which is that many decisions that may appear to be based on bad delay discounting are actually based on logical and rational premises that outsiders are unaware of.
But the most obvious problem is that people in government, who will supposedly save us from this poor decision-making, are human beings as well and should therefore have the exact same cognitive weaknesses. No one has ever managed to suggest a plausible theory as to how our methods of choosing politicians or staffing government jobs somehow selects for people who have better decision-making abilities.
September 15, 2013
September 13, 2013
September 12, 2013
This is rather sinister
At Marginal Revolution, Alex Tabarrok talks about a statistical study which concluded that being left-handed had serious impact on your lifespan:
In 1991 Halpern and Coren published a famous study in the New England Journal of Medicine which appears to show that left handed people die at much younger ages than right-handed people. Halpern and Coren had obtained records on 987 deaths in Southern California — we can stipulate that this was a random sample of deaths in that time period — and had then asked family members whether the deceased was right or left-handed. What they found was stunning, left handers in their sample had died at an average age of 66 compared to 75 for right handers. If true, left handedness would be on the same order of deadliness as a lifetime of smoking. Halpern and Coren argued that this was due mostly to unnatural deaths such as industrial and driving accidents caused by left-handers living in a right-handed world. The study was widely reported at the time and continues to be regularly cited in popular accounts of left handedness (e.g. Buzzfeed, Cracked).
What is less well known is that the conclusions of the Halpern-Coren study are almost certainly wrong, left-handedness is not a major cause of death. Rather than dramatically lower life expectancy, a more plausible explanation of the HC findings is a subtle and interesting statistical artifact. The problem was pointed out as early as the letters to the editor in the next issue of the NEJM (see Strang letter) and was also recently pointed out in an article by Hannah Barnes in the BBC News (kudos to the BBC!) but is much less well known.
The statistical issue is that at a given moment in time a random sample of deaths is not necessarily a random sample of people. I will explain.
August 21, 2013
Obesity – it’s not just for humans any more
We’re constantly being barraged with public service announcements from public figures that we all eat too much, exercise too little, and as a result society has a (sorry) growing obesity problem. However, as David Berreby points out, it’s not as simple as that:
Consider, for example, this troublesome fact, reported in 2010 by the biostatistician David B Allison and his co-authors at the University of Alabama in Birmingham: over the past 20 years or more, as the American people were getting fatter, so were America’s marmosets. As were laboratory macaques, chimpanzees, vervet monkeys and mice, as well as domestic dogs, domestic cats, and domestic and feral rats from both rural and urban areas. In fact, the researchers examined records on those eight species and found that average weight for every one had increased. The marmosets gained an average of nine per cent per decade. Lab mice gained about 11 per cent per decade. Chimps, for some reason, are doing especially badly: their average body weight had risen 35 per cent per decade. Allison, who had been hearing about an unexplained rise in the average weight of lab animals, was nonetheless surprised by the consistency across so many species. ‘Virtually in every population of animals we looked at, that met our criteria, there was the same upward trend,’ he told me.
It isn’t hard to imagine that people who are eating more themselves are giving more to their spoiled pets, or leaving sweeter, fattier garbage for street cats and rodents. But such results don’t explain why the weight gain is also occurring in species that human beings don’t pamper, such as animals in labs, whose diets are strictly controlled. In fact, lab animals’ lives are so precisely watched and measured that the researchers can rule out accidental human influence: records show those creatures gained weight over decades without any significant change in their diet or activities. Obviously, if animals are getting heavier along with us, it can’t just be that they’re eating more Snickers bars and driving to work most days. On the contrary, the trend suggests some widely shared cause, beyond the control of individuals, which is contributing to obesity across many species.
Such a global hidden factor (or factors) might help to explain why most people gain weight gradually, over decades, in seeming contradiction of Bloomberg’s thermodynamics. This slow increase in fat stores would suggest that they are eating only a tiny bit more each month than they use in fuel. But if that were so, as Jonathan C K Wells, professor of child nutrition at University College London, has pointed out, it would be easy to lose weight. One recent model estimated that eating a mere 30 calories a day more than you use is enough to lead to serious weight gain. Given what each person consumes in a day (1,500 to 2,000 calories in poorer nations; 2,500 to 4,000 in wealthy ones), 30 calories is a trivial amount: by my calculations, that’s just two or three peanut M&Ms. If eliminating that little from the daily diet were enough to prevent weight gain, then people should have no trouble losing a few pounds. Instead, as we know, they find it extremely hard.
Many other aspects of the worldwide weight gain are also difficult to square with the ‘it’s-just-thermodynamics’ model. In rich nations, obesity is more prevalent in people with less money, education and status. Even in some poor countries, according to a survey published last year in the International Journal of Obesity, increases in weight over time have been concentrated among the least well-off. And the extra weight is unevenly distributed among the sexes, too. In a study published in the Social Science and Medicine journal last year, Wells and his co-authors found that, in a sample that spanned 68 nations, for every two obese men there were three obese women. Moreover, the researchers found that higher levels of female obesity correlated with higher levels of gender inequality in each nation. Why, if body weight is a matter of individual decisions about what to eat, should it be affected by differences in wealth or by relations between the sexes?
August 18, 2013
Down with the “nudgers”
In Reason, Baylen Linnekin discusses the so-called libertarian paternalists:
Even if I were to concede that point, there are plenty of programs that might be called soft or libertarian paternalism and that yield negative outcomes.
For example, federal farm subsidies quietly influence the choices made by farmers and consumers and lead many in both groups to believe they’re better off — a key precept of libertarian paternalism.
Subsidies influence farmers to produce some foods (like corn, soy, dairy, and sugar) to the exclusion of other foods (like arugula, bok choy, and yams). It’s no surprise that the former foods are the ones most farmers grow, and that they’re much more frequent choices among eaters.
The noodgy allure of farm subsidies is that farmers get money and certainty, while consumers get abundant and cheaper food at the grocery.
Another example of libertarian paternalism around food is menu labeling. Its proponents refer to laws mandating calorie counts on fast food and other restaurant menus as a gentle nudge that requires businesses to provide us with information the government thinks we need but still allows us to make our own choices. The hope by government is that we’ll choose items with fewer calories and be better off for exercising that choice. But studies have shown mandatory restaurant menu labeling does not work in practice. Worse, a recent study showed mandated menu labeling can actually cause consumers to choose foods with more calories.
So both farm subsidies and mandatory menu labeling present firm empirical evidence that libertarian paternalism doesn’t work, right?
You might think so. But Sunstein’s Nudge writing partner, Richard Thaler, would likely argue that these failures simply call for more testing on the part of government.
“No one knows the answers to every problem, and not every idea works, so it is vital to test,” Thaler said earlier this month.
Of course. Who else but a cadre of bureaucrats who’ve never met you could possibly through trial and error determine what’s best for you to eat?
August 8, 2013
Medical marijuana – “We have been terribly and systematically misled for nearly 70 years”
ThinkProgress reports that CNN’s Dr. Sanjay Gupta has changed his position on the medical use of marijuana:
CNN’s Chief Medical Correspondent Dr. Sanjay Gupta reversed his position on marijuana’s health benefits and apologized for his previous stand against it in an article Thursday for CNN. In 2009, Gupta penned an op-ed advocating against marijuana, where he advised as a doctor that “marijuana isn’t really very good for you.” At the time, he was in the running for an appointment to Surgeon General.
Since then, additional research and his work on a documentary have convinced him otherwise.
“I apologize because I didn’t look hard enough, until now,” he said. “I didn’t look far enough. I didn’t review papers from smaller labs in other countries doing some remarkable research, and I was too dismissive of the loud chorus of legitimate patients whose symptoms improved on cannabis.”
[…]
“We have been terribly and systematically misled for nearly 70 years in the United States, and I apologize for my own role in that.”
August 2, 2013
First it was bulemia, then anorexia, now it might be “orthorexia”
It’s nice to know that people in the richest culture in world history can still manage to make themselves utterly miserable by obsessing about things:
Picture this: After spending the summer indulging in ice cream and cocktails, you decide to embrace healthy eating. You cut out refined sugar and packaged food-the kind of nutrient-free junk on any doctor’s warning list. Wheat and dairy are the next to go.
People compliment you on your weight loss; your energy levels rival those of Jillian Michaels. But soon your innocent health kick takes a strange turn. Certain foods – even fruits and veggies – begin to seem dangerous, even unclean.
Within months, you’ve whittled your list of “acceptable” foods down to almost nothing.
This unhealthy fixation with eating healthfully is called “orthorexia nervosa,” a term coined by Dr. Steven Bratman, a Colorado-based physician, in 1997. Since then, orthorexia rates have spiralled in tandem with society’s insistence upon knowing every last detail about its food.
Orthorexia (derived from the Greek “ortho,” which means “correct”) often begins with a noble impulse – to get fit or eat organic – that grows into a self-destructive obsession where fewer and fewer foods meet the orthorexic’s increasingly high standards.
The result is everything from malnutrition to social anxiety as orthorexics avoid restaurants and their friends’ kitchens. At its most extreme, orthorexia can even act as a gateway to anorexia, says Merryl Bear, director of Toronto’s National Eating Disorder Information Centre.
“The gateway possibility is very real because the principles are so similar,” she explains. “Like anorexics, orthorexics prize being pure and in control above all else.” (Orthorexia is currently classified as a form of disordered eating, not a clinical eating disorder, in the Diagnostic and Statistical Manual of Mental Disorders, published by the American Psychiatric Association.)
Since orthorexics value purity, not weight loss, eating becomes a moral act. “A day filled with wheat grass juice, tofu and quinoa biscuits may come to feel as holy as one spent serving the destitute and homeless,” writes Bratman in his book Health Food Junkies: Overcoming the Obsession With Healthful Eating (2004).
H/T to Nicholas Packwood for the link.
Update: Colby Cosh was quick to send me a link to a piece he did on this topic more than a decade ago:
Since becoming a physician, Dr. Bratman has seen many people like his own young self — and some who are worse off — flirting with disaster by depriving their body of vital nutrients. The fads of his youth, far from disappearing, have survived and grown in number: there are even “Breatharians” who believe food to be wholly unnecessary. A few years ago Dr. Bratman coined the phrase “orthorexia” — merging Greek ortho-, meaning righteous, with the stem familiar from “anorexia” — to describe a pathological attachment to dietary theories.
“I never intended the term to be a serious diagnostic entity; you wouldn’t go to a hospital with ‘orthorexia,'” he says. “It’s informal, like ‘workaholic.'” The idea has nonetheless stirred controversy: a Yale University physician sniffed in one critique that “We’ve never had anybody come to our clinic with orthorexia.” Yet fanatical attachment to dietary theories can indeed be hazardous. Macrobiotic diets caused a string of deaths in the 1960s and had to be modified; “metabolic” treatments for cancer, usually involving fasting, occasionally turn disastrous; and vegetarians and vegans must monitor themselves for certain vitamin and mineral deficiencies. In September, an Armenian couple in Surrey, England, were convicted of starving their nine-month-old daughter to death on a “Fruitarian” fruit-only diet.
“People become orthorexic by falling in love with a dietary theory,” says Dr. Bratman. “They run across an idea like macrobiotics or raw-foodism, and embrace it like a religion. We’re not talking about common-sense rules of healthy eating, but theories which reject whole classes of foods and make spontaneous eating [impossible]…There’s a personality type, an obsessive type of person who is prone to embrace them in a quasi-religious way.” This can result in an enticing sense of moral superiority, sometimes coupled with the euphoria associated with partial starvation. But orthorexia also brings crippling feelings of unworthiness after the inevitable slip-ups, when the true believer succumbs to a cookie or a pizza. “There are similarities with anorexia,” he says. “An important one is that anorexics feel like they’ve done something evil when they gain weight, something morally wrong rather than merely unhealthy.” Similarly, the sure sign of an orthorexic is that he associates unhealthy eating with a sense of sin.




