Quotulatiousness

January 13, 2014

The GMO debate – “it is a tale told by an idiot, full of sound and fury, signifying nothing”

Filed under: Environment, Media, Science, Technology — Tags: , , , , , — Nicholas Russon @ 09:22

Nathanael Johnson says he has taken more abuse over his articles on genetically modified organisms than anything else in his writing career. And he says he learned something from his research: that it actually doesn’t matter at all.

It’s a little awkward to admit this, after devoting so much time to this project, but I think Beth was right. The most astonishing thing about the vicious public brawl over GMOs is that the stakes are so low.

I know that to those embroiled in the controversy this will seem preposterous. Let me try to explain.

Let’s start off with a thought experiment: Imagine two alternate futures, one in which genetically modified food has been utterly banned, and another in which all resistance to genetic engineering has ceased. In other words, imagine what would happen if either side “won” the debate.

In the GMO-free future, farming still looks pretty much the same. Without insect-resistant crops, farmers spray more broad-spectrum insecticides, which do some collateral damage to surrounding food webs. Without herbicide-resistant crops, farmers spray less glyphosate, which slows the spread of glyphosate-resistant weeds and perhaps leads to healthier soil biota. Farmers also till their fields more often, which kills soil biota, and releases a lot more greenhouse gases. The banning of GMOs hasn’t led to a transformation of agriculture because GM seed was never a linchpin supporting the conventional food system: Farmers could always do fine without it. Eaters no longer worry about the small potential threat of GMO health hazards, but they are subject to new risks: GMOs were neither the first, nor have they been the last, agricultural innovation, and each of these technologies comes with its own potential hazards. Plant scientists will have increased their use of mutagenesis and epigenetic manipulation, perhaps. We no longer have biotech patents, but we still have traditional seed-breeding patents. Life goes on.

In the other alternate future, where the pro-GMO side wins, we see less insecticide, more herbicide, and less tillage. In this world, with regulations lifted, a surge of small business and garage-biotechnologists got to work on creative solutions for the problems of agriculture. Perhaps these tinkerers would come up with some fresh ideas to usher out the era of petroleum-dependent food. But the odds are low, I think, that any of their inventions would prove transformative. Genetic engineering is just one tool in the tinkerer’s belt. Newer tools are already available, and scientists continue to make breakthroughs with traditional breeding. So in this future, a few more genetically engineered plants and animals get their chance to compete. Some make the world a little better, while others cause unexpected problems. But the science has moved beyond basic genetic engineering, and most of the risks and benefits of progress are coming from other technologies. Life goes on.

The point is that even if you win, the payoff is relatively small in the broad scheme of things. Really, why do so many people care?

December 2, 2013

The FDA and 23andMe

Filed under: Bureaucracy, Business, Health, USA — Tags: , , — Nicholas Russon @ 11:24

Kyle Smith on the FDA’s sudden interest in shutting down private DNA testing company 23andMe:

… the FDA has the power to regulate medical devices, which is the pretext it is using to stop 23andMe. Ordering it to stop selling its personal genome service, the FDA declared that the tube “is a device within the meaning of section 201(h) of the FD&C Act, 21 U.S.C. 321(h), because it is intended for use in the diagnosis of disease or other conditions or in the cure, mitigation, treatment or prevention of disease, or is intended to affect the structure or function of the body.’

It would seem that 23andMe could simply put the words, “not intended for us in the diagnosis, cure, mitigation, treatment or prevention of disease” on its website and satisfy the FDA, but we all know that the motto of today’s federales is “We make it up as we go along.” The FDA seems determined to conduct a lengthy war with 23andMe.

[...]

Using the same reasoning, the FDA might as well shut down WebMd.com because people might type their symptoms into the site, and the response might affect whether or not they choose to go to a doctor. Any computer or iPhone thereby becomes a “medical device” that people can use for the “diagnosis, cure, mitigation, treatment or prevention of disease.”

Come to think of it, that thermometer you use to check your temperature is pretty dangerous too — it might give you either a false positive or a false negative — but why stop there? You exercise to mitigate or prevent disease, don’t you? Maybe the FDA should take your running shoes and your yoga pants away.

November 29, 2013

We’re from the FDA, we’re here to help you

Filed under: Bureaucracy, Government, Health, Science — Tags: , , , — Nicholas Russon @ 09:19

Nick Gillespie on the mindnumbingly awful exercise of FDA regulatory power in shutting down personal DNA testing company 23andMe:

Personal genetic tests are safe, innovative, and the future of medicine. So why is the most transparent administration ever shutting down a cheap and popular service? Because it can.

In its infinite wisdom, the Food and Drug Administration (FDA) has forbidden the personal genetic testing service 23andMe from soliciting new customers, claiming the company hasn’t proven the validity of its product.

The real reason? Because when it comes to learning about your own goddamn genes, the FDA doesn’t think you can handle the truth. That means the FDA is now officially worse than Oedipus’s parents, Dr. Zaius, and the god of Genesis combined, telling us that there are things that us mere mortals just shouldn’t be allowed to know.

23andMe allows you to get rudimentary information about your genetic makeup, including where your ancestors came from and DNA markers for over 240 different hereditary diseases and conditions (not all of them bad, by the way). Think of it as the H&M version of the haute couture genetic mark-up that Angelina Jolie had done prior to having the proactive mastectomy that she revealed this year.

[...]

Peter Huber of the Manhattan Institute, a conservative think tank, has an important new book out called The Cure in the Code: How 20th Century Law is Undermining 21st Century Medicine. Huber writes that whatever sense current drug-approval procedures once might have had, their day is done. Not only does the incredible amount of time and money — 12 years and $350 million at a minimum — slow down innovation, it’s based on the clearly wrong idea that all humans are the same and will respond the same way to the same drugs.

Given what we already know about small but hugely important variations in individual body chemistry, the FDA’s whole mental map needs to be redrawn. “The search for one-dimensional, very simple correlations — one drug, one clinical effect in all patients — is horrendously obsolete,” Huber told me in a recent interview. And the FDA’s latest action needs to be understood in that context — it’s just one more way in which a government which now not only says we must buy insurance but plans whose contours are dictated by bureaucrats who arbitrarily decide what is best for all of us.

October 6, 2013

Any GMO-labelling compromise is a win for big business and a loss for everyone else

Filed under: Bureaucracy, Business, USA — Tags: , , — Nicholas Russon @ 00:06

Baylen Linnekin explains why compromise in the battle over genetically modified food ingredients is likely to be heartily supported by big business — because they can easily cover costs that their smaller competitors will not be able to afford:

Like it or not — and I’m in the not camp — a mandatory, uniform national GMO labeling scheme appears increasingly likely.

[...]

Major players on the business side, including Walmart, America’s leading grocer, and General Mills, which bills itself as “one of the world’s largest food companies,” have publicly tipped their hands that they’d support some sort of mandatory labeling.

As I noted this summer, Walmart held a meeting with FDA officials and others from the food industry earlier this year where, it was alleged, the grocer and other food sellers that have opposed state labeling requirements would push for the federal government to adopt a national GMO labeling standard.

And just last week, Ken Powell, the CEO of General Mills, announced at the company’s annual stockholders’ meeting that the company “strongly support[s] a national, federal labeling solution.”

Powell’s comments are a game changer.

But do they mean that anti-GMO activists and food companies are on the same page? Not by a longshot. Powell made clear in his remarks that the company supports “a national standard that would label foods that don’t have genetically engineered ingredients in them, rather than foods that do.” (emphasis mine)

I suspect that anti-GMO activists would hate that solution because it wouldn’t provide the “information” they want and because all of the significant testing and labeling costs of the mandatory scheme Powell suggests — along with any liability for not testing GMO-free foods or for mislabeling — would be borne by the GMO-free farmers and food producers they frequent (and by their customers, in the form of higher prices).

June 12, 2013

Changing the FDA to meet the new needs of personalized medicine

Filed under: Health, Science — Tags: , , , , — Nicholas Russon @ 08:31

At Marginal Revolution, Alex Tabarrok links to a new paper by Peter Huber:

In a brilliant new paper (pdf) (html) Peter Huber draws upon molecular biology, network analysis and Bayesian statistics to make some very important recommendations about FDA policy.

[. . .]

The current regime was built during a time of pervasive ignorance when the best we could do was throw a drug and a placebo against a randomized population and then count noses. Randomized controlled trials are critical, of course, but in a world of limited resources they fail when confronted by the curse of dimensionality. Patients are heterogeneous and so are diseases. Each patient is a unique, dynamic system and at the molecular level diseases are heterogeneous even when symptoms are not. In just the last few years we have expanded breast cancer into first four and now ten different types of cancer and the subdivision is likely to continue as knowledge expands. Match heterogeneous patients against heterogeneous diseases and the result is a high dimension system that cannot be well navigated with expensive, randomized controlled trials. As a result, the FDA ends up throwing out many drugs that could do good:

    Given what we now know about the biochemical complexity and diversity of the environments in which drugs operate, the unresolved question at the end of many failed clinical trials is whether it was the drug that failed or the FDA-approved script. It’s all too easy for a bad script to make a good drug look awful. The disease, as clinically defined, is, in fact, a cluster of many distinct diseases: a coalition of nine biochemical minorities, each with a slightly different form of the disease, vetoes the drug that would help the tenth. Or a biochemical majority vetoes the drug that would help a minority. Or the good drug or cocktail fails because the disease’s biochemistry changes quickly but at different rates in different patients, and to remain effective, treatments have to be changed in tandem; but the clinical trial is set to continue for some fixed period that doesn’t align with the dynamics of the disease in enough patients

    Or side effects in a biochemical minority veto a drug or cocktail that works well for the majority. Some cocktail cures that we need may well be composed of drugs that can’t deliver any useful clinical effects until combined in complex ways. Getting that kind of medicine through today’s FDA would be, for all practical purposes, impossible.

The alternative to the FDA process is large collections of data on patient biomarkers, diseases and symptoms all evaluated on the fly by Bayesian engines that improve over time as more data is gathered. The problem is that the FDA is still locked in an old mindset when it refuses to permit any drugs that are not “safe and effective” despite the fact that these terms can only be defined for a large population by doing violence to heterogeneity. Safe and effective, moreover, makes sense only when physicians are assumed to be following simple, A to B, drug to disease, prescribing rules and not when they are targeting treatments based on deep, contextual knowledge that is continually evolving

May 6, 2013

Genetically modified barley may mean the end of skunky beer

Filed under: Science — Tags: , , — Nicholas Russon @ 08:54

The Register‘s Simon Sharwood on an Australian development that might herald new long-life beers:

Researchers at Australia’s University of Adelaide have unlocked the secret to letting beer age without it tasting like old socks.

Doctor Jason Eglington of the university’s School of Agriculture, Food and Wine explained that barley contains an enzyme called “lipoxygenase”. The enzymatic process produces several substances, among them an aroma volatile, catchily named “trans-2-nonenal”. The latter substance, over time, gives old beer a nasty taste and odour.

Eglington, who heads the university’s Barley Program*, learned that some ancient strains of barley have a defective version of lipoxygenase.

Some selective breeding later and the booze boffins have produced a new barley with everything a brewer could want — except working lipoxygenase.

May 5, 2013

A second life for the American chestnut tree?

Filed under: Environment, Science, USA — Tags: — Nicholas Russon @ 10:25

The Economist reports on a trial of genetically modified chestnut trees:

Once upon a time, according to folklore, a squirrel could travel through America’s chestnut forests from Maine to Florida without ever touching the ground. The chestnut population of North America was reckoned then to have been about 4 billion trees. No longer. Axes and chainsaws must take a share of the blame. But the principal culprit is Cryphonectria parasitica, the fungus that causes chestnut blight. In the late 19th century, some infected saplings from Asia brought C. parasitica to North America. By 1950 the chestnut was little more than a memory in most parts of the continent.

American chestnuts may, however, be about to rise again — thanks to genetic engineering. This month three experimental patches will be planted, under the watchful eye of the Department of Agriculture, in Georgia, New York and Virginia. Along with their normal complements of genes, these trees have been fitted with a handful of others that researchers hope will protect them from the fungus.

The project has been organised by the Forest Health Initiative (FHI), a quango set up to look into the idea of using genetic engineering to rescue species of tree whose populations have been devastated by fungal diseases or insect pests. It has sponsored research at several universities, and this month’s trial is the first big field test. If it works, the FHI will ask the government for permission to plant transgenic chestnuts in the wild, with the intention of re-establishing the species in America’s woodlands. And if that goes well, it could provide a model for projects to re-establish elm trees (being devastated by Dutch elm disease, a beetle-born fungal infection), ash trees (threatened in North America by a beetle called the emerald ash borer, and in Europe by a fungal disease called ash dieback) and a fir tree known, confusingly, as the eastern hemlock (which is plagued by the woolly adelgid, a sap-sucking bug).

March 18, 2013

Mark Lynas and his break with the anti-GMO activists

Filed under: Britain, Environment, Media, Science — Tags: , , , — Nicholas Russon @ 09:11

Mark Lynas was one of the most prominent activists working against the adoption of genetically modified crops. Over time, he realized he was fighting the wrong battle and publicly recanted his decades-long struggle. He talks about it in an interview with Charlie Gillis in Maclean’s:

Q: You’ve disavowed a cause you were identified with for decades. How are you feeling about your decision?

A: It’s been traumatic, but it’s also been something of a liberation. I’ve obviously been inconsistent in my life, but so are we all. In my view, it’s better to be inconsistent and half-right, than to be consistently wrong. Even the pope doesn’t claim these days to be infallible, yet that’s what most environmental groups do.

Q: Still, you’ve offended your former allies, a lot of whom are now trying to discredit you. Some say you exaggerated your part in founding the anti-GM movement to start with. What’s that been like on a personal level?

A: My whole social scene has been characterized by my environmentalism. I’m in a situation where I can go to a party and I don’t know who’s currently not speaking to me.

Q: On Twitter, Vandana Shiva, a prominent environmentalist in India, likened your calls for farmers to be able to plant GMOs to saying rapists should have the freedom to rape.

A: That was simply astonishing, and frankly, hurtful to people who have actually suffered the trauma of rape. Look, these attacks on me are obviously done in the interests of damage limitation. It’s sort of an emperor’s-new-clothes thing. I have helped expose the fact most people’s concerns about GM foods are based on mythology. Once you can get past the idea that there’s something inherently dangerous about GM foods, it’s a whole different conversation. We actually can tell whether GM foods are safe. They have been extensively tested hundreds and hundreds of times, using different techniques. Many of the tests were conducted independently. The jury is entirely in on this issue.

[. . .]

Q. You argue that opposing GMOs is actually anti-environmental.

A. That was the realization that changed my mind. That recombinant DNA is actually a potentially very powerful technology for designing crop plants that can help humanity tackle our food-supply shortages, and also reduce our environmental footprint. They can help us use less fertilizer, and dramatically reduce pesticide applications. We can reduce our exposure to climate change through drought and heat-tolerant crops. So the potential is enormous.

March 2, 2013

Ethical debates of the very near future: species de-extinction

Filed under: Science — Tags: , , , — Nicholas Russon @ 12:12

Matt Ridley on the soon-to-be-possible reversal of species extinction:

The founders of Revive and Restore aren’t mainstream scientists, but they’re not people to be taken lightly, either. Stewart Brand and Ryan Phelan are a husband-and-wife team with a track record of starting unusual but successful organizations — in his case, the Whole Earth Catalog and the Global Business Network; in hers, the consumer-focused startups Direct Medical Knowledge and DNA Direct. They’ve attracted the interest of the pioneering Harvard University DNA sequencing and synthesis expert George Church.

Their argument is that it’s time to start tentatively trying de-extinction and thinking through its ethical and ecological implications. There are already projects under way to revive extinct subspecies like the European aurochs (a type of wild cattle) and the Pyrenean ibex, or bucardo. In the latter case, when the last female (Celia) was killed by a falling tree in 2000, her tissue was cloned. At least one fetus survived to term in a surrogate mother goat, but it died soon after birth.

A full species that’s been extinct for decades like the thylacine (Tasmanian tiger) or the passenger pigeon — the last one of which, Martha, died in the Cincinnati Zoo 99 years ago — will be a taller order, since the DNA from long dead specimens is fragmented. Yet Ben Novak, a young researcher working with the ancient-DNA expert Beth Shapiro at the University of California, Santa Cruz, has extracted passenger pigeon DNA from the toe pad of a museum specimen and sequenced it. Dr. Church hopes to use one of the newly invented letter-by-letter gene-replacement techniques, such as Talens or Crispr, to transform the genome of a related species called the band-tailed pigeon into that of a passenger pigeon.

February 13, 2013

The imaginary trade-off between ecology and economics

Filed under: Economics, Environment, Media, Technology — Tags: , , , — Nicholas Russon @ 09:37

Matt Ridley on the improvements in the environment in the western world:

Extrapolate global average GDP per capita into the future and it shows a rapid rise to the end of this century, when the average person on the planet would have an income at least twice as high as the typical American has today. If this were to happen, an economist would likely say that it’s a good thing, while an ecologist would likely say that it’s a bad thing because growth means using more resources. Therein lies a gap to be bridged between the two disciplines.

The environmental movement has always based its message on pessimism. Population growth was unstoppable; oil was running out; pesticides were causing a cancer epidemic; deserts were expanding; rainforests were shrinking; acid rain was killing trees; sperm counts were falling; and species extinction was rampant. For the green movement, generally, good news is no news. Many environmentalists are embarrassed even to admit that some trends are going in the right direction.

[. . .]

Why are environmental trends mainly positive? In short, the gains are due to “land sparing,” in which technological innovation allows humans to produce more from less land, leaving more land for forests and wildlife. The list of land sparing technologies is long: Tractors, unlike mules and horses, do not need to feed on hay. Advances in fertilizers and irrigation, as well as better storage, transport, and pest control, help boost yields. New genetic varieties of crops and livestock allow people to get more from less. Chickens now grow three times as fast in they did in the 1950s. The yield boosts from genetically modified crops is now saving from the plow an area equivalent to 24 percent of Brazil’s arable land.

What is really making a positive dent in the environmental arena is the unintended effects of technology rather than nature reserves or exhortations to love nature. Policy analyst Indur Goklany calculated that if we tried to support today’s population using the methods of the 1950s, we would need to farm 82 percent of all land, instead of the 38 percent we do now. The economist Julian Simon once pointed out that with cheap light, an urban, multi-story hydroponic warehouse the size of Delaware could feed the world, leaving the rest for wilderness.

It is not just food. In fiber and fuel too, we replace natural sources with synthetic, reducing the ecological footprint. Construction uses less and lighter materials. Even CO2 emissions enrich crop yields.

February 4, 2013

University of Leicester confirms that the remains are those of King Richard III

Filed under: Britain, History, Science — Tags: , , , — Nicholas Russon @ 09:16

BBC News rounds up the details:

A skeleton found beneath a Leicester car park has been confirmed as that of English king Richard III.

Experts from the University of Leicester said DNA from the bones matched that of descendants of the monarch’s family.

Lead archaeologist Richard Buckley, from the University of Leicester, told a press conference to applause: “Beyond reasonable doubt it’s Richard.”

Richard, killed in battle in 1485, will be reinterred in Leicester Cathedral.

Mr Buckley said the bones had been subjected to “rigorous academic study” and had been carbon dated to a period from 1455-1540.

Dr Jo Appleby, an osteo-archaeologist from the university’s School of Archaeology and Ancient History, revealed the bones were of a man in his late 20s or early 30s. Richard was 32 when he died.

Battle wounds

His skeleton had suffered 10 injuries, including eight to the skull, at around the time of death. Two of the skull wounds were potentially fatal.

One was a “slice” removing a flap of bone, the other caused by bladed weapon which went through and hit the opposite side of the skull, a depth of more than 10cms (4ins).

Dr Appleby said: “Both of these injuries would have caused an almost instant loss of consciousness and death would have followed quickly afterwards.

“In the case of the larger wound, if the blade had penetrated 7cm into the brain, which we cannot determine from the bones, death would have been instantaneous.”

Other wounds included slashes or stabs to the face and the side of the head.

Update: New Scientist still has concerns that the trail of evidence is not strong enough to constitute proof of identity:

Mitochondrial DNA is passed down the maternal line and has 16,000 base pairs in total. Typically, you might expect to get 50 to 150 fragments from a 500-year-old skeleton, says Ian Barnes at Royal Holloway, University of London, who was not involved in the research. “You’d want to get sequences from lots of those fragments,” he says. “There’s a possibility of mitochondrial mutations arising in the line from Richard III.”

“It’s intriguing to be sure,” says Mark Thomas at University College London. It is right that they used mitochondrial DNA based on the maternal line, he says, since genealogical evidence for the paternal lineage cannot be trusted.

But mitochondrial DNA is not especially good for pinpointing identity. “I could have the same mitochondrial DNA as Richard III and not be related to him,” says Thomas.

The researchers used the two living descendents to “triangulate” the DNA results. The evidence will rest on whether Ibsen and his cousin have sufficiently rare mtDNA to make it unlikely that they both match the dead king by chance.

January 14, 2013

The increasing precision of DNA editing

Filed under: Science, Technology — Tags: , , , — Nicholas Russon @ 09:56

Matt Ridley looks at the vastly improved editing tools becoming available for DNA manipulation:

Little wonder that precision genetic engineering has taken a while to arrive. In truth, it has been moving steadily toward greater precision for 10,000 years. Early farmers in what’s now Turkey introduced a mutation to wheat plants in the “Q gene” on chromosome 5A, which made the seed-head less brittle and the seed husks easier to harvest efficiently.

They did so unknowingly, of course, by selecting from among random mutations.

Fifty years ago, scientists used a nuclear reactor to fire gamma rays at barley seeds, scrambling some of their genes. The result was “Golden Promise,” a high-yielding, low-sodium barley variety popular with (ironically) organic farmers and brewers. Again, the gene editing was random, the selection afterward nonrandom.

Twenty years ago, scientists inserted specific sequences for four enzymes into rice plants so that they would synthesize vitamin A and relieve a deadly vitamin deficiency-the result being “golden rice.” This time the researchers knew exactly what letters they were putting in but had no idea where they would end up.

December 23, 2012

More copper to fight superbugs

Filed under: Health, Science — Tags: , , , , — Nicholas Russon @ 11:01

Brass and other copper-alloyed metals may have a bright future in doorknobs, handles, and other frequently handled surfaces due to a recent discovery about the metal’s ability to fight bacteria:

Researchers have discovered that copper and alloys made from the metal, including brass, can prevent antibiotic resistance in bacteria from spreading.

Plastic and stainless steel surfaces, which are now widely used in hospitals and public settings, allow bacteria to survive and spread when people touch them.

Even if the bacteria die, DNA that gives them resistance to antibiotics can survive and be passed on to other bacteria on these surfaces. Copper and brass, however, can kill the bacteria and also destroy this DNA.

Professor Bill Keevil, head of the microbiology group at Southampton University, said using copper on surfaces in public places and on public transport could dramatically cut the threat posed by superbugs.

[. . .]

In research published in the journal Molecular Genetics of Bacteria, Professor Keevil and his colleagues found that compared to stainless steel bacteria on copper surfaces bacterial DNA rapidly degraded at room temperature.

Professor Keevil added: “We live in this new world of stainless steel and plastic, but perhaps we should go back to using brass more instead.”

Tim Worstall points out that much of the stainless steel came in through health and safety regulation:

But isn’t this just great? All that modernity, all that ripping out of the old and replacement with futuristic design actually kills people?

It’s almost as fun as the discovery that the wooden chopping boards, which they made illegal, contain natural antibiotics which the plastic chopping boards, which they made compulsory, do not.

The Man from Whitehall really does not know best. And given that, can we hang them all from the Christmas tree please? It would usher in such a jolly New Year.

December 14, 2012

The revolution will not be revolutionary … soon

Filed under: Health, Media, Science — Tags: , , , — Nicholas Russon @ 11:09

In the Globe and Mail, Timothy Caulfield explains that we need to be careful not to drink the “healthcare revolution” Kool-Aid:

It has been suggested that this technological advance will usher in a new health-care “revolution.” It will allow us, or so it’s promised, to individualize health-care treatments and preventive strategies — an approach often called “personalized medicine.” It will allow us to become fully aware of our genetic shortcomings and the diseases for which we’re at increased genetic risk, thus providing the impetuous to adopt healthier lifestyles.

But will having your personal genome available really revolutionize your health-care world? Will you be able to use this information to significantly improve your chances of avoiding the most common chronic diseases? Not likely.

Tangible benefits will be (and have been) achieved. But, for the most part, these advances are likely to be incremental in nature – which, history tells us, is the way scientific progress usually unfolds.

Why this “we are not in a revolution” message? Overselling the benefits of personal genomics can hurt the science, by creating unrealistic expectations, and distract us from other, more effective areas of health promotion.

The relationship between our genome and disease is far more complicated than originally anticipated. Indeed, the more we learn about the human genome, the less we seem to know. For example, results from a major international initiative to explore all the elements of our genome (the ENCODE project) found that, despite decades-old conventional wisdom that much of our genome was nothing but “junk DNA,” as much as 80 per cent of our genome likely has some biological function. This work hints that things are much more convoluted than expected. So much so that one of ENCODE’s lead researchers, Yale’s Mark Gerstein, was quoted as saying that it’s “like opening a wire closet and seeing a hairball of wires.”

October 28, 2012

Got Milk (mutation)?

Filed under: Environment, Health, History, Science — Tags: , , — Nicholas Russon @ 12:33

Lactose intolerance is part of humankind’s genetic inheritance, which is why the mutation that allowed (some) adult humans to digest milk is of great interest to geneticists:

A genetic mutation appeared, somewhere near modern-day Turkey, that jammed the lactase-production gene permanently in the “on” position. The original mutant was probably a male who passed the gene on to his children. People carrying the mutation could drink milk their entire lives. Genomic analyses have shown that within a few thousand years, at a rate that evolutionary biologists had thought impossibly rapid, this mutation spread throughout Eurasia, to Great Britain, Scandinavia, the Mediterranean, India and all points in between, stopping only at the Himalayas. Independently, other mutations for lactose tolerance arose in Africa and the Middle East, though not in the Americas, Australia, or the Far East.

In an evolutionary eye-blink, 80 percent of Europeans became milk-drinkers; in some populations, the proportion is close to 100 percent. (Though globally, lactose intolerance is the norm; around two-thirds of humans cannot drink milk in adulthood.) The speed of this transformation is one of the weirder mysteries in the story of human evolution, more so because it’s not clear why anybody needed the mutation to begin with. Through their cleverness, our lactose-intolerant forebears had already found a way to consume dairy without getting sick, irrespective of genetics.

[. . .]

A “high selection differential” is something of a Darwinian euphemism. It means that those who couldn’t drink milk were apt to die before they could reproduce. At best they were having fewer, sicklier children. That kind of life-or-death selection differential seems necessary to explain the speed with which the mutation swept across Eurasia and spread even faster in Africa. The unfit must have been taking their lactose-intolerant genomes to the grave.

Milk, by itself, somehow saved lives. This is odd, because milk is just food, just one source of nutrients and calories among many others. It’s not medicine. But there was a time in human history when our diet and environment conspired to create conditions that mimicked those of a disease epidemic. Milk, in such circumstances, may well have performed the function of a life-saving drug.

H/T to Marginal Revolution for the link.

Older Posts »
« « Toronto accused of being deadly waypoint for migratory birds| Yup. Nastiest political rhetoric ever. Or not. » »

Powered by WordPress

%d bloggers like this: