Truth, indeed, is something that is believed in completely only by persons who have never tried personally to pursue it to its fastnesses and grab it by the tail. It is the adoration of second-rate men — men who always receive it at second-hand. Pedagogues believe in immutable truths and spend their lives trying to determine them and propagate them; the intellectual progress of man consists largely of a concerted effort to block and destroy their enterprise. Nine times out of ten, in the arts as in life, there is actually no truth to be discovered; there is only error to be exposed. In whole departments of human inquiry it seems to me quite unlikely that the truth ever will be discovered. Nevertheless, the rubber-stamp thinking of the world always makes the assumption that the exposure of an error is identical with the discovery of the truth — that error and truth are simple opposites. They are nothing of the sort. What the world turns to, when it has been cured of one error, is usually simply another error, and maybe one worse than the first one. This is the whole history of the intellect in brief. The average man of to-day does not believe in precisely the same imbecilities that the Greek of the fourth century before Christ believed in, but the things that he does believe in are often quite as idiotic. Perhaps this statement is a bit too sweeping. There is, year by year, a gradual accumulation of what may be called, provisionally, truths — there is a slow accretion of ideas that somehow manage to meet all practicable human tests, and so survive. But even so, it is risky to call them absolute truths. All that one may safely say of them is that no one, as yet, has demonstrated that they are errors. Soon or late, if experience teaches us anything, they are likely to succumb too. The profoundest truths of the Middle Ages are now laughed at by schoolboys. The profoundest truths of democracy will be laughed at, a few centuries hence, even by school-teachers.
H.L. Mencken, “Footnote on Criticism”, Prejudices, Third Series, 1922.
April 28, 2015
April 18, 2015
Tim Harford‘s latest column on tobacco, research, and lobby money:
It is said that there is a correlation between the number of storks’ nests found on Danish houses and the number of children born in those houses. Could the old story about babies being delivered by storks really be true? No. Correlation is not causation. Storks do not deliver children but larger houses have more room both for children and for storks.
This much-loved statistical anecdote seems less amusing when you consider how it was used in a US Senate committee hearing in 1965. The expert witness giving testimony was arguing that while smoking may be correlated with lung cancer, a causal relationship was unproven and implausible. Pressed on the statistical parallels between storks and cigarettes, he replied that they “seem to me the same”.
The witness’s name was Darrell Huff, a freelance journalist beloved by generations of geeks for his wonderful and hugely successful 1954 book How to Lie with Statistics. His reputation today might be rather different had the proposed sequel made it to print. How to Lie with Smoking Statistics used a variety of stork-style arguments to throw doubt on the connection between smoking and cancer, and it was supported by a grant from the Tobacco Institute. It was never published, for reasons that remain unclear. (The story of Huff’s career as a tobacco consultant was brought to the attention of statisticians in articles by Andrew Gelman in Chance in 2012 and by Alex Reinhart in Significance in 2014.)
Indisputably, smoking causes lung cancer and various other deadly conditions. But the problematic relationship between correlation and causation in general remains an active area of debate and confusion. The “spurious correlations” compiled by Harvard law student Tyler Vigen and displayed on his website (tylervigen.com) should be a warning. Did you realise that consumption of margarine is strongly correlated with the divorce rate in Maine?
April 3, 2015
According to this story in the Guardian, a typical city of one million people poops out $13 million in (potentially recoverable) precious metals every year:
Sewage sludge contains traces of gold, silver and platinum at levels that would be seen as commercially viable by traditional prospectors. “The gold we found was at the level of a minimal mineral deposit,” said Kathleen Smith, of the US Geological Survey.
Smith and her colleagues argue that extracting metals from waste could also help limit the release of harmful metals, such as lead, into the environment in fertilisers and reduce the amount of toxic sewage that has to be buried or burnt.
“If you can get rid of some of the nuisance metals that currently limit how much of these biosolids we can use on fields and forests, and at the same time recover valuable metals and other elements, that’s a win-win,” she said.
A previous study, by Arizona State University, estimated that a city of 1 million inhabitants flushed about $13m (£8.7m) worth of precious metals down toilets and sewer drains each year.
The task of sifting sewage for microscopic quantities of gold may sound grim, but it could have a variety of unexpected benefits over traditional gold mining. The use of powerful chemicals, called leachates, used by the industry to pull metals out of rock is controversial, because these chemicals can be devastating to ecosystems when they leak into the environment. In the controlled setting of a sewage plant, the chemicals could be used liberally without the ecological risks.
March 20, 2015
Epigenetic researchers – “We can double the size of these bugs!” Everyone else – “No, thanks. We’re good.”
Science can be a great source of fascinating experiments. Doubling the size of insects is perhaps not the best way to advertise your particular speciality, however:
Researchers have changed the size of a handful of Florida ants by chemically modifying their DNA, rather than by changing its encoded information. The work is the latest advance from a field known as epigenetics and may help explain how the insects — despite their high degree of genetic similarity — grow into the different varieties of workers needed in a colony.
This discovery “takes the field leaps and bounds forward,” says entomologist Andrew Suarez of the University of Illinois, Urbana-Champaign, who wasn’t connected to the study. “It’s providing a better understanding of how genes interact with the environment to generate diversity.”
Ant nests have division of labor down pat. The queen spends her time pumping out eggs, and the workers, which are genetically similar sisters, perform all the other jobs necessary to keep the colony thriving, such as tending the young, gathering food, and excavating tunnels. Workers in many ant species specialize even further, forming so-called subcastes that look different and have different roles. In Florida carpenter ants (Camponotus floridanus), for example, workers tend to fall into two groups. Minor workers, which can be less than 6 mm long, rear the young and forage for food. Major workers, which can be almost twice as long, use their large jaws to protect the colony from predators.
A team from McGill University in Montreal, Canada, suspected that the mechanism involves DNA methylation: the addition of a chemical to DNA. Genome sequencing and other methods suggest that these physical differences don’t usually stem from genetic differences between individual ants. Instead, environmental factors help push workers to become majors or minors — specifically, the amount of food and coddling that young ants receive. But just how do these factors change the size of ants?
March 5, 2015
At Mother Jones, Kevin Drum talks about all the things we’ve been told about healthy eating … that just ain’t so:
For several years now I’ve been following the controversy over whether the dietary guidelines that have developed over the the past 70 years might be all wrong. And I’ve become tentatively convinced that, in fact, they are wrong. For most people — not all! — salt isn’t a big killer; cholesterol isn’t harmful; and red meat and saturated fat are perfectly OK. Healthy, even. Sugar, on the other hand, really needs to be watched.
Before I go on, a great big caveat: I’m not even an educated amateur on this subject. I’ve read a fair amount about it, but I’ve never dived into it systematically. And the plain truth is that firm proof is hard to come by when it comes to diet. It’s really, really hard to conduct the kinds of experiments that would give us concrete proof that one diet is better than another, and the studies that have been done almost all have defects of some kind.
Randomized trials are the gold standard of dietary studies, but as I said above, they’re really, really hard to conduct properly. You have to find a stable population of people. You have to pick half of them randomly and get them to change their diets. You have to trust them to actually do it. You have to follow them for years, not months. Virtually no trial can ever truly meet this standard.
Nonetheless, as Carroll says, the randomized trials we do have suggest that red meat and saturated fat have little effect on cardiovascular health — and might actually have a positive effect on cancer outcomes.
At the same time, increased consumption of sugars and carbohydrates might be actively bad for us. At the very least they contribute to obesity and diabetes, and there’s some evidence that they aren’t so great for your heart either.
So where does this leave us? As Carroll says, the literature as a whole suggests that we simply don’t know. We’ve been convinced of a lot of things for a long time, and it’s turned out that a lot of what we believed was never really backed by solid evidence in the first place. So now the dietary ship is turning. Slowly, but it’s turning.
His primary take-away from all this: moderation is probably your safest bet, unless you have a condition that requires you to avoid certain foods or types of foods. Oh, and avoid over-indulging in packaged food that uses lots of preservatives. This is certainly one area where the science sure didn’t turn out to be settled, after all.
February 23, 2015
At the Magna Carta Project, Professor Nicholas Vincent recounts how he tracked down a previously unknown copy in Sandwich:
Now, I have often found that the most interesting original records of Magna Carta, as of much else, have gone unnoticed precisely because they are assumed either to be copies rather than originals or because they travel with other less famous documents. Cataloguers, assuming that Magna Carta is much too important to have been overlooked, have very frequently assumed that originals are copies, not from any physical evidence of the fact, but simply because the idea of possessing an unknown Magna Carta has appeared to the cataloguer to be as absurd as suddenly stumbling upon an unknown play by Shakespeare or a unknown canvas by Vermeer. The most famous documents are often the documents that, in their natural habitat, have been least studied. Edgar Allan Poe sums up this situation perfectly in his story “The Purloined Letter”. Poe’s plot here turns on the fact that, if you wish to hide something that everybody else assumes hidden, the best place to hide it is in plain view.I can claim, long before last December, to have found at least three Magna Cartas. All were in plain view. None of them was ‘unknown’, in the sense that they had all previously been listed, albeit in obscure places, either as Magna Cartas or as ‘copies’ of Magna Carta. They were nonetheless ‘unknown’ in the sense that they were either assumed to be ‘copies’ or ‘duplicates’ rather than originals (one of the three 1217 Magna Cartas, and the 1225 Magna Carta in the Bodleian Library in Oxford), or were known locally but without any appreciation that local knowledge had not come to national or international attention (the 1300 Magna Carta preserved in the archives of the borough of Faversham). In one instance (the 1217 Magna Carta now in Hereford Cathedral), it had been catalogued as a royal charter of liberties, but without realizing that these liberties were those otherwise known as ‘Magna Carta’. I vividly remember phoning Hereford Cathedral, in 1989, and asking if I could go down there the following day to see their Magna Carta (for there could be little doubt from the catalogue entry that Hereford’s ‘Charter of liberties 1217′ was a 1217 Magna Carta). I received a very dusty answer. ‘We have no Magna Carta’, I was told, ‘You must be thinking of Mappa Mundi!’. Ignoring this, and ordering up the document by call number, I found myself, the following morning, greeted on Hereford railway station by the canon librarian and the delightful cathedral archivist, Meryl Jancey. Archivists and canon librarians do not generally go to the railway to greet visiting postgraduate students. Short of playing me up Hereford High Street with a brass band, they could not have expressed more joy. And inevitably, their first question was ‘How much is it worth?’.
[…]One other detail before we pass on. Magna Carta as issued in 1215 promised reform not only of the realm as a whole but of the King’s administration of those parts of England placed under ‘forest law’ (i.e. set aside for the King’s hunting, with severe consequences for land use and the preservation of game). In 1217, to answer this demand for reform, King Henry III not only issued a new version of Magna Carta but, as a companion piece, an entirely distinct and smaller charter known as the ‘Forest Charter’. From 1217 onwards, the Forest Charter travelled in the company of Magna Carta, rather as a pilot fish accompanies a shark. It was in order to distinguish between these two documents, bigger and smaller, that as early as 1217 Magna Carta was first named ‘Magna’ (‘the great’). Thereafter, on each successive reissue of Magna Carta, the Forest Charter was also reissued, in 1225, 1265, 1297 and 1300. The Record Commissioners, in their search for original documents, were much less thorough in their treatment of the Forest Charter than they were in their search for its more famous sibling. Blackstone had found only two original Forest Charters, both of them very late. The Record Commissioners knew of only three. By contrast, we now know that at least twelve survive. Some of these turned up fortuitously at the time of my own search for new manuscripts in 2007. Others had resurfaced even more recently.
So it was, that around 4.30am in the morning of 9 December 2014, I decided that a catalogue entry describing a Forest Charter of 1300, might well merit further investigation. Even in the seven years between 2007 (when I compiled my lists for Sotheby’s) and 2014, when I stumbled on the reference to the borough of Sandwich’s Forest Charter, I had found at least three further original Forest Charters previously misidentified or ignored. The earliest of these, of 1225, came to light amongst the muniments of Ely Cathedral, the most recent, of 1300, in the British Library. An original of 1300 at Oriel College seen by Blackstone, reported missing in 2007, had re-emerged safe and sound.
Thanks to modern technology, from Belfast to Maidstone is a mere click of the mouse. At 4.39 Greenwich meantime on the morning of 9 December last year, I sent an email (I have it in front of me) to Dr Mark Bateson. I have known Mark for nearly twenty years, first as an archivist at Canterbury Cathedral (where he was one of those who devised the magnificent catalogue of Canterbury’s medieval charters), and more recently following his transfer to Maidstone. I told him that I had found the reference to a Forest Charter , and as I noted in my email: ‘If this really is the 1300 Sandwich copy of the forest charter, issued under the seal of Edward I, then it is a major find. There are only a handful of such exemplifications still surviving as originals. It would also fundamentally alter our understanding of the way in which the charters of liberties were distributed for the later reissues of Magna Carta. Is there any chance of your taking a sneak preview?’
February 20, 2015
In Nature, Claire Ainsworth explains why it’s becoming more difficult to discuss sex as a binary:
Sex can be much more complicated than it at first seems. According to the simple scenario, the presence or absence of a Y chromosome is what counts: with it, you are male, and without it, you are female. But doctors have long known that some people straddle the boundary — their sex chromosomes say one thing, but their gonads (ovaries or testes) or sexual anatomy say another. Parents of children with these kinds of conditions — known as intersex conditions, or differences or disorders of sex development (DSDs) — often face difficult decisions about whether to bring up their child as a boy or a girl. Some researchers now say that as many as 1 person in 100 has some form of DSD.
When genetics is taken into consideration, the boundary between the sexes becomes even blurrier. Scientists have identified many of the genes involved in the main forms of DSD, and have uncovered variations in these genes that have subtle effects on a person’s anatomical or physiological sex. What’s more, new technologies in DNA sequencing and cell biology are revealing that almost everyone is, to varying degrees, a patchwork of genetically distinct cells, some with a sex that might not match that of the rest of their body. Some studies even suggest that the sex of each cell drives its behaviour, through a complicated network of molecular interactions. “I think there’s much greater diversity within male or female, and there is certainly an area of overlap where some people can’t easily define themselves within the binary structure,” says John Achermann, who studies sex development and endocrinology at University College London’s Institute of Child Health.
These discoveries do not sit well in a world in which sex is still defined in binary terms. Few legal systems allow for any ambiguity in biological sex, and a person’s legal rights and social status can be heavily influenced by whether their birth certificate says male or female.
“The main problem with a strong dichotomy is that there are intermediate cases that push the limits and ask us to figure out exactly where the dividing line is between males and females,” says Arthur Arnold at the University of California, Los Angeles, who studies biological sex differences. “And that’s often a very difficult problem, because sex can be defined a number of ways.”
February 19, 2015
Published on 18 Feb 2015
Almost every cell in your body has the same DNA sequence. So how come a heart cell is different from a brain cell? Cells use their DNA code in different ways, depending on their jobs. Just like orchestras can perform one piece of music in many different ways. A cell’s combined set of changes in gene expression is called its epigenome. This week Nature publishes a slew of new data on the epigenomic landscape in lots of different cells. Learn how epigenomics works in this video.
February 9, 2015
Last month, in his Times column, Matt Ridley explained why — until we discover a treatment for aging itself — rising cancer rates are a weird form of good news:
If we could prevent or cure all cancer, what would we die of? The new year has begun with a war of words over whether cancer is mostly bad luck, as suggested by a new study from Johns Hopkins School of Medicine, and over whether it’s a good way to die, compared with the alternatives, as suggested by Dr Richard Smith, a former editor of the BMJ.
It is certainly bad luck to be British and get cancer, relatively speaking. As The Sunday Times reported yesterday, survival rates after cancer diagnosis are lower here than in most developed and some developing countries, reflecting the National Health Service’s chronic problems with rationing treatment by delay. In Japan, survival rates for lung and liver cancer are three times higher than here.
Cancer is now the leading cause of death in Britain even though it is ever more survivable, with roughly half of people who contract it living long enough to die of something else. But what else? Often another cancer.
In the western world we’ve conquered most of the causes of premature death that used to kill our ancestors. War, smallpox, homicide, measles, scurvy, pneumonia, gangrene, tuberculosis, stroke, typhoid, heart disease and cholera are all much rarer, strike much later in life or are more survivable than they were fifty or a hundred years ago.
The mortality rate in men from coronary heart disease, for instance, has fallen by an amazing 80 per cent since 1968 — for all age groups. Mortality rates from stroke in both sexes have halved in 20 years. Cancer’s growing dominance of the mortality tables is not because it’s getting worse but because we are avoiding other causes of death and living longer.
It is worth remembering that some scientists and anti-pesticide campaigners in the 1960s were convinced that by now lifespans would be much shorter because of cancer caused by pesticides and other chemicals in the environment.
In the 1950s Wilhelm Hueper — a director of the US National Cancer Institute and mentor to Rachel Carson, the environmentalist author of Silent Spring — was so concerned that pesticides were causing cancer that he thought the theory that lung cancer was caused by smoking was a plot by the chemical industry to divert attention from its own culpability: “Cigarette smoking is not a major factor in the causation of lung cancer,” he insisted.
In fact it turns out that pollution causes very little cancer and cigarettes cause a lot. But aside from smoking, most cancers are indeed bad luck. The Johns Hopkins researchers found that tissues that replicate their stem cells most run the highest risk of cancer: basal skin cells do ten trillion cell divisions in a lifetime and have a million times more cancer risk than pelvic bone cells which do about a million cell divisions. Random DNA copying mistakes during cell division are “the major contributors to cancer overall, often more important than either hereditary or external environmental factors”, say the US researchers.
To sum it up, until or unless medical research finds a way to stop the bodily effects of aging, cancer becomes the most likely way for all of us to die. Cancer is a generic rather than a specific term — it’s what we use to describe the inevitable breakdown of the cellular division process that happens millions or even trillions of times over our lifetime. As Ridley puts it, “even if everybody lived in the healthiest possible way, we would still get a lot of cancer.” I’m not a scientist and I don’t even play one on TV, but I suspect that the solution to cancers of all kinds are to boost our immune systems to more quickly identify aberrant cells in our bodies before they start reproducing beyond the capability of the immune system to handle. The short- to medium-term solution to cancer may be to make us all a little bit cyborg…
February 4, 2015
Robyn Arianrhod on the sesquicentential one of the most important discoveries that helped create the world we live in today:
It’s hard to imagine life without mobile phones, radio and television. Yet the discovery of the electromagnetic waves that underpin such technologies grew out of an abstract theory that’s 150 years old.
Our knowledge of the existence of such waves is a direct result of James Clerk Maxwell’s theory of electromagnetism which was first published in January 1865.
Electromagnetism itself was discovered physically rather than theoretically. Some time around 1820, the Danish physicist Hans Oersted noticed that when you switch on an electric current, a nearby magnet – such as the needle of a compass – actually jumps, as if the changing electric current was itself a magnet.
Then, in 1831 (the year Maxwell was born in Edinburgh) the English physicist and chemist Michael Faraday discovered that if you move a magnet through a coil of wire, you create an electric current in the wire without the aid of batteries or other electricity supply.
Faraday was so intrigued by the surprising ability of moving magnets to create electricity that he created a tiny prototype of the electric generator. He also created a prototype of the electric motor, but it would take decades before engineers were able to develop working motors and generators.
Nevertheless, basic technologies had begun to flow almost immediately after the phenomenon of electromagnetism was discovered: in particular, the telegraph – the first high-speed global telecommunications system.
February 3, 2015
Tim Worstall on what’s wrong with Senator Warren’s most recent proposal to claw back profits that are derived from government-sponsored research:
The answer being that finding out basic knowledge is something we call a public good. This has a specific economic meaning and it really means that private actors, whether people or companies, will do too little of this whatever it is. Because it’s just simply too difficult to make money out of having done this whatever it is. That’s really what “public good” means. It doesn’t mean goods supplied to the public nor even things that it is good for the public to have.
So, given that private actors won’t do these things but we also think that it would be just great for lots of these things to be done, well, we’ve got to do something about it then. And the answer to that is government. Even the most minarchist of us (although perhaps not the anarchists) would agree that some of the public goods provided by government are pretty good. A military to defend us from the ravening Canadian hordes, a criminal justice system to protect us from crime, a Constitution to protect us from politicians. All seem pretty good to me. The answer really is government in those cases.
The argument gets extended: that basic research is a public good. It’s very difficult to make a profit from it therefore not enough of it gets done in the private sector. So we should get government to go do it for us. Excellent, so, when we get that research being done then we’re getting what we pay our taxes to get government to do. We’ve got our public good.
What both Warren and Mazzucato are arguing is that government should then come back for a second bite of the cherry. They should get some of the profits from that basic research. But there aren’t any profits from that basic research: that’s why we’re getting government to do it because you can’t make a profit from having done the research. If we can make a profit from having done this research then government shouldn’t be doing it because it’s not a public good.
January 2, 2015
The Center for the Study of the Public Domain (at Duke Law), lists some of the better-known works that should have become public domain in the United States this year, except for the extension of copyright terms:
Current US law extends copyright for 70 years after the date of the author’s death, and corporate “works-for-hire” are copyrighted for 95 years after publication. But prior to the 1976 Copyright Act (which became effective in 1978), the maximum copyright term was 56 years — an initial term of 28 years, renewable for another 28 years. Under those laws, works published in 1958 would enter the public domain on January 1, 2015, where they would be “free as the air to common use.” Under current copyright law, we’ll have to wait until 2054. And no published works will enter our public domain until 2019. The laws in other countries are different — thousands of works are entering the public domain in Canada and the EU on January 1.
What books and plays would be entering the public domain if we had the pre-1978 copyright laws? You might recognize some of the titles below.
- Chinua Achebe, Things Fall Apart
- Hannah Arendt, The Human Condition
- Isaac Asimov (writing as Paul French), Lucky Starr and the Rings of Saturn
- Simone de Beauvoir, Mémoires d’une jeune fille rangée (Memoirs of a Dutiful Daughter)
- Michael Bond, A Bear Called Paddington, with illustrations by Peggy Fortnum
- Eugene Burdick and William Lederer, The Ugly American
- Truman Capote, Breakfast at Tiffany’s
- Agatha Christie, Ordeal by Innocence
- John Kenneth Galbraith, The Affluent Society
- Graham Greene, Our Man in Havana
- Dr. Martin Luther King, Jr., Stride Toward Freedom: The Montgomery Story
- Claude Lévi-Strauss, Anthropologie Structurale (Structural Anthropology)
- Mary Renault, The King Must Die
- Dr. Seuss, Yertle the Turtle and Other Stories
- T.H. White, The Once and Future King
What a trove of books — imagine these being freely available to students and educators around the world. You would be free to translate these books into other languages, create Braille or audio versions for visually impaired readers (if you think that publishers wouldn’t object to this, you would be wrong), or adapt them for theater or film. You could read them online or buy cheaper print editions, because others were free to republish them. (Empirical studies have shown that public domain books are less expensive, available in more editions and formats, and more likely to be in print — see here, here, and here.) Imagine a digital Library of Alexandria containing all of the world’s books from 1958 and earlier, where, thanks to technology, you can search, link, annotate, copy and paste. (Google Books has brought us closer to this reality, but for copyrighted books where there is no separate agreement with the copyright holder, it only shows three short snippets, not the whole book.) You could use these books in your own stories — The Once and Future King was free to draw upon Sir Thomas Malory’s Le Morte d’Arthur (a compilation of King Arthur legends) because Malory’s work was in the public domain. One tale inspires another. That is how the public domain feeds creativity. Instead of seeing these literary works enter the public domain in 2015, we will have to wait until 2054.
December 5, 2014
Michael White says we need to follow up our success in reading our own genetic code by decoding a different one:
There are thousands of mutations that occur in the breast cancer-linked genes BRCA1 and BRCA2. Some of these cause breast or ovarian cancer, while others are harmless. When we design a genetic test for predisposition to breast cancer, we have to know which ones to test for. The same is true of almost any gene that plays a role in disease — you’ll find many mutations in that gene in the general population, only some of which cause health problems. So how do we know which mutations to worry about?
We start by using the genetic code. The genetic code, cracked by scientists in the 1960s, makes it surprisingly easy to “read” our DNA and understand how a particular mutation affects a gene. As genetic testing takes on a bigger role in predicting, diagnosing, and treating disease, we rely on this code to help us make sense of the data. Unfortunately, the genetic code applies to less than two percent of our DNA. In an effort to read the rest, researchers are trying to crack a new genetic code — and this next one is turning out to be much more difficult to solve than the first. In fact, scientists may have to give up the idea that we can use a “code” to “read” the rest of our DNA.
When scientists were working out the original genetic code in the 1950s and ’60s, all sorts of complicated schemes were proposed to explain how information is stored in our genes. The problem they were trying to solve was how a gene, made of DNA, codes the information to make a particular protein — an enzyme, a pump, a piece of cellular scaffolding, or some other critical component of the cell’s working machinery. They were looking for a code that would translate the four-letter DNA alphabet of genes into the 20-letter amino acid alphabet of proteins.
Thanks to its simplicity, the genetic code is a powerful tool in our hunt for mutations that cause disease. Unfortunately, it has also led to the genetic equivalent of a drunk looking for his lost keys under the lamppost. Researchers have put much of their effort into looking for disease mutations in those parts of our genomes that we can read with the genetic code — that is, parts that consist of canonical genes that code for proteins. But these genes make up less than two percent of our DNA; much more of our genetic function is outside of genes in the relatively uncharted “non-coding” portions. We have no idea how many disease-causing mutations are in that non-coding portion — for some types of mutations, it could be as high as 90 percent.
November 25, 2014
Scott Alexander wrote this back in July. I think it’s still relevant as a useful perspective-enhancer:
The year 1969 comes up to you and asks what sort of marvels you’ve got all the way in 2014.
You explain that cameras, which 1969 knows as bulky boxes full of film that takes several days to get developed in dark rooms, are now instant affairs of point-click-send-to-friend that are also much higher quality. Also they can take video.
Music used to be big expensive records, and now you can fit 3,000 songs on an iPod and get them all for free if you know how to pirate or scrape the audio off of YouTube.
Television not only has gone HDTV and plasma-screen, but your choices have gone from “whatever’s on now” and “whatever is in theaters” all the way to “nearly every show or movie that has ever been filmed, whenever you want it”.
Computers have gone from structures filling entire rooms with a few Kb memory and a punchcard-based interface, to small enough to carry in one hand with a few Tb memory and a touchscreen-based interface. And they now have peripherals like printers, mice, scanners, and flash drives.
Lasers have gone from only working in special cryogenic chambers to working at room temperature to fitting in your pocket to being ubiquitious in things as basic as supermarket checkout counters.
Telephones have gone from rotary-dial wire-connected phones that still sometimes connected to switchboards, to cell phones that fit in a pocket. But even better is bypassing them entirely and making video calls with anyone anywhere in the world for free.
Robots now vacuum houses, mow lawns, clean office buildings, perform surgery, participate in disaster relief efforts, and drive cars better than humans. Occasionally if you are a bad person a robot will swoop down out of the sky and kill you.
For better or worse, video games now exist.
Medicine has gained CAT scans, PET scans, MRIs, lithotripsy, liposuction, laser surgery, robot surgery, and telesurgery. Vaccines for pneumonia, meningitis, hepatitis, HPV, and chickenpox. Ceftriaxone, furosemide, clozapine, risperidone, fluoxetine, ondansetron, omeprazole, naloxone, suboxone, mefloquine, – and for that matter Viagra. Artificial hearts, artificial livers, artificial cochleae, and artificial legs so good that their users can compete in the Olympics. People with artificial eyes can only identify vague shapes at best, but they’re getting better every year.
World population has tripled, in large part due to new agricultural advantages. Catastrophic disasters have become much rarer, in large part due to architectural advances and satellites that can watch the weather from space.
We have a box which you can type something into and it will tell you everything anyone has ever written relevant to your query.
We have a place where you can log into from anywhere in the world and get access to approximately all human knowledge, from the scores of every game in the 1956 Roller Hockey World Cup to 85 different side effects of an obsolete antipsychotic medication. It is all searchable instantaneously. Its main problem is that people try to add so much information to it that its (volunteer) staff are constantly busy deleting information that might be extraneous.
We have the ability to translate nearly major human language to any other major human language instantaneously at no cost with relatively high accuracy.
We have navigation technology that over fifty years has gone from “map and compass” to “you can say the name of your destination and a small box will tell you step by step which way you should be going”.
We have the aforementioned camera, TV, music, videophone, video games, search engine, encyclopedia, universal translator, and navigation system all bundled together into a small black rectangle that fits in your pockets, responds to your spoken natural-language commands, and costs so little that Ethiopian subsistence farmers routinely use them to sell their cows.
But, you tell 1969, we have something more astonishing still. Something even more unimaginable.
“We have,” you say, “people who believe technology has stalled over the past forty-five years.”
1969’s head explodes.
November 21, 2014
In the Washington Post, Justin Moyer talks about Elon Musk’s concern about runaway artificial intelligence:
Elon Musk — the futurist behind PayPal, Tesla and SpaceX — has been caught criticizing artificial intelligence again.
“The risk of something seriously dangerous happening is in the five year timeframe,” Musk wrote in a comment since deleted from the Web site Edge.org, but confirmed to Re/Code by his representatives. “10 years at most.”
The very future of Earth, Musk said, was at risk.
“The leading AI companies have taken great steps to ensure safety,” he wrote. “The recognize the danger, but believe that they can shape and control the digital superintelligences and prevent bad ones from escaping into the Internet. That remains to be seen.”
Musk seemed to sense that these comments might seem a little weird coming from a Fortune 1000 chief executive officer.
“This is not a case of crying wolf about something I don’t understand,” he wrote. “I am not alone in thinking we should be worried.”
Unfortunately, Musk didn’t explain how humanity might be compromised by “digital superintelligences,” “Terminator”-style.
He never does. Yet Musk has been holding forth on-and-off about the apocalypse artificial intelligence might bring for much of the past year.