Quotulatiousness

October 12, 2014

The unique challenges to UAVs in the Canadian Arctic

Filed under: Cancon, Environment, Science, Technology — Tags: , , , — Nicholas @ 00:02

Ben Makuch looks at the severe environment of Canada’s Arctic and how UAV design is constrained by those conditions:

The rotary-wing UAV tested, and its view from the sky. Image: DRDC

The rotary-wing UAV tested, and its view from the sky. Image: DRDC

“A lot of these systems — UAVs particularly, and rotor-wing (that is to say helicopters or quadrotors) — are even more sensitive. They require a good understanding of what they’re heading in. And by heading, that’s kind of the direction you’re facing,” said Monckton.

And because of those difficulties, finding headings for aerial drones in the Arctic requires stronger GPS systems to establish a “line segment” of locational data, ripped, according to Monckton, from a “crown” of satellites hovering on top of Earth.

In terms of weather conditions, the extreme sub-zero temperatures is devastating on a UAV when you mix in fog or clouds. While crisp cool air with clear skies provides excellent flying conditions, once you mix in ice fog, it becomes a major risk to small UAVs.

“The biggest risk in the Arctic is structural icing,” said Monckton who explained that water in the clouds is so cool that when “you strike it, it actually crystallizes on contact.”

At CFS Alert, the Multi-Agent Tactical Sentry (MATS) UGV travels through rough Arctic terrain during an autonomous path-following test without the use of GPS. The Canadian Armed Forces Joint Arctic Experiment (CAFJAE) 2014 tests autonomous technology for Arctic conditions and explores its potential for future concepts of military operations through experiments carried out August 2014 at Canadian Forces Station Alert, Nunavut.  CAF and Defence Research and Development Canada's (DRDC) JAE work will benefit multiple government partners and centers around a fictitious satellite crash with hazard identification, telecommunication and other search and rescue tasks.

At CFS Alert, the Multi-Agent Tactical Sentry (MATS) UGV travels through rough Arctic terrain during an autonomous path-following test without the use of GPS. The Canadian Armed Forces Joint Arctic Experiment (CAFJAE) 2014 tests autonomous technology for Arctic conditions and explores its potential for future concepts of military operations through experiments carried out August 2014 at Canadian Forces Station Alert, Nunavut. CAF and Defence Research and Development Canada’s (DRDC) JAE work will benefit multiple government partners and centers around a fictitious satellite crash with hazard identification, telecommunication and other search and rescue tasks. Image: DRDC

Unsurprisingly, the wings of a drone being enveloped in ice presents “a major impediment to general unmanned air operations,” Monckton said. In part, because “UAVs are too small to carry standard deicing equipment [as used] on a commercial aircraft. So that’s a major problem.”

For the project, DRDC took a previously manned helicopter and modified it into an unmanned vehicle. They had help from Calgary-based Meggit Canada for the project, a defence and security contractor also responsible for this armed training hexicopter.

As for ground drones, or unmanned ground vehicles, Monckton said weather and temperature were an afterthought. The real challenge, was the actual terrain.

“The arctic has a really peculiar surface,” said Monckton, adding that the high Arctic offers mostly marshlands, rocky outcrops, or elevated permafrost that produces spiky formations. “So the UGV was kind of going between easy riding on sloppy stuff and then getting pounded to pieces on the rough frost boils.”

September 16, 2014

When the “best nutrition advice” is a big, fat lie

Filed under: Books, Food, Government, Health, Media, Science — Tags: , , — Nicholas @ 10:17

Rob Lyons charts the way our governments and healthcare experts got onboard the anti-fat dietary express, to our long-lasting dietary harm:

… in recent years, the advice to eat a low-fat diet has increasingly been called into question. Despite cutting down on fatty foods, the populations of many Western countries have become fatter. If heart-disease mortality has maintained a steady decline, cases of type-2 diabetes have shot up in recent years. Maybe these changes were in spite of the advice to avoid fat. Maybe they were caused by that advice.

The most notable figure in providing the intellectual ammunition to challenge existing health advice has been the US science writer, Gary Taubes. His 2007 book, Good Calories, Bad Calories, became a bestseller, despite containing long discussions on some fairly complex issues to do with biochemistry, nutrition and medicine. The book’s success triggered a heated debate about what really makes us fat and causes chronic disease.

The move to first discussing and then actively encouraging a low-fat diet was largely due to the work of Dr. Ancel Keys, who is to the low-fat diet movement what Karl Marx is to Communism. His energy, drive, and political savvy helped get the US government and the majority of health experts onboard and pushing his advice. A significant problem with this is that Keys’ advocacy was not statistically backed by even his own data. He drew strong conclusions from tiny, unrepresentative samples, yet managed to persuade most doubters that he was right. A more statistically rigorous analysis might well show that the obesity crisis has actually been driven by the crusading health advisors who have been pushing the low-fat diet all this time … or, as I termed it, “our Woody Allen moment“.

Rob Lyons discussed this with Nina Teicholz, author of the book The Big Fat Surprise:

Once the politically astute Keys had packed the nutrition committee of the AHA and got its backing for the advice to avoid saturated fat, the war on meat and dairy could begin. But a major turning point came in 1977 when the Senate Select Committee on Nutrition, led by Democratic senator George McGovern, held hearings on the issue. The result was a set of guidelines, Dietary Goals for the United States [PDF], which promoted the consumption of ‘complex’ carbohydrates, and reductions in the consumption of fat in general and saturated fat in particular.

By 1980, this report had been worked up into government-backed guidelines — around the same time that obesity appears to have taken off in the US. The McGovern Report inspired all the familiar diet advice around the world that we’ve had ever since, and led to major changes in what food manufacturers offered. Out went fat, though unsaturated fat and hydrogenated oils were deemed less bad than saturated fat, so vegetable oils and margarines became more popular. In came more carbohydrate and more sugar, to give those cardboard-like low-fat ‘treats’ some modicum of flavour.

Yet two recent reviews of the evidence around saturated fat — one led by Ronald Krauss, the other by Rajiv Chowdhury — suggest that saturated fat is not the villain it has been painted as. (The latter paper, in particular, sparked outrage.) As for fat in general, Teicholz tells me: ‘There was no effort until very late in the game to provide evidence for the low-fat diet. It was just assumed that that was reasonable because of the caloric benefit you would see from restricting fat.’

Teicholz also debunks the wonderful reputation of the Mediterranean Diet (“a rose-tinted version of reality tailored to the anti-meat prejudices of American researchers”), points out the role of the olive oil industry in pushing the diet (“Swooning researchers were literally wined and dined into going along with promoting the benefits of olive oil”), and points out that we can’t even blame most of the obesity problem on “Big Food”:

Which leads us to an important third point made by Teicholz: that the blame for our current dietary problems cannot solely, or even mainly, be placed at the door of big food corporations. Teicholz writes about how she discovered that ‘the mistakes of nutrition science could not be primarily pinned on the nefarious interests of Big Food. The source of our misguided dietary advice was in some ways more disturbing, since it seems to have been driven by experts at some of our most trusted institutions working towards what they believed to be the public good.’ Once public-health bureaucracies enshrined the dogma that fat is bad for us, ‘the normally self-correcting mechanism of science, which involves constantly challenging one’s own beliefs, was disabled’.

The war on dietary fat is a terrifying example of what happens when politics and bureaucracy mixes with science: provisional conclusions become laws of nature; resources are piled into the official position, creating material as well as intellectual reasons to continue to support it; and any criticism is suppressed or dismissed. As the war on sugar gets into full swing, a reading of The Big Fat Surprise might provide some much-needed humility.

August 18, 2014

Salt studies and health outcomes – “all models need to be taken with a pinch of salt”

Filed under: Food, Health, Science — Tags: , , , — Nicholas @ 08:41

Colby Cosh linked to this rather interesting BMJ blog post by Richard Lehman, looking at studies of the impact of dietary salt reduction:

601 The usual wisdom about sodium chloride is that the more you take, the higher your blood pressure and hence your cardiovascular risk. We’ll begin, like the NEJM, with the PURE study. This was a massive undertaking. They recruited 102 216 adults from 18 countries and measured their 24 hour sodium and potassium excretion, using a single fasting morning urine specimen, and their blood pressure by using an automated device. In an ideal world, they would have carried on doing this every week for a month or two, but hey, this is still better than anyone has managed before now. Using these single point in time measurements, they found that people with elevated blood pressure seemed to be more sensitive to the effects of the cations sodium and potassium. Higher sodium raised their blood pressure more, and higher potassium lowered it more, than in individuals with normal blood pressure. In fact, if sodium is a cation, potassium should be called a dogion. And what I have described as effects are in fact associations: we cannot really know if they are causal.

612 But now comes the bombshell. In the PURE study, there was no simple linear relationship between sodium intake and the composite outcome of death and major cardiovascular events, over a mean follow-up period of 3.7 years. Quite the contrary, there was a sort of elongated U-shape distribution. The U begins high and is then splayed out: people who excreted less than 3 grams of salt daily were at much the highest risk of death and cardiovascular events. The lowest risk lay between 3 g and 5 g, with a slow and rather flat rise thereafter. On this evidence, trying to achieve a salt intake under 3 g is a bad idea, which will do you more harm than eating as much salt as you like. Moreover, if you eat plenty of potassium as well, you will have plenty of dogion to counter the cation. The true Mediterranean diet wins again. Eat salad and tomatoes with your anchovies, drink wine with your briny olives, sprinkle coarse salt on your grilled fish, lay it on a bed of cucumber, and follow it with ripe figs and apricots. Live long and live happily.

624 It was rather witty, if slightly unkind, of the NEJM to follow these PURE papers with a massive modelling study built on the assumption that sodium increases cardiovascular risk in linear fashion, mediated by blood pressure. Dariush Mozaffarian and his immensely hardworking team must be biting their lips, having trawled through all the data they could find about sodium excretion in 66 countries. They used a reference standard of 2 g sodium a day, assuming this was the point of optimal consumption and lowest risk. But from PURE, we now know it is associated with a higher cardiovascular risk than 13 grams a day. So they should now go through all their data again, having adjusted their statistical software to the observational curves of the PURE study. Even so, I would question the value of modelling studies on this scale: the human race is a complex thing to study, and all models need to be taken with a pinch of salt.

Update: Colby Cosh followed up the original link with this tweet. Ouch!

August 5, 2014

New ways to bug a room

Filed under: Technology — Tags: , , , — Nicholas @ 08:33

MIT, Adobe and Microsoft have developed a technique that allows conversations to be reconstructed based on the almost invisible vibrations of surfaces in the same room:

Researchers at MIT, Microsoft, and Adobe have developed an algorithm that can reconstruct an audio signal by analyzing minute vibrations of objects depicted in video. In one set of experiments, they were able to recover intelligible speech from the vibrations of a potato-chip bag photographed from 15 feet away through soundproof glass.

In other experiments, they extracted useful audio signals from videos of aluminum foil, the surface of a glass of water, and even the leaves of a potted plant. The researchers will present their findings in a paper at this year’s Siggraph, the premier computer graphics conference.

“When sound hits an object, it causes the object to vibrate,” says Abe Davis, a graduate student in electrical engineering and computer science at MIT and first author on the new paper. “The motion of this vibration creates a very subtle visual signal that’s usually invisible to the naked eye. People didn’t realize that this information was there.”

[…]

Reconstructing audio from video requires that the frequency of the video samples — the number of frames of video captured per second — be higher than the frequency of the audio signal. In some of their experiments, the researchers used a high-speed camera that captured 2,000 to 6,000 frames per second. That’s much faster than the 60 frames per second possible with some smartphones, but well below the frame rates of the best commercial high-speed cameras, which can top 100,000 frames per second.

I was aware that you could “bug” a room by monitoring the vibrations of a non-soundproofed window, at least under certain circumstances, but this is rather more subtle. I wonder how long this development has been known to the guys at the NSA…

August 1, 2014

Old and busted – “I cannae change the laws of physics”?

Filed under: Science, Space — Tags: , , , — Nicholas @ 09:02

Call me an old fogey, but I’ve always believed in the law of conservation of momentum … yet a recent NASA finding — if it holds up — may bring me around:

Nasa is a major player in space science, so when a team from the agency this week presents evidence that “impossible” microwave thrusters seem to work, something strange is definitely going on. Either the results are completely wrong, or Nasa has confirmed a major breakthrough in space propulsion.

British scientist Roger Shawyer has been trying to interest people in his EmDrive for some years through his company SPR Ltd. Shawyer claims the EmDrive converts electric power into thrust, without the need for any propellant by bouncing microwaves around in a closed container. He has built a number of demonstration systems, but critics reject his relativity-based theory and insist that, according to the law of conservation of momentum, it cannot work.

[…]

    “Test results indicate that the RF resonant cavity thruster design, which is unique as an electric propulsion device, is producing a force that is not attributable to any classical electromagnetic phenomenon and therefore is potentially demonstrating an interaction with the quantum vacuum virtual plasma.”

This last line implies that the drive may work by pushing against the ghostly cloud of particles and anti-particles that are constantly popping into being and disappearing again in empty space. But the Nasa team has avoided trying to explain its results in favour of simply reporting what it found: “This paper will not address the physics of the quantum vacuum plasma thruster, but instead will describe the test integration, test operations, and the results obtained from the test campaign.”

The drive’s inventor, Guido Fetta calls it the “Cannae Drive”, which he explains as a reference to the Battle of Cannae in which Hannibal decisively defeated a much stronger Roman army: you’re at your best when you are in a tight corner. However, it’s hard not to suspect that Star Trek‘s Engineer Scott — “I cannae change the laws of physics” — might also be an influence. (It was formerly known as the Q-Drive.)

July 21, 2014

The science of ballistics, the art of war, and the birth of the assault rifle

Filed under: History, Military, Technology, Weapons — Tags: , , , — Nicholas @ 15:47

Defence With A “C” summarizes the tale of how we got to the current suite of modern military small arms. It’s a long story, but if you’re interested in firearms, it’s a fascinating one.

To understand why we’ve arrived where we are now with the NATO standard 5.56mm calibre round you have to go all the way back to the war of 1939-1945. Much study of this conflict would later inform decision making surrounding the adoption of the 5.56, but for now there was one major change that took place which would set the course for the future.

The German Sturmgewehr 44 is widely accepted as the worlds first true assault rifle. Combining the ability to hit targets out to around 500 yards with individual shots in a semi-automatic mode, as well as the ability to fire rapidly in fully automatic mode (almost 600 rounds per minute) the StG 44 represented a bridge between short ranged sub-machine guns and longer ranged bolt action rifles.

[…]

After the second world war the US army began conducting research to help it learn the lessons of its previous campaigns, as well as preparing it for potential future threats. As part of this effort it began to contract the services of the Operations Research Office (ORO) of the John Hopkins University in Baltimore, Maryland, for help in conducting the scientific analysis of various aspects of ground warfare.

On October 1st, 1948, the ORO began Project ALCLAD, a study into the means of protecting soldiers from the “casualty producing hazards of warfare“. In order to determine how best to protect soldiers from harm, it was first necessary to investigate the major causes of casualties in war.

After studying large quantities of combat and casualty reports, ALCLAD concluded that first and foremost the main danger to combat soldiers was from high explosive weapons such as artillery shells, fragments from which accounted for the vast majority of combat casualties. It also determined that casualties inflicted by small arms fire were essentially random.

Allied troops in WW2 had been generally armed with full-sized bolt action rifles (while US troops were being issued the M1 Garand), optimized to be accurate out to 600 yards or more, yet most actual combat was at much shorter ranges than that. Accuracy is directly affected by the stress, tension, distraction, and all-around confusion of the battlefield: even at such short ranges, riflemen required many shots to be expended in hopes of inflicting a hit on an enemy. The ORO ran a series of tests to simulate battle conditions for both expert and ordinary riflemen and found some unexpected results:

A number of significant conclusions were thus drawn from these tests. Firstly, that accuracy — even for prone riflemen, some of them expert shots, shooting at large static targets — was poor beyond ranges of about 250 yards. Secondly, that under simulated conditions of combat shooting an expert level marksman was no more accurate than a regular shot. And finally that the capabilities of the individual shooters were far below the potential of the rifle itself.

This in turn — along with the analysis of missed shots caught by a screen behind the targets — led to three further conclusions.

First, that any effort to try and make the infantry’s general purpose weapon more accurate (such as expensive barrels) was largely a waste of time and money. The weapon was, and probably always would be, inherently capable of shooting much tighter groups than the human behind it.

Second, that there was a practical limit to the value of marksmanship training for regular infantry soldiers. Beyond a certain basic level of training any additional hours were of limited value*, and the number of hours required to achieve a high level of proficiency would be prohibitive. This was particularly of interest for planning in the event of another mass mobilisation for war.

July 3, 2014

Skeptical reading should be the rule for health news

Filed under: Health, Media, Science — Tags: , , , , — Nicholas @ 08:49

We’ve all seen many examples of health news stories where the headline promised much more than the article delivered: this is why stories have headlines in the first place — to get you to read the rest of the article. This sometimes means the headline writer (except on blogs, the person writing the headline isn’t the person who wrote the story), knowing less of what went into writing the story, grabs a few key statements to come up with an appealing (or appalling) headline.

This is especially true with science and health reporting, where the writer may not be as fully informed on the subject and the headline writer almost certainly doesn’t have a scientific background. The correct way to read any kind of health report in the mainstream media is to read skeptically — and knowing a few things about how scientific research is (or should be) conducted will help you to determine whether a reported finding is worth paying attention to:

Does the article support its claims with scientific research?

Your first concern should be the research behind the news article. If an article touts a treatment or some aspect of your lifestyle that is supposed to prevent or cause a disease, but doesn’t give any information about the scientific research behind it, then treat it with a lot of caution. The same applies to research that has yet to be published.

Is the article based on a conference abstract?

Another area for caution is if the news article is based on a conference abstract. Research presented at conferences is often at a preliminary stage and usually hasn’t been scrutinised by experts in the field. Also, conference abstracts rarely provide full details about methods, making it difficult to judge how well the research was conducted. For these reasons, articles based on conference abstracts should be no cause for alarm. Don’t panic or rush off to your GP.

Was the research in humans?

Quite often, the ‘miracle cure’ in the headline turns out to have only been tested on cells in the laboratory or on animals. These stories are regularly accompanied by pictures of humans, which creates the illusion that the miracle cure came from human studies. Studies in cells and animals are crucial first steps and should not be undervalued. However, many drugs that show promising results in cells in laboratories don’t work in animals, and many drugs that show promising results in animals don’t work in humans. If you read a headline about a drug or food ‘curing’ rats, there is a chance it might cure humans in the future, but unfortunately a larger chance that it won’t. So there is no need to start eating large amounts of the ‘wonder food’ featured in the article.

How many people did the research study include?

In general, the larger a study the more you can trust its results. Small studies may miss important differences because they lack statistical “power”, and are also more susceptible to finding things (including things that are wrong) purely by chance.

[…]

Did the study have a control group?

There are many different types of studies appropriate for answering different types of questions. If the question being asked is about whether a treatment or exposure has an effect or not, then the study needs to have a control group. A control group allows the researchers to compare what happens to people who have the treatment/exposure with what happens to people who don’t. If the study doesn’t have a control group, then it’s difficult to attribute results to the treatment or exposure with any level of certainty.

Also, it’s important that the control group is as similar to the treated/exposed group as possible. The best way to achieve this is to randomly assign some people to be in the treated/exposed group and some people to be in the control group. This is what happens in a randomised controlled trial (RCT) and is why RCTs are considered the ‘gold standard’ for testing the effects of treatments and exposures. So when reading about a drug, food or treatment that is supposed to have an effect, you want to look for evidence of a control group and, ideally, evidence that the study was an RCT. Without either, retain some healthy scepticism.

June 28, 2014

Autism and vaccines infographic

Filed under: Health, Media, Science — Tags: , , — Nicholas @ 10:44

Click to see full infographic

Click to see full infographic

H/T to Nils Werner for the link.

May 14, 2014

Another area for freedom of choice – the “right to try”

Filed under: Bureaucracy, Health, Science, USA — Tags: , , , — Nicholas @ 11:32

Amity Shlaes talks about a movement to allow more freedom of choice, but in an unusual and tightly regulated sector:

For decades now the Food and Drug Administration has maintained an onerous and slow approval process that delays the debut of new drugs for fatal diseases, sometimes for years longer than the life span of the patients desperate to try them. Attorneys and scholars at the Goldwater Institute of Arizona have crafted legislation for the states that would allow terminally ill patients to try experimental drugs for cancer or degenerative neurological diseases earlier. These “Right to Try” bills are so scripted that they overcome the usual objection to delivery of such experimental drugs: safety. Under “Right to Try,” only drugs that have passed the crucial Phase 1 of FDA testing could be prescribed, thereby reducing the possibility of Thalidomide repeat. Second, only patients determined to have terminal cases would be eligible to purchase the drugs, making it harder to maintain that the drug will jeopardize their lives.

Representatives in Colorado, Louisiana, and Missouri approved the “Right to Try” measure unanimously. Citizens of Arizona will vote on the effort to circumvent the FDA process this fall.

Why the popularity? The phrase “Right to Try” appeals especially in a nation that senses all too well the reductions in freedom that come as the Affordable Care Act is implemented. The recent success of The Dallas Buyers’ Club, a film about a man who procured experimental drugs for AIDS patients, also fuels the “Right to Try” impulse. Some of the popularity comes from our culture of choice. In Colorado, where citizens have choice about abortion, and now the choice to use marijuana, they may also get what seems an elemental choice, that to try to save their own lives.

But of course “Right to Try” also sails because of the frustration of tragedy. Years ago a man named Frank Burroughs founded the Abigail Alliance after conventional options failed to cure his 21-year-old daughter’s cancer. Abigail’s oncologist tried to get Abigail newer drugs, Erbitux or Iressa from AstraZeneca, the company with which Pfizer hopes to merge. But the drugs were not available in time to save the girl. The Abigail Alliance is attempting on the federal level what Goldwater is trying for states: The federal bill’s name is the Compassionate Care Act. “Those waiting for FDA decisions, mainly dying patients and those who care for them, view the agency as a barrier,” co-founder Steve Walker explained simply. And who can disagree? Many of the supporters of “Right to Try” or the Abigail Alliance are businesspeople or scientists who are motivated to honor ones they have lost to illness; others are racing to save sick family who are still living. Yet others labor for patients in particular or science in general.

May 7, 2014

“I’ve been an oncologist for 20 years, and I have never, ever seen anything like this.”

Filed under: Business, Health, Science — Tags: , , , — Nicholas @ 15:30

In Forbes, Matthew Herper looks at how Novartis is transforming itself in an attempt to conquer cancer:

“I’ve been an oncologist for 20 years,” says Grupp, “and I have never, ever seen anything like this.” Emily has become the poster child for a radical new treatment that Novartis, the third-biggest drug company on the Forbes Global 2000, is making one of the top priorities in its $9.9 billion research and development budget.

“I’ve told the team that resources are not an issue. Speed is the issue,” says Novartis Chief Executive Joseph Jimenez, 54. “I want to hear what it takes to run this phase III trial and to get this to market. You’re talking about patients who are about to die. The pain of having to turn patients away is such that we are going as fast as we can and not letting resources get in the way.”

A successful trial would prove a milestone in the fight against the demon that has plagued living things since dinosaurs roamed the Earth. Coupled with the exploding capabilities of DNA-sequencing machines that can unlock the genetic code, recent drugs have delivered stunning results in lung cancer, melanoma and other deadly tumors, sometimes making them disappear entirely – albeit temporarily. Just last year the Food & Drug Administration approved nine targeted cancer drugs. It’s big business, too. According to data provider IMS Health, spending on oncology drugs was $91 billion last year, triple what it was in 2003.

But the developments at Penn point, tantalizingly, to something more, something that would rank among the great milestones in the history of mankind: a true cure. Of 25 children and 5 adults with Emily’s disease, ALL, 27 had a complete remission, in which cancer becomes undetectable. “It’s a stunning breakthrough,” says Sally Church, of drug development advisor Icarus Consultants. Says Crystal Mackall, who is developing similar treatments at the National Cancer Institute: “It really is a revolution. This is going to open the door for all sorts of cell-based and gene therapy for all kinds of disease because it’s going to demonstrate that it’s economically viable.”

H/T to Megan McArdle for the link.

April 28, 2014

Kate Lunau talks to Ray Kurzweil

Filed under: Health, Science, Technology — Tags: , , — Nicholas @ 15:38

I’m interested in life extension … I have no particular hankering to die any time soon, although I admit there is some truth in the aphorism “Many wish for immortality who don’t know how to spend a rainy Sunday afternoon”. Ray Kurzweil wants immortality, and he’s doing what he can to make that happen:

Ray Kurzweil — futurist, inventor, entrepreneur, bestselling author, and now, director of engineering at Google — wants to live forever. He’s working to make it happen. Kurzweil, whose many inventions include the first optical character recognition software (which transforms the written word into data) and the first text-to-speech synthesizer, spoke to Maclean’s for our annual Rethink issue about why we’re on the brink of a technological revolution — one that will improve our health and our lives, even after the robots outsmart us for good.

Q: You say we’re in the midst of a “grand transformation” in the field of medicine. What do you see happening today?

A: Biology is a software process. Our bodies are made up of trillions of cells, each governed by this process. You and I are walking around with outdated software running in our bodies, which evolved in a very different era. We each have a fat insulin receptor gene that says, “Hold on to every calorie.” That was a very good idea 10,000 years ago, when you worked all day to get a few calories; there were no refrigerators, so you stored them in your fat cells. I would like to tell my fat insulin receptor gene, “You don’t need to do that anymore,” and indeed that was done at the Joslin Diabetes Center. They turned off this gene, and the [lab mice] ate ravenously and remained slim. They didn’t get diabetes; they didn’t get heart disease. They lived 20 per cent longer. They’re working with a drug company to bring that to market.

Life expectancy was 20 a thousand years ago; 37, 200 years ago. We’re now able to reprogram health and medicine as software, and that [pace is] going to continue to accelerate. We’re treating biology, and by extension health and medicine, as an information technology. Our intuition about how progress will unfold is linear, but information technology progresses exponentially, not linearly. My Android phone is literally several billion times more powerful, per dollar, than the computer I used when I was a student. And it’s also 100,000 times smaller. We’ll do both of those things again in 25 years. It’ll be a billion times more powerful, and will be the size of a blood cell.

March 28, 2014

Opinions, statistics, and sex work

Filed under: Law, Liberty, Media — Tags: , , , — Nicholas @ 09:04

Maggie McNeill explains why the “sex trafficking” meme has been so relentlessly pushed in the media for the last few years:

Imagine a study of the alcohol industry which interviewed not a single brewer, wine expert, liquor store owner or drinker, but instead relied solely on the statements of ATF agents, dry-county politicians and members of Alcoholics Anonymous and Mothers Against Drunk Driving. Or how about a report on restaurants which treated the opinions of failed hot dog stand operators as the basis for broad statements about every kind of food business from convenience stores to food trucks to McDonald’s to five-star restaurants?

You’d probably surmise that this sort of research would be biased and one-sided to the point of unreliable. And you’d be correct. But change the topic to sex work, and such methods are not only the norm, they’re accepted uncritically by the media and the majority of those who the resulting studies. In fact, many of those who represent themselves as sex work researchers don’t even try to get good data. They simply present their opinions as fact, occasionally bolstered by pseudo-studies designed to produce pre-determined results. Well-known and easily-contacted sex workers are rarely consulted. There’s no peer review. And when sex workers are consulted at all, they’re recruited from jails and substance abuse programs, resulting in a sample skewed heavily toward the desperate, the disadvantaged and the marginalized.

This sort of statistical malpractice has always been typical of prostitution research. But the incentive to produce it has dramatically increased in the past decade, thanks to a media-fueled moral panic over sex trafficking. Sex-work prohibitionists have long seen trafficking and sex slavery as a useful Trojan horse. In its 2010 “national action plan,” for example, the activist group Demand Abolition writes,“Framing the Campaign’s key target as sexual slavery might garner more support and less resistance, while framing the Campaign as combating prostitution may be less likely to mobilize similar levels of support and to stimulate stronger opposition.”

March 16, 2014

Alcoholics Anonymous and addiction

Filed under: Health — Tags: , , — Nicholas @ 09:27

In Maclean’s, Kate Lunau talks to Dr. Lance Dodes about Alcoholics Anonymous:

Dr. Lance Dodes has spent more than 35 years treating people who are battling addiction, including alcoholism. In his new book (co-written with Zachary Dodes), The Sober Truth: Debunking the Bad Science Behind 12-Step Programs and the Rehab Industry, Dodes takes a hard look at Alcoholics Anonymous, a worldwide organization that describes itself as a “non-professional fellowship of alcoholics helping other alcoholics get and stay sober.” Today, there are more than 5,000 AA groups in Canada alone, which are free and open to anyone. Dodes, a retired assistant clinical professor of psychiatry at Harvard Medical School, argues that some groups — and many for-profit private rehab centres based on the 12-step model — are often ineffective, and can cause further damage to addicts.

Q: How did you come to work on addiction?

A: I first became involved with alcoholism and addiction in the ’70s, when the place I was working, which is now part of Massachusetts General Hospital in Boston, needed to develop an alcoholism treatment unit. I was director of psychiatry, so I said, “I’ll develop it.” Afterward, I became involved in various addiction treatment programs, including running the state’s largest compulsive-gambling program. Over the years, I became very familiar with AA. It became clear that, while AA works for some people, the statistics just didn’t back it up. The real problem is that [doctors] refer 100 per cent of their patients with alcoholism to AA, and that’s the wrong thing to do 90 per cent of the time.

Q: AA has more than two million members around the world. You say its success rate is between five and 10 per cent. How, then, do you account for its enduring popularity?

A: AA is a proselytizing organization. The 12th step is to go out and spread the word, and they do. Because there are so many people in prominent positions who are members of AA, it gets tremendously good press. If AA were simply harmless, then I would agree that a seven per cent success rate is better than zero. But that’s not the case. It can be very destructive. According to AA, AA never fails — you fail. AA says that if you’re not doing well in the program, then it’s you. So you should go back and do the same thing you did before: Do more of the 12 steps, and go to more meetings.

March 13, 2014

It’s amazing how much data can be derived from “mere” metadata

Filed under: Liberty, Media, Technology — Tags: , , , , — Nicholas @ 08:25

Two Stanford grad students conducted a research project to find out what kind of actual data can be derived from mobile phone metadata:

Two Stanford computer science students were able to acquire detailed information about people’s lives just from telephone metadata — the phone number of the caller and recipient, the particular serial number of the phones involved, the time and duration of calls and possibly the location of each person when the call occurred.

The researchers did not do any illegal snooping — they worked with the phone records of 546 volunteers, matching phone numbers against the public Yelp and Google Places directories to see who was being called.

From the phone numbers, it was possible to determine that 57 percent of the volunteers made at least one medical call. Forty percent made a call related to financial services.

The volunteers called 33,688 unique numbers; 6,107 of those numbers, or 18 percent, were isolated to a particular identity.

[…]

They crowdsourced the data using an Android application and conducted an analysis of individual calls made by the volunteers to sensitive numbers, connecting the patterns of calls to emphasize the detail available in telephone metadata, Mayer said.

“A pattern of calls will, of course, reveal more than individual call records,” he said. “In our analysis, we identified a number of patterns that were highly indicative of sensitive activities or traits.”

For example, one participant called several local neurology groups, a specialty pharmacy, a rare-condition management service, and a pharmaceutical hotline used for multiple sclerosis.

Another contacted a home improvement store, locksmiths, a hydroponics dealer and a head shop.

The researchers initially shared the same hypothesis as their computer science colleagues, Mayer said. They did not anticipate finding much evidence one way or the other.

“We were wrong. Phone metadata is unambiguously sensitive, even over a small sample and short time window. We were able to infer medical conditions, firearm ownership and more, using solely phone metadata,” he said.

February 17, 2014

Looking forward by looking backward – military evolution

Filed under: History, Military, Technology, Weapons — Tags: , , , — Nicholas @ 11:52

Strategy Page discusses the problems of predicting the future … which isn’t just a task for science fiction writers:

How will warfare change in the next 30 years? Military leaders, and the people they protect, are always trying to figure this out. There’s an easy way to get some good insight on the future. Simply go back 120 years (1894) and note the state of warfare and military technology at the time, then advance, 30 years at a time, until you reach 2014. At that point, making an educated guess at what 2044 will be like will like will be, if not easy, at least a lot less daunting.

In 1894, many infantry were still using single shot black powder rifles. Change was in the air though, and the United States had just begun to adopt the newfangled smokeless powder, a few years after it became widely available. In 1894 American troops were still replacing their black power rifles with a smokeless powder model (the Krag-Jorgensen). The modern machine-gun had been invented in 1883 but armies took about two decades before they began adopting them on a large scale. Most artillery was still short ranged, not very accurate, and could only fire at targets the crew could see. Horses pulled or carried stuff and the infantry marched a lot when they were not being moved long distances by railroad or steamships. But the modern, quick-firing artillery was recently introduced and still unproven in battle. Communications still relied on the telegraph, a half century old invention that revolutionized, in only a few decades, the way commanders could talk to each other over long distances. They could now do it in minutes. This was a big change for warfare. Very big. At this time telephones were all local and not portable. Cavalry was still important for scouting, although less useful for charging infantry (a trend that began when infantry got muskets with bayonets two centuries earlier).

[…]

So what does this portend for 2044? Faster and deadlier, for sure. Information war will be more than a buzzword by then because better sensors and data processing technology will make situational awareness (knowing where you and your enemy are, knowing it first, and acting on it before the other guy does) more decisive than ever.

If the expected breakthrough in batteries (fuel cells) evolves as reliably and cheaply as expected, the 2040s infantryman will be something of a cyborg. In addition to carrying several computers and sensor systems, he might wear body armor that also provides air conditioning. Satellite communications, of course, and two way video. Exoskeletons are already in the works and may mature by then. A lot depends on breakthroughs in battery tech although engineers are also finding to do more with just a little juice. Historians tend to constantly underestimate the cleverness of engineers and inventors in general.

But the big new development will be the continued evolution of robotic weapons. The World War II acoustic torpedo (used by the Germans and the allies, from subs as well as the air) was the first truly robotic weapon. You turned it lose and it would hunt down its prey and attack. There may be a lot of public uproar over land based systems that have sensors, can use them to hunt, and have weapons that can be used without human intervention. But those systems will be easy and cheap to build by 2044, and as soon as one nation builds them others will have to follow. By 2044, machines will be fighting other machines more often than they will be looking for the stray human on the battlefield.

But there will be other developments that are more difficult to anticipate. In 1894 most of the 1924 technologies were already known in a theoretical sense. Same with the 1954 technologies in 1924 and so on. What is most difficult to predict is exactly how new tech will be employed. There will be imagination and ingenuity involved there, and that sort of thing is, by its very nature, resistant to prediction.

« Newer PostsOlder Posts »

Powered by WordPress