Quotulatiousness

September 7, 2012

The debut of energy weapons in the real military world

Filed under: Military, Technology, USA, Weapons — Tags: , , — Nicholas @ 00:03

The Economist looks at the long-anticipated introduction of energy weapons. They’re still a long way from matching the fictional capabilities of phasers, blasters, disruptors, or photon torpedoes:

In the late 1970s and early 1980s the idea was revived when American strategists began thinking in earnest about the technologies they would need to shoot down nuclear-armed ballistic missiles. Among the more fanciful ideas taken up by Ronald Reagan’s Strategic Defence Initiative (more commonly known as Star Wars) was the X-ray laser, which aimed to harness the energy of an atomic explosion to generate powerful laser beams. The hassle of having to explode a nuclear bomb every time a beam was needed meant the idea never went anywhere, though it did spur research into high-powered chemical lasers and the sophisticated optics needed to aim and control them.

The main appeal of using an energy beam to shoot things is that it travels at the speed of light, which means, in practice, that it will hit whatever it is aimed at. Trying to shoot down an incoming missile or warhead with a physical projectile, by contrast, is much more difficult. The guidance challenges of trying to “hit a bullet with a bullet” are enormous and are only gradually being solved using complex radars and missiles equipped with expensive sensors. A second attraction of lasers and other energy weapons is that in most cases they cannot run out of ammunition, and can keep firing for as long as they are plugged into a power source. The initial costs may be quite high, but each shot may then cost only a few dollars, compared with a price-tag of $3m or more for the latest missiles used to shoot down aircraft or other missiles.

[. . .]

The big trend now is to try to scale up three other sorts of laser that are far more compact than chemical lasers and can fire away merrily as long as they have power and don’t get too hot. The first sort is the fibre laser, in which the beam is generated within an optical fibre. Because this is already used in industry for welding and cutting, prices are falling, power output is increasing and reliability has been steadily improving. Industrial lasers can be turned into weapons pretty easily, simply by strapping them to a weapons mount.

But they are not very powerful. The Tactical Laser System being developed for the American navy by BAE Systems, a British firm, has an output of just 10kW, enough to run a few household kettles. Even so, it might be useful for frightening off (or burning holes in) small boats that look threatening but wouldn’t warrant a hail of machinegun fire. A slightly bigger version puts out about 33kW of power and fits neatly on existing turrets that house the rotary cannons used to shoot down incoming anti-ship missiles. It could blind optical or heat-seeking sensors on enemy missiles, or puncture small boats.

September 6, 2012

It’s time to retire the pop-sci term “Junk DNA”

Filed under: Health, Science — Tags: , , — Nicholas @ 08:42

In the Wall Street Journal, Gautam Naik and Robert Lee Hotz report on the most recent discoveries about the human genome:

The new insight is the product of Encode, or Encyclopedia of DNA Elements, a vast, multiyear project that aims to pin down the workings of the human genome in unprecedented detail.

Encode succeeded the Human Genome Project, which identified the 20,000 genes that underpin the blueprint of human biology. But scientists discovered that those 20,000 genes constituted less than 2% of the human genome. The task of Encode was to explore the remaining 98% — the so-called junk DNA — that lies between those genes and was thought to be a biological desert.

That desert, it turns out, is teeming with action. Almost 80% of the genome is biochemically active, a finding that surprised scientists.

In addition, large stretches of DNA that appeared to serve no functional purpose in fact contain about 400,000 regulators, known as enhancers, that help activate or silence genes, even though they sit far from the genes themselves.

The discovery “is like a huge set of floodlights being switched on” to illuminate the darkest reaches of the genetic code, said Ewan Birney of the European Bioinformatics Institute in the U.K., lead analysis coordinator for the Encode results.

September 4, 2012

US Army’s JTRS program a poster child for development failure

Filed under: Bureaucracy, Military, Technology, USA — Tags: , , , , — Nicholas @ 09:35

Strategy Page has the details:

It’s been eleven months now since the U.S. Army cancelled its 15 year effort to develop the JTRS (Joint Tactical Radio System). This program cost over $6 billion and has been a major embarrassment for the U.S. Department of Defense. Actually, JTRS still exists, on paper, but its goal, to provide better combat radios, has been accomplished by adopting civilian radios that do what the troops needed done and calling it JTRS. In the time the army spent working on JTRS some $11 billion was spent on buying more radios using existing designs, and a lot of off-the-shelf equipment incorporating stuff JTRS was supposed to do.

JTRS was yet another example of a military development project that got distracted, and bloated, trying to please everyone. There was, in a word, no focus. There’s been a lot of this in the last decade. That’s what killed the Comanche light attack helicopter, the Crusader self-propelled howitzer, FCS (Future Combat System), the Seawolf SSN, the DDG-1000 destroyer, B-2 bomber, F-22 fighters and several military space satellite projects. In all cases some of the technology developed was put to use in cheaper systems and sometimes a few of the cancelled systems were built (three Seawolfs, three DDG-1000s, 21 B-2s and 187 F-22s). These cancellations and cutbacks saved over half a trillion dollars. That goes a long way towards paying for projects that were not cancelled and are nearly half a trillion dollars over budget. But overall these failures were expensive and embarrassing.

JTRS, however, was the poster child of what usually goes wrong and how it impacts the combat troops. After all, radios are something personnel in all services use a lot. The main problem with JTRS was that the troops needed digital (for computer stuff) and analog (traditional radio) communications in one box and it had to be programmable, in order to handle new applications and the need to communicate with other radio types. That’s what JTRS was supposed to do but it never happened. The procurement bureaucracy and government contractors consumed over six billion dollars but never quite got anything useful out the door.

August 31, 2012

Innovative ways to use huge surplus of beetle-blighted lumber

Filed under: Cancon, Environment, Technology — Tags: , , , — Nicholas @ 09:07

British Columbia has a problem with their trees: too many of them are dead due to a massive increase in the population of the mountain pine beetle. The province is searching for ways to cope with the lumber from all the beetle-killed trees:

When life hands you lemons, goes the old saw, make lemonade. But what if life should hand you 18m hectares (44m acres) of dead trees? That is the problem faced by the province of British Columbia in Canada, which could lose over half its pine trees to the depredations of the fearsome mountain pine beetle. The beetle, no bigger than a grain of rice, is native to the forests of Western North America, where it kills trees by releasing a blue stain fungus that prevents the flow of water and nutrients. While the insect was historically kept in check by spells of cold weather, years of mild winters have unleashed an outbreak whose spread and severity is unlike anything seen previously.

As a result, the province is peppered with billions of dead, grey trees. If they are simply left standing, they will eventually either decay or burn in forest fires. In either case, they will release the carbon dioxide they stored while growing, swelling Canada’s total carbon footprint from 2000 to 2020 by 2%.

[. . .]

Canadian researchers have discovered other uses for BKP. Sorin Pasca, a graduate student at the University of Northern British Columbia, found that rain and snow conveniently wash out sugars and other organic compounds from dead pine trees. By grinding up the dry BKP and adding it to normal cement, he created a hybrid material that is waterproof, fire-resistant and pourable like concrete but that can be worked, cut and nailed or drilled like wood. The material, dubbed Beetlecrete, has already been used to make countertops, benches and planters.

Even more esoteric uses for BKP are on the table. Nanocrystalline cellulose, made up of microscopic needle-like fibres, is a lightweight, ultra-rigid material that can be extracted from wood pulp. Currently used to improve the durability of paints and varnishes, nanocrystalline cellulose promises strong, iridescent films that may find uses in industries ranging from optical computing to cosmetics. And, as a last resort, dead and fallen pine trees can feed British Columbia’s 800MW of bio-mass power plants, which burn pellets of BKP and other waste wood to generate electricity.

August 14, 2012

Anecdotes are not data: Demise of Guys based on anecdotal evidence

Filed under: Media, Randomness — Tags: , , , , — Nicholas @ 09:15

Jacob Sullum on the recent ebook The Demise of Guys: Why Boys Are Struggling and What We Can Do About It, by Philip G. Zimbardo and Nikita Duncan.

Zimbardo’s thesis is that “boys are struggling” in school and in love because they play video games too much and watch too much porn. But he and his co-author, a recent University of Colorado graduate named Nikita Duncan, never establish that boys are struggling any more nowadays than they were when porn was harder to find and video games were limited to variations on Pong. The data they cite mostly show that girls are doing better than boys, not that boys are doing worse than they did before xvideos.com and Grand Theft Auto. Such an association would by no means be conclusive, but it’s the least you’d expect from a respected social scientist like Zimbardo, who oversaw the famous Stanford “prison experiment” that we all read about in Psych 101.

[. . .]

One source of evidence that Zimbardo and Duncan rely on heavily, an eight-question survey of people who watched Zimbardo’s TED talk online, is so dubious that anyone with a bachelor’s degree in psychology (such as Duncan), let alone a Ph.D. (such as Zimbardo), should be embarrassed to cite it without a litany of caveats. The most important one: It seems probable that people who are attracted to Zimbardo’s talk, watch it all the way through, and then take the time to fill out his online survey are especially likely to agree with his thesis and especially likely to report problems related to electronic diversions. This is not just a nonrepresentative sample; it’s a sample bound to confirm what Zimbardo thinks he already knows. “We wanted our personal views to be challenged or validated by others interested in the topic,” the authors claim. Mostly validated, to judge by their survey design.

[. . .]

Other sources of evidence cited by Zimbardo and Duncan are so weak that they have the paradoxical effect of undermining their argument rather than reinforcing it. How do Zimbardo and Duncan know about “the sense of total entitlement that some middle-aged guys feel within their relationships”? Because “a highly educated female colleague alerted us” to this “new phenomenon.” How do they know that “one consequence of teenage boys watching many hours of Internet pornography…is they are beginning to treat their girlfriends like sex objects”? Because of a theory propounded by Daily Mail columnist Penny Marshall. How do they know that “men are as good as their women require them to be”? Because that’s what “one 27-year-old guy we interviewed” said.

Even when more rigorous research is available, Zimbardo and Duncan do not necessarily bother to look it up. How do they know that teenagers “who spend their nights playing video games or texting their friends instead of sleeping are putting themselves at greater risk for gaining unhealthy amounts of weight and becoming obese”? Because an NPR correspondent said so. Likewise, the authors get their information about the drawbacks of the No Child Left Behind Act from a gloss of a RAND Corporation study in a San Francisco Chronicle editorial. This is the level of documentation you’d expect from a mediocre high school student, not a college graduate, let alone a tenured social scientist at a leading university.

August 10, 2012

For you, is no Singularity

Filed under: Science, Technology — Tags: , , , , — Nicholas @ 11:25

Charles Stross linked to this article which points out that we’re not likely to experience the Singularity/Rapture of the Nerds/etc., and for good reasons:

Given that you are tech-savvy, by that point you have almost certainly come across the idea of the Singularity [1] as defended by futurists like Ray Kurzweil and Vernor Vinge. As a reminder, it is the notion that, when we are at last able to compile a smarter-than-human artificial intelligence, this AI will in turn manage to improve its own design, and so on, resulting in an out-of control loop of “intelligence explosion” [2] with unpredictable technological consequences. (singularists go on to predict that after this happens we will merge with machines, live forever, upload our minds into computers, etc).

What’s more, this seemingly far-future revolution would happen within just a few decades (2040 is often mentioned), due to the “exponential” rate of progress of science. That this deadline would arrive just in time to save the proponents of the Singularity from old age is just a weird coincidence that ought to be ignored.

Objection, your honor. As a scientist, I find the claim that scientific progress is exponential to be extremely dubious. If I look at my own field, or at any field that I am vaguely familiar with, I observe roughly linear progress — a rate that has typically been going on since as far back as the field’s foundation. “Exponential progress” claims are usually supported by the most bogus metrics, such as the number of US patents filled per year [3] (essentially a fashion utterly decorrelated from scientific progress).

And as somebody who does AI research, I find the notion of “intelligence explosion” to make exactly zero sense, for reasons reaching back to the very definition of intelligence. But I am not going to argue about that right now, as isn’t even necessary to invalidate the notion of the Singularity.

August 2, 2012

Charles Stross: Where Moore’s Law and Koomey’s Law interact

Filed under: Science, Technology — Tags: , , — Nicholas @ 09:32

On his blog, Charles Stross explores the long-term implications of Moore’s Law (the doubling of computer circuits every two years) and Koomey’s Law (the energy efficiency of computers doubles every eighteen months):

A couple of basic physical rules underly the dizzying progress in electronics that we have seen over the past fifty years. Moore’s Law, attributed to Intel co-founder Gordon Moore, postulates that the number of transistors that can be placed on an integrated circuit of constant size doubles approximately every two years. Originally coined in 1965, Moore’s law has run more or less constantly ever since. It can’t continue indefinitely, if only because we’re getting close to the atomic scale; a silicon atom has a Van der Waals radius of around 200 picometres, and to build circuits that mediate electron transport we need discrete atomic-scale structures. It is not obvious that we can build electronics (or other molecular structures) with a resolution below one nanometre. So it’s possible that Moore’s law will expire within another decade.

Having said that, predictions of the imminent demise of Moore’s Law within a decade go back to the 1970s. And if we can’t increase the two-dimensional structure count on an integrated circuit, we may still be able to increase the number of structures by building vertically.

A newer, and more interesting formulation than mere circuit count is Koomey’s Law, proposed by Jonathan Koomey at Stanford University: that the energy efficiency of computers doubles every 18 months.

This efficiency improvement has held true for a long time; today’s high-end microprocessors require far less power per instruction than those of a decade ago, much less two or three decades ago. A regular ARM-powered smartphone, such as an iPhone 4S, is some 12-13 orders of magnitude more powerful as a computing device than a late 1970s-vintage Cray 1 supercomputer, but consumes milliwatts of power for computing (rather than radio) operations, rather than the 115 kilowatts of the Cray.

Taking them together, what do these two laws imply about the not-too-distant future?

July 19, 2012

Walter Olson: more red flags in the Regnerus study

Filed under: Health, Politics, USA — Tags: , , , , , — Nicholas @ 11:00

Some studies provide results that challenge common beliefs and understandings. Others reinforce them. But some studies are designed from the desired results backwards. The Regnerus study on gay parents’ influence on their children appears to be one of the latter. Walter Olson points out that even in its own terms, the study shows something different from what it is intended to:

By now almost everyone has had a whack at the recent Mark Regnerus (University of Texas) study claiming that young adults who report having a gay parent score worse on a range of life-success indicators than children from intact biological families. According to the study, these kids as young adults have lower educational attainment, are arrested more often, and have more trouble in their own relationships, among other problems. Critics have pointed out that the story is mostly one of collapsed heterosexual families, not “same-sex parenting”: The great majority of the kids were born to male-female couples, most of the presumedly gay dads and many of the moms didn’t get custody of their kids after their relationships dissolved, and few of the kids were actually raised through long periods by gay couples. LGBT advocates point out that sociologist Mark Regnerus accepted $695,000 from the anti-gay Witherspoon Institute to carry out the study.

But many critics have missed one of Regnerus’ most unexpected findings, one that may illuminate his study’s shortcomings. Specifically, and feeding into pretty much all the other problems, the study diagnoses children of gay parents as having a huge problem with poverty. Here’s Regnerus:

    Sixty-nine (69) percent of LMs [respondents with lesbian mothers] and 57% of GFs [those with gay fathers] reported that their family received public assistance at some point while growing up, compared with 17% of IBFs [those with intact two-parent biological families]; 38% of LMs said they are currently receiving some form of public assistance, compared with 10% of IBFs. Just under half of all IBFs reported being employed full-time at present, compared with 26% of LMs.

Those are big gaps. And of course they’re much at odds with the affluent image of gay families presented in both pro- and anti-gay-parenting literature as well as Modern Family-style popular entertainment. What do they signify?

Probably the biggest single reason is the one cited at the outset: This is mostly a survey of what happens when heterosexual families crack up. (Interestingly, if a married couple stayed together, they were counted as an “IBF,” no matter whether one or both partners pursued same-sex liaisons.) Decades of data indicate that children of family breakup do worse than children whose parents stay together, on many variables related to adult success. One reason, though not the only reason, is that they grow up significantly poorer.

Choice: re-evaluating the notion that too much choice is a bad thing

Filed under: Economics, Liberty, Science — Tags: , , , — Nicholas @ 09:37

There was a famous study several years ago that supposedly “proved” that providing too many choices to consumers was worse than providing fewer choices. At the time, I thought there must have been something wrong with the study.

The study used free jam samples in a supermarket, varying between offering 24 samples and only six, to test whether people were more likely to purchase the products (they were given a discount coupon in both variants). The result was that people who sampled from the smaller selection were more likely to actually buy the jam than those who had the wider selection to choose from. This was taken to prove that too many choices were a bad thing (and became a regular part of anti-consumer-choice advocacy campaigns).

Tim Harford explores more recent attempts to reproduce the study’s outcome:

But a more fundamental objection to the “choice is bad” thesis is that the psychological effect may not actually exist at all. It is hard to find much evidence that retailers are ferociously simplifying their offerings in an effort to boost sales. Starbucks boasts about its “87,000 drink combinations”; supermarkets are packed with options. This suggests that “choice demotivates” is not a universal human truth, but an effect that emerges under special circumstances.

Benjamin Scheibehenne, a psychologist at the University of Basel, was thinking along these lines when he decided (with Peter Todd and, later, Rainer Greifeneder) to design a range of experiments to figure out when choice demotivates, and when it does not.

But a curious thing happened almost immediately. They began by trying to replicate some classic experiments – such as the jam study, and a similar one with luxury chocolates. They couldn’t find any sign of the “choice is bad” effect. Neither the original Lepper-Iyengar experiments nor the new study appears to be at fault: the results are just different and we don’t know why.

After designing 10 different experiments in which participants were asked to make a choice, and finding very little evidence that variety caused any problems, Scheibehenne and his colleagues tried to assemble all the studies, published and unpublished, of the effect.

July 18, 2012

What is the best way to demonstrate care for the future?

Filed under: Economics, Environment, Government — Tags: , , , — Nicholas @ 08:29

According to Steven Landsburg, the answer is to cut capital taxes, and he makes a good case:

There are only three things you and I can do to make the future world a better place. First, we can consume less, leaving more resources behind. Second, we can work harder, planting trees, building factories and writing poems that will live on after we’re gone. Third, we can innovate, advancing science and technology so that our children’s children’s children can make better use of the resources they inherit.

As it happens, there’s one key policy variable that drives all three of these things, and that’s the tax rate on capital income (which includes interest, dividends, corporate income and capital gains). Capital taxes are a disincentive to save, and when people don’t save they consume instead. Capital taxes are a disincentive to work and a disincentive to innovate.

This is not a plea for lowering taxes in general, and it’s not a plea for making the tax system either more or less progressive. (If you want to soak the rich, there are plenty of things to tax besides capital.) As a matter of fact, this isn’t even a plea for lowering taxes on capital. It’s simply an observation that if your goal is to leave a better world for our descendants, then your best bet is to support lower capital taxes.

H/T to Tim Harford for the link.

July 9, 2012

The F-35 is “unaffordable and simply unacceptable”

Filed under: Military, Technology, USA — Tags: , , , , — Nicholas @ 12:46

Winslow Wheeler on the near-doubling of the F-35 price (so far):

On June 14 — Flag Day, of all days — the Government Accountability Office released a new oversight report on the F-35: Joint Strike Fighter: DOD Actions Needed to Further Enhance Restructuring and Address Affordability Risks. As usual, it contained some important information on growing costs and other problems. Also as usual, the press covered the new report, albeit a bit sparsely.

Fresh bad news on the F-35 has apparently become so routine that the fundamental problems in the program are plowed right over. One gets the impression, especially from GAO’s own title to its report, that we should expect the bad news, make some minor adjustments, and then move on. But a deeper dive into the report offers more profound, and disturbing, bottom line.

Notorious for burying its more important findings in the body of a report — I know; I worked there for nearly a decade — GAO understates its own results on acquisition cost growth in its one-page summary, which — sadly — is probably what most read to get what they think is the bottom line.

[. . .]

Set in 2001, the total acquisition cost of the F-35 was to be $233.0 billion. Compare that to the current estimate of $395.7 billion: cost growth has been $162.7 billion, or 70%: a lot more than what GAO stated in its summary.

However, the original $233 billion was supposed to buy 2,866 aircraft, not the 2,457 currently planned: making it $162 billion, or 70%, more for 409, or 14%, fewer aircraft. Adjusting for the shrinkage in the fleet, I calculate the cost growth for a fleet of 2,457 aircraft to be $190.8 billion, or 93%.

The cost of the program has almost doubled over the original baseline; it is not an increase of 42%.

July 3, 2012

US Army’s UCP camouflage pattern “makes soldiers more visible, not less”

Filed under: Military, Technology, USA — Tags: , , , — Nicholas @ 09:06

As I mentioned briefly last week, the US Army is abandoning their most recent camouflage patterned combat uniforms:

The United States military is abandoning its recently-adopted pixelated camouflage uniforms, according to articles this week in The Daily as well as Stars and Stripes.

The drab grey digital pattern, known as the Universal Camouflage Pattern (UCP), will be discarded after only eight years following mounting evidence that the colour scheme makes soldiers more visible, not less.

The articles pull few punches in their appraisal of the move to adopt the pattern in 2004.

“Army brass interfered in the selection process, choosing looks and politics over science,” reports Stars and Stripes, the official newspaper of the United States armed forces.

And while the Pentagon spent $5 Billion on the much-heralded uniforms, some of the earliest attempts to conceal soldiers on the battlefield were considerably less expensive.

The This is War blog has a discussion of the development of camouflage over the last century and a half.

June 29, 2012

US Army reluctantly admits USMC did better

Filed under: Military, Technology, USA — Tags: , , , , , — Nicholas @ 09:06

In developing camouflage, that is:

The U.S. Army has decided to scrap its digital pattern camouflage combat uniforms for the more effective, but more expensive, MultiCam. In the last decade, both the army and marines adopted new, digital, camouflage pattern field uniforms. But in Afghanistan, U.S. soldiers noted that the marine digital uniforms (called MARPAT, for Marine Pattern) were superior to the army UCP (Universal Camouflage Pattern). There’s been growing dissatisfaction with UCP, and it has become a major issue because all the infantry have access to the Internet, where the constant clamor for something better than UCP forced the army to do something. This is ironic because UCP is a variant of MARPAT, but a poor one, at least according to soldiers who have encountered marines wearing MARPAT. Even more ironic is that MARPAT is based on research originally done by the army. Thus some of the resistance to copying MARPAT is admitting the marines took the same research on digital camouflage, and produced a superior pattern for combat uniforms.

A digital camouflage pattern uses “pixels” (little square or round spots of color, like you will find on your computer monitor if you look very closely), instead of just splotches of different colors. Naturally, this was called “digital camouflage.” This pattern proved considerably more effective at hiding troops than older methods. For example, in tests, it was found that soldiers wearing digital pattern uniforms were 50 percent more likely to escape detection by other troops, than if they were wearing standard green uniforms. What made the digital pattern work was the way the human brain processed information. The small “pixels” of color on the cloth makes the human brain see vegetation and terrain, not people. One could provide a more technical explanation, but the “brain processing” one pretty much says it all.

June 5, 2012

The US military’s SF research emporium

Filed under: Media, Military, Science, Technology, Weapons — Tags: , , , , , , — Nicholas @ 08:18

John Turner sent me a link to this amusing little survey of what the US military’s R&D organization is willing to admit they’re working on and how it might be helpful in case of an alien invasion:

As summer blockbuster season kicks into high gear, big-budget action movies like The Avengers, Battleship, and Prometheus remind us that there’s one thing that unites Americans: Our shared fear of an alien attack. They also remind us that when the invading space fleet arrives, humanity is not going to surrender without a fight to our intergalactic invaders. Instead, we will band together to fight off their incredibly advanced weaponry with our … well, with what, exactly? Are we really ready to battle our would-be alien overlords?

Luckily, the Pentagon’s Defense Advanced Research Projects Agency, better known as DARPA, as well as some of the world’s largest weapons manufacturers, are dreaming up the weapons of the future today. With the help of everything from lasers on jets to hypersonic planes to invisibility cloaks, we just might be able to make the battle for Earth a fair fight. You may think we’re joking, but why else would NASA be uploading The Avengers to the International Space Station if not as a training manual? Here’s a look at some of the most space-worthy inventions being cooked up now.

An issue for any unmanned, armed vehicle (whether land, sea or air) is the security of communications from the controller to the vehicle. Recent use of such devices has almost always been in combat against relatively low-tech opponents who did not have jamming or hacking capabilities (although the UAV forced down in Iran may signal the end of the easy period for combat UAVs). Earlier discussions of benefits and drawbacks to unmanned fighters are here, here, and here.

April 17, 2012

Buckyballs: the silver bullet for aging?

Filed under: Health, Science — Tags: , , — Nicholas @ 09:34

Well, it’s been shown to almost double normal lifespan … in rats. But in only one study so far:

In the current study researchers fed the molecule dissolved in olive oil to rats and compared outcomes to a control group of rats who got plain olive oil.

The main question they wanted to answer was whether chronic C60 administration had any toxicity, what they discovered actually surprised them.

“Here we show that oral administration of C60 dissolved in olive oil (0.8 mg/ml) at reiterated doses (1.7 mg/kg of body weight) to rats not only does not entail chronic toxicity,” they write “but it almost doubles their lifespan.”

“The estimated median lifespan (EML) for the C60-treated rats was 42 months while the EMLs for control rats and olive oil-treated rats were 22 and 26 months, respectively,” they write.

Using a toxicity model the researchers demonstrated that the effect on lifespan seems to be mediated by “attenuation of age-associated increases in oxidative stress”

« Newer PostsOlder Posts »

Powered by WordPress