Quotulatiousness

January 28, 2011

US Navy’s “pee antenna”

Filed under: Military, Technology, USA — Tags: , — Nicholas @ 08:46

Warships have been sprouting more and more antennae for all the various communication equipment on board — more every year, as new devices are rolled out of development and into active service. This creates problems, especially with smaller vessels, as the multiple antennae need to be spaced far enough apart to avoid mutual interference with signals. The US Navy may have come up with a liquid solution:

With an $80 water pump, a $15 rubber hose and a $20 electrical device called a current probe that was easily plugged into a hand-held radio, [Daniel Tam] produced a spout roughly four metres tall from the waters of San Diego Bay. With this he could send and receive a clear signal. Over the intervening years his invention, dubbed the “pee antenna” by incredulous colleagues, has been tweaked and improved to the point where it can transmit over a distance of more than 50km (30 miles).

To make a seawater antenna, the current probe (an electrical coil roughly the size and shape of a large doughnut) is attached to a radio’s antenna jack. When salt water is squirted through the hole in the middle of the probe, signals are transferred to the water stream by electromagnetic induction. The aerial can be adjusted to the frequency of those signals by lengthening or shortening the spout. To fashion antennae for short-wave radio, for example, spouts between 18 and 24 metres high are about right. To increase bandwidth, and thus transmit more data, such as a video, all you need do is thicken the spout. And the system is economical. The probe consumes less electricity than three incandescent desk lamps.

A warship’s metal antennae, which often weigh more than 3½ tonnes apiece, can be damaged in storms or combat. Seawater antennae, whose components weigh next to nothing and are easily stowable, could provide handy backups — and, eventually, more than backups. Not all of a ship’s antennae are used at once, so the spouts could be adjusted continuously to obtain the types needed at a given moment. According to SPAWAR, ten such antennae could replace 80 copper ones.

January 25, 2011

The genomic treasure trove of Quebec

Filed under: Cancon, History, Science — Tags: , , , — Nicholas @ 07:36

Thanks to relatively thorough genealogical records, the people of Quebec are of great and growing interest to genetic researchers:

One of the great things about the mass personal genomic revolution is that it allows people to have direct access to their own information. This is important for the more than 90% of the human population which has sketchy genealogical records. But even with genealogical records there are often omissions and biases in transmission of information. This is one reason that HAP, Dodecad, and Eurogenes BGA are so interesting: they combine what people already know with scientific genealogy. This intersection can often be very inferentially fruitful.

But what about if you had a whole population with rich robust conventional genealogical records? Combined with the power of the new genomics you could really crank up the level of insight. Where to find these records? A reason that Jewish genetics is so useful and interesting is that there is often a relative dearth of records when it comes to the lineages of American Ashkenazi Jews. Many American Jews even today are often sketchy about the region of the “Old Country” from which their forebears arrived. Jews have been interesting from a genetic perspective because of the relative excess of ethnically distinctive Mendelian disorders within their population. There happens to be another group in North America with the same characteristic: the French Canadians. And importantly, in the French Canadian population you do have copious genealogical records. The origins of this group lay in the 17th and 18th century, and the Roman Catholic Church has often been a punctilious institution when it comes to preserving events under its purview such as baptisms and marriages. The genealogical archives are so robust that last fall a research group input centuries of ancestry for ~2,000 French Canadians, and used it to infer patterns of genetic relationships as a function of geography, as well as long term contribution by provenance.

January 11, 2011

Glass that is “stronger than steel” developed

Filed under: Technology — Tags: — Nicholas @ 08:59

Arnie Bruce-Cooper reports on a recent development in high-strength glass:

In the world of materials, strength (the amount of force a substance can withstand) and toughness (its capacity to resist fracturing) are not merely different attributes; they’re very difficult to achieve together. Now a collaboration of researchers from Caltech and the Department of Energy’s Lawrence Berkeley National Laboratory has created a form of glass that has both qualities. It’s stronger and tougher than steel or, indeed, any other known material. The material features palladium, a metal whose possible use in glasses was recognized 45 years ago.

[. . .]

The work is outlined in a study published this week in the journal Nature Materials. Marios Demetriou, a professor at Caltech and lead author of the paper, says the work involved finding a particularly strong version of the simplest form of glass, called marginal glass, and then turning it into the even stronger form known as bulk glass.

“What we did here is find a very, very tough marginal glass made of palladium with small fractions of metalloids like phosphorus, silicon, and germanium, which yielded one-millimeter-thick samples. And we just said, let’s add very little of something that will make it bulk without making it brittle,” says Demetriou. By adding 3.5 percent silver to this marginal glass, Demetriou was able to increase the thickness to six millimeters while maintaining its toughness.

H/T to Virginia Postrel for the link.

January 3, 2011

Healthy skepticism about study results

Filed under: Bureaucracy, Media, Science — Tags: , , , , — Nicholas @ 13:30

John Allen Paulos provides some useful mental tools to use when presented with unlikely published findings from various studies:

Ioannidis examined the evidence in 45 well-publicized health studies from major journals appearing between 1990 and 2003. His conclusion: the results of more than one third of these studies were flatly contradicted or significantly weakened by later work.

The same general idea is discussed in “The Truth Wears Off,” an article by Jonah Lehrer that appeared last month in the New Yorker magazine. Lehrer termed the phenomenon the “decline effect,” by which he meant the tendency for replication of scientific results to fail — that is, for the evidence supporting scientific results to seemingly weaken over time, disappear altogether, or even suggest opposite conclusions.

[. . .]

One reason for some of the instances of the decline effect is provided by regression to the mean, the tendency for an extreme value of a random quantity dependent on many variables to be followed by a value closer to the average or mean.

[. . .]

This phenomenon leads to nonsense when people attribute the regression to the mean as the result of something real, rather than to the natural behavior of any randomly varying quantity.

[. . .]

In some instances, another factor contributing to the decline effect is sample size. It’s become common knowledge that polls that survey large groups of people have a smaller margin of error than those that canvass a small number. Not just a poll, but any experiment or measurement that examines a large number of test subjects will have a smaller margin of error than one having fewer subjects.

Not surprisingly, results of experiments and studies with small samples often appear in the literature, and these results frequently suggest that the observed effects are quite large — at one end or the other of the large margin of error. When researchers attempt to demonstrate the effect on a larger sample of subjects, the margin of error is smaller and so the effect size seems to shrink or decline.

[. . .]

Publication bias is, no doubt, also part of the reason for the decline effect. That is to say that seemingly significant experimental results will be published much more readily than those that suggest no experimental effect or only a small one. People, including journal editors, naturally prefer papers announcing or at least suggesting a dramatic breakthrough to those saying, in effect, “Ehh, nothing much here.”

The availability error, the tendency to be unduly influenced by results that, for one reason or another, are more psychologically available to us, is another factor. Results that are especially striking or counterintuitive or consistent with experimenters’ pet theories also more likely will result in publication.

December 14, 2010

Conservatives now still pushing corporate welfare

Filed under: Cancon, Economics, Politics, Technology — Tags: , , , , — Nicholas @ 07:35

Okay, they’re not even pretending to be fiscally conservative any more:

The Conservative government has announced it is loaning aerospace giant Pratt & Whitney Canada $300 million for a $1 billion research project to develop the next generation of aircraft engines.

Industry Minister Tony Clement made the announcement on Monday saying it will create 700 high-skilled jobs in the GTA and more than 2,000 over the 15-year lifespan of the project. He also claimed the firm is in the process of hiring 200 engineers.

[. . .]

‘Create and maintain Canadian jobs’ has been the Conservative mantra during their recent shift to Keynesian economics and massive long-term deficits for the next half decade. The same political party that once decried government largesse and inexplicable corporate subsidies (also known affectionately as corporate welfare) is now a major player in the ‘too big to fail’ macroeconomics game.

This is nothing new: under former minister Maxime Bernier, the current darling of the small-government wing of the Conservative party, Pratt & Whitney got $350 million in corporate welfare just four years ago. That debt hasn’t been repaid.

December 2, 2010

Autism may be linked to faulty mitochondria

Filed under: Health — Tags: , , , — Nicholas @ 00:05

A report in The Economist looks at a recent study which may explain the cause of autism:

One suggestion that does pop up from time to time is that the process which leads to autism involves faulty mitochondria. The mitochondria are a cell’s powerpacks. They disassemble sugar molecules and turn the energy thus liberated into a form that biochemical machinery can use. Mitochondrial faults could be caused by broken genes, by environmental effects, or by a combination of the two.

Nerve cells have a huge demand for energy, so a failure of the mitochondria would certainly affect them. The question is, could it cause autism? To try to find out Cecilia Giulivi of the University of California, Davis, and her colleagues studied the mitochondria of ten children, aged between two and five years, who had been diagnosed with autism. They have just published their results in the Journal of the American Medical Association.

[. . .]

The children in question were randomly selected from a previous study on autism. They were matched with ten children of similar ages and ethnic backgrounds who were developing normally. Dr Giulivi found that mitochondria from children with autism consumed far less oxygen than those from the control group. That is a sign of lower activity. One important set of enzymes — NADH oxidases — used, on average, only a third as much oxygen in autistic children as they did in non-autists, and eight of the autistic children had significantly lower NADH-oxidase activity than is normal.

The mitochondria of the autistic children also leaked damaging oxygen-rich chemicals such as hydrogen peroxide. These are a normal by-product of mitochondrial activity, but are usually mopped up by special enzymes before they can escape and cause harm — for instance, by damaging a cell’s DNA. The level of hydrogen peroxide in the cells of autistic children was twice that found in non-autists. Such high levels suggest the brains of autistic children are exposed to a lot of oxidative stress, something that would probably cause cumulative damage.

While such a mechanism may allow better treatments to be developed, it also implies that those who already suffer from autism may not benefit as much (or at all) from such treatments, as the cellular damage will be a much tougher challenge to reverse.

On the other hand, if the pattern can be detected early enough, it may allow treatment well in advance of serious damage.

November 14, 2010

Wandering minds or wandered researchers?

Filed under: Health, Randomness — Tags: , , , — Nicholas @ 11:02

I can’t improve at all on Chris Myrick’s comment on this article:

Harvard psychologists have determined that we are happiest when: having sex, exercising, in intense conversations with friends, listening to music or playing. Aside from their use of an iPhone app to determine this, (http://www.trackyourhappiness.org/), I’m missing the news value.

However, I see potential for a follow-up study where researchers can determine whether receiving phone messages during sex, sport or engaging conversation puts a damper on someone’s mood.

November 1, 2010

This will come as a surprise only to drug warriors

Filed under: Britain, Health, Law, Politics — Tags: , , , — Nicholas @ 07:10

A recent British study totted up the effects both on the individual and costs to society of various legal and illegal drugs:

Alcohol is more harmful than heroin or crack, according to a study published in medical journal the Lancet.

The report is co-authored by Professor David Nutt, the former UK chief drugs adviser who was sacked by the government in October 2009.

It ranks 20 drugs on 16 measures of harm to users and to wider society.

Tobacco and cocaine are judged to be equally harmful, while ecstasy and LSD are among the least damaging.

H/T to DarkWaterMuse, who writes:

An interesting result, no doubt, but one thing the researchers failed to do is to aggregate the harm due to all illicit drugs, or even a handful of drugs frequently abused by the same users. Seems to me this would likely reveal alcohol as relatively benign though it’s not clear how additive the effects are.

October 21, 2010

Bringing back the American Chestnut tree

Filed under: Environment, Science, USA, Woodworking — Tags: , — Nicholas @ 08:22

The once-common American Chestnut tree fell victim to a blight during the last century, almost wiping out the species. The American Chestnut Foundation is hopeful of a revival:

By interbreeding the American with its Chinese cousin, tree lovers have created an American chestnut with some resistance to Asian blight and have developed a virus that can be injected into affected trees to combat the fungus. It’s a project that shows every sign of promise — with about 25,000 of the new chestnuts planted under the guidance of trained scientists and chestnut devotees.

If the hybrid plantings thrive, some envision huge tracts of strip-mined Appalachia one day being restored with lovely chestnut forests.

“We know we’re interbreeding resistance. Now we have to figure out, does it have enough resistance?” said Bryan Burhans, president of the American Chestnut Foundation, which has led the revival efforts.

He said it will take 75 to 100 years to know whether the tree can be reestablished as a mainstay of Eastern forests. But he said he’s “very optimistic” about the American chestnut’s future.

[. . .]

A fast-growing, hardy tree that thrives on rocky and acidic soil, the American chestnut served as an economic engine for Appalachia. Families fattened livestock with its nuts and used its wood for fuel, railroad ties, fence posts, musical instruments and furniture. It was a fixture along East Coast and Appalachian streets and highways, where its display of fingery white flowers was a springtime delight.

October 7, 2010

Not only is the science not “settled”, we still need to learn far more

Filed under: Science — Tags: , , — Nicholas @ 07:49

Counter-intuitively, although the sun has been going through a period of decreased activity over the last few years, a new study in Nature claims that there’s been an increase in the amount of visible light and near-infrared energy:

New data indicates that changes in the Sun’s output of energy were a major factor in the global temperature increases seen in recent years. The research will be unwelcome among hardcore green activists, as it downplays the influence of human-driven carbon emissions.

As the Sun has shown decreased levels of activity during the past decade, it had been generally thought that it was warming the Earth less, not more. Thus, scientists considered that temperature rises seen in global databases must mean that human-caused greenhouse gas emissions — in particular of CO2 — must be exerting a powerful warming effect.

Now, however, boffins working at Imperial College in London (and one in Colorado) have analysed detailed sunlight readings taken from 2004 to 2007 by NASA’s Solar Radiation and Climate Experiment (SORCE) satellite. They found that although the Sun was putting out less energy overall than usual, in line with observations showing decreased sunspot activity, it actually emitted more in the key visible-light and near-infrared wavelengths.

These shorter wavelength forms of radiated heat penetrate the atmosphere particularly well to heat up the Earth’s surface — just as the same frequencies get in through car windows to heat up its interior. The hot seats and dashboard — in this case the seas, landmasses etc — then radiate their own increased warmth via conduction, convection and longer-wave infrared, which can’t escape the way the shortwave energy came in. This is why the car, and the planet, become so hot.

Thus the Sun, though it was unusually calm in the back half of the last decade, was actually warming the planet much more strongly than before.

If this research is confirmed, it certainly provides a lot of ammunition to the folks who don’t want to spend huge sums to try to cut down carbon emissions.

September 29, 2010

Transformer TX project initial funding awarded to AAI Corporation

Filed under: Military, Technology — Tags: , , , , — Nicholas @ 12:14

Remember the “Flying Jeep” proposal? It’s still being pursued, as the initial funding for a flying gyrocopter/SUV has been awarded by DARPA:

Transformer TX, as we have previously reported, is intended to produce a vehicle able to drive on the ground with similar performance to a Humvee or other offroad vehicle. It must also be able to take off vertically with 1,000lb of passengers and payload aboard and fly about at altitudes up to 10,000 feet at speeds equivalent to normal light aircraft.

Perhaps best of all, the Transformer TX is also intended to be fully automated, capable of flying itself with only the most basic guidance from its human operator — who would not, therefore, need to be a highly trained pilot.

Admittedly, I know almost nothing about flying, but this sounds like getting something for nothing (that is, aren’t there laws of physics against this?):

The SR/C idea is basically a winged, propellor-driven light aeroplane with a set of free-spinning autogyro rotors on top. It’s not a helicopter: the engine can’t drive the rotors in flight, and a sustained hover isn’t possible. Nonetheless, though, the CarterCopter can take off vertically as required by Transformer TX rules.

It does this by having weighted rotor tips, meaning that a lot of energy can be stored in the spinning blades (rather as in a flywheel). Sitting on the ground, a small engine-driven “pre-rotator” assembly can gradually spin the rotors up to high speed. The pre-rotator, pleasingly, doesn’t have to transmit a lot of power — thus it is lightweight, cheap and simple compared to a helicopter’s transmission. Nor is the engine required to deliver the massive grunt required to keep chopper blades spinning hard enough to support the aircraft.

Once the rotors are at takeoff speed, the pre-rotator is declutched, the prop engaged and the pitch of the rotors pulled in so that they start to bite air. As they slow down, the energy stored in their whirling weighted tips blasts air down through the disc and the aircraft leaps vertically into the air in a “jump takeoff”.

Sounds amazingly like pulling yourself into the air by your own bootstraps . . .

Still, I’d like to eventually get that flying car I was promised all those years ago.

July 8, 2010

“Flying jeep” proposal

Filed under: Military, Technology — Tags: , , , , — Nicholas @ 09:09

The Register looks at the “Tyrannos” flying jeep:

Who remembers the “Transformer TX” flying-car project, intended to equip the US Marines with a small four-seat vehicle able to drive about on the ground like a jeep, hover like a helicopter, or fly like a plane? The first team to publicly offer a contending design has now stepped forward.

That design is the “Tyrannos” from Logi Aerospace, allied with other companies and organisations including the South West Research Institute and Californian electric-vehicle firm ZAP.

The Tyrannos is nominally intended to provide Marines with the ability to leapfrog over troublesome roadside bombs, mines, and ambushes while remaining able to drive on the ground as they normally might. However, it promises to be much quieter than ordinary helicopters in use and far easier to fly and maintain.

If the Tyrannos can do all its makers claim, it really does have the potential to become the flying car for everyman.

That last sentence really does wrap up the situation: if it can do all that is claimed, it’ll be a fantastic new toy for the military and (eventually) lead to the flying cars we were promised forty years ago. The specs seem hopelessly optimistic, but perhaps I’m just jaded because I don’t have a flying car of my own yet . . .

Reader beware, however. The Transformer TX project is being run not by the Marines themselves but by DARPA, the Pentagon crazytech agency which won’t even touch a project unless it is extremely unlikely to succeed.

“Give us ideas that probably won’t work,” that is DARPA’s motto: and the Tyrannos team assembled their design specifically to DARPA requirements. And, let it be noted, they have yet to satisfy even DARPA’s very relaxed rules on what kind of ideas should get taxpayers’ money spent on them.

June 30, 2010

Here’s an example of a home that’s really a castle

Filed under: Architecture, Europe, France, History — Tags: , , — Nicholas @ 12:03

Chateau de Guedelon is a real 13th century castle, or at least, it will be when they finish building it:

The ­Chateau de Guedelon was started in 1998, after local landowner Michel Guyot wondered whether it would be possible to build a castle from scratch, using only contemporary tools and materials.

Today, the walls are rising gradually from the red Burgundy clay. The great hall is almost finished, with only part of the roof remaining, while the main tower edges past the 15m (50ft) mark.

Builders use sandstone quarried from the very ground from which the castle is emerging.

[. . .]

The Guedelon site was chosen because it contained all the necessary materials: plentiful oak from the forests, as well as clay and water. Stone from the quarry had actually been used in the building of real-life medieval chateaux.

May 13, 2010

A “secret weapon” from WW2 updated for the 21st century

Filed under: History, Military, WW2 — Tags: , , , — Nicholas @ 07:54

Strategy Page looks at Operations Research in its modern guise:

It all began back in the 1970s, when some CIA analysts discovered a new way to analyze the mountains of information they were receiving. The new tool was predictive analysis. What does this do for intelligence analysts? Predictive analysis was the result of a fortuitous combination of OR (Operations Research), large amounts of data and more powerful computers. OR is one the major (and generally unheralded) scientific developments of the early 20th century. OR is basically applying mathematical analysis to problems. OR turned out to be a major “weapon” for the Allies during World War II. OR, like radar, was developed in the 1930s, just in time for a major war, when whatever was available was put to work to win the conflict. OR is also, half jokingly, called a merger of math and common sense. It is widely used today in science, industry and, especially, in business (it’s the primary tool of MBAs, where it’s called “management science”.) With predictive analysis, the most important OR tool was the ability to “backtest” (see if the simulation of a situation could accurately predict the outcome of something that had already happened, if the same historical decisions are made). For predictive analysis of contemporary situations, the backtest is, instead, a predictive tool that reveals likely outcomes.

Predictive analysis, like OR in general, creates a framework that points you towards the right questions, and often provides the best answers as well. Like many OR problems, especially in the business world, the simulation framework is often quite rough. But in war, as in commerce, anything that will give you an edge can lead to success over your opponents. A predictive analysis is similar to what engineers call “a 60 percent solution” that can be calculated on the back of an envelope.

The one form of predictive analysis that the general public is aware of is wargames, and these have been increasingly useful in predicting the outbreak, and outcomes, of wars. There have even been commercial manual (like chess) wargames that have successfully applied predictive analysis. The commercial manual wargames produced some impressive results when it came to actual wars.

In late 1972 a game (“Year of the Rat”) was published covering the recent (earlier in the year) North Vietnamese invasion of South Vietnam. This game didn’t predict the outcome of the war, but it got the attention of people in the intelligence community, especially those who knew something about wargames, for it was a convincing demonstration of what a manual wargame, using unclassified data, could do in representing a very recently fought campaign. There was even talk that these games could actually predict the outcome, and details, of a future war. The next year, wargames did just that, accurately portraying the outcome of the 1973 Arab-Israeli war. The game (“Sinai”) was about to be published when the war broke out, but some people in the intelligence community knew about it. A member of the Israeli UN delegation had watched the game in development (he was a wargamer), and was assigned to camp out at the publishers offices, while the war raged, and report what the game was predicting.

May 2, 2010

Latest trick to play on your drinking buddies

Filed under: Randomness, Science — Tags: , , — Nicholas @ 10:13

Roofies are so passé. Now it’s the Girlifier you need to watch for:

German boffins say they have developed a miracle nasal spray which can make men into big girls’ blouses.

Dr René Hurlemann of Bonn Uni’s Klinik für Psychiatrie, working with colleagues in Germany, Arizona and Blighty, has just announced successful tests of the new girlification spray — whose active ingredient is the hormone oxytocin.

[. . .]

Control-group German men who had been given a placebo rather than the soppiness compound reacted normally, either unemotionally or with mild discomfort. But the hapless subjects who had been given the drug showed “significantly higher emotional empathy levels”, according to Hurlemann.

“The males under test achieved levels [of emotion] which would normally only be expected in women,” says a statement from Bonn University, indicating that they had cooed or even blubbed at the sight of the affecting images.

H/T to Chris Taylor who said understated “No good can come of this”.

« Newer PostsOlder Posts »

Powered by WordPress