Quotulatiousness

April 28, 2014

Kate Lunau talks to Ray Kurzweil

Filed under: Health, Science, Technology — Tags: , , — Nicholas @ 15:38

I’m interested in life extension … I have no particular hankering to die any time soon, although I admit there is some truth in the aphorism “Many wish for immortality who don’t know how to spend a rainy Sunday afternoon”. Ray Kurzweil wants immortality, and he’s doing what he can to make that happen:

Ray Kurzweil — futurist, inventor, entrepreneur, bestselling author, and now, director of engineering at Google — wants to live forever. He’s working to make it happen. Kurzweil, whose many inventions include the first optical character recognition software (which transforms the written word into data) and the first text-to-speech synthesizer, spoke to Maclean’s for our annual Rethink issue about why we’re on the brink of a technological revolution — one that will improve our health and our lives, even after the robots outsmart us for good.

Q: You say we’re in the midst of a “grand transformation” in the field of medicine. What do you see happening today?

A: Biology is a software process. Our bodies are made up of trillions of cells, each governed by this process. You and I are walking around with outdated software running in our bodies, which evolved in a very different era. We each have a fat insulin receptor gene that says, “Hold on to every calorie.” That was a very good idea 10,000 years ago, when you worked all day to get a few calories; there were no refrigerators, so you stored them in your fat cells. I would like to tell my fat insulin receptor gene, “You don’t need to do that anymore,” and indeed that was done at the Joslin Diabetes Center. They turned off this gene, and the [lab mice] ate ravenously and remained slim. They didn’t get diabetes; they didn’t get heart disease. They lived 20 per cent longer. They’re working with a drug company to bring that to market.

Life expectancy was 20 a thousand years ago; 37, 200 years ago. We’re now able to reprogram health and medicine as software, and that [pace is] going to continue to accelerate. We’re treating biology, and by extension health and medicine, as an information technology. Our intuition about how progress will unfold is linear, but information technology progresses exponentially, not linearly. My Android phone is literally several billion times more powerful, per dollar, than the computer I used when I was a student. And it’s also 100,000 times smaller. We’ll do both of those things again in 25 years. It’ll be a billion times more powerful, and will be the size of a blood cell.

April 26, 2014

We’re not “running out of resources”

Filed under: Economics, Environment, Technology — Tags: , , — Nicholas @ 10:00

In the Wall Street Journal, Matt Ridley explains that claims that we’re running out of this or that resource are almost always bogus:

How many times have you heard that we humans are “using up” the world’s resources, “running out” of oil, “reaching the limits” of the atmosphere’s capacity to cope with pollution or “approaching the carrying capacity” of the land’s ability to support a greater population? The assumption behind all such statements is that there is a fixed amount of stuff — metals, oil, clean air, land — and that we risk exhausting it through our consumption.

[…]

But here’s a peculiar feature of human history: We burst through such limits again and again. After all, as a Saudi oil minister once said, the Stone Age didn’t end for lack of stone. Ecologists call this “niche construction” — that people (and indeed some other animals) can create new opportunities for themselves by making their habitats more productive in some way. Agriculture is the classic example of niche construction: We stopped relying on nature’s bounty and substituted an artificial and much larger bounty.

Economists call the same phenomenon innovation. What frustrates them about ecologists is the latter’s tendency to think in terms of static limits. Ecologists can’t seem to see that when whale oil starts to run out, petroleum is discovered, or that when farm yields flatten, fertilizer comes along, or that when glass fiber is invented, demand for copper falls.

That frustration is heartily reciprocated. Ecologists think that economists espouse a sort of superstitious magic called “markets” or “prices” to avoid confronting the reality of limits to growth. The easiest way to raise a cheer in a conference of ecologists is to make a rude joke about economists.

[…]

In 1972, the ecologist Paul Ehrlich of Stanford University came up with a simple formula called IPAT, which stated that the impact of humankind was equal to population multiplied by affluence multiplied again by technology. In other words, the damage done to Earth increases the more people there are, the richer they get and the more technology they have.

Many ecologists still subscribe to this doctrine, which has attained the status of holy writ in ecology. But the past 40 years haven’t been kind to it. In many respects, greater affluence and new technology have led to less human impact on the planet, not more. Richer people with new technologies tend not to collect firewood and bushmeat from natural forests; instead, they use electricity and farmed chicken — both of which need much less land. In 2006, Mr. Ausubel calculated that no country with a GDP per head greater than $4,600 has a falling stock of forest (in density as well as in acreage).

April 23, 2014

LibreSSL website – “This page scientifically designed to annoy web hipsters”

Filed under: Technology — Tags: , , , , , — Nicholas @ 09:24

Julian Sanchez linked to this Ars Technica piece on a new fork of OpenSSL:

OpenBSD founder Theo de Raadt has created a fork of OpenSSL, the widely used open source cryptographic software library that contained the notorious Heartbleed security vulnerability.

OpenSSL has suffered from a lack of funding and code contributions despite being used in websites and products by many of the world’s biggest and richest corporations.

The decision to fork OpenSSL is bound to be controversial given that OpenSSL powers hundreds of thousands of Web servers. When asked why he wanted to start over instead of helping to make OpenSSL better, de Raadt said the existing code is too much of a mess.

“Our group removed half of the OpenSSL source tree in a week. It was discarded leftovers,” de Raadt told Ars in an e-mail. “The Open Source model depends [on] people being able to read the code. It depends on clarity. That is not a clear code base, because their community does not appear to care about clarity. Obviously, when such cruft builds up, there is a cultural gap. I did not make this decision… in our larger development group, it made itself.”

The LibreSSL code base is on OpenBSD.org, and the project is supported financially by the OpenBSD Foundation and OpenBSD Project. LibreSSL has a bare bones website that is intentionally unappealing.

“This page scientifically designed to annoy web hipsters,” the site says. “Donate now to stop the Comic Sans and Blink Tags.” In explaining the decision to fork, the site links to a YouTube video of a cover of the Twisted Sister song “We’re not gonna take it.”

April 20, 2014

SpaceX Falcon F9R First Flight Test | 250m

Filed under: Space, Technology — Tags: — Nicholas @ 11:43

Published on 18 Apr 2014

Video of Falcon 9 Reusable (F9R) taking its first test flight at our rocket development facility. F9R lifts off from a launch mount to a height of approximately 250m, hovers and then returns for landing just next to the launch stand. Early flights of F9R will take off with legs fixed in the down position. However, we will soon be transitioning to liftoff with legs stowed against the side of the rocket and then extending them just before landing.

The F9R testing program is the next step towards reusability following completion of the Grasshopper program last year (Grasshopper can be seen in the background of this video). Future testing, including that in New Mexico, will be conducted using the first stage of a F9R as shown here, which is essentially a Falcon 9 v1.1 first stage with legs. F9R test flights in New Mexico will allow us to test at higher altitudes than we are permitted for at our test site in Texas, to do more with unpowered guidance and to prove out landing cases that are more-flight like.

April 17, 2014

The patented Heath Robinson/Rube Goldberg wine opener

Filed under: Technology, Wine — Tags: — Nicholas @ 08:12

Certainly the most SteamPunk wine-opener-and-pourer you’ll see this week:

H/T to Roger Henry for the link.

QotD: User interface design for all ages

Filed under: Humour, Media, Technology — Tags: , , — Nicholas @ 07:51

As your body staggers down the winding road to death, user interfaces that require fighter pilot-grade eyesight, the dexterity of a neurosurgeon, and the mental agility of Derren Brown, are going to screw with you at some point.

Don’t kid yourself otherwise — disability, in one form or another, can strike at any moment.

Given that people are proving ever harder to kill off, you can expect to have decades of life ahead of you — during which you’ll be battling to figure out where on the touchscreen that trendy transdimensional two-pixel wide “OK” button is hiding.

Can you believe, people born today will spend their entire lives having to cope with this crap? The only way I can explain the web design of many Google products today is that some wannabe Picasso stole Larry Page’s girl when they were all 13, and is only now exacting his revenge. Nobody makes things that bad by accident, surely?

Dominic Connor, “Is tech the preserve of the young able-bodied? Let’s talk over a fine dinner and claret”, The Register, 2014-04-17

April 16, 2014

QotD: The wizards of the web

Filed under: Business, Quotations, Technology — Tags: , , , , — Nicholas @ 08:28

You would have thought this would have sunk in by now. The fact that it hasn’t shows what an extraordinary machine the internet is — quite different to any technology that has gone before it. When the Lovebug struck, few of us lived our lives online. Back then we banked in branches, shopped in shops, met friends and lovers in the pub and obtained jobs by posting CVs. Tweeting was for the birds. Cyberspace was marginal. Now, for billions, the online world is their lives. But there is a problem. Only a tiny, tiny percentage of the people who use the internet have even the faintest clue about how any of it works. “SSL”, for instance, stands for “Secure Sockets Layer”.

I looked it up and sort of understood it — for about five minutes. While most drivers have at least a notion of how an engine works (something about petrol exploding in cylinders and making pistons go up and down and so forth) the very language of the internet — “domain names” and “DNS codes”, endless “protocols” and so forth — is arcane, exclusive; it is, in fact, the language of magic. For all intents and purposes the internet is run by wizards.

And the trouble with letting wizards run things is that when things go wrong we are at their mercy. The world spends several tens of billions of pounds a year on anti-malware programs, which we are exhorted to buy lest the walls of our digital castles collapse around us. Making security software is a huge industry, and whenever there is a problem — either caused by viruses or by a glitch like Heartbleed — the internet security companies rush to be quoted in the media. And guess what, their message is never “keep calm and carry on”. As Professor Ross Anderson of Cambridge University says: “Almost all the cost of cybercrime is the cost of anticipation.”

Michael Hanlon, “Relax, Mumsnet users: don’t lose sleep over Heartbleed hysteria”, Telegraph, 2014-04-16

April 14, 2014

SpaceX to test hover capability on next launch

Filed under: Space, Technology — Tags: , , , — Nicholas @ 08:25

In The Register, Brid-Aine Parnell explains what will be different about the next SpaceX launch to resupply the ISS:

NASA has said that SpaceX’s latest cargoship launch to the International Space Station will go ahead, despite a critical computer outage on the station, allowing the firm to test the craft’s hovering abilities.

[…]

The booster rocket that’s blasting the Dragon supply capsule into space is going to attempt to make a hovering soft landing after it’s disengaged and dropped back to Earth.

The spruced-up Falcon 9 has its own landing legs, which Elon Musk’s space tech company hopes will eventually make for precise set-downs on the surface of alien worlds. For this test though, the rocket will still be coming down over the ocean, just in case.

The launch is already a month late with its nearly 5,000 pounds of supplies and payloads, including VEGGIE, a new unit capable of growing salad vegetables for the ‘nauts to munch on. The ship was delayed from March after a ground-based radar system at Cape Canaveral was damaged.

April 13, 2014

Ephemeral apps and the NSA

Filed under: Business, Media, Technology — Tags: , , — Nicholas @ 11:21

Bruce Schneier on the rising popularity of apps that only leave your content visible briefly and then automatically removes it:

Ephemeral messaging apps such as Snapchat, Wickr and Frankly, all of which advertise that your photo, message or update will only be accessible for a short period, are on the rise. Snapchat and Frankly, for example, claim they permanently delete messages, photos and videos after 10 seconds. After that, there’s no record.

This notion is especially popular with young people, and these apps are an antidote to sites such as Facebook where everything you post lasts forever unless you take it down—and taking it down is no guarantee that it isn’t still available.

These ephemeral apps are the first concerted push against the permanence of Internet conversation. We started losing ephemeral conversation when computers began to mediate our communications. Computers naturally produce conversation records, and that data was often saved and archived.

[…]

At best, the data is recorded, used, saved and then deliberately deleted. At worst, the ephemeral nature is faked. While the apps make the posts, texts or messages unavailable to users quickly, they probably don’t erase them off their systems immediately. They certainly don’t erase them from their backup tapes, if they end up there.

The companies offering these apps might very well analyze their content and make that information available to advertisers. We don’t know how much metadata is saved. In SnapChat, users can see the metadata even though they can’t see the content and what it’s used for. And if the government demanded copies of those conversations — either through a secret NSA demand or a more normal legal process involving an employer or school — the companies would have no choice but to hand them over.

Even worse, if the FBI or NSA demanded that American companies secretly store those conversations and not tell their users, breaking their promise of deletion, the companies would have no choice but to comply.

That last bit isn’t just paranoia.

April 12, 2014

The Baaaa-studs 2009 – Extreme LED Sheep Art

Filed under: Britain, Humour, Technology — Tags: — Nicholas @ 15:38

April 11, 2014

Open source software and the Heartbleed bug

Filed under: Technology — Tags: , , , , — Nicholas @ 07:03

Some people are claiming that the Heartbleed bug proves that open source software is a failure. ESR quickly addresses that idiotic claim:

Heartbleed bugI actually chuckled when I read rumor that the few anti-open-source advocates still standing were crowing about the Heartbleed bug, because I’ve seen this movie before after every serious security flap in an open-source tool. The script, which includes a bunch of people indignantly exclaiming that many-eyeballs is useless because bug X lurked in a dusty corner for Y months, is so predictable that I can anticipate a lot of the lines.

The mistake being made here is a classic example of Frederic Bastiat’s “things seen versus things unseen”. Critics of Linus’s Law overweight the bug they can see and underweight the high probability that equivalently positioned closed-source security flaws they can’t see are actually far worse, just so far undiscovered.

That’s how it seems to go whenever we get a hint of the defect rate inside closed-source blobs, anyway. As a very pertinent example, in the last couple months I’ve learned some things about the security-defect density in proprietary firmware on residential and small business Internet routers that would absolutely curl your hair. It’s far, far worse than most people understand out there.

[…]

Ironically enough this will happen precisely because the open-source process is working … while, elsewhere, bugs that are far worse lurk in closed-source router firmware. Things seen vs. things unseen…

Returning to Heartbleed, one thing conspicuously missing from the downshouting against OpenSSL is any pointer to an implementation that is known to have a lower defect rate over time. This is for the very good reason that no such empirically-better implementation exists. What is the defect history on proprietary SSL/TLS blobs out there? We don’t know; the vendors aren’t saying. And we can’t even estimate the quality of their code, because we can’t audit it.

The response to the Heartbleed bug illustrates another huge advantage of open source: how rapidly we can push fixes. The repair for my Linux systems was a push-one-button fix less than two days after the bug hit the news. Proprietary-software customers will be lucky to see a fix within two months, and all too many of them will never see a fix patch.

Update: There are lots of sites offering tools to test whether a given site is vulnerable to the Heartbeat bug, but you need to step carefully there, as there’s a thin line between what’s legal in some countries and what counts as an illegal break-in attempt:

Websites and tools that have sprung up to check whether servers are vulnerable to OpenSSL’s mega-vulnerability Heartbleed have thrown up anomalies in computer crime law on both sides of the Atlantic.

Both the US Computer Fraud and Abuse Act and its UK equivalent the Computer Misuse Act make it an offence to test the security of third-party websites without permission.

Testing to see what version of OpenSSL a site is running, and whether it is also supports the vulnerable Heartbeat protocol, would be legal. But doing anything more active — without permission from website owners — would take security researchers onto the wrong side of the law.

And you shouldn’t just rush out and change all your passwords right now (you’ll probably need to do it, but the timing matters):

Heartbleed is a catastrophic bug in widely used OpenSSL that creates a means for attackers to lift passwords, crypto-keys and other sensitive data from the memory of secure server software, 64KB at a time. The mega-vulnerability was patched earlier this week, and software should be updated to use the new version, 1.0.1g. But to fully clean up the problem, admins of at-risk servers should generate new public-private key pairs, destroy their session cookies, and update their SSL certificates before telling users to change every potentially compromised password on the vulnerable systems.

April 9, 2014

XKCD on the impact of “Heartbleed”

Filed under: Technology — Tags: , , , , , — Nicholas @ 11:00

Update: In case you’re not concerned about the seriousness of this issue, The Register‘s John Leyden would like you to think again.

The catastrophic crypto key password vulnerability in OpenSSL affects far more than web servers, with everything from routers to smartphones also affected.

The so-called “Heartbleed” vulnerability (CVE-2014-0160) can be exploited to extract information from the servers running vulnerable version of OpenSSL, and this includes email servers and Android smartphones as well as routers.

Hackers could potentially gain access to private encryption key before using this information to decipher the encrypted traffic to and from vulnerable websites.

Web sites including Yahoo!, Flickr and OpenSSL were among the many left vulnerable to the megabug that exposed encryption keys, passwords and other sensitive information.

Preliminary tests suggested 47 of the 1000 largest sites are vulnerable to Heartbleed and that’s only among the less than half that provide support for SSL or HTTPS at all. Many of the affected sites – including Yahoo! – have since patched the vulnerability. Even so, security experts – such as Graham Cluley – remain concerned.

OpenSSL is a widely used encryption library that is a key component of technology that enables secure (https) website connections.

The bug exists in the OpenSSL 1.0.1 source code and stems from coding flaws in a fairly new feature known as the TLS Heartbeat Extension. “TLS heartbeats are used as ‘keep alive’ packets so that the ends of an encrypted connection can agree to keep the session open even when they don’t have any official data to exchange,” explains security veteran Paul Ducklin in a post on Sophos’ Naked Security blog.

The Heartbleed vulnerability in the OpenSSL cryptographic library might be exploited to reveal contents of secured communication exchanges. The same flaw might also be used to lift SSL keys.

This means that sites could still be vulnerable to attacks after installing the patches in cases where a private key has been stolen. Sites therefore need to revoke exposed keys, reissue new keys, and invalidate all session keys and session cookies.

Bruce Schneier:

“Catastrophic” is the right word. On the scale of 1 to 10, this is an 11.

Half a million sites are vulnerable, including my own. Test your vulnerability here.

The bug has been patched. After you patch your systems, you have to get a new public/private key pair, update your SSL certificate, and then change every password that could potentially be affected.

At this point, the probability is close to one that every target has had its private keys extracted by multiple intelligence agencies. The real question is whether or not someone deliberately inserted this bug into OpenSSL, and has had two years of unfettered access to everything. My guess is accident, but I have no proof.

April 7, 2014

US government data security failures

Filed under: Bureaucracy, Government, Technology — Tags: , , , , — Nicholas @ 09:02

David Gewirtz says that the press has totally mis-reported the scale of government security breaches:

Summary: This is one of those articles that spoils your faith in mankind. Not only are government security incidents fully into holy-cow territory, the press is reporting numbers three magnitudes too low because someone misread a chart and everyone else copied that report.

You might think this was an April Fool’s gag, except it was published on April 2nd, not April 1st.

According to testimony given by Gregory C. Wilshusen [PDF], Director of Information Security Issues for the Government Accountability Office to United States Senate Committee on Homeland Security and Governmental Affairs that, and I quote, “most major federal agencies had weaknesses in major categories of information security controls.”

In other words, some government agency data security functions more like a sieve than a lockbox.

Some of the data the GAO presented was deeply disturbing. For example, the number of successful breaches doubled since 2009. Doubled. There’s also a story inside this story, which I’ll discuss later in the article. Almost all of the press reporting on this testimony got the magnitude of the breach wrong. Most reported that government security incidents numbered in the thousands, when, in fact, they numbered in the millions.

Emphasis mine. Here are the actual numbers:

Incidents involving personal identifying information grew from about 10.5 million in 2009 to over 25 million last year. By the way, some press reports on this misread the GAO’s charts. For example, the Washington Free Beacon wrote about this, claiming “25,566 incidents of lost taxpayer data, Social Security numbers, patient health information.” What they missed was the little notation on the chart that says “in thousands,” so when they reported 25,566 incidents, what that really reads as is 25,566 x 1000 incidents.

2014 GAO analysis of security breaches

This is an example of how the Internet echo chamber can get information very, very wrong. The Chicago Tribune, via Reuters reported the same incorrect statistic. So did InformationWeek. So did FierceHealthIT. Business Insider picked up the Reuters report and happily repeated the same statistic — which was three orders of magnitude incorrect.

This is why I always try to go to the original source material [PDF] and not just repeat the crap other writers are parroting. It’s more work, but it means the difference between reporting 25 thousand government breaches and 25 million government breaches. 25 thousand is disturbing. 25 million is horrifying.

Big data’s promises and limitations

Filed under: Economics, Science, Technology — Tags: , — Nicholas @ 07:06

In the New York Times, Gary Marcus and Ernest Davis examine the big claims being made for the big data revolution:

Is big data really all it’s cracked up to be? There is no doubt that big data is a valuable tool that has already had a critical impact in certain areas. For instance, almost every successful artificial intelligence computer program in the last 20 years, from Google’s search engine to the I.B.M. Jeopardy! champion Watson, has involved the substantial crunching of large bodies of data. But precisely because of its newfound popularity and growing use, we need to be levelheaded about what big data can — and can’t — do.

The first thing to note is that although big data is very good at detecting correlations, especially subtle correlations that an analysis of smaller data sets might miss, it never tells us which correlations are meaningful. A big data analysis might reveal, for instance, that from 2006 to 2011 the United States murder rate was well correlated with the market share of Internet Explorer: Both went down sharply. But it’s hard to imagine there is any causal relationship between the two. Likewise, from 1998 to 2007 the number of new cases of autism diagnosed was extremely well correlated with sales of organic food (both went up sharply), but identifying the correlation won’t by itself tell us whether diet has anything to do with autism.

Second, big data can work well as an adjunct to scientific inquiry but rarely succeeds as a wholesale replacement. Molecular biologists, for example, would very much like to be able to infer the three-dimensional structure of proteins from their underlying DNA sequence, and scientists working on the problem use big data as one tool among many. But no scientist thinks you can solve this problem by crunching data alone, no matter how powerful the statistical analysis; you will always need to start with an analysis that relies on an understanding of physics and biochemistry.

April 3, 2014

ESR reviews Jeremy Rifkin’s latest book

Filed under: Books, Economics, Media, Technology — Tags: , , , , — Nicholas @ 10:46

The publisher sent a copy of The Zero Marginal Cost Society along with a note that Rifkin himself wanted ESR to receive a copy (because Rifkin thinks ESR is a good representative of some of the concepts in the book). ESR isn’t impressed:

In this book, Rifkin is fascinated by the phenomenon of goods for which the marginal cost of production is zero, or so close to zero that it can be ignored. All of the present-day examples of these he points at are information goods — software, music, visual art, novels. He joins this to the overarching obsession of all his books, which are variations on a theme of “Let us write an epitaph for capitalism”.

In doing so, Rifkin effectively ignores what capitalists do and what capitalism actually is. “Capital” is wealth paying for setup costs. Even for pure information goods those costs can be quite high. Music is a good example; it has zero marginal cost to reproduce, but the first copy is expensive. Musicians must own expensive instruments, be paid to perform, and require other capital goods such as recording studios. If those setup costs are not reliably priced into the final good, production of music will not remain economically viable.

[…]

Rifkin cites me in his book, but it is evident that he almost completely misunderstood my arguments in two different way, both of which bear on the premises of his book.

First, software has a marginal cost of production that is effectively zero, but that’s true of all software rather than just open source. What makes open source economically viable is the strength of secondary markets in support and related services. Most other kinds of information goods don’t have these. Thus, the economics favoring open source in software are not universal even in pure information goods.

Second, even in software — with those strong secondary markets — open-source development relies on the capital goods of software production being cheap. When computers were expensive, the economics of mass industrialization and its centralized management structures ruled them. Rifkin acknowledges that this is true of a wide variety of goods, but never actually grapples with the question of how to pull capital costs of those other goods down to the point where they no longer dominate marginal costs.

There are two other, much larger, holes below the waterline of Rifkin’s thesis. One is that atoms are heavy. The other is that human attention doesn’t get cheaper as you buy more of it. In fact, the opposite tends to be true — which is exactly why capitalists can make a lot of money by substituting capital goods for labor.

These are very stubborn cost drivers. They’re the reason Rifkin’s breathless hopes for 3-D printing will not be fulfilled. Because 3-D printers require feedstock, the marginal cost of producing goods with them has a floor well above zero. That ABS plastic, or whatever, has to be produced. Then it has to be moved to where the printer is. Then somebody has to operate the printer. Then the finished good has to be moved to the point of use. None of these operations has a cost that is driven to zero, or near zero at scale. 3-D printing can increase efficiency by outcompeting some kinds of mass production, but it can’t make production costs go away.

« Newer PostsOlder Posts »

Powered by WordPress