Quotulatiousness

April 23, 2014

LibreSSL website – “This page scientifically designed to annoy web hipsters”

Filed under: Technology — Tags: , , , , — Nicholas Russon @ 09:24

Julian Sanchez linked to this Ars Technica piece on a new fork of OpenSSL:

OpenBSD founder Theo de Raadt has created a fork of OpenSSL, the widely used open source cryptographic software library that contained the notorious Heartbleed security vulnerability.

OpenSSL has suffered from a lack of funding and code contributions despite being used in websites and products by many of the world’s biggest and richest corporations.

The decision to fork OpenSSL is bound to be controversial given that OpenSSL powers hundreds of thousands of Web servers. When asked why he wanted to start over instead of helping to make OpenSSL better, de Raadt said the existing code is too much of a mess.

“Our group removed half of the OpenSSL source tree in a week. It was discarded leftovers,” de Raadt told Ars in an e-mail. “The Open Source model depends [on] people being able to read the code. It depends on clarity. That is not a clear code base, because their community does not appear to care about clarity. Obviously, when such cruft builds up, there is a cultural gap. I did not make this decision… in our larger development group, it made itself.”

The LibreSSL code base is on OpenBSD.org, and the project is supported financially by the OpenBSD Foundation and OpenBSD Project. LibreSSL has a bare bones website that is intentionally unappealing.

“This page scientifically designed to annoy web hipsters,” the site says. “Donate now to stop the Comic Sans and Blink Tags.” In explaining the decision to fork, the site links to a YouTube video of a cover of the Twisted Sister song “We’re not gonna take it.”

April 20, 2014

SpaceX Falcon F9R First Flight Test | 250m

Filed under: Space, Technology — Tags: — Nicholas Russon @ 11:43

Published on 18 Apr 2014

Video of Falcon 9 Reusable (F9R) taking its first test flight at our rocket development facility. F9R lifts off from a launch mount to a height of approximately 250m, hovers and then returns for landing just next to the launch stand. Early flights of F9R will take off with legs fixed in the down position. However, we will soon be transitioning to liftoff with legs stowed against the side of the rocket and then extending them just before landing.

The F9R testing program is the next step towards reusability following completion of the Grasshopper program last year (Grasshopper can be seen in the background of this video). Future testing, including that in New Mexico, will be conducted using the first stage of a F9R as shown here, which is essentially a Falcon 9 v1.1 first stage with legs. F9R test flights in New Mexico will allow us to test at higher altitudes than we are permitted for at our test site in Texas, to do more with unpowered guidance and to prove out landing cases that are more-flight like.

April 17, 2014

The patented Heath Robinson/Rube Goldberg wine opener

Filed under: Technology, Wine — Tags: — Nicholas Russon @ 08:12

Certainly the most SteamPunk wine-opener-and-pourer you’ll see this week:

H/T to Roger Henry for the link.

QotD: User interface design for all ages

Filed under: Humour, Media, Technology — Tags: , , — Nicholas Russon @ 07:51

As your body staggers down the winding road to death, user interfaces that require fighter pilot-grade eyesight, the dexterity of a neurosurgeon, and the mental agility of Derren Brown, are going to screw with you at some point.

Don’t kid yourself otherwise — disability, in one form or another, can strike at any moment.

Given that people are proving ever harder to kill off, you can expect to have decades of life ahead of you — during which you’ll be battling to figure out where on the touchscreen that trendy transdimensional two-pixel wide “OK” button is hiding.

Can you believe, people born today will spend their entire lives having to cope with this crap? The only way I can explain the web design of many Google products today is that some wannabe Picasso stole Larry Page’s girl when they were all 13, and is only now exacting his revenge. Nobody makes things that bad by accident, surely?

Dominic Connor, “Is tech the preserve of the young able-bodied? Let’s talk over a fine dinner and claret”, The Register, 2014-04-17

April 16, 2014

QotD: The wizards of the web

Filed under: Business, Quotations, Technology — Tags: , , , , — Nicholas Russon @ 08:28

You would have thought this would have sunk in by now. The fact that it hasn’t shows what an extraordinary machine the internet is — quite different to any technology that has gone before it. When the Lovebug struck, few of us lived our lives online. Back then we banked in branches, shopped in shops, met friends and lovers in the pub and obtained jobs by posting CVs. Tweeting was for the birds. Cyberspace was marginal. Now, for billions, the online world is their lives. But there is a problem. Only a tiny, tiny percentage of the people who use the internet have even the faintest clue about how any of it works. “SSL”, for instance, stands for “Secure Sockets Layer”.

I looked it up and sort of understood it — for about five minutes. While most drivers have at least a notion of how an engine works (something about petrol exploding in cylinders and making pistons go up and down and so forth) the very language of the internet — “domain names” and “DNS codes”, endless “protocols” and so forth — is arcane, exclusive; it is, in fact, the language of magic. For all intents and purposes the internet is run by wizards.

And the trouble with letting wizards run things is that when things go wrong we are at their mercy. The world spends several tens of billions of pounds a year on anti-malware programs, which we are exhorted to buy lest the walls of our digital castles collapse around us. Making security software is a huge industry, and whenever there is a problem — either caused by viruses or by a glitch like Heartbleed — the internet security companies rush to be quoted in the media. And guess what, their message is never “keep calm and carry on”. As Professor Ross Anderson of Cambridge University says: “Almost all the cost of cybercrime is the cost of anticipation.”

Michael Hanlon, “Relax, Mumsnet users: don’t lose sleep over Heartbleed hysteria”, Telegraph, 2014-04-16

April 14, 2014

SpaceX to test hover capability on next launch

Filed under: Space, Technology — Tags: , , , — Nicholas Russon @ 08:25

In The Register, Brid-Aine Parnell explains what will be different about the next SpaceX launch to resupply the ISS:

NASA has said that SpaceX’s latest cargoship launch to the International Space Station will go ahead, despite a critical computer outage on the station, allowing the firm to test the craft’s hovering abilities.

[...]

The booster rocket that’s blasting the Dragon supply capsule into space is going to attempt to make a hovering soft landing after it’s disengaged and dropped back to Earth.

The spruced-up Falcon 9 has its own landing legs, which Elon Musk’s space tech company hopes will eventually make for precise set-downs on the surface of alien worlds. For this test though, the rocket will still be coming down over the ocean, just in case.

The launch is already a month late with its nearly 5,000 pounds of supplies and payloads, including VEGGIE, a new unit capable of growing salad vegetables for the ‘nauts to munch on. The ship was delayed from March after a ground-based radar system at Cape Canaveral was damaged.

April 13, 2014

Ephemeral apps and the NSA

Filed under: Business, Media, Technology — Tags: , , — Nicholas Russon @ 11:21

Bruce Schneier on the rising popularity of apps that only leave your content visible briefly and then automatically removes it:

Ephemeral messaging apps such as Snapchat, Wickr and Frankly, all of which advertise that your photo, message or update will only be accessible for a short period, are on the rise. Snapchat and Frankly, for example, claim they permanently delete messages, photos and videos after 10 seconds. After that, there’s no record.

This notion is especially popular with young people, and these apps are an antidote to sites such as Facebook where everything you post lasts forever unless you take it down—and taking it down is no guarantee that it isn’t still available.

These ephemeral apps are the first concerted push against the permanence of Internet conversation. We started losing ephemeral conversation when computers began to mediate our communications. Computers naturally produce conversation records, and that data was often saved and archived.

[...]

At best, the data is recorded, used, saved and then deliberately deleted. At worst, the ephemeral nature is faked. While the apps make the posts, texts or messages unavailable to users quickly, they probably don’t erase them off their systems immediately. They certainly don’t erase them from their backup tapes, if they end up there.

The companies offering these apps might very well analyze their content and make that information available to advertisers. We don’t know how much metadata is saved. In SnapChat, users can see the metadata even though they can’t see the content and what it’s used for. And if the government demanded copies of those conversations — either through a secret NSA demand or a more normal legal process involving an employer or school — the companies would have no choice but to hand them over.

Even worse, if the FBI or NSA demanded that American companies secretly store those conversations and not tell their users, breaking their promise of deletion, the companies would have no choice but to comply.

That last bit isn’t just paranoia.

April 12, 2014

The Baaaa-studs 2009 – Extreme LED Sheep Art

Filed under: Britain, Humour, Technology — Tags: — Nicholas Russon @ 15:38

April 11, 2014

Open source software and the Heartbleed bug

Filed under: Technology — Tags: , , , , — Nicholas Russon @ 07:03

Some people are claiming that the Heartbleed bug proves that open source software is a failure. ESR quickly addresses that idiotic claim:

Heartbleed bugI actually chuckled when I read rumor that the few anti-open-source advocates still standing were crowing about the Heartbleed bug, because I’ve seen this movie before after every serious security flap in an open-source tool. The script, which includes a bunch of people indignantly exclaiming that many-eyeballs is useless because bug X lurked in a dusty corner for Y months, is so predictable that I can anticipate a lot of the lines.

The mistake being made here is a classic example of Frederic Bastiat’s “things seen versus things unseen”. Critics of Linus’s Law overweight the bug they can see and underweight the high probability that equivalently positioned closed-source security flaws they can’t see are actually far worse, just so far undiscovered.

That’s how it seems to go whenever we get a hint of the defect rate inside closed-source blobs, anyway. As a very pertinent example, in the last couple months I’ve learned some things about the security-defect density in proprietary firmware on residential and small business Internet routers that would absolutely curl your hair. It’s far, far worse than most people understand out there.

[...]

Ironically enough this will happen precisely because the open-source process is working … while, elsewhere, bugs that are far worse lurk in closed-source router firmware. Things seen vs. things unseen…

Returning to Heartbleed, one thing conspicuously missing from the downshouting against OpenSSL is any pointer to an implementation that is known to have a lower defect rate over time. This is for the very good reason that no such empirically-better implementation exists. What is the defect history on proprietary SSL/TLS blobs out there? We don’t know; the vendors aren’t saying. And we can’t even estimate the quality of their code, because we can’t audit it.

The response to the Heartbleed bug illustrates another huge advantage of open source: how rapidly we can push fixes. The repair for my Linux systems was a push-one-button fix less than two days after the bug hit the news. Proprietary-software customers will be lucky to see a fix within two months, and all too many of them will never see a fix patch.

Update: There are lots of sites offering tools to test whether a given site is vulnerable to the Heartbeat bug, but you need to step carefully there, as there’s a thin line between what’s legal in some countries and what counts as an illegal break-in attempt:

Websites and tools that have sprung up to check whether servers are vulnerable to OpenSSL’s mega-vulnerability Heartbleed have thrown up anomalies in computer crime law on both sides of the Atlantic.

Both the US Computer Fraud and Abuse Act and its UK equivalent the Computer Misuse Act make it an offence to test the security of third-party websites without permission.

Testing to see what version of OpenSSL a site is running, and whether it is also supports the vulnerable Heartbeat protocol, would be legal. But doing anything more active — without permission from website owners — would take security researchers onto the wrong side of the law.

And you shouldn’t just rush out and change all your passwords right now (you’ll probably need to do it, but the timing matters):

Heartbleed is a catastrophic bug in widely used OpenSSL that creates a means for attackers to lift passwords, crypto-keys and other sensitive data from the memory of secure server software, 64KB at a time. The mega-vulnerability was patched earlier this week, and software should be updated to use the new version, 1.0.1g. But to fully clean up the problem, admins of at-risk servers should generate new public-private key pairs, destroy their session cookies, and update their SSL certificates before telling users to change every potentially compromised password on the vulnerable systems.

April 9, 2014

XKCD on the impact of “Heartbleed”

Filed under: Technology — Tags: , , , , , — Nicholas Russon @ 11:00

Update: In case you’re not concerned about the seriousness of this issue, The Register‘s John Leyden would like you to think again.

The catastrophic crypto key password vulnerability in OpenSSL affects far more than web servers, with everything from routers to smartphones also affected.

The so-called “Heartbleed” vulnerability (CVE-2014-0160) can be exploited to extract information from the servers running vulnerable version of OpenSSL, and this includes email servers and Android smartphones as well as routers.

Hackers could potentially gain access to private encryption key before using this information to decipher the encrypted traffic to and from vulnerable websites.

Web sites including Yahoo!, Flickr and OpenSSL were among the many left vulnerable to the megabug that exposed encryption keys, passwords and other sensitive information.

Preliminary tests suggested 47 of the 1000 largest sites are vulnerable to Heartbleed and that’s only among the less than half that provide support for SSL or HTTPS at all. Many of the affected sites – including Yahoo! – have since patched the vulnerability. Even so, security experts – such as Graham Cluley – remain concerned.

OpenSSL is a widely used encryption library that is a key component of technology that enables secure (https) website connections.

The bug exists in the OpenSSL 1.0.1 source code and stems from coding flaws in a fairly new feature known as the TLS Heartbeat Extension. “TLS heartbeats are used as ‘keep alive’ packets so that the ends of an encrypted connection can agree to keep the session open even when they don’t have any official data to exchange,” explains security veteran Paul Ducklin in a post on Sophos’ Naked Security blog.

The Heartbleed vulnerability in the OpenSSL cryptographic library might be exploited to reveal contents of secured communication exchanges. The same flaw might also be used to lift SSL keys.

This means that sites could still be vulnerable to attacks after installing the patches in cases where a private key has been stolen. Sites therefore need to revoke exposed keys, reissue new keys, and invalidate all session keys and session cookies.

Bruce Schneier:

“Catastrophic” is the right word. On the scale of 1 to 10, this is an 11.

Half a million sites are vulnerable, including my own. Test your vulnerability here.

The bug has been patched. After you patch your systems, you have to get a new public/private key pair, update your SSL certificate, and then change every password that could potentially be affected.

At this point, the probability is close to one that every target has had its private keys extracted by multiple intelligence agencies. The real question is whether or not someone deliberately inserted this bug into OpenSSL, and has had two years of unfettered access to everything. My guess is accident, but I have no proof.

April 7, 2014

US government data security failures

Filed under: Bureaucracy, Government, Technology — Tags: , , , , — Nicholas Russon @ 09:02

David Gewirtz says that the press has totally mis-reported the scale of government security breaches:

Summary: This is one of those articles that spoils your faith in mankind. Not only are government security incidents fully into holy-cow territory, the press is reporting numbers three magnitudes too low because someone misread a chart and everyone else copied that report.

You might think this was an April Fool’s gag, except it was published on April 2nd, not April 1st.

According to testimony given by Gregory C. Wilshusen [PDF], Director of Information Security Issues for the Government Accountability Office to United States Senate Committee on Homeland Security and Governmental Affairs that, and I quote, “most major federal agencies had weaknesses in major categories of information security controls.”

In other words, some government agency data security functions more like a sieve than a lockbox.

Some of the data the GAO presented was deeply disturbing. For example, the number of successful breaches doubled since 2009. Doubled. There’s also a story inside this story, which I’ll discuss later in the article. Almost all of the press reporting on this testimony got the magnitude of the breach wrong. Most reported that government security incidents numbered in the thousands, when, in fact, they numbered in the millions.

Emphasis mine. Here are the actual numbers:

Incidents involving personal identifying information grew from about 10.5 million in 2009 to over 25 million last year. By the way, some press reports on this misread the GAO’s charts. For example, the Washington Free Beacon wrote about this, claiming “25,566 incidents of lost taxpayer data, Social Security numbers, patient health information.” What they missed was the little notation on the chart that says “in thousands,” so when they reported 25,566 incidents, what that really reads as is 25,566 x 1000 incidents.

2014 GAO analysis of security breaches

This is an example of how the Internet echo chamber can get information very, very wrong. The Chicago Tribune, via Reuters reported the same incorrect statistic. So did InformationWeek. So did FierceHealthIT. Business Insider picked up the Reuters report and happily repeated the same statistic — which was three orders of magnitude incorrect.

This is why I always try to go to the original source material [PDF] and not just repeat the crap other writers are parroting. It’s more work, but it means the difference between reporting 25 thousand government breaches and 25 million government breaches. 25 thousand is disturbing. 25 million is horrifying.

Big data’s promises and limitations

Filed under: Economics, Science, Technology — Tags: , — Nicholas Russon @ 07:06

In the New York Times, Gary Marcus and Ernest Davis examine the big claims being made for the big data revolution:

Is big data really all it’s cracked up to be? There is no doubt that big data is a valuable tool that has already had a critical impact in certain areas. For instance, almost every successful artificial intelligence computer program in the last 20 years, from Google’s search engine to the I.B.M. Jeopardy! champion Watson, has involved the substantial crunching of large bodies of data. But precisely because of its newfound popularity and growing use, we need to be levelheaded about what big data can — and can’t — do.

The first thing to note is that although big data is very good at detecting correlations, especially subtle correlations that an analysis of smaller data sets might miss, it never tells us which correlations are meaningful. A big data analysis might reveal, for instance, that from 2006 to 2011 the United States murder rate was well correlated with the market share of Internet Explorer: Both went down sharply. But it’s hard to imagine there is any causal relationship between the two. Likewise, from 1998 to 2007 the number of new cases of autism diagnosed was extremely well correlated with sales of organic food (both went up sharply), but identifying the correlation won’t by itself tell us whether diet has anything to do with autism.

Second, big data can work well as an adjunct to scientific inquiry but rarely succeeds as a wholesale replacement. Molecular biologists, for example, would very much like to be able to infer the three-dimensional structure of proteins from their underlying DNA sequence, and scientists working on the problem use big data as one tool among many. But no scientist thinks you can solve this problem by crunching data alone, no matter how powerful the statistical analysis; you will always need to start with an analysis that relies on an understanding of physics and biochemistry.

April 3, 2014

ESR reviews Jeremy Rifkin’s latest book

Filed under: Economics, Media, Technology — Tags: , , , , — Nicholas Russon @ 10:46

The publisher sent a copy of The Zero Marginal Cost Society along with a note that Rifkin himself wanted ESR to receive a copy (because Rifkin thinks ESR is a good representative of some of the concepts in the book). ESR isn’t impressed:

In this book, Rifkin is fascinated by the phenomenon of goods for which the marginal cost of production is zero, or so close to zero that it can be ignored. All of the present-day examples of these he points at are information goods — software, music, visual art, novels. He joins this to the overarching obsession of all his books, which are variations on a theme of “Let us write an epitaph for capitalism”.

In doing so, Rifkin effectively ignores what capitalists do and what capitalism actually is. “Capital” is wealth paying for setup costs. Even for pure information goods those costs can be quite high. Music is a good example; it has zero marginal cost to reproduce, but the first copy is expensive. Musicians must own expensive instruments, be paid to perform, and require other capital goods such as recording studios. If those setup costs are not reliably priced into the final good, production of music will not remain economically viable.

[...]

Rifkin cites me in his book, but it is evident that he almost completely misunderstood my arguments in two different way, both of which bear on the premises of his book.

First, software has a marginal cost of production that is effectively zero, but that’s true of all software rather than just open source. What makes open source economically viable is the strength of secondary markets in support and related services. Most other kinds of information goods don’t have these. Thus, the economics favoring open source in software are not universal even in pure information goods.

Second, even in software — with those strong secondary markets — open-source development relies on the capital goods of software production being cheap. When computers were expensive, the economics of mass industrialization and its centralized management structures ruled them. Rifkin acknowledges that this is true of a wide variety of goods, but never actually grapples with the question of how to pull capital costs of those other goods down to the point where they no longer dominate marginal costs.

There are two other, much larger, holes below the waterline of Rifkin’s thesis. One is that atoms are heavy. The other is that human attention doesn’t get cheaper as you buy more of it. In fact, the opposite tends to be true — which is exactly why capitalists can make a lot of money by substituting capital goods for labor.

These are very stubborn cost drivers. They’re the reason Rifkin’s breathless hopes for 3-D printing will not be fulfilled. Because 3-D printers require feedstock, the marginal cost of producing goods with them has a floor well above zero. That ABS plastic, or whatever, has to be produced. Then it has to be moved to where the printer is. Then somebody has to operate the printer. Then the finished good has to be moved to the point of use. None of these operations has a cost that is driven to zero, or near zero at scale. 3-D printing can increase efficiency by outcompeting some kinds of mass production, but it can’t make production costs go away.

April 2, 2014

Enigma’s 21st century open sourced descendent

Filed under: History, Military, Technology — Tags: , , , — Nicholas Russon @ 09:51

The Enigma device was used by the German military in World War 2 to encrypt and decrypt communication between units and headquarters on land and at sea. Original Enigma units — the few that are on the market at any time — sell for tens of thousands of dollars. You may not be able to afford an original, but you might be interested in a modern implementation of Enigma using Arduino-based open-source hardware and software:

Actual hand-crafted Final design

Actual hand-crafted Final design

Enigma machines have captivated everyone from legendary code breaker Alan Turing and the dedicated cryptographers from England’s Bletchley Park to historians and collectors the world over.

But while many history buffs would surely love to get their hands on an authentic Enigma machine used during WWII, the devices aren’t exactly affordable (last year, a 1944 German Enigma machine was available for auction at Bonhams with an estimated worth of up to $82,000). Enter the Open Enigma Project, a kit for building one from scratch.

The idea came to Marc Tessier and James Sanderson from S&T Geotronics by accident.

“We were working on designing and building intelligent Arduino-based open-source geocaching devices to produce a unique interactive challenge at an upcoming Geocaching Mega Event,” Tessier told Crave. “A friend of ours suggested we use an Enigma type encrypting/decrypting machine as the ultimate stage of the challenge and pointed us to an Instructables tutorial that used a kid’s toy to provide some Enigma encoding. We looked all over to buy a real Enigma machine even if we had to assemble it ourselves and realized that there was nothing available at the moment. So we decided to build our own.”

[...]

“Our version is an electronic microprocessor-based machine that is running software which is a mathematical expression of how the historical mechanical machine behaved,” Sanderson told Crave. “Having never touched a real Enigma M4, we built our open version based on what we read online. From what we understand, the real electro-mechanical devices are much heavier and a little bigger.”

They took some design liberties — replacing the physical rotors with LED units and replacing the light bulbs with white LEDs. The replica can be modified by changing the Arduino code and can communicate to any computer via USB. Future versions may include Wi-Fi and/or Bluetooth.

People are less inclined to shop or bank online after NSA surveillance reports

Filed under: Business, Government, Technology — Tags: , , , , , — Nicholas Russon @ 08:46

Among the side-effects of government surveillance revelations, ordinary people are deciding to be a bit less involved in online activities, according to a new Harris Poll:

Online banking and shopping in America are being negatively impacted by ongoing revelations about the National Security Agency’s digital surveillance activities. That is the clear implication of a recent ESET-commissioned Harris poll which asked more than 2,000 U.S. adults ages 18 and older whether or not, given the news about the NSA’s activities, they have changed their approach to online activity.

Almost half of respondents (47%) said that they have changed their online behavior and think more carefully about where they go, what they say, and what they do online.

When it comes to specific Internet activities, such as email or online banking, this change in behavior translates into a worrying trend for the online economy: over one quarter of respondents (26%) said that, based on what they have learned about secret government surveillance, they are now doing less banking online and less online shopping. This shift in behavior is not good news for companies that rely on sustained or increased use of the Internet for their business model.

[...]

Whether or not we have seen the full extent of the public’s reaction to state-sponsored mass surveillance is hard to predict, but based on this survey and the one we did last year, I would say that, if the NSA revelations continue – and I am sure they will – and if government reassurances fail to impress the public, then it is possible that the trends in behavior we are seeing right now will continue. For example, I do not see many people finding reassurance in President Obama’s recently announced plan to transfer the storage of millions of telephone records from the government to private phone companies. As we will document in our next installment of survey findings, data gathering by companies is even more of a privacy concern for some Americans than government surveillance.

And in case anyone is tempted to think that this is a narrow issue of concern only to news junkies and security geeks, let me be clear: according to this latest survey, 85% of adult Americans are now at least somewhat familiar with the news about secret government surveillance of private citizens’ phone calls, emails, online activity, and so on.

Older Posts »
« « Comparing scandals – Toyota’s phantom acceleration and GM’s ignition switches| Dog adoptions – the economics are trickier than you think » »

Powered by WordPress

%d bloggers like this: