Quotulatiousness

May 16, 2014

The built-in confusion about net neutrality

While I’ve been following the net neutrality debate, I was still unconvinced that either side had the answers. In a post from 2008, ESR helps to explain why I was confused:

Let it be clear from the outset that the telcos are putting their case for being allowed to do these things with breathtaking hypocrisy. They honk about how awful it is that regulation keeps them from setting their own terms, blithely ignoring the fact that their last-mile monopoly is entirely a creature of regulation. In effect, Theodore Vail and the old Bell System bribed the Feds to steal the last mile out from under the public’s nose between 1878 and 1920; the wireline telcos have been squatting on that unnatural monopoly ever since as if they actually had some legitimate property right to it.

But the telcos’ crimes aren’t merely historical. They have repeatedly bargained for the right to exclude competitors from their networks on the grounds that if the regulators would let them do that, they’d be able to generate enough capital to deploy broadband everywhere. That promise has been repeatedly, egregiously broken. Instead, they’ve creamed off that monopoly rent as profit or used it to cross-subsidize competition in businesses with higher rates of return. (Oh, and of course, to bribe legislators and buy regulators.)

Mistake #1 for libertarians to avoid is falling for the telcos’ “we’re pro-free market” bullshit. They’re anything but; what they really want is a politically sheltered monopoly in which they have captured the regulators and created business conditions that fetter everyone but them.

OK, so if the telcos are such villainous scum, the pro-network-neutrality activists must be the heroes of this story, right?

Unfortunately, no.

Your typical network-neutrality activist is a good-government left-liberal who is instinctively hostile to market-based approaches. These people think, rather, that if they can somehow come up with the right regulatory formula, they can jawbone the government into making the telcos play nice. They’re ideologically incapable of questioning the assumption that bandwidth is a scarce “public good” that has to be regulated. They don’t get it that complicated regulations favor the incumbent who can afford to darken the sky with lawyers, and they really don’t get it about outright regulatory capture, a game at which the telcos are past masters.

[…]

In short, the “network neutrality” crowd is mainly composed of well-meaning fools blinded by their own statism, and consequently serving mainly as useful idiots for the telcos’ program of ever-more labyrinthine and manipulable regulation. If I were a telco executive, I’d be on my knees every night thanking my god(s) for this “opposition”. Mistake #2 for any libertarian to avoid is backing these clowns.

In the comments, he summarizes “the history of the Bell System’s theft of the last mile”.

May 11, 2014

The NSA worked very hard to set themselves up for the Snowden leaks

Filed under: Government, Liberty, Technology — Tags: , , , , , — Nicholas @ 10:30

A few days back, Charles Stross pointed out one of the most ironic points of interest in the NSA scandal … they did it to themselves, over the course of several years effort:

I don’t need to tell you about the global surveillance disclosures of 2013 to the present — it’s no exaggeration to call them the biggest secret intelligence leak in history, a monumental gaffe (from the perspective of the espionage-industrial complex) and a security officer’s worst nightmare.

But it occurs to me that it’s worth pointing out that the NSA set themselves up for it by preventing the early internet specifications from including transport layer encryption.

At every step in the development of the public internet the NSA systematically lobbied for weaker security, to enhance their own information-gathering capabilities. The trouble is, the success of the internet protocols created a networking monoculture that the NSA themselves came to rely on for their internal infrastructure. The same security holes that the NSA relied on to gain access to your (or Osama bin Laden’s) email allowed gangsters to steal passwords and login credentials and credit card numbers. And ultimately these same baked-in security holes allowed Edward Snowden — who, let us remember, is merely one guy: a talented system administrator and programmer, but no Clark Kent — to rampage through their internal information systems.

The moral of the story is clear: be very cautious about poisoning the banquet you serve your guests, lest you end up accidentally ingesting it yourself.

May 6, 2014

Reset the Net on June 5th

Filed under: Liberty, Media, Technology — Tags: , , , — Nicholas @ 09:58

At Wired, Kim Zetter talks about an initiative to reclaim (some measure of) privacy on the internet:

A coalition of nearly two-dozen tech companies and civil liberties groups is launching a new fight against mass internet surveillance, hoping to battle the NSA in much the same way online campaigners pushed back on bad piracy legislation in 2012.

The new coalition, organized by Fight for the Future, is planning a Reset the Net day of action on June 5, the anniversary of the date the first Edward Snowden story broke detailing the government’s PRISM program, based on documents leaked by the former NSA contractor.

“Government spies have a weakness: they can hack anybody, but they can’t hack everybody,” the organizers behind the Reset the Net movement say in their video (above). “Folks like the NSA depend on collecting insecure data from tapped fiber. They depend on our mistakes, mistakes we can fix.”

To that end, the groups are calling on developers to add at least one NSA resistant feature to mobile apps, and on websites to add security features like SSL (Secure Socket Layer), HSTS (HTTP Strict Transport Security), and Perfect Forward Secrecy to better secure the communication of users and thwart government man-in-the-middle attacks.

May 3, 2014

I am not a number!

Filed under: Business, Cancon, Humour — Tags: , — Nicholas @ 07:58

We had a call from Rogers (our ISP/cable provider) last night to discuss our current internet plan (we’ve been bumping up against our data cap lately, even though we increased it from 60GB to 80GB only a few months ago). I pointed out that my son’s internet bill while he was away at university came to about the same as our bill with Rogers, but that his data cap was 250GB. I asked if Rogers could come close to offering me that in Brooklin, since Cogeco is clearly able to turn a profit while offering folks in Peterborough a much higher data cap.

Rogers couldn’t quite match the offer, but for a slightly higher monthly bill we’ll now have a 270GB cap and higher (nominal) upload/download speeds. After this, I got an email that showed I’m not just a number to Rogers … I’m {$/process_data/xmlData/CRCFormatRequest/CustomerInfo/FullName$} instead:

Rogers internet service quote glitch

May 1, 2014

Rethinking Canadian broadcast regulation

Filed under: Bureaucracy, Business, Cancon, Media — Tags: , , , , , — Nicholas @ 07:27

On Google+, Michael Geist posted a few thoughts on hitting the reset button in Canadian broadcast regulation:

The Broadcasting Act is a complex statute that lists more than twenty broadcasting policy goals. Yet for decades, Canadian policy has largely boiled down to a single objective: Maximizing the benefits from the broadcasting system for creators, broadcasters, and broadcast distributors such as cable and satellite companies.

Consumers were nowhere to be found in that objective and it showed. Creators benefited from Canadian content requirements and financial contributions that guaranteed the creation of Canadian broadcast content. Broadcasters flourished in a market that permitted simultaneous substitution (thereby enabling big profits from licensing U.S. content) and that kept U.S. giants such as HBO, ESPN, and MTV out of the market for years in favour of Canadian alternatives. Cable and satellite companies became dominant media companies by requiring consumers to purchase large packages filled with channels they did not want in order to access the few they did.

As I mentioned in a conversation last night, the Canadian market for broadcast, telecommunications, and internet providers has been carefully managed by the government to minimize the whole messy “competition” thing and ensure quasi-monopoly conditions in various regions across the country. The regulators prefer a small number of players in the market: it makes it easier to do the “regulation” thing when you can fit all the regulated players around a small table, and it also provides post-civil service career opportunities for former regulators. Having a larger number of competing organizations makes the regulation game much more difficult and reduces the revolving door opportunities for former regulators.

April 30, 2014

What if real life had lag like online games do?

Filed under: Humour, Technology — Tags: , — Nicholas @ 00:01

You wouldn’t accept lag offline, so why do it online? ume.net, a fiber broadband provider that offers up to 1000 Mbit/s, performed an experiment. Four volunteers got to experience internet’s biggest disturbance in real life – lag.

H/T to Jeff Sher for the link.

April 25, 2014

Is it science or “science”? A cheat sheet

Filed under: Media, Science — Tags: , , , , , — Nicholas @ 08:32

At Lifehacker, Alan Henry links to this useful infographic:

Click to see full-size image at Compound Interest

Click to see full-size image at Compound Interest

Science is amazing, but science reporting can be confusing at times and misleading at worst. The folks at Compound Interest put together this reference graphic that will help you pick out good articles from bad ones, and help you qualify the impact of the study you’re reading

One of the best and worst things about having a scientific background is being able to see when a science story is poorly reported, or a preliminary study published as if it were otherwise. One of the worst things about writing about science worrying you’ll fall into the same trap. It’s a constant struggle, because there are interesting takeaways even from preliminary studies and small sample sizes, but it’s important to qualify them as such so you don’t misrepresent the research. With this guide, you’ll be able to see when a study’s results are interesting food for thought that’s still developing, versus a relatively solid position that has consensus behind it.

April 23, 2014

LibreSSL website – “This page scientifically designed to annoy web hipsters”

Filed under: Technology — Tags: , , , , , — Nicholas @ 09:24

Julian Sanchez linked to this Ars Technica piece on a new fork of OpenSSL:

OpenBSD founder Theo de Raadt has created a fork of OpenSSL, the widely used open source cryptographic software library that contained the notorious Heartbleed security vulnerability.

OpenSSL has suffered from a lack of funding and code contributions despite being used in websites and products by many of the world’s biggest and richest corporations.

The decision to fork OpenSSL is bound to be controversial given that OpenSSL powers hundreds of thousands of Web servers. When asked why he wanted to start over instead of helping to make OpenSSL better, de Raadt said the existing code is too much of a mess.

“Our group removed half of the OpenSSL source tree in a week. It was discarded leftovers,” de Raadt told Ars in an e-mail. “The Open Source model depends [on] people being able to read the code. It depends on clarity. That is not a clear code base, because their community does not appear to care about clarity. Obviously, when such cruft builds up, there is a cultural gap. I did not make this decision… in our larger development group, it made itself.”

The LibreSSL code base is on OpenBSD.org, and the project is supported financially by the OpenBSD Foundation and OpenBSD Project. LibreSSL has a bare bones website that is intentionally unappealing.

“This page scientifically designed to annoy web hipsters,” the site says. “Donate now to stop the Comic Sans and Blink Tags.” In explaining the decision to fork, the site links to a YouTube video of a cover of the Twisted Sister song “We’re not gonna take it.”

April 19, 2014

Transaction costs, takedown notices, and the DMCA

Filed under: Economics, Law, Media — Tags: , , , , — Nicholas @ 09:59

Mike Masnick reports on an inadvertent natural experiment that just came to light:

We’ve written a few times in the past about research done by Paul Heald on copyright and its impact on the availability of certain content. He’s recently published an interesting new study on how the DMCA’s notice-and-takedown regime facilitates making content available by decreasing transaction costs among parties. As we’ve discussed at length, the entertainment industry’s main focus in the next round of copyright reform is to wipe out the notice-and-takedown provisions of the DMCA. The legacy recording and movie industries want everyone else to act as copyright cops, and hate the idea that notice-and-takedown puts the initial burden on themselves as copyright holders.

However, Heald’s research looks at music on YouTube and concludes that the notice-and-takedown system has actually enabled much greater authorized availability of music, by reducing transaction costs. The idea is pretty straightforward. Without a notice-and-takedown provision, someone who wants to post music to YouTube needs to go out and seek a license. Of course, getting permission from all the various rightsholders is frequently impossible. The transaction costs of getting permission make it such that it’s way too high. Yet, with notice-and-takedown, the person can upload the content without permission, and then the copyright holder is given the option of what to do with it. On YouTube, that includes the option of monetizing it, thus “authorizing” the use. That creates a natural experiment for Heald to explore, in which he can see how much content is “authorized” thanks to such a setup. And the result, not surprisingly, is that this system has enabled much greater authorized (and monetized) access to music than an alternative, high transaction cost system, under which uploaders must first seek out permission to upload everything.

April 17, 2014

Online illegal drug sales persist because they’re safer than other channels

Filed under: Britain, Law, Liberty — Tags: , , , , — Nicholas @ 07:34

At the Adam Smith Institute blog, Daniel Pryor discusses the reasons for “Silk Road” continuing despite police crackdowns:

Growing up in Essex has made me appreciate why purchasing illegal drugs online is a far more attractive option. I have experienced the catastrophic effects of drug prohibition first-hand, and it is part of the reason that the issue means a great deal to me. Friends and acquaintances have had terrible experiences due to contamination from unscrupulous dealers with little incentive to raise their drugs’ quality, and every reason to lace their products with harmful additives. The violence associated with buying and selling drugs in person has affected the lives of people close to me.

As a current university student, I now live in an environment populated by many people who use Silk Road regularly, and for a variety of purchases. From prescription-only ‘study drugs’ like modafinil to recreational marijuana and cocaine, fellow students’ experiences with drugs ordered from Silk Road have reinforced my beliefs in the benefits of legalisation. They have no need to worry about aggressive dealers and are more likely to receive safer drugs: meaning chances of an overdose and other health risks are substantially reduced.

Their motivations for using Silk Road rather than street dealers correlate with the Global Drug Survey’s findings. Over 60% of participants cited the quality of Silk Road’s drugs as being a reason for ordering, whilst a significant proportion also used the site as a way to avoid the potential violence of purchasing from the street. Given that payments are made in the highly volatile Bitcoin, it was also surprising to learn that lower prices were a motivation for more than a third of respondents.

April 16, 2014

QotD: The wizards of the web

Filed under: Business, Quotations, Technology — Tags: , , , , — Nicholas @ 08:28

You would have thought this would have sunk in by now. The fact that it hasn’t shows what an extraordinary machine the internet is — quite different to any technology that has gone before it. When the Lovebug struck, few of us lived our lives online. Back then we banked in branches, shopped in shops, met friends and lovers in the pub and obtained jobs by posting CVs. Tweeting was for the birds. Cyberspace was marginal. Now, for billions, the online world is their lives. But there is a problem. Only a tiny, tiny percentage of the people who use the internet have even the faintest clue about how any of it works. “SSL”, for instance, stands for “Secure Sockets Layer”.

I looked it up and sort of understood it — for about five minutes. While most drivers have at least a notion of how an engine works (something about petrol exploding in cylinders and making pistons go up and down and so forth) the very language of the internet — “domain names” and “DNS codes”, endless “protocols” and so forth — is arcane, exclusive; it is, in fact, the language of magic. For all intents and purposes the internet is run by wizards.

And the trouble with letting wizards run things is that when things go wrong we are at their mercy. The world spends several tens of billions of pounds a year on anti-malware programs, which we are exhorted to buy lest the walls of our digital castles collapse around us. Making security software is a huge industry, and whenever there is a problem — either caused by viruses or by a glitch like Heartbleed — the internet security companies rush to be quoted in the media. And guess what, their message is never “keep calm and carry on”. As Professor Ross Anderson of Cambridge University says: “Almost all the cost of cybercrime is the cost of anticipation.”

Michael Hanlon, “Relax, Mumsnet users: don’t lose sleep over Heartbleed hysteria”, Telegraph, 2014-04-16

April 11, 2014

Open source software and the Heartbleed bug

Filed under: Technology — Tags: , , , , — Nicholas @ 07:03

Some people are claiming that the Heartbleed bug proves that open source software is a failure. ESR quickly addresses that idiotic claim:

Heartbleed bugI actually chuckled when I read rumor that the few anti-open-source advocates still standing were crowing about the Heartbleed bug, because I’ve seen this movie before after every serious security flap in an open-source tool. The script, which includes a bunch of people indignantly exclaiming that many-eyeballs is useless because bug X lurked in a dusty corner for Y months, is so predictable that I can anticipate a lot of the lines.

The mistake being made here is a classic example of Frederic Bastiat’s “things seen versus things unseen”. Critics of Linus’s Law overweight the bug they can see and underweight the high probability that equivalently positioned closed-source security flaws they can’t see are actually far worse, just so far undiscovered.

That’s how it seems to go whenever we get a hint of the defect rate inside closed-source blobs, anyway. As a very pertinent example, in the last couple months I’ve learned some things about the security-defect density in proprietary firmware on residential and small business Internet routers that would absolutely curl your hair. It’s far, far worse than most people understand out there.

[…]

Ironically enough this will happen precisely because the open-source process is working … while, elsewhere, bugs that are far worse lurk in closed-source router firmware. Things seen vs. things unseen…

Returning to Heartbleed, one thing conspicuously missing from the downshouting against OpenSSL is any pointer to an implementation that is known to have a lower defect rate over time. This is for the very good reason that no such empirically-better implementation exists. What is the defect history on proprietary SSL/TLS blobs out there? We don’t know; the vendors aren’t saying. And we can’t even estimate the quality of their code, because we can’t audit it.

The response to the Heartbleed bug illustrates another huge advantage of open source: how rapidly we can push fixes. The repair for my Linux systems was a push-one-button fix less than two days after the bug hit the news. Proprietary-software customers will be lucky to see a fix within two months, and all too many of them will never see a fix patch.

Update: There are lots of sites offering tools to test whether a given site is vulnerable to the Heartbeat bug, but you need to step carefully there, as there’s a thin line between what’s legal in some countries and what counts as an illegal break-in attempt:

Websites and tools that have sprung up to check whether servers are vulnerable to OpenSSL’s mega-vulnerability Heartbleed have thrown up anomalies in computer crime law on both sides of the Atlantic.

Both the US Computer Fraud and Abuse Act and its UK equivalent the Computer Misuse Act make it an offence to test the security of third-party websites without permission.

Testing to see what version of OpenSSL a site is running, and whether it is also supports the vulnerable Heartbeat protocol, would be legal. But doing anything more active — without permission from website owners — would take security researchers onto the wrong side of the law.

And you shouldn’t just rush out and change all your passwords right now (you’ll probably need to do it, but the timing matters):

Heartbleed is a catastrophic bug in widely used OpenSSL that creates a means for attackers to lift passwords, crypto-keys and other sensitive data from the memory of secure server software, 64KB at a time. The mega-vulnerability was patched earlier this week, and software should be updated to use the new version, 1.0.1g. But to fully clean up the problem, admins of at-risk servers should generate new public-private key pairs, destroy their session cookies, and update their SSL certificates before telling users to change every potentially compromised password on the vulnerable systems.

April 9, 2014

XKCD on the impact of “Heartbleed”

Filed under: Technology — Tags: , , , , , — Nicholas @ 11:00

Update: In case you’re not concerned about the seriousness of this issue, The Register‘s John Leyden would like you to think again.

The catastrophic crypto key password vulnerability in OpenSSL affects far more than web servers, with everything from routers to smartphones also affected.

The so-called “Heartbleed” vulnerability (CVE-2014-0160) can be exploited to extract information from the servers running vulnerable version of OpenSSL, and this includes email servers and Android smartphones as well as routers.

Hackers could potentially gain access to private encryption key before using this information to decipher the encrypted traffic to and from vulnerable websites.

Web sites including Yahoo!, Flickr and OpenSSL were among the many left vulnerable to the megabug that exposed encryption keys, passwords and other sensitive information.

Preliminary tests suggested 47 of the 1000 largest sites are vulnerable to Heartbleed and that’s only among the less than half that provide support for SSL or HTTPS at all. Many of the affected sites – including Yahoo! – have since patched the vulnerability. Even so, security experts – such as Graham Cluley – remain concerned.

OpenSSL is a widely used encryption library that is a key component of technology that enables secure (https) website connections.

The bug exists in the OpenSSL 1.0.1 source code and stems from coding flaws in a fairly new feature known as the TLS Heartbeat Extension. “TLS heartbeats are used as ‘keep alive’ packets so that the ends of an encrypted connection can agree to keep the session open even when they don’t have any official data to exchange,” explains security veteran Paul Ducklin in a post on Sophos’ Naked Security blog.

The Heartbleed vulnerability in the OpenSSL cryptographic library might be exploited to reveal contents of secured communication exchanges. The same flaw might also be used to lift SSL keys.

This means that sites could still be vulnerable to attacks after installing the patches in cases where a private key has been stolen. Sites therefore need to revoke exposed keys, reissue new keys, and invalidate all session keys and session cookies.

Bruce Schneier:

“Catastrophic” is the right word. On the scale of 1 to 10, this is an 11.

Half a million sites are vulnerable, including my own. Test your vulnerability here.

The bug has been patched. After you patch your systems, you have to get a new public/private key pair, update your SSL certificate, and then change every password that could potentially be affected.

At this point, the probability is close to one that every target has had its private keys extracted by multiple intelligence agencies. The real question is whether or not someone deliberately inserted this bug into OpenSSL, and has had two years of unfettered access to everything. My guess is accident, but I have no proof.

April 2, 2014

Dog adoptions – the economics are trickier than you think

Filed under: Economics — Tags: , , — Nicholas @ 09:01

In the Harvard Business Review, Paul Oyer explains some of the changes in the market for adopting dogs over the last decade or so:

Lots of people are looking for a canine companion to brighten their lives, and there are always plenty of dogs “on the market” at shelters or through breeders. Yet, too many dogs don’t find homes, and they often pay the ultimate price (especially if they are in Sochi). So what stands in the way of dogs and owners finding one another?

For starters, the supply and demand at any given time in any given area is typically thin and random. Thankfully, online pet boards have thickened the market by enabling potential adopters, especially those who want to rescue a dog, to find a broader range of options rather than just settling for what the shelter happens to have the day they go there. Sites such as petfinder.com lead to many adoptions, many of which cross significant geographic territory.

[…]

A second problem — and this is much harder to solve than the thin-market problem — is there are a lot of duds on both sides of the dog adoption market, and it’s hard to tell exactly who they are. A breeder could describe a bad, Cujo-like dog as “good with children” while potential owners like Michael Vicks’ former associates would surely claim they would give a dog a safe home.

Shelters address this issue by thoroughly screening would-be adopters (I have always found it ironic that they give you your baby to take home after it is born with no questions asked, but you have to jump through a lot of hoops to adopt a puppy or kitten that will otherwise be euthanized.) But there is no evidence that these screenings are very effective.

People are less inclined to shop or bank online after NSA surveillance reports

Filed under: Business, Government, Technology — Tags: , , , , , — Nicholas @ 08:46

Among the side-effects of government surveillance revelations, ordinary people are deciding to be a bit less involved in online activities, according to a new Harris Poll:

Online banking and shopping in America are being negatively impacted by ongoing revelations about the National Security Agency’s digital surveillance activities. That is the clear implication of a recent ESET-commissioned Harris poll which asked more than 2,000 U.S. adults ages 18 and older whether or not, given the news about the NSA’s activities, they have changed their approach to online activity.

Almost half of respondents (47%) said that they have changed their online behavior and think more carefully about where they go, what they say, and what they do online.

When it comes to specific Internet activities, such as email or online banking, this change in behavior translates into a worrying trend for the online economy: over one quarter of respondents (26%) said that, based on what they have learned about secret government surveillance, they are now doing less banking online and less online shopping. This shift in behavior is not good news for companies that rely on sustained or increased use of the Internet for their business model.

[…]

Whether or not we have seen the full extent of the public’s reaction to state-sponsored mass surveillance is hard to predict, but based on this survey and the one we did last year, I would say that, if the NSA revelations continue – and I am sure they will – and if government reassurances fail to impress the public, then it is possible that the trends in behavior we are seeing right now will continue. For example, I do not see many people finding reassurance in President Obama’s recently announced plan to transfer the storage of millions of telephone records from the government to private phone companies. As we will document in our next installment of survey findings, data gathering by companies is even more of a privacy concern for some Americans than government surveillance.

And in case anyone is tempted to think that this is a narrow issue of concern only to news junkies and security geeks, let me be clear: according to this latest survey, 85% of adult Americans are now at least somewhat familiar with the news about secret government surveillance of private citizens’ phone calls, emails, online activity, and so on.

« Newer PostsOlder Posts »

Powered by WordPress