Quotulatiousness

June 13, 2014

Supreme Court rules unanimously in favour of internet privacy

Filed under: Cancon, Law, Liberty, Technology — Tags: , , , , — Nicholas @ 13:11

Some great news on the privacy front, this time a decision handed down by the Supreme Court of Canada, as reported by Michael Geist:

This morning another voice entered the discussion and completely changed the debate. The Supreme Court of Canada issued its long-awaited R. v. Spencer decision, which examined the legality of voluntary warrantless disclosure of basic subscriber information to law enforcement. In a unanimous decision written by (Harper appointee) Justice Thomas Cromwell, the court issued a strong endorsement of Internet privacy, emphasizing the privacy importance of subscriber information, the right to anonymity, and the need for police to obtain a warrant for subscriber information except in exigent circumstances or under a reasonable law.

I discuss the implications below, but first some of the key findings. First, the Court recognizes that there is a privacy interest in subscriber information. While the government has consistently sought to downplay that interest, the court finds that the information is much more than a simple name and address, particular in the context of the Internet. As the court states:

    the Internet has exponentially increased both the quality and quantity of information that is stored about Internet users. Browsing logs, for example, may provide detailed information about users’ interests. Search engines may gather records of users’ search terms. Advertisers may track their users across networks of websites, gathering an overview of their interests and concerns. “Cookies” may be used to track consumer habits and may provide information about the options selected within a website, which web pages were visited before and after the visit to the host website and any other personal information provided. The user cannot fully control or even necessarily be aware of who may observe a pattern of online activity, but by remaining anonymous – by guarding the link between the information and the identity of the person to whom it relates – the user can in large measure be assured that the activity remains private.

Given all of this information, the privacy interest is about much more than just name and address.

Second, the court expands our understanding of informational privacy, concluding that there three conceptually distinct issues: privacy as secrecy, privacy as control, and privacy as anonymity. It is anonymity that is particularly notable as the court recognizes its importance within the context of Internet usage. Given the importance of the information and the ability to link anonymous Internet activities with an identifiable person, a high level of informational privacy is at stake.

Third, not only is there a significant privacy interest, but there is also a reasonable expectation of privacy by the user. The court examines both PIPEDA and the Shaw terms of use (the ISP in this case) and concludes that PIPEDA must surely be understood within the context of protecting privacy (not opening the door to greater disclosures) and that the ISP agreement was confusing at best and may support the expectation of privacy. With those findings in mind:

    in the totality of the circumstances of this case, there is a reasonable expectation of privacy in the subscriber information. The disclosure of this information will often amount to the identification of a user with intimate or sensitive activities being carried out online, usually on the understanding that these activities would be anonymous. A request by a police officer that an ISP voluntarily disclose such information amounts to a search.

Fourth, having concluded that obtaining subscriber information was a search with a reasonable expectation of privacy, the information was unconstitutionally obtained therefore led to an unlawful search. Addressing the impact of the PIPEDA voluntary disclosure clause, the court notes:

    Since in the circumstances of this case the police do not have the power to conduct a search for subscriber information in the absence of exigent circumstances or a reasonable law, I do not see how they could gain a new search power through the combination of a declaratory provision and a provision enacted to promote the protection of personal information.

Update, 7 July: A few weeks later, the US Supreme Court also made a strong pro-privacy ruling, this one mandating a warrant for police to search the contents of a cellphone.

Politico‘s Josh Gerstein has more on the ruling in in Riley v. California:

The Supreme Court’s blunt and unequivocal decision Wednesday giving Americans strong protection against arrest-related searches of their cell phones could also give a boost to lawsuits challenging the National Security Agency’s vast collection of phone call data.

Chief Justice John Roberts’s 28-page paean to digital privacy was like music to the ears of critics of the NSA’s metadata program, which sweeps up details on billions of calls and searches them for possible links to terrorist plots.

“This is a remarkably strong affirmation of privacy rights in a digital age,” said Marc Rotenberg of the Electronic Privacy Information Center. “The court found that digital data is different and that has constitutional significance, particularly in the realm of [the] Fourth Amendment…I think it also signals the end of the NSA program.”

Roberts’s opinion is replete with rhetoric warning about the privacy implications of access to data in individuals’ smart phones, including call logs, Web search records and location information. Many of the arguments parallel, or are virtually identical to, the ones privacy advocates have made about the dangers inherent in the NSA’s call metadata program.

June 5, 2014

Living in a post-Snowden world, under the gaze of the Five Eyes

Filed under: Australia, Cancon, Government, Technology, USA — Tags: , , , , — Nicholas @ 07:12

It’s been a year since the name Edward Snowden became known to the world, and it’s been a bumpy ride since then, as we found out that the tinfoil-hat-wearing anti-government conspiracy theorists were, if anything, under-estimating the actual level of organized, secret government surveillance. At The Register, Duncan Campbell takes us inside the “FIVE-EYED VAMPIRE SQUID of the internet”, the five-way intelligence-sharing partnership of US/UK/Canada/Australia/New Zealand:

One year after The Guardian opened up the trove of top secret American and British documents leaked by former National Security Agency (NSA) sysadmin Edward J Snowden, the world of data security and personal information safety has been turned on its head.

Everything about the safety of the internet as a common communication medium has been shown to be broken. As with the banking disasters of 2008, the crisis and damage created — not by Snowden and his helpers, but by the unregulated and unrestrained conduct the leaked documents have exposed — will last for years if not decades.

Compounding the problem is the covert network of subornment and control that agencies and collaborators working with the NSA are now revealed to have created in communications and computer security organisations and companies around the globe.

The NSA’s explicit objective is to weaken the security of the entire physical fabric of the net. One of its declared goals is to “shape the worldwide commercial cryptography market to make it more tractable to advanced cryptanalytic capabilities being developed by the NSA”, according to top secret documents provided by Snowden.

Profiling the global machinations of merchant bank Goldman Sachs in Rolling Stone in 2009, journalist Matt Taibbi famously characterized them as operating “everywhere … a great vampire squid wrapped around the face of humanity, relentlessly jamming its blood funnel into anything that smells like money”.

The NSA, with its English-speaking “Five Eyes” partners (the relevant agencies of the UK, USA, Australia, New Zealand and Canada) and a hitherto unknown secret network of corporate and government partners, has been revealed to be a similar creature. The Snowden documents chart communications funnels, taps, probes, “collection systems” and malware “implants” everywhere, jammed into data networks and tapped into cables or onto satellites.

June 4, 2014

Sarcasm-detecting software wanted

Filed under: Media, Technology — Tags: , , , , — Nicholas @ 09:02

Charles Stross discusses some of the second-order effects should the US Secret Service actually get the sarcasm-detection software they’re reportedly looking for:

… But then the Internet happened, and it just so happened to coincide with a flowering of highly politicized and canalized news media channels such that at any given time, whoever is POTUS, around 10% of the US population are convinced that they’re a baby-eating lizard-alien in a fleshsuit who is plotting to bring about the downfall of civilization, rather than a middle-aged male politician in a business suit.

Well now, here’s the thing: automating sarcasm detection is easy. It’s so easy they teach it in first year computer science courses; it’s an obvious application of AI. (You just get your Turing-test-passing AI that understands all the shared assumptions and social conventions that human-human conversation rely on to identify those statements that explicitly contradict beliefs that the conversationalist implicitly holds. So if I say “it’s easy to earn a living as a novelist” and the AI knows that most novelists don’t believe this and that I am a member of the set of all novelists, the AI can infer that I am being sarcastic. Or I’m an outlier. Or I’m trying to impress a date. Or I’m secretly plotting to assassinate the POTUS.)

Of course, we in the real world know that shaved apes like us never saw a system we didn’t want to game. So in the event that sarcasm detectors ever get a false positive rate of less than 99% (or a false negative rate of less than 1%) I predict that everybody will start deploying sarcasm as a standard conversational gambit on the internet.

Wait … I thought everyone already did?

Trolling the secret service will become a competitive sport, the goal being to not receive a visit from the SS in response to your totally serious threat to kill the resident of 1600 Pennsylvania Avenue. Al Qaida terrrrst training camps will hold tutorials on metonymy, aggressive irony, cynical detachment, and sarcasm as a camouflage tactic for suicide bombers. Post-modernist pranks will draw down the full might of law enforcement by mistake, while actual death threats go encoded as LOLCat macros. Any attempt to algorithmically detect sarcasm will fail because sarcasm is self-referential and the awareness that a sarcasm detector may be in use will change the intent behind the message.

As the very first commenter points out, a problem with this is that a substantial proportion of software developers (as indicated by their position on the Asperger/Autism spectrum) find it very difficult to detect sarcasm in real life…

Bruce Schneier on the human side of the Heartbleed vulnerability

Filed under: Technology — Tags: , , , — Nicholas @ 07:24

Reposting at his own site an article he did for The Mark News:

The announcement on April 7 was alarming. A new Internet vulnerability called Heartbleed could allow hackers to steal your logins and passwords. It affected a piece of security software that is used on half a million websites worldwide. Fixing it would be hard: It would strain our security infrastructure and the patience of users everywhere.

It was a software insecurity, but the problem was entirely human.

Software has vulnerabilities because it’s written by people, and people make mistakes — thousands of mistakes. This particular mistake was made in 2011 by a German graduate student who was one of the unpaid volunteers working on a piece of software called OpenSSL. The update was approved by a British consultant.

In retrospect, the mistake should have been obvious, and it’s amazing that no one caught it. But even though thousands of large companies around the world used this critical piece of software for free, no one took the time to review the code after its release.

The mistake was discovered around March 21, 2014, and was reported on April 1 by Neel Mehta of Google’s security team, who quickly realized how potentially devastating it was. Two days later, in an odd coincidence, researchers at a security company called Codenomicon independently discovered it.

When a researcher discovers a major vulnerability in a widely used piece of software, he generally discloses it responsibly. Why? As soon as a vulnerability becomes public, criminals will start using it to hack systems, steal identities, and generally create mayhem, so we have to work together to fix the vulnerability quickly after it’s announced.

May 27, 2014

Internet privacy advice for kids (who are not “Digital Natives”)

Filed under: Business, Media, Technology — Tags: , , , , , — Nicholas @ 13:15

Cory Doctorow sympathizes with young people who have literally grown up with the internet:

The problem with being a “digital native” is that it transforms all of your screw-ups into revealed deep truths about how humans are supposed to use the Internet. So if you make mistakes with your Internet privacy, not only do the companies who set the stage for those mistakes (and profited from them) get off Scot-free, but everyone else who raises privacy concerns is dismissed out of hand. After all, if the “digital natives” supposedly don’t care about their privacy, then anyone who does is a laughable, dinosauric idiot, who isn’t Down With the Kids.

“Privacy” doesn’t mean that no one in the world knows about your business. It means that you get to choose who knows about your business.

It’s difficult to explain to people just how open their online “secrets” really are … and that’s not even covering the folks who are specifically targets of active surveillance … just being on Facebook or other social media sites hands over a lot of your personal details without your direct knowledge or (informed) consent. But you can start to take back some of your own privacy online:

If you start using computers when you’re a little kid, you’ll have a certain fluency with them that older people have to work harder to attain. As Douglas Adams wrote:

  1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.
  2. Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.
  3. Anything invented after you’re thirty-five is against the natural order of things.

If I was a kid today, I’d be all about the opsec — the operational security. I’d learn how to use tools that kept my business between me and the people I explicitly shared it with. I’d make it my habit, and get my friends into the habit too (after all, it doesn’t matter if all your email is encrypted if you send it to some dorkface who keeps it all on Google’s servers in unscrambled form where the NSA can snaffle it up).

Here’s some opsec links to get you started:

  • First of all, get a copy of Tails, AKA “The Amnesic Incognito Live System.” This is an operating system that you can use to boot up your computer so that you don’t have to trust the OS it came with to be free from viruses and keyloggers and spyware. It comes with a ton of secure communications tools, as well as everything you need to make the media you want to send out into the world.
  • Next, get a copy of The Tor Browser Bundle, a special version of Firefox that automatically sends your traffic through something called TOR (The Onion Router, not to be confused with Tor Books, who publish my novels). This lets you browse the Web with a much greater degree of privacy and anonymity than you would otherwise get.
  • Learn to use GPG, which is a great way to encrypt (scramble) your emails. There’s a Chrome plugin for using GPG with Gmail, and another version for Firefox
  • If you like chatting, get OTR, AKA “Off the Record,” a very secure private chat tool that has exciting features like “perfect forward secrecy” (this being a cool way of saying, even if someone breaks this tomorrow, they won’t be able to read the chats they captured today).

Once you’ve mastered that stuff, start to think about your phone. Android phones are much, much easier to secure than Apple’s iPhones (Apple tries to lock their phones so you can’t install software except through their store, and because of a 1998 law called the DMCA, it’s illegal to make a tool to unlock them). There are lots of alternative operating systems for Android, of varying degrees of security. The best place to start is Cyanogenmod, which makes it much easier to use privacy tools with your mobile device.

May 19, 2014

Gillespie – Don’t let the FCC ruin the internet!

Filed under: Bureaucracy, Business, Government, Liberty, Media — Tags: , , , — Nicholas @ 10:51

Nick Gillespie thinks that the uproar about net neutrality may end up with the worst of all possible solutions by letting the FCC control the internet:

Reports of the imminent death of the Internet’s freewheeling ways and utopian possibilities are more wildly exaggerated and full of spam than those emails from Mrs. Mobotu Sese-Seko.

In fact, the real problem isn’t that the FCC hasn’t shown the cyber-cojones to regulate ISPs like an old-school telephone company or “common carrier,” but that it’s trying to increase its regulatory control of the Internet in the first place.

Under the proposal currently in play, the FCC assumes an increased ability to review ISP offerings on a “case-by-case basis” and kill any plan it doesn’t believe is “commercially reasonable.” Goodbye fast-moving innovation and adjustment to changing technology on the part of companies, hello regulatory morass and long, drawn-out bureaucratic hassles.

In 1998, the FCC told Congress that the Internet should properly be understood as an “information service,” which allows for a relatively low level of government interference, rather than as a “telecommunication service,” which could subject it to the sort of oversight that public utilities get (as my Reason colleague Peter Suderman explains, there’s every reason to keep that original classification). The Internet has flourished in the absence of major FCC regulation, and there’s no demonstrated reason to change that now. That’s exactly why the parade of horribles — non-favored video streams slowed to an unwatchable trickle! whole sites blocked! plucky new startups throttled in the crib! — trotted out by net neutrality proponents is hypothetical in a world without legally mandated net neutrality.

Apart from addressing a problem that doesn’t yet exist, if you are going to pin your hopes for free expression and constant innovation on a government agency, the FCC is about the last place to start. For God’s sake, we’re talking about the agency that spent the better part of a decade trying to figuratively cover up Janet Jackson’s tit by fining Viacom and CBS for airing the 2004 Super Bowl.

May 16, 2014

The built-in confusion about net neutrality

While I’ve been following the net neutrality debate, I was still unconvinced that either side had the answers. In a post from 2008, ESR helps to explain why I was confused:

Let it be clear from the outset that the telcos are putting their case for being allowed to do these things with breathtaking hypocrisy. They honk about how awful it is that regulation keeps them from setting their own terms, blithely ignoring the fact that their last-mile monopoly is entirely a creature of regulation. In effect, Theodore Vail and the old Bell System bribed the Feds to steal the last mile out from under the public’s nose between 1878 and 1920; the wireline telcos have been squatting on that unnatural monopoly ever since as if they actually had some legitimate property right to it.

But the telcos’ crimes aren’t merely historical. They have repeatedly bargained for the right to exclude competitors from their networks on the grounds that if the regulators would let them do that, they’d be able to generate enough capital to deploy broadband everywhere. That promise has been repeatedly, egregiously broken. Instead, they’ve creamed off that monopoly rent as profit or used it to cross-subsidize competition in businesses with higher rates of return. (Oh, and of course, to bribe legislators and buy regulators.)

Mistake #1 for libertarians to avoid is falling for the telcos’ “we’re pro-free market” bullshit. They’re anything but; what they really want is a politically sheltered monopoly in which they have captured the regulators and created business conditions that fetter everyone but them.

OK, so if the telcos are such villainous scum, the pro-network-neutrality activists must be the heroes of this story, right?

Unfortunately, no.

Your typical network-neutrality activist is a good-government left-liberal who is instinctively hostile to market-based approaches. These people think, rather, that if they can somehow come up with the right regulatory formula, they can jawbone the government into making the telcos play nice. They’re ideologically incapable of questioning the assumption that bandwidth is a scarce “public good” that has to be regulated. They don’t get it that complicated regulations favor the incumbent who can afford to darken the sky with lawyers, and they really don’t get it about outright regulatory capture, a game at which the telcos are past masters.

[…]

In short, the “network neutrality” crowd is mainly composed of well-meaning fools blinded by their own statism, and consequently serving mainly as useful idiots for the telcos’ program of ever-more labyrinthine and manipulable regulation. If I were a telco executive, I’d be on my knees every night thanking my god(s) for this “opposition”. Mistake #2 for any libertarian to avoid is backing these clowns.

In the comments, he summarizes “the history of the Bell System’s theft of the last mile”.

May 11, 2014

The NSA worked very hard to set themselves up for the Snowden leaks

Filed under: Government, Liberty, Technology — Tags: , , , , , — Nicholas @ 10:30

A few days back, Charles Stross pointed out one of the most ironic points of interest in the NSA scandal … they did it to themselves, over the course of several years effort:

I don’t need to tell you about the global surveillance disclosures of 2013 to the present — it’s no exaggeration to call them the biggest secret intelligence leak in history, a monumental gaffe (from the perspective of the espionage-industrial complex) and a security officer’s worst nightmare.

But it occurs to me that it’s worth pointing out that the NSA set themselves up for it by preventing the early internet specifications from including transport layer encryption.

At every step in the development of the public internet the NSA systematically lobbied for weaker security, to enhance their own information-gathering capabilities. The trouble is, the success of the internet protocols created a networking monoculture that the NSA themselves came to rely on for their internal infrastructure. The same security holes that the NSA relied on to gain access to your (or Osama bin Laden’s) email allowed gangsters to steal passwords and login credentials and credit card numbers. And ultimately these same baked-in security holes allowed Edward Snowden — who, let us remember, is merely one guy: a talented system administrator and programmer, but no Clark Kent — to rampage through their internal information systems.

The moral of the story is clear: be very cautious about poisoning the banquet you serve your guests, lest you end up accidentally ingesting it yourself.

May 6, 2014

Reset the Net on June 5th

Filed under: Liberty, Media, Technology — Tags: , , , — Nicholas @ 09:58

At Wired, Kim Zetter talks about an initiative to reclaim (some measure of) privacy on the internet:

A coalition of nearly two-dozen tech companies and civil liberties groups is launching a new fight against mass internet surveillance, hoping to battle the NSA in much the same way online campaigners pushed back on bad piracy legislation in 2012.

The new coalition, organized by Fight for the Future, is planning a Reset the Net day of action on June 5, the anniversary of the date the first Edward Snowden story broke detailing the government’s PRISM program, based on documents leaked by the former NSA contractor.

“Government spies have a weakness: they can hack anybody, but they can’t hack everybody,” the organizers behind the Reset the Net movement say in their video (above). “Folks like the NSA depend on collecting insecure data from tapped fiber. They depend on our mistakes, mistakes we can fix.”

To that end, the groups are calling on developers to add at least one NSA resistant feature to mobile apps, and on websites to add security features like SSL (Secure Socket Layer), HSTS (HTTP Strict Transport Security), and Perfect Forward Secrecy to better secure the communication of users and thwart government man-in-the-middle attacks.

May 3, 2014

I am not a number!

Filed under: Business, Cancon, Humour — Tags: , — Nicholas @ 07:58

We had a call from Rogers (our ISP/cable provider) last night to discuss our current internet plan (we’ve been bumping up against our data cap lately, even though we increased it from 60GB to 80GB only a few months ago). I pointed out that my son’s internet bill while he was away at university came to about the same as our bill with Rogers, but that his data cap was 250GB. I asked if Rogers could come close to offering me that in Brooklin, since Cogeco is clearly able to turn a profit while offering folks in Peterborough a much higher data cap.

Rogers couldn’t quite match the offer, but for a slightly higher monthly bill we’ll now have a 270GB cap and higher (nominal) upload/download speeds. After this, I got an email that showed I’m not just a number to Rogers … I’m {$/process_data/xmlData/CRCFormatRequest/CustomerInfo/FullName$} instead:

Rogers internet service quote glitch

May 1, 2014

Rethinking Canadian broadcast regulation

Filed under: Bureaucracy, Business, Cancon, Media — Tags: , , , , , — Nicholas @ 07:27

On Google+, Michael Geist posted a few thoughts on hitting the reset button in Canadian broadcast regulation:

The Broadcasting Act is a complex statute that lists more than twenty broadcasting policy goals. Yet for decades, Canadian policy has largely boiled down to a single objective: Maximizing the benefits from the broadcasting system for creators, broadcasters, and broadcast distributors such as cable and satellite companies.

Consumers were nowhere to be found in that objective and it showed. Creators benefited from Canadian content requirements and financial contributions that guaranteed the creation of Canadian broadcast content. Broadcasters flourished in a market that permitted simultaneous substitution (thereby enabling big profits from licensing U.S. content) and that kept U.S. giants such as HBO, ESPN, and MTV out of the market for years in favour of Canadian alternatives. Cable and satellite companies became dominant media companies by requiring consumers to purchase large packages filled with channels they did not want in order to access the few they did.

As I mentioned in a conversation last night, the Canadian market for broadcast, telecommunications, and internet providers has been carefully managed by the government to minimize the whole messy “competition” thing and ensure quasi-monopoly conditions in various regions across the country. The regulators prefer a small number of players in the market: it makes it easier to do the “regulation” thing when you can fit all the regulated players around a small table, and it also provides post-civil service career opportunities for former regulators. Having a larger number of competing organizations makes the regulation game much more difficult and reduces the revolving door opportunities for former regulators.

April 30, 2014

What if real life had lag like online games do?

Filed under: Humour, Technology — Tags: , — Nicholas @ 00:01

You wouldn’t accept lag offline, so why do it online? ume.net, a fiber broadband provider that offers up to 1000 Mbit/s, performed an experiment. Four volunteers got to experience internet’s biggest disturbance in real life – lag.

H/T to Jeff Sher for the link.

April 25, 2014

Is it science or “science”? A cheat sheet

Filed under: Media, Science — Tags: , , , , , — Nicholas @ 08:32

At Lifehacker, Alan Henry links to this useful infographic:

Click to see full-size image at Compound Interest

Click to see full-size image at Compound Interest

Science is amazing, but science reporting can be confusing at times and misleading at worst. The folks at Compound Interest put together this reference graphic that will help you pick out good articles from bad ones, and help you qualify the impact of the study you’re reading

One of the best and worst things about having a scientific background is being able to see when a science story is poorly reported, or a preliminary study published as if it were otherwise. One of the worst things about writing about science worrying you’ll fall into the same trap. It’s a constant struggle, because there are interesting takeaways even from preliminary studies and small sample sizes, but it’s important to qualify them as such so you don’t misrepresent the research. With this guide, you’ll be able to see when a study’s results are interesting food for thought that’s still developing, versus a relatively solid position that has consensus behind it.

April 23, 2014

LibreSSL website – “This page scientifically designed to annoy web hipsters”

Filed under: Technology — Tags: , , , , , — Nicholas @ 09:24

Julian Sanchez linked to this Ars Technica piece on a new fork of OpenSSL:

OpenBSD founder Theo de Raadt has created a fork of OpenSSL, the widely used open source cryptographic software library that contained the notorious Heartbleed security vulnerability.

OpenSSL has suffered from a lack of funding and code contributions despite being used in websites and products by many of the world’s biggest and richest corporations.

The decision to fork OpenSSL is bound to be controversial given that OpenSSL powers hundreds of thousands of Web servers. When asked why he wanted to start over instead of helping to make OpenSSL better, de Raadt said the existing code is too much of a mess.

“Our group removed half of the OpenSSL source tree in a week. It was discarded leftovers,” de Raadt told Ars in an e-mail. “The Open Source model depends [on] people being able to read the code. It depends on clarity. That is not a clear code base, because their community does not appear to care about clarity. Obviously, when such cruft builds up, there is a cultural gap. I did not make this decision… in our larger development group, it made itself.”

The LibreSSL code base is on OpenBSD.org, and the project is supported financially by the OpenBSD Foundation and OpenBSD Project. LibreSSL has a bare bones website that is intentionally unappealing.

“This page scientifically designed to annoy web hipsters,” the site says. “Donate now to stop the Comic Sans and Blink Tags.” In explaining the decision to fork, the site links to a YouTube video of a cover of the Twisted Sister song “We’re not gonna take it.”

April 19, 2014

Transaction costs, takedown notices, and the DMCA

Filed under: Economics, Law, Media — Tags: , , , , — Nicholas @ 09:59

Mike Masnick reports on an inadvertent natural experiment that just came to light:

We’ve written a few times in the past about research done by Paul Heald on copyright and its impact on the availability of certain content. He’s recently published an interesting new study on how the DMCA’s notice-and-takedown regime facilitates making content available by decreasing transaction costs among parties. As we’ve discussed at length, the entertainment industry’s main focus in the next round of copyright reform is to wipe out the notice-and-takedown provisions of the DMCA. The legacy recording and movie industries want everyone else to act as copyright cops, and hate the idea that notice-and-takedown puts the initial burden on themselves as copyright holders.

However, Heald’s research looks at music on YouTube and concludes that the notice-and-takedown system has actually enabled much greater authorized availability of music, by reducing transaction costs. The idea is pretty straightforward. Without a notice-and-takedown provision, someone who wants to post music to YouTube needs to go out and seek a license. Of course, getting permission from all the various rightsholders is frequently impossible. The transaction costs of getting permission make it such that it’s way too high. Yet, with notice-and-takedown, the person can upload the content without permission, and then the copyright holder is given the option of what to do with it. On YouTube, that includes the option of monetizing it, thus “authorizing” the use. That creates a natural experiment for Heald to explore, in which he can see how much content is “authorized” thanks to such a setup. And the result, not surprisingly, is that this system has enabled much greater authorized (and monetized) access to music than an alternative, high transaction cost system, under which uploaders must first seek out permission to upload everything.

« Newer PostsOlder Posts »

Powered by WordPress