Quotulatiousness

May 8, 2015

Quantum Insert

Filed under: Britain, Technology, USA — Tags: , , , , , — Nicholas @ 02:00

Kim Zetter talks about some of the NSA’s more sneaky ways of intercepting communications:

Among all of the NSA hacking operations exposed by whistleblower Edward Snowden over the last two years, one in particular has stood out for its sophistication and stealthiness. Known as Quantum Insert, the man-on-the-side hacking technique has been used to great effect since 2005 by the NSA and its partner spy agency, Britain’s GCHQ, to hack into high-value, hard-to-reach systems and implant malware.

Quantum Insert is useful for getting at machines that can’t be reached through phishing attacks. It works by hijacking a browser as it’s trying to access web pages and forcing it to visit a malicious web page, rather than the page the target intend to visit. The attackers can then surreptitiously download malware onto the target’s machine from the rogue web page.

Quantum Insert has been used to hack the machines of terrorist suspects in the Middle East, but it was also used in a controversial GCHQ/NSA operation against employees of the Belgian telecom Belgacom and against workers at OPEC, the Organization of Petroleum Exporting Countries. The “highly successful” technique allowed the NSA to place 300 malicious implants on computers around the world in 2010, according to the spy agency’s own internal documents — all while remaining undetected.

But now security researchers with Fox-IT in the Netherlands, who helped investigate that hack against Belgacom, have found a way to detect Quantum Insert attacks using common intrusion detection tools such as Snort, Bro and Suricata.

February 15, 2015

The term “carjacking” may take on a new meaning

Filed under: Law, Technology — Tags: , , , — Nicholas @ 05:00

Earlier this month, The Register‘s Iain Thomson summarized the rather disturbing report released by Senator Ed Markey (D-MA) on the self-reported security (or lack thereof) in modern automobile internal networks:

In short, as we’ve long suspected, the computers in today’s cars can be hijacked wirelessly by feeding specially crafted packets of data into their networks. There’s often no need for physical contact; no leaving of evidence lying around after getting your hands dirty.

This means, depending on the circumstances, the software running in your dashboard can be forced to unlock doors, or become infected with malware, and records on where you’ve have been and how fast you were going may be obtained. The lack of encryption in various models means sniffed packets may be readable.

Key systems to start up engines, the electronics connecting up vital things like the steering wheel and brakes, and stuff on the CAN bus, tend to be isolated and secure, we’re told.

The ability for miscreants to access internal systems wirelessly, cause mischief to infotainment and navigation gear, and invade one’s privacy, is irritating, though.

“Drivers have come to rely on these new technologies, but unfortunately the automakers haven’t done their part to protect us from cyber-attacks or privacy invasions,” said Markey, a member of the Senate’s Commerce, Science and Transportation Committee.

“Even as we are more connected than ever in our cars and trucks, our technology systems and data security remain largely unprotected. We need to work with the industry and cyber-security experts to establish clear rules of the road to ensure the safety and privacy of 21st-century American drivers.”

Of the 17 car makers who replied [PDF] to Markey’s letters (Tesla, Aston Martin, and Lamborghini didn’t) all made extensive use of computing in their 2014 models, with some carrying 50 electronic control units (ECUs) running on a series of internal networks.

BMW, Chrysler, Ford, General Motors, Honda, Hyundai, Jaguar Land Rover, Mazda, Mercedes-Benz, Mitsubishi, Nissan, Porsche, Subaru, Toyota, Volkswagen (with Audi), and Volvo responded to the study. According to the senator’s six-page dossier:

  • Over 90 per cent of vehicles manufactured in 2014 had a wireless network of some kind — such as Bluetooth to link smartphones to the dashboard or a proprietary standard for technicians to pull out diagnostics.
  • Only six automakers have any kind of security software running in their cars — such as firewalls for blocking connections from untrusted devices, or encryption for protecting data in transit around the vehicle.
  • Just five secured wireless access points with passwords, encryption or proximity sensors that (in theory) only allow hardware detected within the car to join a given network.
  • And only models made by two companies can alert the manufacturers in real time if a malicious software attack is attempted — the others wait until a technician checks at the next servicing.

There wasn’t much detail on the security of over-the-air updates for firmware, nor the use of crypto to protect personal data being phoned home from vehicles to an automaker’s HQ.

January 14, 2015

British PM’s latest technological brain fart

Filed under: Britain, Law, Liberty, Technology — Tags: , , , , — Nicholas @ 07:43

Cory Doctorow explains why David Cameron’s proposals are not just dumb, but doubleplus-dumb:

What David Cameron thinks he’s saying is, “We will command all the software creators we can reach to introduce back-doors into their tools for us.” There are enormous problems with this: there’s no back door that only lets good guys go through it. If your Whatsapp or Google Hangouts has a deliberately introduced flaw in it, then foreign spies, criminals, crooked police (like those who fed sensitive information to the tabloids who were implicated in the hacking scandal — and like the high-level police who secretly worked for organised crime for years), and criminals will eventually discover this vulnerability. They — and not just the security services — will be able to use it to intercept all of our communications. That includes things like the pictures of your kids in your bath that you send to your parents to the trade secrets you send to your co-workers.

But this is just for starters. David Cameron doesn’t understand technology very well, so he doesn’t actually know what he’s asking for.

For David Cameron’s proposal to work, he will need to stop Britons from installing software that comes from software creators who are out of his jurisdiction. The very best in secure communications are already free/open source projects, maintained by thousands of independent programmers around the world. They are widely available, and thanks to things like cryptographic signing, it is possible to download these packages from any server in the world (not just big ones like Github) and verify, with a very high degree of confidence, that the software you’ve downloaded hasn’t been tampered with.

[…]

This, then, is what David Cameron is proposing:

* All Britons’ communications must be easy for criminals, voyeurs and foreign spies to intercept

* Any firms within reach of the UK government must be banned from producing secure software

* All major code repositories, such as Github and Sourceforge, must be blocked

* Search engines must not answer queries about web-pages that carry secure software

* Virtually all academic security work in the UK must cease — security research must only take place in proprietary research environments where there is no onus to publish one’s findings, such as industry R&D and the security services

* All packets in and out of the country, and within the country, must be subject to Chinese-style deep-packet inspection and any packets that appear to originate from secure software must be dropped

* Existing walled gardens (like Ios and games consoles) must be ordered to ban their users from installing secure software

* Anyone visiting the country from abroad must have their smartphones held at the border until they leave

* Proprietary operating system vendors (Microsoft and Apple) must be ordered to redesign their operating systems as walled gardens that only allow users to run software from an app store, which will not sell or give secure software to Britons

* Free/open source operating systems — that power the energy, banking, ecommerce, and infrastructure sectors — must be banned outright

David Cameron will say that he doesn’t want to do any of this. He’ll say that he can implement weaker versions of it — say, only blocking some “notorious” sites that carry secure software. But anything less than the programme above will have no material effect on the ability of criminals to carry on perfectly secret conversations that “we cannot read”. If any commodity PC or jailbroken phone can run any of the world’s most popular communications applications, then “bad guys” will just use them. Jailbreaking an OS isn’t hard. Downloading an app isn’t hard. Stopping people from running code they want to run is — and what’s more, it puts the whole nation — individuals and industry — in terrible jeopardy.

September 20, 2014

Can you trust Apple’s new commitment to your privacy?

Filed under: Business, Technology — Tags: , , , — Nicholas @ 12:32

David Akin posted a list of questions posed by John Gilmore, challenging the Apple iOS8 cryptography promises:

Gilmore considered what Apple said and considered how Apple creates its software — a closed, secret, proprietary method — and what coders like him know about the code that Apple says protects our privacy — pretty much nothing — and then wrote the following for distribution on Dave Farber‘s Interesting People listserv. I’m pretty sure neither Farber nor Gilmore will begrudge me reproducing it.

    And why do we believe [Apple]?

    • Because we can read the source code and the protocol descriptions ourselves, and determine just how secure they are?
    • Because they’re a big company and big companies never lie?
    • Because they’ve implemented it in proprietary binary software, and proprietary crypto is always stronger than the company claims it to be?
    • Because they can’t covertly send your device updated software that would change all these promises, for a targeted individual, or on a mass basis?
    • Because you will never agree to upgrade the software on your device, ever, no matter how often they send you updates?
    • Because this first release of their encryption software has no security bugs, so you will never need to upgrade it to retain your privacy?
    • Because if a future update INSERTS privacy or security bugs, we will surely be able to distinguish these updates from future updates that FIX privacy or security bugs?
    • Because if they change their mind and decide to lessen our privacy for their convenience, or by secret government edict, they will be sure to let us know?
    • Because they have worked hard for years to prevent you from upgrading the software that runs on their devices so that YOU can choose it and control it instead of them?
    • Because the US export control bureacracy would never try to stop Apple from selling secure mass market proprietary encryption products across the border?
    • Because the countries that wouldn’t let Blackberry sell phones that communicate securely with your own corporate servers, will of course let Apple sell whatever high security non-tappable devices it wants to?
    • Because we’re apple fanboys and the company can do no wrong?
    • Because they want to help the terrorists win?
    • Because NSA made them mad once, therefore they are on the side of the public against NSA?
    • Because it’s always better to wiretap people after you convince them that they are perfectly secure, so they’ll spill all their best secrets?

    There must be some other reason, I’m just having trouble thinking of it.

September 19, 2014

Doctorow – “The time has come to create privacy tools for normal people”

Filed under: Liberty, Technology — Tags: , , , , — Nicholas @ 00:03

In the Guardian, Cory Doctorow says that we need privacy-enhancing technical tools that can be easily used by everyone, not just the highly technical (or highly paranoid) among us:

You don’t need to be a technical expert to understand privacy risks anymore. From the Snowden revelations to the daily parade of internet security horrors around the world – like Syrian and Egyptian checkpoints where your Facebook logins are required in order to weigh your political allegiances (sometimes with fatal consequences) or celebrities having their most intimate photos splashed all over the web.

The time has come to create privacy tools for normal people – people with a normal level of technical competence. That is, all of us, no matter what our level of technical expertise, need privacy. Some privacy measures do require extraordinary technical competence; if you’re Edward Snowden, with the entire NSA bearing down on your communications, you will need to be a real expert to keep your information secure. But the kind of privacy that makes you immune to mass surveillance and attacks-of-opportunity from voyeurs, identity thieves and other bad guys is attainable by anyone.

I’m a volunteer on the advisory board for a nonprofit that’s aiming to do just that: Simply Secure (which launches Thursday at simplysecure.org) collects together some very bright usability and cryptography experts with the aim of revamping the user interface of the internet’s favorite privacy tools, starting with OTR, the extremely secure chat system whose best-known feature is “perfect forward secrecy” which gives each conversation its own unique keys, so a breach of one conversation’s keys can’t be used to snoop on others.

More importantly, Simply Secure’s process for attaining, testing and refining usability is the main product of its work. This process will be documented and published as a set of best practices for other organisations, whether they are for-profits or non-profits, creating a framework that anyone can use to make secure products easier for everyone.

July 10, 2014

Throwing a bit of light on security in the “internet of things”

Filed under: Technology — Tags: , , , , — Nicholas @ 07:36

The “internet of things” is coming: more and more of your surroundings are going to be connected in a vastly expanded internet. A lot of attention needs to be paid to security in this new world, as Dan Goodin explains:

In the latest cautionary tale involving the so-called Internet of things, white-hat hackers have devised an attack against network-connected lightbulbs that exposes Wi-Fi passwords to anyone in proximity to one of the LED devices.

The attack works against LIFX smart lightbulbs, which can be turned on and off and adjusted using iOS- and Android-based devices. Ars Senior Reviews Editor Lee Hutchinson gave a good overview here of the Philips Hue lights, which are programmable, controllable LED-powered bulbs that compete with LIFX. The bulbs are part of a growing trend in which manufacturers add computing and networking capabilities to appliances so people can manipulate them remotely using smartphones, computers, and other network-connected devices. A 2012 Kickstarter campaign raised more than $1.3 million for LIFX, more than 13 times the original goal of $100,000.

According to a blog post published over the weekend, LIFX has updated the firmware used to control the bulbs after researchers discovered a weakness that allowed hackers within about 30 meters to obtain the passwords used to secure the connected Wi-Fi network. The credentials are passed from one networked bulb to another over a mesh network powered by 6LoWPAN, a wireless specification built on top of the IEEE 802.15.4 standard. While the bulbs used the Advanced Encryption Standard (AES) to encrypt the passwords, the underlying pre-shared key never changed, making it easy for the attacker to decipher the payload.

May 19, 2014

“Parallel construction” and Godwinizing the NSA

Filed under: Government, History, USA, WW2 — Tags: , , , — Nicholas @ 09:33

At Popehat, Clark uses an excerpt from a Bruce Schneier post to make a larger point. Here’s what Bruce wrote last year:

This dynamic was vitally important during World War II. During the war, the British were able to break the German Enigma encryption machine and eavesdrop on German military communications. But while the Allies knew a lot, they would only act on information they learned when there was another plausible way they could have learned it. They even occasionally manufactured plausible explanations. It was just too risky to tip the Germans off that their encryption machines’ code had been broken.

And this is Clark’s take:

We know that the NSA collects all sorts of information on American citizens. We know that the FBI and the CIA have full access to this information. We know that the DEA also has full access to that data. And we know that when the DEA busts someone using information gleaned by the electronic panopticon of our internal spy organization, they take pains to hide the source of the information via the subterfuge of parallel construction.

The insight is this: our government is now dealing with the citizenry the same way that the British dealt with the Nazis: treating them as an external existential threat, spying on them, and taking pains to obfuscate the source of the information that they use to target their attacks.

Yeah, Godwin’s law, whatever, whatever. My point is NOT that the NSA is the same as the Nazi party (in fact, my argument has the NSA on the opposite side). My point is that the government now treats ordinary civilians as worthy of the same sort of tactics that they once used against the Nazis.

H/T to Bernard King for the link.

April 23, 2014

LibreSSL website – “This page scientifically designed to annoy web hipsters”

Filed under: Technology — Tags: , , , , , — Nicholas @ 09:24

Julian Sanchez linked to this Ars Technica piece on a new fork of OpenSSL:

OpenBSD founder Theo de Raadt has created a fork of OpenSSL, the widely used open source cryptographic software library that contained the notorious Heartbleed security vulnerability.

OpenSSL has suffered from a lack of funding and code contributions despite being used in websites and products by many of the world’s biggest and richest corporations.

The decision to fork OpenSSL is bound to be controversial given that OpenSSL powers hundreds of thousands of Web servers. When asked why he wanted to start over instead of helping to make OpenSSL better, de Raadt said the existing code is too much of a mess.

“Our group removed half of the OpenSSL source tree in a week. It was discarded leftovers,” de Raadt told Ars in an e-mail. “The Open Source model depends [on] people being able to read the code. It depends on clarity. That is not a clear code base, because their community does not appear to care about clarity. Obviously, when such cruft builds up, there is a cultural gap. I did not make this decision… in our larger development group, it made itself.”

The LibreSSL code base is on OpenBSD.org, and the project is supported financially by the OpenBSD Foundation and OpenBSD Project. LibreSSL has a bare bones website that is intentionally unappealing.

“This page scientifically designed to annoy web hipsters,” the site says. “Donate now to stop the Comic Sans and Blink Tags.” In explaining the decision to fork, the site links to a YouTube video of a cover of the Twisted Sister song “We’re not gonna take it.”

April 9, 2014

XKCD on the impact of “Heartbleed”

Filed under: Technology — Tags: , , , , , — Nicholas @ 11:00

Update: In case you’re not concerned about the seriousness of this issue, The Register‘s John Leyden would like you to think again.

The catastrophic crypto key password vulnerability in OpenSSL affects far more than web servers, with everything from routers to smartphones also affected.

The so-called “Heartbleed” vulnerability (CVE-2014-0160) can be exploited to extract information from the servers running vulnerable version of OpenSSL, and this includes email servers and Android smartphones as well as routers.

Hackers could potentially gain access to private encryption key before using this information to decipher the encrypted traffic to and from vulnerable websites.

Web sites including Yahoo!, Flickr and OpenSSL were among the many left vulnerable to the megabug that exposed encryption keys, passwords and other sensitive information.

Preliminary tests suggested 47 of the 1000 largest sites are vulnerable to Heartbleed and that’s only among the less than half that provide support for SSL or HTTPS at all. Many of the affected sites – including Yahoo! – have since patched the vulnerability. Even so, security experts – such as Graham Cluley – remain concerned.

OpenSSL is a widely used encryption library that is a key component of technology that enables secure (https) website connections.

The bug exists in the OpenSSL 1.0.1 source code and stems from coding flaws in a fairly new feature known as the TLS Heartbeat Extension. “TLS heartbeats are used as ‘keep alive’ packets so that the ends of an encrypted connection can agree to keep the session open even when they don’t have any official data to exchange,” explains security veteran Paul Ducklin in a post on Sophos’ Naked Security blog.

The Heartbleed vulnerability in the OpenSSL cryptographic library might be exploited to reveal contents of secured communication exchanges. The same flaw might also be used to lift SSL keys.

This means that sites could still be vulnerable to attacks after installing the patches in cases where a private key has been stolen. Sites therefore need to revoke exposed keys, reissue new keys, and invalidate all session keys and session cookies.

Bruce Schneier:

“Catastrophic” is the right word. On the scale of 1 to 10, this is an 11.

Half a million sites are vulnerable, including my own. Test your vulnerability here.

The bug has been patched. After you patch your systems, you have to get a new public/private key pair, update your SSL certificate, and then change every password that could potentially be affected.

At this point, the probability is close to one that every target has had its private keys extracted by multiple intelligence agencies. The real question is whether or not someone deliberately inserted this bug into OpenSSL, and has had two years of unfettered access to everything. My guess is accident, but I have no proof.

April 2, 2014

Enigma’s 21st century open sourced descendent

Filed under: History, Military, Technology, WW2 — Tags: , , — Nicholas @ 09:51

The Enigma device was used by the German military in World War 2 to encrypt and decrypt communication between units and headquarters on land and at sea. Original Enigma units — the few that are on the market at any time — sell for tens of thousands of dollars. You may not be able to afford an original, but you might be interested in a modern implementation of Enigma using Arduino-based open-source hardware and software:

Actual hand-crafted Final design

Actual hand-crafted Final design

Enigma machines have captivated everyone from legendary code breaker Alan Turing and the dedicated cryptographers from England’s Bletchley Park to historians and collectors the world over.

But while many history buffs would surely love to get their hands on an authentic Enigma machine used during WWII, the devices aren’t exactly affordable (last year, a 1944 German Enigma machine was available for auction at Bonhams with an estimated worth of up to $82,000). Enter the Open Enigma Project, a kit for building one from scratch.

The idea came to Marc Tessier and James Sanderson from S&T Geotronics by accident.

“We were working on designing and building intelligent Arduino-based open-source geocaching devices to produce a unique interactive challenge at an upcoming Geocaching Mega Event,” Tessier told Crave. “A friend of ours suggested we use an Enigma type encrypting/decrypting machine as the ultimate stage of the challenge and pointed us to an Instructables tutorial that used a kid’s toy to provide some Enigma encoding. We looked all over to buy a real Enigma machine even if we had to assemble it ourselves and realized that there was nothing available at the moment. So we decided to build our own.”

[…]

“Our version is an electronic microprocessor-based machine that is running software which is a mathematical expression of how the historical mechanical machine behaved,” Sanderson told Crave. “Having never touched a real Enigma M4, we built our open version based on what we read online. From what we understand, the real electro-mechanical devices are much heavier and a little bigger.”

They took some design liberties — replacing the physical rotors with LED units and replacing the light bulbs with white LEDs. The replica can be modified by changing the Arduino code and can communicate to any computer via USB. Future versions may include Wi-Fi and/or Bluetooth.

November 14, 2013

How the internet was “weaponized”

Filed under: Government, Technology, USA — Tags: , , , , , — Nicholas @ 07:45

In Wired, Nicholas Weaver looks back on the way the internet was converted from a passive network infrastructure to a spy agency wonderland:

According to revelations about the QUANTUM program, the NSA can “shoot” (their words) an exploit at any target it desires as his or her traffic passes across the backbone. It appears that the NSA and GCHQ were the first to turn the internet backbone into a weapon; absent Snowdens of their own, other countries may do the same and then say, “It wasn’t us. And even if it was, you started it.”

If the NSA can hack Petrobras, the Russians can justify attacking Exxon/Mobil. If GCHQ can hack Belgicom to enable covert wiretaps, France can do the same to AT&T. If the Canadians target the Brazilian Ministry of Mines and Energy, the Chinese can target the U.S. Department of the Interior. We now live in a world where, if we are lucky, our attackers may be every country our traffic passes through except our own.

Which means the rest of us — and especially any company or individual whose operations are economically or politically significant — are now targets. All cleartext traffic is not just information being sent from sender to receiver, but is a possible attack vector.

[…]

The only self defense from all of the above is universal encryption. Universal encryption is difficult and expensive, but unfortunately necessary.

Encryption doesn’t just keep our traffic safe from eavesdroppers, it protects us from attack. DNSSEC validation protects DNS from tampering, while SSL armors both email and web traffic.

There are many engineering and logistic difficulties involved in encrypting all traffic on the internet, but its one we must overcome if we are to defend ourselves from the entities that have weaponized the backbone.

September 15, 2013

Bruce Schneier on what you can do to stay out of the NSA’s view

Filed under: Liberty, Technology — Tags: , , , , , — Nicholas @ 10:44

Other than going completely off the grid, you don’t have the ability to stay completely hidden, but there are some things you can do to decrease your visibility to the NSA:

With all this in mind, I have five pieces of advice:

  1. Hide in the network. Implement hidden services. Use Tor to anonymize yourself. Yes, the NSA targets Tor users, but it’s work for them. The less obvious you are, the safer you are.
  2. Encrypt your communications. Use TLS. Use IPsec. Again, while it’s true that the NSA targets encrypted connections — and it may have explicit exploits against these protocols — you’re much better protected than if you communicate in the clear.
  3. Assume that while your computer can be compromised, it would take work and risk on the part of the NSA — so it probably isn’t. If you have something really important, use an air gap. Since I started working with the Snowden documents, I bought a new computer that has never been connected to the Internet. If I want to transfer a file, I encrypt the file on the secure computer and walk it over to my Internet computer, using a USB stick. To decrypt something, I reverse the process. This might not be bulletproof, but it’s pretty good.
  4. Be suspicious of commercial encryption software, especially from large vendors. My guess is that most encryption products from large US companies have NSA-friendly back doors, and many foreign ones probably do as well. It’s prudent to assume that foreign products also have foreign-installed backdoors. Closed-source software is easier for the NSA to backdoor than open-source software. Systems relying on master secrets are vulnerable to the NSA, through either legal or more clandestine means.
  5. Try to use public-domain encryption that has to be compatible with other implementations. For example, it’s harder for the NSA to backdoor TLS than BitLocker, because any vendor’s TLS has to be compatible with every other vendor’s TLS, while BitLocker only has to be compatible with itself, giving the NSA a lot more freedom to make changes. And because BitLocker is proprietary, it’s far less likely those changes will be discovered. Prefer symmetric cryptography over public-key cryptography. Prefer conventional discrete-log-based systems over elliptic-curve systems; the latter have constants that the NSA influences when they can.

Since I started working with Snowden’s documents, I have been using GPG, Silent Circle, Tails, OTR, TrueCrypt, BleachBit, and a few other things I’m not going to write about. There’s an undocumented encryption feature in my Password Safe program from the command line; I’ve been using that as well.

I understand that most of this is impossible for the typical Internet user. Even I don’t use all these tools for most everything I am working on. And I’m still primarily on Windows, unfortunately. Linux would be safer.

The NSA has turned the fabric of the Internet into a vast surveillance platform, but they are not magical. They’re limited by the same economic realities as the rest of us, and our best defense is to make surveillance of us as expensive as possible.

Trust the math. Encryption is your friend. Use it well, and do your best to ensure that nothing can compromise it. That’s how you can remain secure even in the face of the NSA.

September 6, 2013

Bruce Schneier on taking back the internet

Filed under: Liberty, Technology, USA — Tags: , , , , — Nicholas @ 08:51

From his article in yesterday’s Guardian:

This is not the internet the world needs, or the internet its creators envisioned. We need to take it back.

And by we, I mean the engineering community.

Yes, this is primarily a political problem, a policy matter that requires political intervention.

But this is also an engineering problem, and there are several things engineers can — and should — do.

One, we should expose. If you do not have a security clearance, and if you have not received a National Security Letter, you are not bound by a federal confidentially requirements or a gag order. If you have been contacted by the NSA to subvert a product or protocol, you need to come forward with your story. Your employer obligations don’t cover illegal or unethical activity. If you work with classified data and are truly brave, expose what you know. We need whistleblowers.

We need to know how exactly how the NSA and other agencies are subverting routers, switches, the internet backbone, encryption technologies and cloud systems. I already have five stories from people like you, and I’ve just started collecting. I want 50. There’s safety in numbers, and this form of civil disobedience is the moral thing to do.

Two, we can design. We need to figure out how to re-engineer the internet to prevent this kind of wholesale spying. We need new techniques to prevent communications intermediaries from leaking private information.

We can make surveillance expensive again. In particular, we need open protocols, open implementations, open systems — these will be harder for the NSA to subvert.

The Internet Engineering Task Force, the group that defines the standards that make the internet run, has a meeting planned for early November in Vancouver. This group needs to dedicate its next meeting to this task. This is an emergency, and demands an emergency response.

Update: Glenn Greenwald retweeted this, saying it was “not really hard for a rational person to understand why this is newsworthy”.

August 9, 2013

Locking the (electronic) barn door

Filed under: Law, Media, Technology — Tags: , , , — Nicholas @ 08:03

The encrypted email service that was reportedly used by Edward Snowden just announced that it will be shutting down:

Today, Lavabit announced that it would shut down its encrypted email service rather than “become complicit in crimes against the American people.” Lavabit did not say what it had been asked to do, only that it was legally prohibited from sharing the events leading to its decision.

Lavabit was an email provider, apparently used by Edward Snowden along with other privacy sensitive users, with an avowed mission to offer an “e-mail service that never sacrifices privacy for profits” and promised to “only release private information if legally compelled by the courts in accordance with the United States Constitution.” It backed up this claim by encrypting all emails on Lavabit servers such that Lavabit did not have the ability to access a user’s email (Lavabit’s white paper), at least without that user’s passphrase, which the email provider did not store.

Given the impressive powers of the government to obtain emails and records from service providers, both with and without legal authority, it is encouraging to see service providers take steps to limit their ability to access user data, as Lavabit had done.

[…]

Lavabit’s post indicates that there was a gag order, and that there is an ongoing appeal before the Fourth Circuit. We call on the government and the courts to unseal enough of the docket to allow, at a minimum, the public to know the legal authority asserted, both for the gag and the substance, and give Lavabit the breathing room to participate in the vibrant and critical public debates on the extent of email privacy in an age of warrantless bulk surveillance by the NSA.

July 20, 2013

UK government inches closer to giving posthumous pardon to Alan Turing

Filed under: Britain, History, Law, WW2 — Tags: , , — Nicholas @ 09:50

In the Guardian, Nicholas Watt updates the news on a private member’s bill that would give Alan Turing a pardon:

Alan Turing, the Enigma codebreaker who took his own life after being convicted of gross indecency under anti-homosexuality legislation, is to be given a posthumous pardon.

The government signalled on Friday that it is prepared to support a backbench bill that would pardon Turing, who died from cyanide poisoning at the age of 41 in 1954 after he was subjected to “chemical castration”.

Lord Ahmad of Wimbledon, a government whip, told peers that the government would table the third reading of the Alan Turing (statutory pardon) bill at the end of October if no amendments are made. “If nobody tables an amendment to this bill, its supporters can be assured that it will have speedy passage to the House of Commons,” Ahmad said.

The announcement marks a change of heart by the government, which declined last year to grant pardons to the 49,000 gay men, now dead, who were convicted under the 1885 Criminal Law Amendment Act. They include Oscar Wilde.

Update: On the other hand, Matt Ridley thinks that there’s a major problem with this approach.

That Turing deserved an apology in his lifetime for this appalling treatment is not in doubt. What will be debated tomorrow is whether a posthumous pardon from today’s Government is right, or may be a further insult to his memory. After all, the word pardon implies that his crime is still a crime, which it is not, and it will do nothing for the victim (especially since he was an atheist), and do nothing to untarnish his reputation, which history has already fully untarnished. Also it could be unfair to other, less famous convicted gay men and may even seem to rewrite history rather than leaving it starkly to reproach us. By rights, Turing should be pardoning the Government, but that’s not possible.

So it is not easy to judge if a pardon is the right thing. For my part, I think a greater matter is at issue — whether we have done enough to recognise Turing’s scientific reputation and how we put that right. It becomes clearer by the day that, irrespective of his tragic end and even of his secret war service, he ranks for the momentous nature of his achievements with the likes of Francis Crick and Albert Einstein in the 20th-century scientific pantheon. This was not just a moderately good scientist made famous by persecution; this was the author of a really big idea.

[…]

When the war broke out, Turing’s genius proved as practical as it had been ethereal in the 1930s. His crucial contributions to three successive computing innovations at Bletchley Park — the “bombe” machines for replicating the settings of the German Enigma encryption machine, the later cracking of the naval Enigma machine enabling U-boat traffic to be read, and finally the Colossus computer that broke the Germans’ “tunny” cipher machine — provided Churchill with the famous “ultra” decrypts that almost certainly shortened the war and saved millions of lives in battlefields, ships and camps.

For this he was appointed OBE, but secrecy shrouded his work until long after his death, so he wasn’t known to be a hero, let alone the man who saved so many lives. He moved to what would become GCHQ, but in the paranoid days after Burgess and Maclean fled east, his homosexuality conviction categorised him as a security risk.

« Newer PostsOlder Posts »

Powered by WordPress