Quotulatiousness

March 26, 2014

Secret Service upholds (recent) tradition in the Netherlands

Filed under: Europe, Media, USA — Tags: , , , — Nicholas @ 07:17

BBC News reports that — once again — some of the US Secret Service agents tasked with protecting the President have come to the attention of the press for reasons other than their assigned mission:

Three US Secret Service agents tasked with protecting President Barack Obama in the Netherlands have been sent home for “disciplinary reasons”.

The Washington Post reported that one was found drunk and passed out in the hallway of an Amsterdam hotel.

A Secret Service spokesman declined to give details but said the three had been put on administrative leave pending an investigation.

The service has been trying to rebuild its reputation after previous scandals.

In 2013 two agents were removed from President Obama’s security detail amid allegations of sexual harassment and misconduct.

And in 2012 several agents were dismissed following allegations that they hired prostitutes while in Cartagena, Colombia.

Secret Service spokesman Ed Donovan said the latest incident happened before President Obama’s arrival in the Netherlands on Monday for a nuclear security summit.

He said the three had been sent home for “disciplinary reasons” but declined to elaborate.

Mr Donovan added that the president’s security had not been compromised in any way.

February 9, 2014

“A car is a mini network … and right now there’s no security implemented”

Filed under: Technology — Tags: , , , — Nicholas @ 11:48

Driving your car anywhere soon? Got anti-hacking gear installed?

Spanish hackers have been showing off their latest car-hacking creation; a circuit board using untraceable, off-the-shelf parts worth $20 that can give wireless access to the car’s controls while it’s on the road.

The device, which will be shown off at next month’s Black Hat Asia hacking conference, uses the Controller Area Network (CAN) ports car manufacturers build into their engines for computer-system checks. Once assembled, the smartphone-sized device can be plugged in under some vehicles, or inside the bonnet of other models, and give the hackers remote access to control systems.

“A car is a mini network,” security researcher Alberto Garcia Illera told Forbes. “And right now there’s no security implemented.”

Illera and fellow security researcher Javier Vazquez-Vidal said that they had tested the CAN Hacking Tool (CHT) successfully on four popular makes of cars and had been able to apply the emergency brakes while the car was in motion, affect the steering, turn off the headlights, or set off the car alarm.

The device currently only works via Bluetooth, but the team says that they will have a GSM version ready by the time the conference starts. This would allow remote control of a target car from much greater distances, and more technical details of the CHT will be given out at the conference.

January 21, 2014

Coming soon – ShapeShifter’s “polymorphic” defence against malware

Filed under: Technology — Tags: , , , — Nicholas @ 11:11

In The Register, John Leyden discusses a new start-up’s plans for defending websites against hackers:

Startup Shape Security is re-appropriating a favourite tactic of malware writers in developing a technology to protect websites against automated hacking attacks.

Trojan authors commonly obfuscate their code to frustrate reverse engineers at security firms. The former staffers from Google, VMWare and Mozilla (among others) have created a network security appliance which takes a similar approach (dubbed real-time polymorphism) towards defending websites against breaches — by hobbling the capability of malware, bots, and other scripted attacks to interact with web applications.

Polymorphic code was originally used by malicious software to rewrite its own code every time a new machine was infected. Shape has invented patent-pending technology that is able to implement “real-time polymorphism” — or dynamically changing code — on any website. By doing this, it removes the static elements which botnets and malware depend on for their attacks.

January 18, 2014

How “safe” is your safe?

Filed under: History, Military — Tags: , , , — Nicholas @ 10:44

Safe manufacturers generally ship their products with a factory-standard combination. Many people fail to change them once the safe is in use:

In England many years ago, chatting with a locksmith while he worked, I learned the following thing: One of the country’s leading manufacturers of safes shipped all its products set to a default opening combination of 102030, and a high proportion of customers never reset it.

He: “If I need to open a Chubb safe, it’s the first thing I try. You’d be surprised how often it works.”

This came to mind when I was reading the story about Kennedy-era launch codes for our nuclear missiles:

    …The Strategic Air Command greatly resented [Defense Secretary Robert] McNamara’s presence and almost as soon as he left, the code to launch the missile’s [sic], all 50 of them, was set to 00000000.

I use a random-string generator for my passwords and change them often. I guess safeguarding my Netflix account is more important than preventing a nuclear holocaust.

December 12, 2013

Paranoid? You’re probably not paranoid enough

Filed under: Technology — Tags: , , , , — Nicholas @ 09:26

Charles Stross has a few adrenaline shots for your paranoia gland this morning:

The internet of things may be coming to us all faster and harder than we’d like.

Reports coming out of Russia suggest that some Chinese domestic appliances, notably kettles, come kitted out with malware — in the shape of small embedded computers that leech off the mains power to the device. The covert computational passenger hunts for unsecured wifi networks, connects to them, and joins a spam and malware pushing botnet. The theory is that a home computer user might eventually twig if their PC is a zombie, but who looks inside the base of their electric kettle, or the casing of their toaster? We tend to forget that the Raspberry Pi is as powerful as an early 90s UNIX server or a late 90s desktop; it costs £25, is the size of a credit card, and runs off a 5 watt USB power source. And there are cheaper, less competent small computers out there. Building them into kettles is a stroke of genius for a budding crime lord looking to build a covert botnet.

But that’s not what I’m here to talk about.

[…]

I’m dozy and slow on the uptake: I should have been all over this years ago.

And it’s not just keyboards. It’s ebook readers. Flashlights. Not your smartphone, but the removable battery in your smartphone. (Have you noticed it running down just a little bit faster?) Your toaster and your kettle are just the start. Could your electric blanket be spying on you? Koomey’s law is going to keep pushing the power consumption of our devices down even after Moore’s law grinds to a halt: and once Moore’s law ends, the only way forward is to commoditize the product of those ultimate fab lines, and churn out chips for pennies. In another decade, we’ll have embedded computers running some flavour of Linux where today we have smart inventory control tags — any item in a shop that costs more than about £50, basically. Some of those inventory control tags will be watching and listening to us; and some of their siblings will, repurposed, be piggy-backing a ride home and casing the joint.

The possibilities are endless: it’s the dark side of the internet of things. If you’ll excuse me now, I’ve got to go wallpaper my apartment in tinfoil …

November 1, 2013

Let’s hope badBIOS is an elaborate Halloween hoax

Filed under: Technology — Tags: , , , , , — Nicholas @ 08:05

Dan Goodin posted a scary Halloween tale at Ars Technica yesterday … at least, I’m hoping it’s just a scary story for the season:

In the intervening three years, Ruiu said, the infections have persisted, almost like a strain of bacteria that’s able to survive extreme antibiotic therapies. Within hours or weeks of wiping an infected computer clean, the odd behavior would return. The most visible sign of contamination is a machine’s inability to boot off a CD, but other, more subtle behaviors can be observed when using tools such as Process Monitor, which is designed for troubleshooting and forensic investigations.

Another intriguing characteristic: in addition to jumping “airgaps” designed to isolate infected or sensitive machines from all other networked computers, the malware seems to have self-healing capabilities.

“We had an air-gapped computer that just had its [firmware] BIOS reflashed, a fresh disk drive installed, and zero data on it, installed from a Windows system CD,” Ruiu said. “At one point, we were editing some of the components and our registry editor got disabled. It was like: wait a minute, how can that happen? How can the machine react and attack the software that we’re using to attack it? This is an air-gapped machine and all of a sudden the search function in the registry editor stopped working when we were using it to search for their keys.”

Over the past two weeks, Ruiu has taken to Twitter, Facebook, and Google Plus to document his investigative odyssey and share a theory that has captured the attention of some of the world’s foremost security experts. The malware, Ruiu believes, is transmitted though USB drives to infect the lowest levels of computer hardware. With the ability to target a computer’s Basic Input/Output System (BIOS), Unified Extensible Firmware Interface (UEFI), and possibly other firmware standards, the malware can attack a wide variety of platforms, escape common forms of detection, and survive most attempts to eradicate it.

But the story gets stranger still. In posts here, here, and here, Ruiu posited another theory that sounds like something from the screenplay of a post-apocalyptic movie: “badBIOS,” as Ruiu dubbed the malware, has the ability to use high-frequency transmissions passed between computer speakers and microphones to bridge airgaps.

October 29, 2013

What happens when you challenge hackers to investigate you?

Filed under: Law, Technology — Tags: , , , , — Nicholas @ 09:13

Adam Penenberg had himself investigated in the late 1990s and wrote that up for Forbes. This time around, he asked Nick Percoco to do the same thing, and was quite weirded out by the experience:

It’s my first class of the semester at New York University. I’m discussing the evils of plagiarism and falsifying sources with 11 graduate journalism students when, without warning, my computer freezes. I fruitlessly tap on the keyboard as my laptop takes on a life of its own and reboots. Seconds later the screen flashes a message. To receive the four-digit code I need to unlock it I’ll have to dial a number with a 312 area code. Then my iPhone, set on vibrate and sitting idly on the table, beeps madly.

I’m being hacked — and only have myself to blame.

Two months earlier I challenged Nicholas Percoco, senior vice president of SpiderLabs, the advanced research and ethical hacking team at Trustwave, to perform a personal “pen-test,” industry-speak for “penetration test.” The idea grew out of a cover story I wrote for Forbes some 14 years earlier, when I retained a private detective to investigate me, starting with just my byline. In a week he pulled up an astonishing amount of information, everything from my social security number and mother’s maiden name to long distance phone records, including who I called and for how long, my rent, bank accounts, stock holdings, and utility bills.

[…]

A decade and a half later, and given the recent Edward Snowden-fueled brouhaha over the National Security Agency’s snooping on Americans, I wondered how much had changed. Today, about 250 million Americans are on the Internet, and spend an average of 23 hours a week online and texting, with 27 percent of that engaged in social media. Like most people, I’m on the Internet, in some fashion, most of my waking hours, if not through a computer then via a tablet or smart phone.

With so much of my life reduced to microscopic bits and bytes bouncing around in a netherworld of digital data, how much could Nick Percoco and a determined team of hackers find out about me? Worse, how much damage could they potentially cause?

What I learned is that virtually all of us are vulnerable to electronic eavesdropping and are easy hack targets. Most of us have adopted the credo “security by obscurity,” but all it takes is a person or persons with enough patience and know-how to pierce anyone’s privacy — and, if they choose, to wreak havoc on your finances and destroy your reputation.

H/T to Terry Teachout for the link.

October 11, 2013

Creating an “air gap” for computer security

Filed under: Liberty, Technology — Tags: , , , , — Nicholas @ 12:13

Bruce Schneier explains why you’d want to do this … and how much of a pain it can be to set up and work with:

Since I started working with Snowden’s documents, I have been using a number of tools to try to stay secure from the NSA. The advice I shared included using Tor, preferring certain cryptography over others, and using public-domain encryption wherever possible.

I also recommended using an air gap, which physically isolates a computer or local network of computers from the Internet. (The name comes from the literal gap of air between the computer and the Internet; the word predates wireless networks.)

But this is more complicated than it sounds, and requires explanation.

Since we know that computers connected to the Internet are vulnerable to outside hacking, an air gap should protect against those attacks. There are a lot of systems that use — or should use — air gaps: classified military networks, nuclear power plant controls, medical equipment, avionics, and so on.

Osama Bin Laden used one. I hope human rights organizations in repressive countries are doing the same.

Air gaps might be conceptually simple, but they’re hard to maintain in practice. The truth is that nobody wants a computer that never receives files from the Internet and never sends files out into the Internet. What they want is a computer that’s not directly connected to the Internet, albeit with some secure way of moving files on and off.

He also provides a list of ten rules (or recommendations, I guess) you should follow if you want to set up an air-gapped machine of your own.

October 2, 2013

Bruce Schneier’s TEDx talk “The Battle for Power on the Internet”

Filed under: Media, Technology — Tags: , , , , — Nicholas @ 08:56

Published on 25 Sep 2013

Bruce Schneier gives us a glimpse of the future of the internet, and shares some of the context we should keep in mind, and the insights we need to understand, as we prepare for it. Learn more about Bruce Schneier at https://www.schneier.com and TEDxCambridge at http://www.tedxcambridge.com.

About TEDx, x = independently organized event
In the spirit of ideas worth spreading, TEDx is a program of local, self-organized events that bring people together to share a TED-like experience. At a TEDx event, TEDTalks video and live speakers combine to spark deep discussion and connection in a small group. These local, self-organized events are branded TEDx, where x = independently organized TED event. The TED Conference provides general guidance for the TEDx program, but individual TEDx events are self-organized.* (*Subject to certain rules and regulations)

September 21, 2013

Justin Amash on congressional classified briefings

Filed under: Bureaucracy, Government, USA — Tags: , , , , — Nicholas @ 10:01

In The Atlantic, Garance Franke-Ruta has transcribed some of Representative Justin Amash’s comments on the ins-and-outs of confidential briefings offered to congressmen:

Amash, who has previously butted heads with Intelligence Committee Chairman Mike Rogers and ranking member Dutch Ruppersberger over access to classified documents, recounted what happened during remarks before libertarian activists attending the Liberty Political Action Conference in Chantilly, Virginia, Thursday night. I quote his anecdote in full here, because it’s interesting to hear what it feels like to be one of the activist congressmen trying to rein in National Security Agency surveillance:

    What you hear from the intelligence committees, from the chairmen of the intelligence committees, is that members can come to classified briefings and they can ask whatever questions they want. But if you’ve actually been to one of these classified briefings — which none of you have, but I have — what you discover is that it’s just a game of 20 questions.

    You ask a question and if you don’t ask it exactly the right way you don’t get the right answer. So if you use the wrong pronoun, or if you talk about one agency but actually another agency is doing it, they won’t tell you. They’ll just tell you, no that’s not happening. They don’t correct you and say here’s what is happening.

    So you actually have to go from meeting to meeting, to hearing to hearing, asking asking questions — sometimes ridiculous questions — just to get an answer. So this idea that you can just ask, just come into a classified briefing and ask questions and get answers is ridiculous.

    If the government — in an extreme hypothetical, let’s say they had a base on the moon. If I don’t know that there’s a base on the moon, I’m not going to go into the briefing and say you have a moonbase. Right? [Audience laughs.] If they have a talking bear or something, I’m not going to say, ‘You guys, you didn’t engineer the talking bear.’

    You’re not going to ask questions about things you don’t know about. The point of the Intelligence Committee is to provide oversight to Congress and every single member of Congress needs information. Each person in Congress represents about 700,000 people. It’s not acceptable to say, ‘Well, the Intelligence Committees get the information, we don’t need to share with the rest of Congress.’ The Intelligence Committee is not one of the branches of government, but that’s how it’s being treated over and over again.

September 18, 2013

The NSA scandal is not about mere privacy

Filed under: Government, Liberty, USA — Tags: , , , , — Nicholas @ 08:19

Last week, Yochai Benkler posted this in the Guardian:

The spate of new NSA disclosures substantially raises the stakes of this debate. We now know that the intelligence establishment systematically undermines oversight by lying to both Congress and the courts. We know that the NSA infiltrates internet standard-setting processes to security protocols that make surveillance harder. We know that the NSA uses persuasion, subterfuge, and legal coercion to distort software and hardware product design by commercial companies.

We have learned that in pursuit of its bureaucratic mission to obtain signals intelligence in a pervasively networked world, the NSA has mounted a systematic campaign against the foundations of American power: constitutional checks and balances, technological leadership, and market entrepreneurship. The NSA scandal is no longer about privacy, or a particular violation of constitutional or legislative obligations. The American body politic is suffering a severe case of auto-immune disease: our defense system is attacking other critical systems of our body.

First, the lying. The National Intelligence University, based in Washington, DC, offers a certificate program called the denial and deception advanced studies program. That’s not a farcical sci-fi dystopia; it’s a real program about countering denial and deception by other countries. The repeated misrepresentations suggest that the intelligence establishment has come to see its civilian bosses as adversaries to be managed through denial and deception.

[…]

Second, the subversion. Last week, we learned that the NSA’s strategy to enhance its surveillance capabilities was to weaken internet security in general. The NSA infiltrated the social-professional standard-setting organizations on which the whole internet relies, from National Institute of Standards and Technology to the Internet Engineering Task Force itself, the very institutional foundation of the internet, to weaken the security standards. Moreover, the NSA combined persuasion and legal coercion to compromise the commercial systems and standards that offer the most basic security systems on which the entire internet runs. The NSA undermined the security of the SSL standard critical to online banking and shopping, VPN products central to secure corporate, research, and healthcare provider networks, and basic email utilities.

Serious people with grave expressions will argue that if we do not ruthlessly expand our intelligence capabilities, we will suffer terrorism and defeat. Whatever minor tweaks may be necessary, the argument goes, the core of the operation is absolutely necessary and people will die if we falter. But the question remains: how much of what we have is really necessary and effective, and how much is bureaucratic bloat resulting in the all-too-familiar dynamics of organizational self-aggrandizement and expansionism?

The “serious people” are appealing to our faith that national security is critical, in order to demand that we accept the particular organization of the Intelligence Church. Demand for blind faith adherence is unacceptable.

September 15, 2013

Bruce Schneier on what you can do to stay out of the NSA’s view

Filed under: Liberty, Technology — Tags: , , , , , — Nicholas @ 10:44

Other than going completely off the grid, you don’t have the ability to stay completely hidden, but there are some things you can do to decrease your visibility to the NSA:

With all this in mind, I have five pieces of advice:

  1. Hide in the network. Implement hidden services. Use Tor to anonymize yourself. Yes, the NSA targets Tor users, but it’s work for them. The less obvious you are, the safer you are.
  2. Encrypt your communications. Use TLS. Use IPsec. Again, while it’s true that the NSA targets encrypted connections — and it may have explicit exploits against these protocols — you’re much better protected than if you communicate in the clear.
  3. Assume that while your computer can be compromised, it would take work and risk on the part of the NSA — so it probably isn’t. If you have something really important, use an air gap. Since I started working with the Snowden documents, I bought a new computer that has never been connected to the Internet. If I want to transfer a file, I encrypt the file on the secure computer and walk it over to my Internet computer, using a USB stick. To decrypt something, I reverse the process. This might not be bulletproof, but it’s pretty good.
  4. Be suspicious of commercial encryption software, especially from large vendors. My guess is that most encryption products from large US companies have NSA-friendly back doors, and many foreign ones probably do as well. It’s prudent to assume that foreign products also have foreign-installed backdoors. Closed-source software is easier for the NSA to backdoor than open-source software. Systems relying on master secrets are vulnerable to the NSA, through either legal or more clandestine means.
  5. Try to use public-domain encryption that has to be compatible with other implementations. For example, it’s harder for the NSA to backdoor TLS than BitLocker, because any vendor’s TLS has to be compatible with every other vendor’s TLS, while BitLocker only has to be compatible with itself, giving the NSA a lot more freedom to make changes. And because BitLocker is proprietary, it’s far less likely those changes will be discovered. Prefer symmetric cryptography over public-key cryptography. Prefer conventional discrete-log-based systems over elliptic-curve systems; the latter have constants that the NSA influences when they can.

Since I started working with Snowden’s documents, I have been using GPG, Silent Circle, Tails, OTR, TrueCrypt, BleachBit, and a few other things I’m not going to write about. There’s an undocumented encryption feature in my Password Safe program from the command line; I’ve been using that as well.

I understand that most of this is impossible for the typical Internet user. Even I don’t use all these tools for most everything I am working on. And I’m still primarily on Windows, unfortunately. Linux would be safer.

The NSA has turned the fabric of the Internet into a vast surveillance platform, but they are not magical. They’re limited by the same economic realities as the rest of us, and our best defense is to make surveillance of us as expensive as possible.

Trust the math. Encryption is your friend. Use it well, and do your best to ensure that nothing can compromise it. That’s how you can remain secure even in the face of the NSA.

September 12, 2013

QotD: The “never let a crisis go to waste” mentality

Filed under: Law, Liberty, Quotations, USA — Tags: , , , — Nicholas @ 14:11

The lesson I remember best from my religious instruction as a youth in the Catholic church came from a nun who was explaining the ten commandments. She asked me to explain the prohibition of taking the Lord’s name in vain; I said it meant I should not curse using God’s name. She corrected me — ultimately the commandment means we should not invoke God’s name for our own power or glory or purposes rather than His own, she said.

9/11 — like every great and terrible thing and event that has ever come before it — is invoked to demand and justify a wide array of ends and prove a confusing jumble of conclusions. Many of those ends and conclusions were sought by their advocates well before 9/11. It has ever been so. People will seek power, seek prominence, seek money, seek their religious and ideological goals by invoking events — by trying, as I suggested in #4 above, to blur the line between the thing and our reaction to the thing. This has been a constant theme on this blog: the government has sought more and more power over us, and more and more limitations on our rights, by invoking 9/11, only to use those new powers to fight old fights unrelated to terrorism and to suppress things they didn’t like before 9/11. The PATRIOT ACT was an incoherent jumble of law enforcement wet dreams and wish lists, components of which had been floating about for decades. But though the government’s efforts to use 9/11 has carried the most weight, the invocations have not come only from the government — they’ve come from everywhere, left and right, seeking to use the tragedy to prove preconceptions about America and its foreign policy.

Ken White, “Ten Things I Want My Children To Learn From 9/11”, Popehat, 2011-09-11

August 18, 2013

The real problem facing the NSA and other intelligence organizations

Filed under: Bureaucracy, Government, Liberty — Tags: , , , , , — Nicholas @ 10:24

Charles Stross points out that there’s been a vast change in the working world that the NSA and other acronyms didn’t see coming and haven’t prepared themselves to face:

The big government/civil service agencies are old. They’re products of the 20th century, and they are used to running their human resources and internal security processes as if they’re still living in the days of the “job for life” culture; potential spooks-to-be were tapped early (often while at school or university), vetted, then given a safe sinecure along with regular monitoring to ensure they stayed on the straight-and-narrow all the way to the gold watch and pension. Because that’s how we all used to work, at least if we were civil servants or white collar paper pushers back in the 1950s.

[…]

Here’s the problem: they’re now running into outside contractors who grew up in Generation X or Generation Y.

Let’s leave aside the prognostications of sociologists about over-broad cultural traits of an entire generation. The key facts are: Generation X’s parents expected a job for life, but with few exceptions Gen Xers never had that — they’re used to nomadic employment, hire-and-fire, right-to-work laws, the whole nine yards of organized-labour deracination. Gen Y’s parents are Gen X. Gen Y has never thought of jobs as permanent things. Gen Y will stare at you blankly if you talk about loyalty to their employer; the old feudal arrangement (“we’ll give you a job for life and look after you as long as you look out for the Organization”) is something their grandparents maybe ranted about, but it’s about as real as the divine right of kings. Employers are alien hive-mind colony intelligences who will fuck you over for the bottom line on the quarterly balance sheet. They’ll give you a laptop and tell you to hot-desk or work at home so that they can save money on office floorspace and furniture. They’ll dangle the offer of a permanent job over your head but keep you on a zero-hours contract for as long as is convenient. This is the world they grew up in: this is the world that defines their expectations.

To Gen X, a job for life with the NSA was a probably-impossible dream — it’s what their parents told them to expect, but few of their number achieved. To Gen Y the idea of a job for life is ludicrous and/or impossible.

This means the NSA and their fellow swimmers in the acronym soup of the intelligence-industrial complex are increasingly reliant on nomadic contractor employees, and increasingly subject to staff churn. There is an emerging need to security-clear vast numbers of temporary/transient workers … and workers with no intrinsic sense of loyalty to the organization. For the time being, security clearance is carried out by other contractor organizations that specialize in human resource management, but even they are subject to the same problem: Quis custodiet ipsos custodes?

August 4, 2013

Bruce Schneier talks about security and trust

Filed under: Business, Media, Technology — Tags: , , , — Nicholas @ 12:07

Published on 19 Jun 2013

Human society runs on trust. We all trust millions of people, organizations, and systems every day — and we do it so easily that we barely notice. But in any system of trust, there is an alternative, parasitic, strategy that involves abusing that trust. Making sure those defectors don’t destroy the cooperative systems they’re abusing is an age-old problem, one that we’ve solved through morals and ethics, laws, and all sort of security technologies. Understanding how these all work — and fail — is essential to understanding the problems we face in today’s increasingly technological and interconnected world.

Bruce Schneier is an internationally renowned security technologist and author. Described by The Economist as a “security guru,” he is best known as a refreshingly candid and lucid security critic and commentator. When people want to know how security really works, they turn to Schneier.

H/T to AVC for the link.

« Newer PostsOlder Posts »

Powered by WordPress