Quotulatiousness

December 31, 2013

Social networking – your weak contacts may be the most valuable ones

Filed under: Business, Technology — Tags: , , , — Nicholas @ 10:20

Tim Harford explains why your friends and family are not the most valuable members of your extended social network … at least when it comes to looking for jobs:

This dispiriting stuff reminded me of Mark Granovetter’s work on “the strength of weak ties”, published in 1973. Granovetter, a sociologist, brought together two disparate strands of work: a survey of how people with professional or managerial jobs had found those jobs; and a theoretical analysis of the structure of social networks.

Start with the theoretical observation first: the most irreplaceable social connections, paradoxically, are often rather weak or distant ones. A family group or clique of close friends all tend to know each other and know similar things at similar times. Their social ties are strong but also redundant, in the sense that there are many different paths through which information could pass from one member of that group to another.

By contrast, “weak ties” between one social cluster and another are valuable precisely because the social contact is unusual. Information passed along a weak tie will often be totally new — and if it doesn’t arrive through the weak tie, it is unlikely to arrive at all.

Granovetter then supplemented this theoretical idea with his survey, showing that it was very common for people to find jobs — especially managerial jobs and jobs with which they were satisfied — through personal contacts. The old saw is true: it’s not what you know, it’s who you know. Or as Granovetter put it in his book Finding a Job, what matters most is “one’s position in a social network”.

But this is not because of crude nepotism: the key contacts who helped jobseekers find jobs were typically distant rather than close friends — old college contacts, perhaps, or former colleagues. Granovetter’s analysis made this finding make sense: it’s the more peripheral contacts who tell you things you don’t already know.

This observation has certainly been true for many of my jobs: colleagues from a decade or more in the past suddenly pop up with an interesting position or business opportunity (such contacts are all the more interesting because they’re completely unexpected).

2013 in review

Filed under: Humour, Liberty, Politics — Tags: , , , — Nicholas @ 10:09

I nearly ran Steve Chapman‘s wonderful little squib as a QotD entry: “The course of freedom and democracy in the world is an evolutionary process, though sometimes it proceeds in the wrong direction. Wines have good years and bad years. If 2013 were a wine, you’d use it to kill weeds.”

Looking ahead to 2014, Radley Balko has some Dire Civil Liberties Predictions to ring in the new year:

As we come to the end of a year that saw revelations about massive government spying programs, horrifying stories of police abuse, and brazen violations of the Fourth Amendment, I thought I might offer my own grim predictions about where civil liberties are headed in the coming year. Sure, some of these may seem outlandish. But to borrow from H.L. Mencken, nobody ever went broke underestimating the grade and lubriciousness of the slippery slope.

On a less-depressing note, Nick Mediati rounds up the “top” memes of 2013, including the latest attempt to de-grammaticize the internet:

Doge meme of 2013

After years and years of cats dominating the Internet, dog lovers were finally thrown a bone in 2013 with the emergence of the Doge meme. The meme typically features photos of Shiba Inu dogs with internal thoughts overlaid in brightly colored Comic Sans. And it’s frickin’ awesome. You might find yourself spontaneously speaking in doge. Such language. So words. Very thought. Wow.

November 18, 2013

Lifelogging in 30-second intervals

Filed under: Media, Technology — Tags: , , , — Nicholas @ 15:38

Jerry Brito is a sousveillance fan and he thinks you should be too:

The Narrative Clip is a digital camera about the size of a postage stamp that clips to one’s breast pocket or shirt collar and takes a photo every thirty seconds of whatever one’s seeing. The photos are uploaded to the cloud and can be accessed on demand with a smartphone app, making it easy to look up any moment in one’s life. When the project to mass-produce these cameras first hit Kickstarter, I knew I had to have one, and with any luck mine will be arriving in a couple of weeks.

The prospect of having a complete photographic record of my life is compelling for many reasons. I have a terrible memory, especially for faces, so it will be interesting to see if this device can help. There are also moments in life that would be great to relive, but that one can’t – or one doesn’t know one should – be photographing. Narrative’s Instagram feed has some good examples of these. But most importantly, I want to help hasten our inevitable sousveillance future.

[…]

Being monitored in everyday life has become inescapable. So, as David Brin points out in The Transparent Society, the question is not whether there should be pervasive monitoring, but who will have access to the data. Will it only be the powerful, who will use the information to control? Or will the rest of us also be able to watch back?

Ideally, perhaps, we would all be left alone to live private lives under no one’s gaze. Short of halting all technological progress, however, that ship has sailed. Mass surveillance is the inevitable result of smaller cameras and microphones, faster processors, and incredibly cheap storage. So if I can’t change that reality, I want to be able to watch back as well.

November 14, 2013

How the internet was “weaponized”

Filed under: Government, Technology, USA — Tags: , , , , , — Nicholas @ 07:45

In Wired, Nicholas Weaver looks back on the way the internet was converted from a passive network infrastructure to a spy agency wonderland:

According to revelations about the QUANTUM program, the NSA can “shoot” (their words) an exploit at any target it desires as his or her traffic passes across the backbone. It appears that the NSA and GCHQ were the first to turn the internet backbone into a weapon; absent Snowdens of their own, other countries may do the same and then say, “It wasn’t us. And even if it was, you started it.”

If the NSA can hack Petrobras, the Russians can justify attacking Exxon/Mobil. If GCHQ can hack Belgicom to enable covert wiretaps, France can do the same to AT&T. If the Canadians target the Brazilian Ministry of Mines and Energy, the Chinese can target the U.S. Department of the Interior. We now live in a world where, if we are lucky, our attackers may be every country our traffic passes through except our own.

Which means the rest of us — and especially any company or individual whose operations are economically or politically significant — are now targets. All cleartext traffic is not just information being sent from sender to receiver, but is a possible attack vector.

[…]

The only self defense from all of the above is universal encryption. Universal encryption is difficult and expensive, but unfortunately necessary.

Encryption doesn’t just keep our traffic safe from eavesdroppers, it protects us from attack. DNSSEC validation protects DNS from tampering, while SSL armors both email and web traffic.

There are many engineering and logistic difficulties involved in encrypting all traffic on the internet, but its one we must overcome if we are to defend ourselves from the entities that have weaponized the backbone.

November 9, 2013

Barack Obama on the difference between private enterprise and government

Filed under: Bureaucracy, Business, Government, Technology, USA — Tags: , , , — Nicholas @ 11:43

Ann Althouse finds it amazing that President Obama clearly understands why his campaign website was so effective and why the Obamacare website fails on so many levels, but can’t generalize that knowledge to the whole public/private sphere:

In yesterday’s interview with Chuck Todd, Obama said:

    You know, one of the lessons — learned from this whole process on the website — is that probably the biggest gap between the private sector and the federal government is when it comes to I.T. …

    Well, the reason is is that when it comes to my campaign, I’m not constrained by a bunch of federal procurement rules, right?

That is, many have pointed out that his campaign website was really good, so why didn’t that mean that he’d be good at setting up a health insurance website? The answer is that the government is bad because the government is hampered by… government!

    And how we write — specifications and — and how the — the whole things gets built out. So part of what I’m gonna be looking at is how do we across the board, across the federal government, leap into the 21st century.

I love the combination of: 1. Barely able to articulate what the hell happens inside these computer systems, and 2. Wanting to leap!

    Because when it comes to medical records for veterans, it’s still done in paper. Medicaid is still largely done on paper.

    When we buy I.T. services generally, it is so bureaucratic and so cumbersome that a whole bunch of it doesn’t work or it ends up being way over cost.

This should have made him sympathetic to the way government burdens private enterprise, but he’s focused on liberating government to take over more of what has been done privately. And yet there’s no plan, no idea about what would suddenly enable government to displace private businesses competing to offer a product people want to buy.

November 4, 2013

QotD: Software quality assurance

Filed under: Business, Government, Quotations, Technology — Tags: , , , — Nicholas @ 10:13

The fundamental purpose of testing—and, for that matter, of all software quality assurance (QA) deliverables and processes — is to tell you just what you’ve built and whether it does what you think it should do. This is essential, because you can’t inspect a software program the same way you can inspect a house or a car. You can’t touch it, you can’t walk around it, you can’t open the hood or the bedroom door to see what’s inside, you can’t take it out for spin. There are very few tangible or visible clues to the completeness and reliability of a software system — and so we have to rely on QA activities to tell us how well built the system is.

Furthermore, almost any software system developed nowadays for production is vastly more complex than a house or car — it’s more on the same order of complexity of a large petrochemical processing and storage facility, with thousands of possible interconnections, states, and processes. We would be (rightly) terrified if, say, Exxon build such a sprawling oil refining complex near our neighborhood and then started up production having only done a bare minimum of inspection, testing, and trial operations before, during and after construction, offering the explanation that they would wait until after the plant went into production and then handle problems as they crop up. Yet too often that’s just how large software development projects are run, even though the system in development may well be more complex (in terms of connections, processes, and possible states) than such a petrochemical factory. And while most inadequately tested software systems won’t spew pollutants, poison the neighborhood, catch fire, or explode, they can cripple corporate operations, lose vast sums of money, spark shareholder lawsuits, and open the corporation’s directors and officers to civil and even criminal liability (particularly with the advent of Sarbanes-Oxley).

And that presumes that the system can actually go into production. The software engineering literature and the trade press are replete with well-documented case studies of “software runaways”: large IT re-engineering or development projects that consume tens or hundreds of millions of dollars, or in a few spectacular (government) cases, billions of dollars, over a period of years, before grinding to a halt and being terminated without ever having put a usable, working system into production. So it’s important not to skimp on testing and the other QA-related activities.

Bruce F. Webster, “Obamacare and the Testing Gap”, And Still I Persist…, 2013-10-31

Living in a Surveillance State: Mikko Hypponen at TEDxBrussels

Filed under: Liberty, Technology, USA — Tags: , , , , — Nicholas @ 00:01

October 31, 2013

Reason.tv – Do the Healthcare Mash

Filed under: Government, Health, Humour, USA — Tags: , , — Nicholas @ 10:35

Trick or Treatment? Remy channels Bobby “Boris ” Pickett for this Healthcare.gov-Halloween mash-up.

Written and performed by Remy. Video by Sean Malone.

[…]

Lyrics:
He was working on his laptop late one night
when his eyes beheld a ghoulish site
He could not log in despite several tries
then suddenly to no one’s surprise

(he did the Mash)
He did the Healthcare Mash
(the Healthcare Mash)
it was a keyboard smash
(he did the Mash)
the website was trash
(he did the Mash)
He did the Healthcare mash

Who could design such a site so flawed and so sloppy?
The code is so ancient, perhaps it was Hammurabi
He’d try to apply but the site would suspend
I’ve seen a eunuch with a more functional front end

(he did the Mash)
He did the Healthcare Mash
(the Healthcare Mash)
it was a keyboard smash
(he did the Mash)
He tried to clear his cache
(he did the Mash)
He did the Healthcare mash

Hundreds of millions of dollars were spent
for a website that has trouble loading
How could the government’s web designers
create a site with such awful coding?

(they did the Mash)
Ahh, they did the Healthcare Mash
(the Healthcare Mash)
it was a keyboard smash
(they did the Mash)
they spent all of our cash
(they did the Mash)
They did the Healthcare Mash

October 29, 2013

Obamacare’s technical issues

Filed under: Government, Technology, USA — Tags: , , , — Nicholas @ 07:48

A comment at Marginal Revolution deservedly has been promoted to being a guest post, discussing the scale of the problems with the Obamacare software:

The real problems are with the back end of the software. When you try to get a quote for health insurance, the system has to connect to computers at the IRS, the VA, Medicaid/CHIP, various state agencies, Treasury, and HHS. They also have to connect to all the health plan carriers to get pre-subsidy pricing. All of these queries receive data that is then fed into the online calculator to give you a price. If any of these queries fails, the whole transaction fails.

Most of these systems are old legacy systems with their own unique data formats. Some have been around since the 1960′s, and the people who wrote the code that runs on them are long gone. If one of these old crappy systems takes too long to respond, the transaction times out.

[…]

When you even contemplate bringing an old legacy system into a large-scale web project, you should do load testing on that system as part of the feasibility process before you ever write a line of production code, because if those old servers can’t handle the load, your whole project is dead in the water if you are forced to rely on them. There are no easy fixes for the fact that a 30 year old mainframe can not handle thousands of simultaneous queries. And upgrading all the back-end systems is a bigger job than the web site itself. Some of those systems are still there because attempts to upgrade them failed in the past. Too much legacy software, too many other co-reliant systems, etc. So if they aren’t going to handle the job, you need a completely different design for your public portal.

A lot of focus has been on the front-end code, because that’s the code that we can inspect, and it’s the code that lots of amateur web programmers are familiar with, so everyone’s got an opinion. And sure, it’s horribly written in many places. But in systems like this the problems that keep you up at night are almost always in the back-end integration.

The root problem was horrific management. The end result is a system built incorrectly and shipped without doing the kind of testing that sound engineering practices call for. These aren’t ‘mistakes’, they are the result of gross negligence, ignorance, and the violation of engineering best practices at just about every step of the way.

October 28, 2013

Mark Steyn on the Obamacare software

Filed under: Bureaucracy, Cancon, Government, Technology — Tags: , , , — Nicholas @ 07:22

Mark Steyn’s weekend column touched on some items of interest to aficionados of past government software fiascos:

The witness who coughed up the intriguing tidbit about Obamacare’s exemption from privacy protections was one Cheryl Campbell of something called CGI. This rang a vague bell with me. CGI is not a creative free spirit from Jersey City with an impressive mastery of Twitter, but a Canadian corporate behemoth. Indeed, CGI is so Canadian their name is French: Conseillers en Gestion et Informatique. Their most famous government project was for the Canadian Firearms Registry. The registry was estimated to cost in total $119 million, which would be offset by $117 million in fees. That’s a net cost of $2 million. Instead, by 2004 the CBC (Canada’s PBS) was reporting costs of some $2 billion — or a thousand times more expensive.

Yeah, yeah, I know, we’ve all had bathroom remodelers like that. But in this case the database had to register some 7 million long guns belonging to some two-and-a-half to three million Canadians. That works out to almost $300 per gun — or somewhat higher than the original estimate for processing a firearm registration of $4.60. Of those $300 gun registrations, Canada’s auditor general reported to parliament that much of the information was either duplicated or wrong in respect to basic information such as names and addresses.

Sound familiar?

Also, there was a 1-800 number, but it wasn’t any use.

Sound familiar?

So it was decided that the sclerotic database needed to be improved.

Sound familiar?

But it proved impossible to “improve” CFIS (the Canadian Firearms Information System). So CGI was hired to create an entirely new CFIS II, which would operate alongside CFIS I until the old system could be scrapped. CFIS II was supposed to go operational on January 9, 2003, but the January date got postponed to June, and 2003 to 2004, and $81 million was thrown at it before a new Conservative government scrapped the fiasco in 2007. Last year, the government of Ontario canceled another CGI registry that never saw the light of day — just for one disease, diabetes, and costing a mere $46 million.

But there’s always America! “We continue to view U.S. federal government as a significant growth opportunity,” declared CGI’s chief exec, in what would also make a fine epitaph for the republic. Pizza and Mountain Dew isn’t very Montreal, and on the evidence of three years of missed deadlines in Ontario and the four-year overrun on the firearms database CGI don’t sound like they’re pulling that many all-nighters. Was the government of the United States aware that CGI had been fired by the government of Canada and the government of Ontario (and the government of New Brunswick)? Nobody’s saying. But I doubt it would make much difference.

October 25, 2013

QotD: The dangers of reading internet comments

Filed under: Humour, Media, Quotations — Tags: , , — Nicholas @ 00:01

I joke — hilariously — but there is a serious issue here. At least, I assume there is. Frankly, I can’t remember, because I made the mistake of scrolling down to the reader comments about the visa story. Reading online comments is like letting someone punch your brain in the face with a fistful of stupid. If you doubt this, consider that I’ve been hit with the “fist of stupid” so many times, I now think brains have faces. Kudos, Internet.

Scott Feschuk, “Mexico is ‘really mad’ at us, and it is so a big whoop: Diplomacy should be more like ‘Mean Girls’”, Maclean’s, 2013-09-20

October 11, 2013

Creating an “air gap” for computer security

Filed under: Liberty, Technology — Tags: , , , , — Nicholas @ 12:13

Bruce Schneier explains why you’d want to do this … and how much of a pain it can be to set up and work with:

Since I started working with Snowden’s documents, I have been using a number of tools to try to stay secure from the NSA. The advice I shared included using Tor, preferring certain cryptography over others, and using public-domain encryption wherever possible.

I also recommended using an air gap, which physically isolates a computer or local network of computers from the Internet. (The name comes from the literal gap of air between the computer and the Internet; the word predates wireless networks.)

But this is more complicated than it sounds, and requires explanation.

Since we know that computers connected to the Internet are vulnerable to outside hacking, an air gap should protect against those attacks. There are a lot of systems that use — or should use — air gaps: classified military networks, nuclear power plant controls, medical equipment, avionics, and so on.

Osama Bin Laden used one. I hope human rights organizations in repressive countries are doing the same.

Air gaps might be conceptually simple, but they’re hard to maintain in practice. The truth is that nobody wants a computer that never receives files from the Internet and never sends files out into the Internet. What they want is a computer that’s not directly connected to the Internet, albeit with some secure way of moving files on and off.

He also provides a list of ten rules (or recommendations, I guess) you should follow if you want to set up an air-gapped machine of your own.

October 2, 2013

Bruce Schneier’s TEDx talk “The Battle for Power on the Internet”

Filed under: Media, Technology — Tags: , , , , — Nicholas @ 08:56

Published on 25 Sep 2013

Bruce Schneier gives us a glimpse of the future of the internet, and shares some of the context we should keep in mind, and the insights we need to understand, as we prepare for it. Learn more about Bruce Schneier at https://www.schneier.com and TEDxCambridge at http://www.tedxcambridge.com.

About TEDx, x = independently organized event
In the spirit of ideas worth spreading, TEDx is a program of local, self-organized events that bring people together to share a TED-like experience. At a TEDx event, TEDTalks video and live speakers combine to spark deep discussion and connection in a small group. These local, self-organized events are branded TEDx, where x = independently organized TED event. The TED Conference provides general guidance for the TEDx program, but individual TEDx events are self-organized.* (*Subject to certain rules and regulations)

Now we have the real reason for the decline in newspaper revenue

Filed under: Media, USA — Tags: , , , , — Nicholas @ 08:25

If you guessed “the internet” — particularly the internet sites that ate the classified ad business alive — you’re apparently wrong. The real culprit is … an amazingly old-fashioned racist and sexist stereotype:

For years, we’ve talked about the ridiculousness with which many old school journalists want to blame the internet (or, more specifically Google or Craigslist) for the troubles some in the industry have had lately. It is a ridiculous claim. Basically, newspapers have survived for years on a massive inefficiency in information. What newspapers did marginally well was bring together a local community of interest, take their attention, and then sell that attention. What many folks in the news business still can’t come to terms with is the fact that there are tons of other communities of attention out there now, so they can’t slide by on inefficiencies like they did in the past.

Either way, it’s always nice to see some in the industry recognize that blaming the internet is a mistake. However, Chris Powell, the managing editor for the Journal Inquirer in Connecticut’s choice of a different culprit doesn’t seem much more on target. Powell, who it appears, actually does have a journalism job (I can’t fathom how or why) published an opinion piece (found via Mark Hamilton and Mathew Ingram) that puts the blame squarely on… single mothers. Okay, not just any single mothers:

    Indeed, newspapers still can sell themselves to traditional households — two-parent families involved with their children, schools, churches, sports, civic groups, and such. But newspapers cannot sell themselves to households headed by single women who have several children by different fathers, survive on welfare stipends, can hardly speak or read English, move every few months to cheat their landlords, barely know what town they’re living in, and couldn’t afford a newspaper subscription even if they could read. And such households constitute a rising share of the population.

September 28, 2013

Google is “fighting stupid with stupid”

Filed under: Business, Law, Technology — Tags: , , , — Nicholas @ 11:54

In Maclean’s, Jesse Brown looks at the rather dangerous interpretation of how email works in a recent court decision:

Newsflash: Google scans your email! Whether you have a Gmail account or just send email to people who do, Gmail’s bots automatically read your messages, mostly for the purpose of creating targeted advertising. And if you were reading this in 2005, that might seem shocking.

Today, I think most Internet users understand how free webmail works and are okay with it. But a U.S. federal judge has ruled otherwise. Yesterday, U.S. District Judge Lucy H. Koh ruled that Google’s terms of service and privacy policies do not explicitly spell out that Google will “intercept” users’ email (here’s the ruling).

The word “intercept” is crucial here, because it may put Google in the crosshairs of State and Federal anti-wiretapping laws. After Judge Koh’s ruling, a class-action lawsuit against Google can proceed, whose plaintiffs seek remedies for themselves and for class groups including “all U.S. citizen non-Gmail users who have sent a message to a Gmail user and received a reply…”. Like they say in Vegas, go big or go home.

[…]

An algorithm that scans my messages for keywords like “vacation” in order to offer me cheap flights is not by any stretch of the imagination a wiretap.

But Google has taken a different tack in their defence. If, they’ve argued, what Gmail does qualifies as interception, than so does all email, since automated processing is needed just to send the stuff, whether or not advertising algorithms or anti-spam filters are in use. This logic can be extended, I suppose, to all data that passes through the Internet.

You might call it fighting stupid with stupid, but I think it’s a bold bluff: rule us illegal, Google warns the court, and be prepared to deem the Internet itself a wiretap violation.

« Newer PostsOlder Posts »

Powered by WordPress