This was posted to Google+ the other day, and it’s pretty accurate:
December 11, 2013
Yet for Babbage, the biggest innovation of Doom was something subtler. Video games, then and now, are mainly passive entertainment products, a bit like a more interactive television. You buy one and play it until you either beat it or get bored. But Doom was popular enough that eager users delved into its inner workings, hacking together programs that would let people build their own levels. Drawing something in what was, essentially, a rudimentary CAD program, and then running around inside your own creation, was an astonishing, liberating experience. Like almost everybody else, Babbage’s first custom level was an attempt to reconstruct his own house.
Other programs allowed you to play around with the game itself, changing how weapons worked, or how monsters behaved. For a 12-year-old who liked computers but was rather fuzzy about how they actually worked, being able to pull back the curtain like this was revelatory. Tinkering around with Doom was a wonderful introduction to the mysteries of computers and how their programs were put together. Rather than trying to stop this unauthorised meddling, id embraced it. Its next game, Quake, was designed to actively encourage it.
The modification, or “modding” movement that Doom and Quake inspired heavily influenced the growing games industry. Babbage knows people who got jobs in the industry off the back of their ability to remix others’ creations. (Tim Willits, id’s current creative director, was hired after impressing the firm with his home-brewed Doom maps.) Commercial products — even entire genres of games — exist that trace their roots back to a fascinated teenager playing around in his (or, more rarely, her) bedroom.
But it had more personal effects, too. Being able to alter the game transformed the player from a mere passive consumer of media into a producer in his own right, something that is much harder in most other kinds of media. Amateur filmmakers need expensive kit and a willing cast to indulge their passion. Mastering a musical instrument takes years of practice; starting a band requires like-minded friends. Writing a novel looks easy, until you try it. But creating your own Doom mod was easy enough that anyone could learn it in a day or two. With a bit of practice, it was possible to churn out professional-quality stuff. “User-generated content” was a big buzzword a few years back, but once again, Doom got there first.
December 10, 2013
At The Register, Lucy Orr gets all nostalgic for id Software’s Doom, which turned 20 today:
Doom wasn’t short on story, never mind the gore and gunfire to follow, I particularly enjoyed the fact my own government had fucked things up by messing where they shouldn’t and opened a portal to hell. Damn, it’s just me left to go ultraviolent and push the legions of hell back into fiery limbo.
Faced with dual chain gun-wielding bulked up Aryans as your foe, Wolfenstein 3D was funny rather than scary. Indeed, I don’t remember being scared by a game until Doom appeared, with its engine capable of dimmed quivering lights and its repugnant textures. The nihilistic tones of Alien 3 echoed through such levels as the toxic refinery. Like the Alien series Doom’s dark corners allowed my imagination to run wild and consider turning the lights back on.
But Doom had a lot more going for it then a few scary moments, and I don’t just mean those scrambles for the health kit. Being able to carry an army’s worth of gun power is not necessarily realistic but neither are angry alien demons trying to rip my flesh off. I’m never empty handed with a chainsaw, a shotgun, a chain-gun, and a rocket launcher at my disposal.
With Doom you were not only introduced to a world of cyber demons but death matches — be sure to have the BFG 9000 on hand for that one shot kill — cooperative gameplay and also a world of player mods including maps and sometimes full remakes.
November 20, 2013
November 4, 2013
The fundamental purpose of testing—and, for that matter, of all software quality assurance (QA) deliverables and processes — is to tell you just what you’ve built and whether it does what you think it should do. This is essential, because you can’t inspect a software program the same way you can inspect a house or a car. You can’t touch it, you can’t walk around it, you can’t open the hood or the bedroom door to see what’s inside, you can’t take it out for spin. There are very few tangible or visible clues to the completeness and reliability of a software system — and so we have to rely on QA activities to tell us how well built the system is.
Furthermore, almost any software system developed nowadays for production is vastly more complex than a house or car — it’s more on the same order of complexity of a large petrochemical processing and storage facility, with thousands of possible interconnections, states, and processes. We would be (rightly) terrified if, say, Exxon build such a sprawling oil refining complex near our neighborhood and then started up production having only done a bare minimum of inspection, testing, and trial operations before, during and after construction, offering the explanation that they would wait until after the plant went into production and then handle problems as they crop up. Yet too often that’s just how large software development projects are run, even though the system in development may well be more complex (in terms of connections, processes, and possible states) than such a petrochemical factory. And while most inadequately tested software systems won’t spew pollutants, poison the neighborhood, catch fire, or explode, they can cripple corporate operations, lose vast sums of money, spark shareholder lawsuits, and open the corporation’s directors and officers to civil and even criminal liability (particularly with the advent of Sarbanes-Oxley).
And that presumes that the system can actually go into production. The software engineering literature and the trade press are replete with well-documented case studies of “software runaways”: large IT re-engineering or development projects that consume tens or hundreds of millions of dollars, or in a few spectacular (government) cases, billions of dollars, over a period of years, before grinding to a halt and being terminated without ever having put a usable, working system into production. So it’s important not to skimp on testing and the other QA-related activities.
Bruce F. Webster, “Obamacare and the Testing Gap”, And Still I Persist…, 2013-10-31
October 29, 2013
A comment at Marginal Revolution deservedly has been promoted to being a guest post, discussing the scale of the problems with the Obamacare software:
The real problems are with the back end of the software. When you try to get a quote for health insurance, the system has to connect to computers at the IRS, the VA, Medicaid/CHIP, various state agencies, Treasury, and HHS. They also have to connect to all the health plan carriers to get pre-subsidy pricing. All of these queries receive data that is then fed into the online calculator to give you a price. If any of these queries fails, the whole transaction fails.
Most of these systems are old legacy systems with their own unique data formats. Some have been around since the 1960′s, and the people who wrote the code that runs on them are long gone. If one of these old crappy systems takes too long to respond, the transaction times out.
When you even contemplate bringing an old legacy system into a large-scale web project, you should do load testing on that system as part of the feasibility process before you ever write a line of production code, because if those old servers can’t handle the load, your whole project is dead in the water if you are forced to rely on them. There are no easy fixes for the fact that a 30 year old mainframe can not handle thousands of simultaneous queries. And upgrading all the back-end systems is a bigger job than the web site itself. Some of those systems are still there because attempts to upgrade them failed in the past. Too much legacy software, too many other co-reliant systems, etc. So if they aren’t going to handle the job, you need a completely different design for your public portal.
A lot of focus has been on the front-end code, because that’s the code that we can inspect, and it’s the code that lots of amateur web programmers are familiar with, so everyone’s got an opinion. And sure, it’s horribly written in many places. But in systems like this the problems that keep you up at night are almost always in the back-end integration.
The root problem was horrific management. The end result is a system built incorrectly and shipped without doing the kind of testing that sound engineering practices call for. These aren’t ‘mistakes’, they are the result of gross negligence, ignorance, and the violation of engineering best practices at just about every step of the way.
October 28, 2013
Mark Steyn’s weekend column touched on some items of interest to aficionados of past government software fiascos:
The witness who coughed up the intriguing tidbit about Obamacare’s exemption from privacy protections was one Cheryl Campbell of something called CGI. This rang a vague bell with me. CGI is not a creative free spirit from Jersey City with an impressive mastery of Twitter, but a Canadian corporate behemoth. Indeed, CGI is so Canadian their name is French: Conseillers en Gestion et Informatique. Their most famous government project was for the Canadian Firearms Registry. The registry was estimated to cost in total $119 million, which would be offset by $117 million in fees. That’s a net cost of $2 million. Instead, by 2004 the CBC (Canada’s PBS) was reporting costs of some $2 billion — or a thousand times more expensive.
Yeah, yeah, I know, we’ve all had bathroom remodelers like that. But in this case the database had to register some 7 million long guns belonging to some two-and-a-half to three million Canadians. That works out to almost $300 per gun — or somewhat higher than the original estimate for processing a firearm registration of $4.60. Of those $300 gun registrations, Canada’s auditor general reported to parliament that much of the information was either duplicated or wrong in respect to basic information such as names and addresses.
Also, there was a 1-800 number, but it wasn’t any use.
So it was decided that the sclerotic database needed to be improved.
But it proved impossible to “improve” CFIS (the Canadian Firearms Information System). So CGI was hired to create an entirely new CFIS II, which would operate alongside CFIS I until the old system could be scrapped. CFIS II was supposed to go operational on January 9, 2003, but the January date got postponed to June, and 2003 to 2004, and $81 million was thrown at it before a new Conservative government scrapped the fiasco in 2007. Last year, the government of Ontario canceled another CGI registry that never saw the light of day — just for one disease, diabetes, and costing a mere $46 million.
But there’s always America! “We continue to view U.S. federal government as a significant growth opportunity,” declared CGI’s chief exec, in what would also make a fine epitaph for the republic. Pizza and Mountain Dew isn’t very Montreal, and on the evidence of three years of missed deadlines in Ontario and the four-year overrun on the firearms database CGI don’t sound like they’re pulling that many all-nighters. Was the government of the United States aware that CGI had been fired by the government of Canada and the government of Ontario (and the government of New Brunswick)? Nobody’s saying. But I doubt it would make much difference.
October 25, 2013
Virginia Postrel on the hubris of the Obamacare project team:
The HealthCare.gov website is a disaster — symbolic to Obamacare opponents, disheartening to supporters, and incredibly frustrating to people who just need to buy insurance. Some computer experts are saying the only way to save the system is to scrap the current bloated code and start over.
Looking back, it seems crazy that neither the Barack Obama administration nor the public was prepared for the startup difficulties. There’s no shortage of database experts willing to opine on the complexities of the problem. Plenty of companies have nightmarish stories to tell about much simpler software projects. And reporting by the New York Times finds that the people involved with the system knew months ago that it was in serious trouble. “We foresee a train wreck,” one said back in February.
So why didn’t the administration realize that integrating a bunch of incompatible government databases into a seamless system with an interface just about anyone could understand was a really, really hard problem? Why was even the president seemingly taken by surprise when the system didn’t work like it might in the movies?
We have become seduced by computer glamour.
Whether it’s a television detective instantly checking a database of fingerprints or the ease of Amazon.com’s “1-Click” button, we imagine that software is a kind of magic — all the more so if it’s software we’ve never actually experienced. We expect it to be effortless. We don’t think about how it got there or what its limitations might be. Instead of imagining future technologies as works in progress, improving over time, we picture them as perfect from day one.
October 13, 2013
As a writer, Charles Stross hates, hates, hates, hates, hates Microsoft Word and wants it to DIE:
Microsoft Word is a tyrant of the imagination, a petty, unimaginative, inconsistent dictator that is ill-suited to any creative writer’s use. Worse: it is a near-monopolist, dominating the word processing field. Its pervasive near-monopoly status has brainwashed software developers to such an extent that few can imagine a word processing tool that exists as anything other than as a shallow imitation of the Redmond Behemoth. But what exactly is wrong with it?
I’ve been using word processors and text editors for nearly 30 years. There was an era before Microsoft Word’s dominance when a variety of radically different paradigms for text preparation and formatting competed in an open marketplace of ideas. One early and particularly effective combination was the idea of a text file, containing embedded commands or macros, that could be edited with a programmer’s text editor (such as ed or teco or, later, vi or emacs) and subsequently fed to a variety of tools: offline spelling checkers, grammar checkers, and formatters like scribe, troff, and latex that produced a binary page image that could be downloaded to a printer.
These tools were fast, powerful, elegant, and extremely demanding of the user. As the first 8-bit personal computers appeared (largely consisting of the Apple II and the rival CP/M ecosystem), programmers tried to develop a hybrid tool called a word processor: a screen-oriented editor that hid the complex and hostile printer control commands from the author, replacing them with visible highlight characters on screen and revealing them only when the user told the program to “reveal codes”. Programs like WordStar led the way, until WordPerfect took the market in the early 1980s by adding the ability to edit two or more files at the same time in a split screen view.
October 5, 2013
Those of us who are non-believing heathens might prefer porting our minds to robot bodies before the natural expiration date on our organic selves. It’s hard to wrap your head around the idea that a digital representation of your mind, no matter how accurate, is still “you” in some sense. But I think that fear will go away as soon as we see the first robot that thinks and acts exactly like Uncle Bob did before he made the jump. If Uncle Bob the robot acts human enough, we’ll come to see him as the same entity that once inhabited an organic shell. When technology is sufficiently advanced, we’ll get past the magical thinking about spirits and souls and the specialness of having organic parts.
To me, the most interesting possibility for the future involves porting human minds to software that includes entirely simulated realities. Such a program — a digital mind if you will — could live in an entirely artificial reality and experience what seems to be a genuine human life for the rest of eternity, or at least as long as the software keeps running. The freaky part is that if such a thing will someday be possible — and I think it will — then it follows that the time after it happens will be infinitely long whereas the history of time before it happens is finite. So it follows that there is an infinitely greater chance you are already the simulation and not a human who is reading this paragraph and contemplating it. Weird.
If you didn’t already have enough reasons to eat right, exercise, and keep your mind sharp, consider what you might be bringing to your own immortality. I was hoping to get there before the dementia sets in. But I just reread what I wrote and apparently I’m already too late.
September 6, 2013
I moderate a few special interest groups on Yahoo Groups, and I’m subscribed to a couple of dozen others. There’s nothing flashy or exciting about the service: it’s been relatively stable for years, with few changes or disruptions. For most users, this has been ideal. This week Yahoo not only introduced a new logo, they also tossed a stink bomb into the placid Yahoo Groups with a new user interface called “Neo”. They apparently rolled out the changes to a few groups last month, but most users and list owner/moderators hadn’t been given any notice that the change was coming. The Register‘s Kelly Fiveash on the diabolical scheme to annoy long-term users of Yahoo Groups:
‘WTF! MORONS!’ Yahoo! Groups! redesign! traumatises! users!
‘Vile, unfriendly interface’ attacked by world+dog. But format stays
Yahoo! has told thousands of users who are complaining about the Purple Palace’s pisspoor redesign of its Groups service that it will not be rolled back to the old format — despite a huge outcry.
The Marissa Mayer-run company revamped Yahoo! Groups last week, but it was immediately inundated with unhappy netizens who grumbled that the overhaul was glitchy, difficult to navigate and “severely degraded”.
In response, Yahoo! told its users:
We deeply value how much you, our users, care about Yahoo! Groups … we launched our first update to the Groups experience in several years and while these changes are an important step to building a more modern Groups experience, we recognise that this is a considerable change.
We are listening to all of the community feedback and we are actively measuring user feedback so we can continuously make improvements.
But the complaints have continued to flood in since Yahoo! made the tweak by changing its “classic” (read: ancient) interface to one dubbed “neo” that appeared to have been quickly spewed on to the interwebs with little testing before going live.
And — while the company claimed it was listening closely to its users about the new look Yahoo! Groups — it has ignored pleas from thousands of people who want it to reverse the update.
For users who access Yahoo Groups through the website, the new design has completely befuddled many, hiding functions (and even group names) and making it far more difficult to search for older posts (you reportedly have to search by message number: no other searches are supported). Even for those who only receive email updates, the Neo redesign included odd and sometimes completely unreadable email formatting, broken links, and other highly irritating issues.
This is the real problem with “free” services: when things go wrong, as a user of the service, you don’t have much leverage to complain or to get things fixed.
August 29, 2013
A major new patent bill, passed in a 117-4 vote by New Zealand’s Parliament after five years of debate, has banned software patents.
The relevant clause of the patent bill actually states that a computer program is “not an invention.” Some have suggested that was a way to get around the wording of the TRIPS intellectual property treaty, which requires patents to be “available for any inventions, whether products or processes, in all fields of technology.”
Processes will still be patentable if the computer program is merely a way of implementing a patentable process. But patent claims that cover computer programs “as such” will not be allowed.
It seems there will be some leeway for computer programs directly tied to improved hardware. The bill includes the example of a better washing machine. Even if the improvements are implemented with a computer program, “the actual contribution is a new and improved way of operating a washing machine that gets clothes cleaner and uses less electricity,” so a patent could be awarded.
July 29, 2013
Wired‘s Ryan Tate sat down to talk to Phil Libin of Evernote:
Evernote is known for its eponymous note-taking app, a seemingly modest piece of software that has brought in a heap of money. Evernote has topped 10 million downloads in the iOS and Android app stores and accumulated more than 65 million users across its mobile, web, and desktop versions.
CEO and serial tech entrepreneur Phil Libin used to bristle when people would refer to Evernote as a digital notebook. He sees the product as an extension of the mind, albeit one that’s only about 5 percent complete. These days, though, he’s learned to embrace the pigeonholing. After all, it was humble note-takers who brought Redwood City, California-based Evernote to profitability in 2011 by upgrading en masse to a premium version that includes optical character recognition (handy for pictures of business cards and receipts) and collaborative note editing (great for workgroups).
This year, Evernote is in the red again as the company scales up to reach Libin’s bigger ambition — becoming something like Microsoft Office for mobile devices. Or, as Libin put it in an hourlong interview with WIRED, “like Nike for your mind.”
Evernote’s staff of 330 is divided into teams of no more than eight members — small enough, as Libin sees it, to sit around a dinner table and have a single conversation. No team project can last more than nine months, and none of the teams share any code, which is something close to sacrilege among the software priests of Silicon Valley. One recent sunny Friday, while programmers behind him raced to rewrite the iPhone and iPad versions of Evernote from scratch, we pelted Libin with questions about the past, present, and future of his company.
July 19, 2013
I use several web browsers every day, including Firefox, Chrome, and even Internet Explorer. I also use Opera for some tasks, and I was less than happy to find out that the most recent release of the browser is a major step back in functionality. I’m clearly not the only disappointed Opera fan:
After 9 years of using Opera almost exclusively, I finally saw the developers betray their own user base. Curse you, Opera 15… ~Frozen
— Dragon Season (@DragonSeasonCom) July 19, 2013
Seriously, who removes useful, popular and working features with no bugs and almost no maintenance needs from their major releases? ~Fozen
— Dragon Season (@DragonSeasonCom) July 19, 2013
Tab pinning, tab thumbnails and tab resizing, some mouse gestures, Opera Mail, bookmarks (not kidding), search box… @Paeroka ~Frozen
— Dragon Season (@DragonSeasonCom) July 19, 2013
@Paeroka When you have a small market share, the obvious best move is to alienate them! ~Frozen
— Dragon Season (@DragonSeasonCom) July 19, 2013
June 15, 2013
Ronald Bailey gathers up some resources you might want to investigate if you’d prefer not to have the NSA or other government agencies watching your online activities:
First, consider not putting so much stuff out there in the first place. Wuergler devised a program he calls Stalker that can siphon off nearly all of your digital information to put together an amazingly complete portrait of your life and pretty much find out where you are at all times. Use Facebook if you must, but realize you’re making it easy for the government to track and find you when they choose to do so.
A second step toward increased privacy is to use a browser like DuckDuckGo, which does not collect the sort of information — say, your IP address — that can identify you with your Internet searches. Thus, if the government bangs on their doors to find out what you’ve been up to, DuckDuckGo has nothing to hand over. I have decided to make DuckDuckGo my default for general browsing, turning to Google only for items such as breaking news and scholarly articles. (Presumably, the NSA would be able to tap into my searches on DuckDuckGo in real time.)
Third, TOR offers free software and a network of relays that can shield your location from prying eyes. TOR operates by bouncing your emails and files around the Internet through encrypted relays. Anyone intercepting your message once it exits a TOR relay cannot trace it back to your computer and your physical location. TOR is used by dissidents and journalists around the world. On the downside, in my experience it operates more slowly than, say, Google.
Fourth, there is encryption. An intriguing one-stop encryption solution is Silent Circle. Developed by Phil Zimmerman, the inventor of the Pretty Good Privacy encryption system, Silent Circle enables users to encrypt their text messages, video, and phone calls, as well as their emails. Zimmerman and his colleagues claim that they, or anyone else, cannot decrypt our messages across their network, period. As Wuergler warned, this security doesn’t come free. Silent Circle charges $10 per month for its encryption services.
However, your mobile phone is a beacon that can’t be easily masked or hidden:
Now for some bad news. Telephone metadata of the sort the NSA acquired from Verizon is hard — read: impossible — to hide. As the ACLU’s Soghoian notes, you can’t violate the laws of physics: In order to connect your mobile phone, the phone company necessarily needs to know where you are located. Of course, you can avoid being tracked through your cell phone by removing its batteries (unless you have an iPhone), but once you slot it back in, there you are.
For lots more information on how to you might be able to baffle government monitoring agencies, check out the Electronic Frontier Foundation’s Surveillance Self-Defense Web pages.