Quotulatiousness

April 11, 2014

Open source software and the Heartbleed bug

Filed under: Technology — Tags: , , , , — Nicholas Russon @ 07:03

Some people are claiming that the Heartbleed bug proves that open source software is a failure. ESR quickly addresses that idiotic claim:

Heartbleed bugI actually chuckled when I read rumor that the few anti-open-source advocates still standing were crowing about the Heartbleed bug, because I’ve seen this movie before after every serious security flap in an open-source tool. The script, which includes a bunch of people indignantly exclaiming that many-eyeballs is useless because bug X lurked in a dusty corner for Y months, is so predictable that I can anticipate a lot of the lines.

The mistake being made here is a classic example of Frederic Bastiat’s “things seen versus things unseen”. Critics of Linus’s Law overweight the bug they can see and underweight the high probability that equivalently positioned closed-source security flaws they can’t see are actually far worse, just so far undiscovered.

That’s how it seems to go whenever we get a hint of the defect rate inside closed-source blobs, anyway. As a very pertinent example, in the last couple months I’ve learned some things about the security-defect density in proprietary firmware on residential and small business Internet routers that would absolutely curl your hair. It’s far, far worse than most people understand out there.

[...]

Ironically enough this will happen precisely because the open-source process is working … while, elsewhere, bugs that are far worse lurk in closed-source router firmware. Things seen vs. things unseen…

Returning to Heartbleed, one thing conspicuously missing from the downshouting against OpenSSL is any pointer to an implementation that is known to have a lower defect rate over time. This is for the very good reason that no such empirically-better implementation exists. What is the defect history on proprietary SSL/TLS blobs out there? We don’t know; the vendors aren’t saying. And we can’t even estimate the quality of their code, because we can’t audit it.

The response to the Heartbleed bug illustrates another huge advantage of open source: how rapidly we can push fixes. The repair for my Linux systems was a push-one-button fix less than two days after the bug hit the news. Proprietary-software customers will be lucky to see a fix within two months, and all too many of them will never see a fix patch.

Update: There are lots of sites offering tools to test whether a given site is vulnerable to the Heartbeat bug, but you need to step carefully there, as there’s a thin line between what’s legal in some countries and what counts as an illegal break-in attempt:

Websites and tools that have sprung up to check whether servers are vulnerable to OpenSSL’s mega-vulnerability Heartbleed have thrown up anomalies in computer crime law on both sides of the Atlantic.

Both the US Computer Fraud and Abuse Act and its UK equivalent the Computer Misuse Act make it an offence to test the security of third-party websites without permission.

Testing to see what version of OpenSSL a site is running, and whether it is also supports the vulnerable Heartbeat protocol, would be legal. But doing anything more active — without permission from website owners — would take security researchers onto the wrong side of the law.

And you shouldn’t just rush out and change all your passwords right now (you’ll probably need to do it, but the timing matters):

Heartbleed is a catastrophic bug in widely used OpenSSL that creates a means for attackers to lift passwords, crypto-keys and other sensitive data from the memory of secure server software, 64KB at a time. The mega-vulnerability was patched earlier this week, and software should be updated to use the new version, 1.0.1g. But to fully clean up the problem, admins of at-risk servers should generate new public-private key pairs, destroy their session cookies, and update their SSL certificates before telling users to change every potentially compromised password on the vulnerable systems.

April 3, 2014

ESR reviews Jeremy Rifkin’s latest book

Filed under: Economics, Media, Technology — Tags: , , , , — Nicholas Russon @ 10:46

The publisher sent a copy of The Zero Marginal Cost Society along with a note that Rifkin himself wanted ESR to receive a copy (because Rifkin thinks ESR is a good representative of some of the concepts in the book). ESR isn’t impressed:

In this book, Rifkin is fascinated by the phenomenon of goods for which the marginal cost of production is zero, or so close to zero that it can be ignored. All of the present-day examples of these he points at are information goods — software, music, visual art, novels. He joins this to the overarching obsession of all his books, which are variations on a theme of “Let us write an epitaph for capitalism”.

In doing so, Rifkin effectively ignores what capitalists do and what capitalism actually is. “Capital” is wealth paying for setup costs. Even for pure information goods those costs can be quite high. Music is a good example; it has zero marginal cost to reproduce, but the first copy is expensive. Musicians must own expensive instruments, be paid to perform, and require other capital goods such as recording studios. If those setup costs are not reliably priced into the final good, production of music will not remain economically viable.

[...]

Rifkin cites me in his book, but it is evident that he almost completely misunderstood my arguments in two different way, both of which bear on the premises of his book.

First, software has a marginal cost of production that is effectively zero, but that’s true of all software rather than just open source. What makes open source economically viable is the strength of secondary markets in support and related services. Most other kinds of information goods don’t have these. Thus, the economics favoring open source in software are not universal even in pure information goods.

Second, even in software — with those strong secondary markets — open-source development relies on the capital goods of software production being cheap. When computers were expensive, the economics of mass industrialization and its centralized management structures ruled them. Rifkin acknowledges that this is true of a wide variety of goods, but never actually grapples with the question of how to pull capital costs of those other goods down to the point where they no longer dominate marginal costs.

There are two other, much larger, holes below the waterline of Rifkin’s thesis. One is that atoms are heavy. The other is that human attention doesn’t get cheaper as you buy more of it. In fact, the opposite tends to be true — which is exactly why capitalists can make a lot of money by substituting capital goods for labor.

These are very stubborn cost drivers. They’re the reason Rifkin’s breathless hopes for 3-D printing will not be fulfilled. Because 3-D printers require feedstock, the marginal cost of producing goods with them has a floor well above zero. That ABS plastic, or whatever, has to be produced. Then it has to be moved to where the printer is. Then somebody has to operate the printer. Then the finished good has to be moved to the point of use. None of these operations has a cost that is driven to zero, or near zero at scale. 3-D printing can increase efficiency by outcompeting some kinds of mass production, but it can’t make production costs go away.

March 25, 2014

Tech culture and ageism

Filed under: Business, Technology, USA — Tags: , , , , — Nicholas Russon @ 07:56

Noam Scheiber examines the fanatic devotion to youth in (some parts of) the high tech culture:

Silicon Valley has become one of the most ageist places in America. Tech luminaries who otherwise pride themselves on their dedication to meritocracy don’t think twice about deriding the not-actually-old. “Young people are just smarter,” Facebook CEO Mark Zuckerberg told an audience at Stanford back in 2007. As I write, the website of ServiceNow, a large Santa Clara–based I.T. services company, features the following advisory in large letters atop its “careers” page: “We Want People Who Have Their Best Work Ahead of Them, Not Behind Them.”

And that’s just what gets said in public. An engineer in his forties recently told me about meeting a tech CEO who was trying to acquire his company. “You must be the token graybeard,” said the CEO, who was in his late twenties or early thirties. “I looked at him and said, ‘No, I’m the token grown-up.’”

Investors have also become addicted to the youth movement:

The economics of the V.C. industry help explain why. Investing in new companies is fantastically risky, and even the best V.C.s fail a large majority of the time. That makes it essential for the returns on successes to be enormous. Whereas a 500 percent return on a $2 million investment (or “5x,” as it’s known) would be considered remarkable in any other line of work, the investments that sustain a large V.C. fund are the “unicorns” and “super-unicorns” that return 100x or 1,000x — the Googles and the Facebooks.

And this is where finance meets what might charitably be called sociology but is really just Silicon Valley mysticism. Finding themselves in the position of chasing 100x or 1,000x returns, V.C.s invariably tell themselves a story about youngsters. “One of the reasons they collectively prefer youth is because youth has the potential for the black swan,” one V.C. told me of his competitors. “It hasn’t been marked down to reality yet. If I was at Google for five years, what’s the chance I would be a black swan? A lot lower than if you never heard of me. That’s the collective mentality.”

Some of the corporate cultures sound more like playgroups than workgroups:

Whatever the case, the veneration of youth in Silicon Valley now seems way out of proportion to its usefulness. Take Dropbox, which an MIT alumnus named Drew Houston co-founded in 2007, after he got tired of losing access to his files whenever he forgot a thumb drive. Dropbox quickly caught on among users and began to vacuum up piles of venture capital. But the company has never quite outgrown its dorm-room vibe, even now that it houses hundreds of employees in an 85,000-square-foot space. Dropbox has a full-service jamming studio and observes a weekly ritual known as whiskey Fridays. Job candidates have complained about being interviewed in conference rooms with names like “The Break-up Room” and the “Bromance Chamber.” (A spokesman says the names were recently changed.)

Once a year, Houston, who still wears his chunky MIT class ring, presides over “Hack Week,” during which Dropbox headquarters turns into the world’s best-capitalized rumpus room. Employees ride around on skateboards and scooters, play with Legos at all hours, and generally tool around with whatever happens to interest them, other than work, which they are encouraged to set aside. “I’ve been up for about forty hours working on Dropbox Jeopardy,” one engineer told a documentarian who filmed a recent Hack Week. “It’s close to nearing insanity, but it feels worth it.”

It’s safe to say that the reigning sensibility at Dropbox has conquered more or less every corner of the tech world. The ping-pong playing can be ceaseless. The sexual mores are imported from college—“They’ll say something like, ‘This has been such a long day. I have to go out and meet some girls, hook up tonight,’ ” says one fortysomething consultant to several start-ups. And the vernacular is steroidally bro-ish. Another engineer in his forties who recently worked at a crowdsourcing company would steel himself anytime he reviewed a colleague’s work. “In programming, you need a throw-away variable,” the engineer explained to me. “So you come up with something quick.” With his co-workers “it would always be ‘dong’ this, ‘dick’ that, ‘balls’ this.”

There’s also the blind spot about having too many youth-focussed firms in the same market:

The most common advice V.C.s give entrepreneurs is to solve a problem they encounter in their daily lives. Unfortunately, the problems the average 22-year-old male programmer has experienced are all about being an affluent single guy in Northern California. That’s how we’ve ended up with so many games (Angry Birds, Flappy Bird, Crappy Bird) and all those apps for what one start-up founder described to me as cooler ways to hang out with friends on a Saturday night.

H/T to Kathy Shaidle for the link.

December 11, 2013

I’ve heard all of these responses many, many times

Filed under: Humour, Technology — Tags: , — Nicholas Russon @ 11:08

This was posted to Google+ the other day, and it’s pretty accurate:

Programmer top 20 replies

The legacy of id Software’s Doom

Filed under: Gaming, Technology — Tags: , , — Nicholas Russon @ 09:10

Following up from yesterday’s post on the 20th anniversary, The Economist also sings the praises of Doom:

Yet for Babbage, the biggest innovation of Doom was something subtler. Video games, then and now, are mainly passive entertainment products, a bit like a more interactive television. You buy one and play it until you either beat it or get bored. But Doom was popular enough that eager users delved into its inner workings, hacking together programs that would let people build their own levels. Drawing something in what was, essentially, a rudimentary CAD program, and then running around inside your own creation, was an astonishing, liberating experience. Like almost everybody else, Babbage’s first custom level was an attempt to reconstruct his own house.

Other programs allowed you to play around with the game itself, changing how weapons worked, or how monsters behaved. For a 12-year-old who liked computers but was rather fuzzy about how they actually worked, being able to pull back the curtain like this was revelatory. Tinkering around with Doom was a wonderful introduction to the mysteries of computers and how their programs were put together. Rather than trying to stop this unauthorised meddling, id embraced it. Its next game, Quake, was designed to actively encourage it.

The modification, or “modding” movement that Doom and Quake inspired heavily influenced the growing games industry. Babbage knows people who got jobs in the industry off the back of their ability to remix others’ creations. (Tim Willits, id’s current creative director, was hired after impressing the firm with his home-brewed Doom maps.) Commercial products — even entire genres of games — exist that trace their roots back to a fascinated teenager playing around in his (or, more rarely, her) bedroom.

But it had more personal effects, too. Being able to alter the game transformed the player from a mere passive consumer of media into a producer in his own right, something that is much harder in most other kinds of media. Amateur filmmakers need expensive kit and a willing cast to indulge their passion. Mastering a musical instrument takes years of practice; starting a band requires like-minded friends. Writing a novel looks easy, until you try it. But creating your own Doom mod was easy enough that anyone could learn it in a day or two. With a bit of practice, it was possible to churn out professional-quality stuff. “User-generated content” was a big buzzword a few years back, but once again, Doom got there first.

December 10, 2013

Twenty years of Doom

Filed under: Gaming, History — Tags: , — Nicholas Russon @ 12:26

At The Register, Lucy Orr gets all nostalgic for id Software’s Doom, which turned 20 today:

Doom wasn’t short on story, never mind the gore and gunfire to follow, I particularly enjoyed the fact my own government had fucked things up by messing where they shouldn’t and opened a portal to hell. Damn, it’s just me left to go ultraviolent and push the legions of hell back into fiery limbo.

Faced with dual chain gun-wielding bulked up Aryans as your foe, Wolfenstein 3D was funny rather than scary. Indeed, I don’t remember being scared by a game until Doom appeared, with its engine capable of dimmed quivering lights and its repugnant textures. The nihilistic tones of Alien 3 echoed through such levels as the toxic refinery. Like the Alien series Doom’s dark corners allowed my imagination to run wild and consider turning the lights back on.

But Doom had a lot more going for it then a few scary moments, and I don’t just mean those scrambles for the health kit. Being able to carry an army’s worth of gun power is not necessarily realistic but neither are angry alien demons trying to rip my flesh off. I’m never empty handed with a chainsaw, a shotgun, a chain-gun, and a rocket launcher at my disposal.

With Doom you were not only introduced to a world of cyber demons but death matches — be sure to have the BFG 9000 on hand for that one shot kill — cooperative gameplay and also a world of player mods including maps and sometimes full remakes.

id Software - Doom 1993

November 20, 2013

An app like this may justify the existence of Google Glass

Filed under: Randomness, Technology — Tags: , , , , , — Nicholas Russon @ 08:38

I have a terrible memory for people’s names (and no, it’s not just early senility … I’ve always had trouble remembering names). For example, I’ve been a member of the same badminton club for nearly 15 years and there are still folks there whose names just don’t register: not just new members, but people I’ve played with or against on dozens of occasions. I know them … I just can’t remember their names in a timely fashion. David Friedman suggests that Google Glass might be the solution I need:

I first encountered the solution to my problem in Double Star, a very good novel by Robert Heinlein. It will be made possible, in a higher tech version, by Google glass. The solution is the Farley File, named after FDR’s campaign manager.

A politician such as Roosevelt meets lots of people over the course of his career. For each of them the meeting is an event to be remembered and retold. It is much less memorable to the politician, who cannot possibly remember the details of ten thousand meetings. He can, however, create the illusion of doing so by maintaining a card file with information on everyone he has ever met: The name of the man’s wife, how many children he has, his dog, the joke he told, all the things the politician would have remembered if the meeting had been equally important to him. It is the job of one of the politician’s assistants to make sure that, any time anyone comes to see him, he gets thirty seconds to look over the card.

My version will use more advanced technology, courtesy of Google glass or one of its future competitors. When I subvocalize the key word “Farley,” the software identifies the person I am looking at, shows me his name (that alone would be worth the price) and, next to it, whatever facts about him I have in my personal database. A second trigger, if invoked, runs a quick search of the web for additional information.

Evernote has an application intended to do some of this (Evernote Hello), but it still requires the immersion-breaking act of accessing your smartphone to look up your contact information. Something similar in a Google Glass or equivalent environment might be the perfect solution.

November 4, 2013

QotD: Software quality assurance

Filed under: Business, Government, Quotations, Technology — Tags: , , , — Nicholas Russon @ 10:13

The fundamental purpose of testing—and, for that matter, of all software quality assurance (QA) deliverables and processes — is to tell you just what you’ve built and whether it does what you think it should do. This is essential, because you can’t inspect a software program the same way you can inspect a house or a car. You can’t touch it, you can’t walk around it, you can’t open the hood or the bedroom door to see what’s inside, you can’t take it out for spin. There are very few tangible or visible clues to the completeness and reliability of a software system — and so we have to rely on QA activities to tell us how well built the system is.

Furthermore, almost any software system developed nowadays for production is vastly more complex than a house or car — it’s more on the same order of complexity of a large petrochemical processing and storage facility, with thousands of possible interconnections, states, and processes. We would be (rightly) terrified if, say, Exxon build such a sprawling oil refining complex near our neighborhood and then started up production having only done a bare minimum of inspection, testing, and trial operations before, during and after construction, offering the explanation that they would wait until after the plant went into production and then handle problems as they crop up. Yet too often that’s just how large software development projects are run, even though the system in development may well be more complex (in terms of connections, processes, and possible states) than such a petrochemical factory. And while most inadequately tested software systems won’t spew pollutants, poison the neighborhood, catch fire, or explode, they can cripple corporate operations, lose vast sums of money, spark shareholder lawsuits, and open the corporation’s directors and officers to civil and even criminal liability (particularly with the advent of Sarbanes-Oxley).

And that presumes that the system can actually go into production. The software engineering literature and the trade press are replete with well-documented case studies of “software runaways”: large IT re-engineering or development projects that consume tens or hundreds of millions of dollars, or in a few spectacular (government) cases, billions of dollars, over a period of years, before grinding to a halt and being terminated without ever having put a usable, working system into production. So it’s important not to skimp on testing and the other QA-related activities.

Bruce F. Webster, “Obamacare and the Testing Gap”, And Still I Persist…, 2013-10-31

October 29, 2013

Obamacare’s technical issues

Filed under: Government, Technology, USA — Tags: , , , — Nicholas Russon @ 07:48

A comment at Marginal Revolution deservedly has been promoted to being a guest post, discussing the scale of the problems with the Obamacare software:

The real problems are with the back end of the software. When you try to get a quote for health insurance, the system has to connect to computers at the IRS, the VA, Medicaid/CHIP, various state agencies, Treasury, and HHS. They also have to connect to all the health plan carriers to get pre-subsidy pricing. All of these queries receive data that is then fed into the online calculator to give you a price. If any of these queries fails, the whole transaction fails.

Most of these systems are old legacy systems with their own unique data formats. Some have been around since the 1960′s, and the people who wrote the code that runs on them are long gone. If one of these old crappy systems takes too long to respond, the transaction times out.

[...]

When you even contemplate bringing an old legacy system into a large-scale web project, you should do load testing on that system as part of the feasibility process before you ever write a line of production code, because if those old servers can’t handle the load, your whole project is dead in the water if you are forced to rely on them. There are no easy fixes for the fact that a 30 year old mainframe can not handle thousands of simultaneous queries. And upgrading all the back-end systems is a bigger job than the web site itself. Some of those systems are still there because attempts to upgrade them failed in the past. Too much legacy software, too many other co-reliant systems, etc. So if they aren’t going to handle the job, you need a completely different design for your public portal.

A lot of focus has been on the front-end code, because that’s the code that we can inspect, and it’s the code that lots of amateur web programmers are familiar with, so everyone’s got an opinion. And sure, it’s horribly written in many places. But in systems like this the problems that keep you up at night are almost always in the back-end integration.

The root problem was horrific management. The end result is a system built incorrectly and shipped without doing the kind of testing that sound engineering practices call for. These aren’t ‘mistakes’, they are the result of gross negligence, ignorance, and the violation of engineering best practices at just about every step of the way.

October 28, 2013

Mark Steyn on the Obamacare software

Filed under: Bureaucracy, Cancon, Government, Technology — Tags: , , , — Nicholas Russon @ 07:22

Mark Steyn’s weekend column touched on some items of interest to aficionados of past government software fiascos:

The witness who coughed up the intriguing tidbit about Obamacare’s exemption from privacy protections was one Cheryl Campbell of something called CGI. This rang a vague bell with me. CGI is not a creative free spirit from Jersey City with an impressive mastery of Twitter, but a Canadian corporate behemoth. Indeed, CGI is so Canadian their name is French: Conseillers en Gestion et Informatique. Their most famous government project was for the Canadian Firearms Registry. The registry was estimated to cost in total $119 million, which would be offset by $117 million in fees. That’s a net cost of $2 million. Instead, by 2004 the CBC (Canada’s PBS) was reporting costs of some $2 billion — or a thousand times more expensive.

Yeah, yeah, I know, we’ve all had bathroom remodelers like that. But in this case the database had to register some 7 million long guns belonging to some two-and-a-half to three million Canadians. That works out to almost $300 per gun — or somewhat higher than the original estimate for processing a firearm registration of $4.60. Of those $300 gun registrations, Canada’s auditor general reported to parliament that much of the information was either duplicated or wrong in respect to basic information such as names and addresses.

Sound familiar?

Also, there was a 1-800 number, but it wasn’t any use.

Sound familiar?

So it was decided that the sclerotic database needed to be improved.

Sound familiar?

But it proved impossible to “improve” CFIS (the Canadian Firearms Information System). So CGI was hired to create an entirely new CFIS II, which would operate alongside CFIS I until the old system could be scrapped. CFIS II was supposed to go operational on January 9, 2003, but the January date got postponed to June, and 2003 to 2004, and $81 million was thrown at it before a new Conservative government scrapped the fiasco in 2007. Last year, the government of Ontario canceled another CGI registry that never saw the light of day — just for one disease, diabetes, and costing a mere $46 million.

But there’s always America! “We continue to view U.S. federal government as a significant growth opportunity,” declared CGI’s chief exec, in what would also make a fine epitaph for the republic. Pizza and Mountain Dew isn’t very Montreal, and on the evidence of three years of missed deadlines in Ontario and the four-year overrun on the firearms database CGI don’t sound like they’re pulling that many all-nighters. Was the government of the United States aware that CGI had been fired by the government of Canada and the government of Ontario (and the government of New Brunswick)? Nobody’s saying. But I doubt it would make much difference.

October 25, 2013

The glamour of big IT projects

Filed under: Government, Technology, USA — Tags: , , , , — Nicholas Russon @ 00:02

Virginia Postrel on the hubris of the Obamacare project team:

The HealthCare.gov website is a disaster — symbolic to Obamacare opponents, disheartening to supporters, and incredibly frustrating to people who just need to buy insurance. Some computer experts are saying the only way to save the system is to scrap the current bloated code and start over.

Looking back, it seems crazy that neither the Barack Obama administration nor the public was prepared for the startup difficulties. There’s no shortage of database experts willing to opine on the complexities of the problem. Plenty of companies have nightmarish stories to tell about much simpler software projects. And reporting by the New York Times finds that the people involved with the system knew months ago that it was in serious trouble. “We foresee a train wreck,” one said back in February.

So why didn’t the administration realize that integrating a bunch of incompatible government databases into a seamless system with an interface just about anyone could understand was a really, really hard problem? Why was even the president seemingly taken by surprise when the system didn’t work like it might in the movies?

We have become seduced by computer glamour.

Whether it’s a television detective instantly checking a database of fingerprints or the ease of Amazon.com’s “1-Click” button, we imagine that software is a kind of magic — all the more so if it’s software we’ve never actually experienced. We expect it to be effortless. We don’t think about how it got there or what its limitations might be. Instead of imagining future technologies as works in progress, improving over time, we picture them as perfect from day one.

October 13, 2013

Stross – Microsoft Word delenda est

Filed under: Media, Technology — Tags: , , — Nicholas Russon @ 10:32

As a writer, Charles Stross hates, hates, hates, hates, hates Microsoft Word and wants it to DIE:

Microsoft Word is a tyrant of the imagination, a petty, unimaginative, inconsistent dictator that is ill-suited to any creative writer’s use. Worse: it is a near-monopolist, dominating the word processing field. Its pervasive near-monopoly status has brainwashed software developers to such an extent that few can imagine a word processing tool that exists as anything other than as a shallow imitation of the Redmond Behemoth. But what exactly is wrong with it?

I’ve been using word processors and text editors for nearly 30 years. There was an era before Microsoft Word’s dominance when a variety of radically different paradigms for text preparation and formatting competed in an open marketplace of ideas. One early and particularly effective combination was the idea of a text file, containing embedded commands or macros, that could be edited with a programmer’s text editor (such as ed or teco or, later, vi or emacs) and subsequently fed to a variety of tools: offline spelling checkers, grammar checkers, and formatters like scribe, troff, and latex that produced a binary page image that could be downloaded to a printer.

These tools were fast, powerful, elegant, and extremely demanding of the user. As the first 8-bit personal computers appeared (largely consisting of the Apple II and the rival CP/M ecosystem), programmers tried to develop a hybrid tool called a word processor: a screen-oriented editor that hid the complex and hostile printer control commands from the author, replacing them with visible highlight characters on screen and revealing them only when the user told the program to “reveal codes”. Programs like WordStar led the way, until WordPerfect took the market in the early 1980s by adding the ability to edit two or more files at the same time in a split screen view.

October 5, 2013

QotD: Immortality, if you want it

Filed under: Humour, Science, Technology — Tags: , — Nicholas Russon @ 00:01

Those of us who are non-believing heathens might prefer porting our minds to robot bodies before the natural expiration date on our organic selves. It’s hard to wrap your head around the idea that a digital representation of your mind, no matter how accurate, is still “you” in some sense. But I think that fear will go away as soon as we see the first robot that thinks and acts exactly like Uncle Bob did before he made the jump. If Uncle Bob the robot acts human enough, we’ll come to see him as the same entity that once inhabited an organic shell. When technology is sufficiently advanced, we’ll get past the magical thinking about spirits and souls and the specialness of having organic parts.

To me, the most interesting possibility for the future involves porting human minds to software that includes entirely simulated realities. Such a program — a digital mind if you will — could live in an entirely artificial reality and experience what seems to be a genuine human life for the rest of eternity, or at least as long as the software keeps running. The freaky part is that if such a thing will someday be possible — and I think it will — then it follows that the time after it happens will be infinitely long whereas the history of time before it happens is finite. So it follows that there is an infinitely greater chance you are already the simulation and not a human who is reading this paragraph and contemplating it. Weird.

If you didn’t already have enough reasons to eat right, exercise, and keep your mind sharp, consider what you might be bringing to your own immortality. I was hoping to get there before the dementia sets in. But I just reread what I wrote and apparently I’m already too late.

Scott Adams, “Choose Your Immortality: Someday you’ll be a robot with a locket holding your last human cells”, Time, 2013-09-18

September 6, 2013

Yahoo goes out of its way to lose more long-term users

Filed under: Business, Media, Technology — Tags: , — Nicholas Russon @ 08:11

I moderate a few special interest groups on Yahoo Groups, and I’m subscribed to a couple of dozen others. There’s nothing flashy or exciting about the service: it’s been relatively stable for years, with few changes or disruptions. For most users, this has been ideal. This week Yahoo not only introduced a new logo, they also tossed a stink bomb into the placid Yahoo Groups with a new user interface called “Neo”. They apparently rolled out the changes to a few groups last month, but most users and list owner/moderators hadn’t been given any notice that the change was coming. The Register‘s Kelly Fiveash on the diabolical scheme to annoy long-term users of Yahoo Groups:

‘WTF! MORONS!’ Yahoo! Groups! redesign! traumatises! users!
‘Vile, unfriendly interface’ attacked by world+dog. But format stays

Yahoo! has told thousands of users who are complaining about the Purple Palace’s pisspoor redesign of its Groups service that it will not be rolled back to the old format — despite a huge outcry.

The Marissa Mayer-run company revamped Yahoo! Groups last week, but it was immediately inundated with unhappy netizens who grumbled that the overhaul was glitchy, difficult to navigate and “severely degraded”.

In response, Yahoo! told its users:

    We deeply value how much you, our users, care about Yahoo! Groups … we launched our first update to the Groups experience in several years and while these changes are an important step to building a more modern Groups experience, we recognise that this is a considerable change.

    We are listening to all of the community feedback and we are actively measuring user feedback so we can continuously make improvements.

But the complaints have continued to flood in since Yahoo! made the tweak by changing its “classic” (read: ancient) interface to one dubbed “neo” that appeared to have been quickly spewed on to the interwebs with little testing before going live.

And — while the company claimed it was listening closely to its users about the new look Yahoo! Groups — it has ignored pleas from thousands of people who want it to reverse the update.

For users who access Yahoo Groups through the website, the new design has completely befuddled many, hiding functions (and even group names) and making it far more difficult to search for older posts (you reportedly have to search by message number: no other searches are supported). Even for those who only receive email updates, the Neo redesign included odd and sometimes completely unreadable email formatting, broken links, and other highly irritating issues.

This is the real problem with “free” services: when things go wrong, as a user of the service, you don’t have much leverage to complain or to get things fixed.

August 29, 2013

New Zealand bans (most) software patents

Filed under: Law, Technology — Tags: , , , — Nicholas Russon @ 09:27

Hurrah for New Zealand:

A major new patent bill, passed in a 117-4 vote by New Zealand’s Parliament after five years of debate, has banned software patents.

The relevant clause of the patent bill actually states that a computer program is “not an invention.” Some have suggested that was a way to get around the wording of the TRIPS intellectual property treaty, which requires patents to be “available for any inventions, whether products or processes, in all fields of technology.”

Processes will still be patentable if the computer program is merely a way of implementing a patentable process. But patent claims that cover computer programs “as such” will not be allowed.

It seems there will be some leeway for computer programs directly tied to improved hardware. The bill includes the example of a better washing machine. Even if the improvements are implemented with a computer program, “the actual contribution is a new and improved way of operating a washing machine that gets clothes cleaner and uses less electricity,” so a patent could be awarded.

Older Posts »
« « Meet the Undercover Economist, Tim Harford| British parliament defeats government motion on Syria » »

Powered by WordPress

%d bloggers like this: