Charles Stross thinks he understands why Steve Jobs won’t allow Adobe Flash on to the iPhone and iPad:
Steve Jobs believes he’s gambling Apple’s future — the future of a corporation with a market cap well over US $200Bn — on an all-or-nothing push into a new market. HP have woken up and smelled the forest fire, two or three years late; Microsoft are mired in a tar pit, unable to grasp that the inferno heading towards them is going to burn down the entire ecosystem in which they exist. There is the smell of panic in the air, and here’s why . . .
We have known since the mid-1990s that the internet was the future of computing. With increasing bandwidth, data doesn’t need to be trapped in the hard drives of our desktop computers: data and interaction can follow us out into the world we live in. Modem uptake drove dot-com 1.0; broadband uptake drove dot-com 2.0. Now everyone is anticipating what you might call dot-com 3.0, driven by a combination of 4G mobile telephony (LTE or WiMax, depending on which horse you back) and wifi everywhere. Wifi and 4G protocols will shortly be delivering 50-150mbps to whatever gizmo is in your pocket, over the air. (3G is already good for 6mbps, which is where broadband was around the turn of the millennium. And there are ISPs in Tokyo who are already selling home broadband delivered via WiMax. It’s about as fast as my cable modem connection was in 2005.)
[. . .]
This is why there’s a stench of panic hanging over silicon valley. this is why Apple have turned into paranoid security Nazis, why HP have just ditched Microsoft from a forthcoming major platform and splurged a billion-plus on buying up a near-failure; it’s why everyone is terrified of Google:
The PC revolution is almost coming to an end, and everyone’s trying to work out a strategy for surviving the aftermath.
Read the whole thing. I don’t see any obvious flaw in his line of thought. It may not happen the way he predicts, but it is consistent with what we know, and it should frighten the heck out of Apple’s competitors.
I found your blog on google and read a few of your other posts. I just added you to my Google News Reader. Keep up the good work. Look forward to reading more from you in the future.
Comment by Darryl Coleman — April 30, 2010 @ 16:47
Well there’s one obvious flaw: assuming everything will follow the same general trend line ad infinitum into our Glorious Cloud Future. Which sounds an awful lot like the scaled-up version of our Glorious Mainframe Future, circa 1970, and our Glorious Virtualised Future, circa 2000.
Computer trends tend to ebb and flow between centralised and decentralised models but no one architecture ever wins out decisively. Just look at the history of corporate computing, from centralised mainframes to decentralised individual servers and PCs; then from the support sprawl of PCs to more concentrated and centralised virtual servers and “cloud” computing.
You can see this cycle play out in any given company as they go through the insourcing/outsourcing cycle every couple of years. Nobody ever stays insourced or outsourced, that condition ebbs and flows as the company’s perception of its expenses, service levels, and so on are changed. Likewise with the cloud; everyone will love it right up to the point where it suffers a major failure and then nobody quite trusts the third-party provider as much as they used to (hello, McAfee). Then they will want greater control over their own destiny for a while, until they realise it’s a big investment of time and effort and treasure. And then they’ll decide to offload it into the cloud (or whatever replaces it).
There’s no final winner, the core model of distributed computing eternally drifts between its binary polar stars.
Comment by Chris Taylor — May 1, 2010 @ 08:06