Not yet commercially available, but a fascinating proof-of-concept model. H/T to Peter Suderman.
September 11, 2009
4 Comments
RSS feed for comments on this post.
Sorry, the comment form is closed at this time.
Not yet commercially available, but a fascinating proof-of-concept model. H/T to Peter Suderman.
RSS feed for comments on this post.
Sorry, the comment form is closed at this time.
Powered by WordPress
Zombie. Shmombie.
For a quarter I got to play this http://www.youtube.com/watch?v=g1EUc7oHulQ and I didn’t have to run around breaking out in a sweat doing it.
How, with all this caloric expenditure, is the next generation of obese and lethargic youth ever going to exceed what their parents managed to achieve?
I’ll reserve final judgment until I get a chance to spend a few hundred hours walking around town being unpleasant to the undead.
Comment by DWM — September 11, 2009 @ 14:04
Your quarter back then was — relatively speaking — a much greater chunk of your personal net worth than the cost of the rig for “ARhrrr” would be now.
Of course, maybe I’m just bitter because my l33t gaming skillz were not up to winning often enough to make the twenty-five cent investment yield a good enough return.
Naw, that couldn’t be it.
Comment by Nicholas — September 11, 2009 @ 14:19
The augmented reality thing could actually be useful, but here it’s being pissed away on a game.
If you could find a way around the need for the device to look at a reference grid (the map on the table in this example is a reference grid that the device recognizes and uses to position the 3D drawings on the device screen) so that you could point at anything and the device would render the correct perspective and whatnot, then this could be handy for real-world applications: things like architectural rendering, surveying, undressing passers-by. You know, productive stuff.
Humph.
Comment by Lickmuffin — September 11, 2009 @ 16:17
You’re forgetting, in your rush to condemn games, that games have been the second greatest driver of tech advancement in the last 20 years of computing. The greatest, of course, has been PR0N . . .
Comment by Nicholas — September 12, 2009 @ 10:02