Precut – Modern Japanese Timber Construction from BAKOKO on Vimeo.
H/T to Popular Woodworking for the link.

Precut – Modern Japanese Timber Construction from BAKOKO on Vimeo.
H/T to Popular Woodworking for the link.
Responding to an article in Vanity Fair by Kurt Andersen, Cowen lists a few things that must be taken into consideration:
1. Movies: The Hollywood product has regressed, though one can cite advances in 3-D and CGI as innovations in the medium if not always the aesthetics. The foreign product is robust in quality, though European films are not nearly as innovative as during the 1960s and 70s. Still, I don’t see a slowdown in global cinema as a whole.
2. TV: We just finished a major upswing in quality for the best shows, though I fear it is over, as no-episode-stands-alone series no longer seem to be supported by the economics.
3. Books/fiction: It’s wrong to call graphic novels “new,” but they have seen lots of innovation. If we look at writing more broadly, the internet has led to plenty of innovation, including of course blogs. The traditional novel is doing well in terms of quality even if this is not a high innovation era comparable to say the 1920s (Mann, Kafka, Proust, others).
4. Computer and video games: This major area of innovation is usually completely overlooked by such discussions.
He also includes something which — at least for me — counts as a “killer app” for this kind of discussion:
7. Your personal stream: This is arguably the biggest innovation in recent times, and it is almost completely overlooked. It’s about how you use modern information technology to create your own running blend of sources, influences, distractions, and diversions, usually taken from a blend of the genres and fields mentioned above. It’s really fun and most of us find it extremely compelling
That “personal stream” is so pervasive that we generally don’t notice that it’s a hallmark of the modern era. We don’t get our news from single sources anymore: not just a single local newspaper, or a single TV newscast. We can easily find like-minded communities for just about all our niche interests with relatively minimal effort. In the past, such communities were severely distance-challenged to even form, never mind to thrive.
That we now can easily control and direct our personal streams to include and exclude in such fine gradations is something that few could even imagine 20 years ago. In a sense, we all have “clipping services” providing us with interesting and relevant snippets, but we can literally have hundreds of such services — at little or no cost — to chase down our merest whims for fresh information. It doesn’t show up on the GNP as a gain, but it’s very much a differentiator between today and just a few years back. Yet we don’t notice because we’re immersed in it.
Stephen Gordon thinks that the term “innovation” is well on the way to being just another way of saying “corporate handout”:
The theory of economic growth includes roles for such well-defined concepts as investment, human capital, research and development, productivity, and technical progress. I don’t know where innovation fits into this. My guess would have been that innovation is another name for R&D, but apparently there’s an ineffable distinction between innovation and R&D.
There are well-known policy instruments at the government’s disposal for increasing investment in human and physical capital and for increasing R&D activities. (Their relative effectiveness is another question.) But so far, the only proposals I’ve seen for an innovation policy consist of programs in which governments give money to deserving firms. This is problematic on a couple fronts.
Firstly, there are already many — too many — ‘economic development’ programs whose purpose is to channel public money to companies that enjoy the favour of the government. It’s hard to believe we need more of them.
This is the sort of thing that more science fiction authors should take into account before they write, but not enough seem to do:
So here are some rules of thumb I use, tending towards an increasingly narrow focus. (Sorry if you were expecting me to address the broader uses of confabulation as a fictional tool; this is very much a set of practical guidelines rather than an examination of the theory behind the activity.)
1. Humans are interested in reading fiction about humans.
Constraint #1 on any work of fiction is that it needs to provide an environment in which recognizable human protagonists can exist. If they’re not human (e.g. “Diaspora”, by Greg Egan; “Saturn’s Children”, by me) you need to provide some sort of continuity with the human and give the reader reasons to feel concerned for them. Or you can go for the “they’re not human, don’t look human, and they have no connection with us”, but what you get is either borderline-unreadable at best, or suffers from human-mind-in-a-giant-land-snail-body syndrome (which risks demolishing the reader’s willing suspension of disbelief).
So I’m going to focus on providing a human environment …
2. In general, High Fantasy steals its dress from pre-modern history; Urban Fantasy buys off-the-shelf in TK-Maxx: and Science Fiction goes for that bold futurist look.
Which is to say, if you’re going to write a trilogy with a young soldier on the rise and a throne and an evil emperor, you can do a lot worse than plunder the decline and fall of the Roman Empire for your social background. Note, however, that you’ll do a lot better if you read some social history texts rather than believing what you see in the movies.
That last bit is especially good advice, as the more you know about cultures other than the one you were raised in, the better you can understand why things are different. Ancient Babylonians were not just Englishmen with funny clothes. Classic Greece, for all that it provided a lot of the underpinnings of our western culture, was functionally very different from life as we know it now.
Lewis Page on the new lightweight ammunition the US Army is considering introducing:
The US Army has announced successful tests of a new, lightweight portable machine-gun which fires special plastic ammunition. The gun and ammo are so much lighter than current weapons and their brass-cased cartridges that some soldiers are suggesting that every infantryman could in future pack the sort of firepower reserved today for heavy-weapons specialists.
[. . .]
[M]ost soldiers are armed with assault rifles not intended to deliver sustained automatic fire and holding less ammo. These lighter weapons are handier for close-in fighting and permit other kit to be carried.
But US military boffins at the famous Picatinny Arsenal have been working on this situation for some time. Since ammo weight and bulk is much of the problem, they have come up with a new kind of ammunition: Cased Telescoped cartridges.
In a cased telescoped round, the bullet is no longer attached to the tip of a brass case full of propellant powder. The new case is shorter, fatter and made of plastic, so weighing substantially less, and the bullet is sunk into the middle of the propellant which makes the whole round shorter — it has been “telescoped”. A shorter round weighs less itself, and also means that the gun’s action, feed equipment etc is smaller and thus lighter as well. It’s a trick originally developed for tanks, to make the turret smaller and easier to protect.
According to the Picatinny scientists, their new LMG and a thousand rounds of its plastic-cased-telescoped ammo weigh no less than 20.4 pounds less than the current M249 (a version of which is also used by British troops) and a thousand ordinary 5.56mm brass cartridges. The new LMG shaves no less than 8.3 pounds off the 15.7-lb M249, coming in at just 7.4lb — actually lighter than a standard British SA80 assault rifle! This, perhaps, explains Specialist Smith’s opinion that it would be reasonable for all soldiers to carry such weapons, rather than just heavy-weapons specialists.
Gawker tries to beat the rush to switch from praising the dead to exposing their flaws:
We mentioned much of the good Jobs did during his career earlier. His accomplishments were far-reaching and impossible to easily summarize. But here’s one way of looking at the scope of his achievement: It’s the dream of any entrepreneur to affect change in one industry. Jobs transformed half a dozen of them forever, from personal computers to phones to animation to music to publishing to video games. He was a polymath, a skilled motivator, a decisive judge, a farsighted tastemaker, an excellent showman, and a gifted strategist.
One thing he wasn’t, though, was perfect. Indeed there were things Jobs did while at Apple that were deeply disturbing. Rude, dismissive, hostile, spiteful: Apple employees — the ones not bound by confidentiality agreements — have had a different story to tell over the years about Jobs and the bullying, manipulation and fear that followed him around Apple. Jobs contributed to global problems, too. Apple’s success has been built literally on the backs of Chinese workers, many of them children and all of them enduring long shifts and the specter of brutal penalties for mistakes. And, for all his talk of enabling individual expression, Jobs imposed paranoid rules that centralized control of who could say what on his devices and in his company.
[. . .]
Before he was deposed from Apple the first time around, Jobs already had a reputation internally for acting like a tyrant. Jobs regularly belittled people, swore at them, and pressured them until they reached their breaking point. In the pursuit of greatness he cast aside politeness and empathy. His verbal abuse never stopped.
[. . .]
Steve Jobs created many beautiful objects. He made digital devices more elegant and easier to use. He made a lot of money for Apple Inc. after people wrote it off for dead. He will undoubtedly serve as a role model for generations of entrepreneurs and business leaders. Whether that’s a good thing or a bad thing depends on how honestly his life is appraised.
Update: ESR has some thoughts on the legacy — good and bad — and the man:
It’s easy to point at the good Steve Jobs did. While he didn’t invent the personal computer, he made it cool, twice. Once in 1976 when the Apple II surpassed all the earlier prototypes, and again in 1984 with the introduction of the Mac. I’ll also always be grateful for the way Jobs built Pixar into a studio that combined technical brilliance with an artistic sense and moral centeredness that has perhaps been equaled in the history of animated art, but never exceeded.
But the Mac also set a negative pattern that Jobs was to repeat with greater amplification later in his life. In two respects; first, it was a slick repackaging of design ideas from an engineering tradition that long predated Jobs (in this case, going back to the pioneering Xerox PARC WIMP interfaces of the early 1970s). Which would be fine, except that Jobs created a myth that arrogated that innovation to himself and threw the actual pioneers down the memory hole.
Second, even while Jobs was posing as a hip liberator from the empire of the beige box, he was in fact creating a hardware and software system so controlling and locked down that the case couldn’t even be opened without a special cracking tool. The myth was freedom, but the reality was Jobs’s way or the highway. Such was Jobs’s genius as a marketer that he was able to spin that contradiction as a kind of artistic integrity, and gain praise for it when he should have been slammed for hypocrisy.
[. . .]
What’s really troubling is that Jobs made the walled garden seem cool. He created a huge following that is not merely resigned to having their choices limited, but willing to praise the prison bars because they have pretty window treatments.
[. . .]
Commerce is powerful, but culture is even more persistent. The lure of high profits from secrecy rent can slow down the long-term trend towards open source and user-controlled computing, but not really stop it. Jobs’s success at hypnotizing millions of people into a perverse love for the walled garden is more dangerous to freedom in the long term than Bill Gates’s efficient but brutal and unattractive corporatism. People feared and respected Microsoft, but they love and worship Apple — and that is precisely the problem, precisely the reason Jobs may in the end have done more harm than good.
In his defence of the late Steve Jobs, Brendan O’Neill pinpoints the exact moment that Apple stopped being the ne plus ultra of status signalling devices for the Guardianista set:
It is absolutely no coincidence that it became cool to hate Apple just as Apple started to make products for (whisper it) ‘the masses’. Back when Apple was largely known as the provider of smooth computers to graphic designers and Guardian columnists, there was nothing cooler than being an Applehead. But then it made the iPod and the iPhone, which you can now see everyone from paint-covered builders to Romanian au pairs tapping away on, and that meant it was just another engine of ‘mass consumerism’, the thing the chattering classes hate most. So where in the Nineties, people who used Apple products were presumed to be erudite and tasteful, now people who use Apple products are ‘iZombies’ or ‘hostages’, as one columnist calls them. In the eyes of the opinion-forming classes, Jobs’ great crime was to include the little people in his techno-revolution, to give glossy gadgets to the masses as well as the intellectuals, since that robbed these gadgets of the special symbolism that allowed their users to declare: ‘I am above the crowd.’
As to the idea that Jobs was the killer of Chinese people, this, too, is fuelled by the perverse fantasies of the uncomfortable-with-capitalism cultural elite. Following some suicides at the factories in China in which Apple stuff is put together, it became fashionable here in the West to indulge in orgies of iGuilt, to whip both yourself and everyone else for wanting gadgets so badly that we’re willing to turn a blind eye to ‘enslavement’ in China. The deaths in China were referred to as ‘The iPad suicides’, with journalists saying: ‘Should you blame yourself for all those deaths at the Chinese electronics factory? Yes.’
Yet as I argued on spiked last year, anyone who looked at the number of suicides in these vast factories, which can employ up to 400,000 people, would have realised that the suicide rate was lower in these places than it was in China as a whole. The self-flagellation of iPad-using hacks in the West merely revealed how shallow and moralistic so-called anti-capitalism is these days, where the aim is not to analyse social relations, all the better to overhaul them, but rather to partake in a borderline Catholic guilt trip about the impact of our greed on their lives. In one fell swoop, Jobs-bashers manage to criminalise the material aspirations of Western consumers, the iZombies whose desires are apparently dangerous, and to infantilise Chinese workers, who are depicted as hapless victims, in need of rescue by that super-super-cool tribe of East Coast and Shoreditch hipsters who now actually boycott Apple products. Rad, man.
Kevin D. Williams explains why the late Steve Jobs did more good by avoiding big-ticket philanthropy and concentrating on his business:
Mr. Jobs’s contribution to the world is Apple and its products, along with Pixar and his other enterprises, his 338 patented inventions — his work — not some Steve Jobs Memorial Foundation for Giving Stuff to Poor People in Exotic Lands and Making Me Feel Good About Myself. Because he already did that: He gave them better computers, better telephones, better music players, etc. In a lot of cases, he gave them better jobs, too. Did he do it because he was a nice guy, or because he was greedy, or because he was a maniacally single-minded competitor who got up every morning possessed by an unspeakable rage to strangle his rivals? The beauty of capitalism — the beauty of the iPhone world as opposed to the world of politics — is that that question does not matter one little bit. Whatever drove Jobs, it drove him to create superior products, better stuff at better prices. Profits are not deductions from the sum of the public good, but the real measure of the social value a firm creates. Those who talk about the horror of putting profits over people make no sense at all. The phrase is without intellectual content. Perhaps you do not think that Apple, or Goldman Sachs, or a professional sports enterprise, or an internet pornographer actually creates much social value; but markets are very democratic — everybody gets to decide for himself what he values. That is not the final answer to every question, because economic answers can only satisfy economic questions. But the range of questions requiring economic answers is very broad.
I was down at the Occupy Wall Street protest today, and never has the divide between the iPhone world and the politics world been so clear: I saw a bunch of people very well-served by their computers and telephones (very often Apple products) but undeniably shortchanged by our government-run cartel education system. And the tragedy for them — and for us — is that they will spend their energy trying to expand the sphere of the ineffective, hidebound, rent-seeking, unproductive political world, giving the Barney Franks and Tom DeLays an even stronger whip hand over the Steve Jobses and Henry Fords. And they — and we — will be poorer for it.
H/T to Jon, my former virtual landlord, for the link.
Update: An obituary from The Economist seems pretty accurate to me:
NOBODY else in the computer industry, or any other industry for that matter, could put on a show like Steve Jobs. His product launches, at which he would stand alone on a black stage and conjure up a “magical” or “incredible” new electronic gadget in front of an awed crowd, were the performances of a master showman. All computers do is fetch and shuffle numbers, he once explained, but do it fast enough and “the results appear to be magic”. He spent his life packaging that magic into elegantly designed, easy to use products.
[. . .]
His on-stage persona as a Zen-like mystic notwithstanding, Mr Jobs was an autocratic manager with a fierce temper. But his egomania was largely justified. He eschewed market researchers and focus groups, preferring to trust his own instincts when evaluating potential new products. “A lot of times, people don’t know what they want until you show it to them,” he said. His judgment proved uncannily accurate: by the end of his career the hits far outweighed the misses. Mr Jobs was said by an engineer in the early years of Apple to emit a “reality distortion field”, such were his powers of persuasion. But in the end he changed reality, channelling the magic of computing into products that reshaped music, telecoms and media. The man who said in his youth that he wanted to “put a ding in the universe” did just that.
Update, the second: “Death is very likely the single best invention of life.” Steve Jobs, 2005.
Neal Stephenson on the ability of science fiction to inspire:
In early 2011, I participated in a conference called Future Tense, where I lamented the decline of the manned space program, then pivoted to energy, indicating that the real issue isn’t about rockets. It’s our far broader inability as a society to execute on the big stuff. I had, through some kind of blind luck, struck a nerve. The audience at Future Tense was more confident than I that science fiction [SF] had relevance — even utility — in addressing the problem. I heard two theories as to why:
1. The Inspiration Theory. SF inspires people to choose science and engineering as careers. This much is undoubtedly true, and somewhat obvious.
2. The Hieroglyph Theory. Good SF supplies a plausible, fully thought-out picture of an alternate reality in which some sort of compelling innovation has taken place. A good SF universe has a coherence and internal logic that makes sense to scientists and engineers. Examples include Isaac Asimov’s robots, Robert Heinlein’s rocket ships, and William Gibson’s cyberspace. As Jim Karkanias of Microsoft Research puts it, such icons serve as hieroglyphs — simple, recognizable symbols on whose significance everyone agrees.
Researchers and engineers have found themselves concentrating on more and more narrowly focused topics as science and technology have become more complex. A large technology company or lab might employ hundreds or thousands of persons, each of whom can address only a thin slice of the overall problem. Communication among them can become a mare’s nest of email threads and Powerpoints. The fondness that many such people have for SF reflects, in part, the usefulness of an over-arching narrative that supplies them and their colleagues with a shared vision. Coordinating their efforts through a command-and-control management system is a little like trying to run a modern economy out of a Politburo. Letting them work toward an agreed-on goal is something more like a free and largely self-coordinated market of ideas.
Ever read some of those late-60s or early-70s SF stories that assumed that computers that ran using non-binary logic would replace the binary logic machines of the day? Welcome to the future:
In 2007, Hewlett-Packard’s labs demonstrated the first memristor recognized as such. A portmanteau of “memory” and “resistor,” “memristance” was the theoretical fourth circuit variable first described in 1971. While HP stock will probably not yield the sort of profits we’re looking for here, it will help generate them indirectly.
Because of its unique properties, memristors will enable far more powerful circuitry. Unlike transistor-based circuits that form the core of modern electronics, memristive circuits retain their state after losing power. Theoretically, you could power on a memristor-based computer and have all the data in memory that it had when you powered off. Memristor memory could replace hard drives and transistor-based RAM.
Memristors, however, can do more than act as memory. They can replace existing processing components. This means that much more functionality can be implemented in a single component. Instead of busing data back and forth between separate memory and processing locations on a circuit board, memristors do it all. Data, then, are available for processing with shorter wait times. Memristors reduce total hardware size, cost and energy consumption. Yet memristors can multitask in other ways, opening up a whole range of exciting possibilities.
John Tierney reports on the conservation efforts on one of the most revolutionary warships in history:
[In 1861] a shipyard in Greenpoint, Brooklyn, launched not merely an ironclad but an entirely new kind of warship. The U.S.S. Monitor had no masts and no line of cannons. It was essentially a submarine beneath a revolving gun turret, something so tiny and bizarre-looking that many experts doubted the “cheese box on a raft” would float, much less fight.
But somehow it survived both the Navy bureaucracy and a broadside barrage to become one of the most celebrated ships in the world. Its designer and crew were the 19th-century celebrity equivalent of astronauts. Long after the ship sank in a storm off Cape Hatteras, N.C., the turret remained a cultural icon: an “armored tower” in Melville’s poetry, an image on book covers and film posters, a shape reproduced in items from toys to refrigerators.
Now the original turret, which was recovered from the ocean floor nine years ago and placed in a freshwater tank to protect it from corrosion, is on display again. It has been temporarily exposed to the air so that it can be scraped clean — very carefully, in front of museum visitors and a live webcam — by a team of researchers at the U.S.S. Monitor Center of the Mariners’ Museum here in Newport News. The team expects to have nearly all the barnacles and sediment removed by the end of this month, giving the public a new look at the dents from the Confederate cannonballs and shells that would have sunk any ordinary ship of its day. Then the turret will be submerged again in fresh water for 15 more years, until enough ocean salt has been removed from the metal to allow it to face the air permanently.
Strategy Page titled this one as “Four Decades To Become An Overnight Sensation”:
Wonder weapons, in general, aren’t. Those spiffy and seemingly magical new “wonder weapons” tend to be old weapons designs that finally got to the point where they lived up to the original hype. Take smart bombs. They were invented, and used quite successfully, during World War II. But these were radio controlled, and required skilled operators to succeed. Expensive as well, and no one wanted to spend the money to train effective operators in peacetime. In wartime, price was no object, and experience was easy to get.
Thus the U.S. dropped smart bombs from their arsenal after World War II, and didn’t revive them until the 1960s, when lasers (developed a decade earlier) were used to bounce their light off a target. A bomb was equipped with a seeker that could home on the reflected laser light, and a guidance kit (battery and motors to operate small wings) to hit the target without an operator. This was cheaper and more effective than the earlier smart bombs. The next big jump, in the 1990s, was the GPS guided bomb, which finally perfected the smart bomb. Thus this wonder weapon took four decades to become an overnight sensation.
Other examples are helicopters, which became iconic of the Vietnam War: first flown in 1904, used sparingly by both sides in World War II, but not in wide use until the 1950s.
While many of these systems are called “wonder weapons,” they aren’t. That’s because every new weapon quickly produces new tactics and combat techniques that reduce the improved capabilities of the new weapons. This is often ignored by historians. Self-preservation is a great motivator, and in the face of new weapons, the enemy will quickly find ways to diminish the wonder.
Powered by WordPress