Quotulatiousness

August 28, 2014

Digital “ecosystems”, “platforms”, and sunk costs

Filed under: Business, Technology — Tags: , , — Nicholas Russon @ 09:19

The Guardian Technology Blog looks at how digital product vendors attempt to lock you into their own (more profitable) platform or ecosystem:

Depending on your view, the stuff you own is either a boon to business or a tremendous loss of opportunity.

For example, your collection of spice bottles in your pantry means that I could possibly sell you a spice rack. On the other hand, it also means that I can’t design a special spice rack that only admits spice bottles of my own patent-protected design, which would thereby ensure that if you wanted to buy spices in the future you’d either have to buy them from me or throw away that very nice spice rack I sold you.

In the tech world, this question is often framed in terms of “ecosystems” (as in the “Google/Chrome/Android ecosystem”) or platforms (as in the “Facebook platform”) but whatever you call it, the discussion turns on a crucial different concept: sunk cost.

That’s the money, time, mental energy and social friction you’ve already sunk into the stuff you own. Your spice rack’s sunk cost includes the money you spend on the rack, the time you spent buying fixings for it and the time you spent affixing it, the emotional toil of getting your family to agree on a spice rack, and the incredible feeling of dread that arises when you contemplate going through the whole operation again.

If you’ve already got a lot of sunk costs, the canny product strategy is to convince you that you can buy something that will help you organise your spices, rip all your CDs and put them on a mobile device, or keep your clothes organised.

But what a vendor really wants is to get you to sink cost into his platform, ecosystem, or what have you. To convince you to buy his wares, in order to increase the likelihood that you’ll go on doing so — because they match the decor, because you already have the adapters, and so on.

The vendor wants to impose a switching cost on you, to penalise you for disloyalty should you defect to another ecosystem/platform. The higher your switching costs, the worse the vendor can afford to treat you — rather than supplying the best goods at the best price, he can provide the best goods at the best price, plus the switching cost you’d have to pay if you went somewhere else. Or he can offer the best price, but offer goods whose manufacture — and quality — is cheaper by a sum of about the cost you’d have to pay for switching.

August 25, 2014

Want the coolest thing to hit the waves recently? $10 million will be the starting price

Filed under: Military, Technology — Tags: , , — Nicholas Russon @ 14:52

Slashgear‘s Chris Davies says this is just the ticket for up-and-coming Bond villains:

It looks like a half-submerged X-Wing, or maybe a Star Trek Shuttle, but it’s actually Ghost, one American start-up’s vision for what an attack helicopter designed for the navy might look like. Mustering 4,000 HP from two engines on the end of powered legs, Ghost promises to whip across the ocean in a supercavitation bubble, avoiding radar and with a silky smooth ride for the crew inside.

What makes the boat special is how it improves on hydroplane technology, more commonly used in racing boats. Hydroplanes increase their top speed by skimming across the top of the water, rather than burying their hulls in it, reducing drag in the process.

However, that also makes them relatively unstable and prone to flipping — so, Ghost’s manufacturer Juliet Marine Systems turned to supercavitation, which creates a bubble of gas around each of the legs and cuts drag by a factor of 900. Air is pulled down through the struts, while the propellers are at the front of the 62 foot long tubes, effectively pulling the vessel along.

August 23, 2014

Defining the “best tank of World War 2″

Filed under: Britain, Europe, History, Military, Technology, USA — Tags: , , , , , — Nicholas Russon @ 11:14

Nigel Davies revisits one of the perpetual debates among amateur WW2 historians:

Let us start with the issue of tanks from the perspective of propaganda. More rubbish has been written about who had the best tanks during the Second World War than about any other topic to do with that war. Again and again you get supposedly serious historians talking about how the Germans started the Second World War with overwhelming tank superiority; that the Allies were only brought back into the race by the arrival of the Sherman tank; and how German technology leapt ahead again at the end of the Second World War to give them unrivalled vehicles. All these statements are of course completely incorrect.

One of the problems of course, is ‘best tank when, and for what?’

Comparing what was available in 1939/40 to what was being produced in 1945 (say a Panzer III or Matilda II with a Centurion or Stalin), is worse than useless. There is no comparison. Only the Panzer IV was actually produced throughout the war: and the heavily armoured final version of the tank — with a long barrelled 75mm gun capable of taking on almost every tank yet operational in mid 1945 — bore only a passing resemblance to the lightly armoured tank with a short barrelled infantry support gun — of with minimal ability to do more than scratch the paint of a CharB in 1939/40.

SOMUA 35 tank at Bovington Tank Museum (via Wikipedia)

SOMUA 35 tank at Bovington Tank Museum (via Wikipedia)

It’s relatively easy to do a quick measurables test comparing one tank against another: thickness and location of the armour, size and muzzle velocity of the main gun, engine horsepower, road speed, etc., but the very best tank on all of those measurements could still be beaten by an enemy using better combat tactics: the French SOMUA 35 and the British Matilda II were the best tanks in the world in 1939 and 1940 respectively (according to Davies). In spite of the superior measurables, the SOMUA 35 was incredibly limited by having the tank commander also be the gunner and loader and it lacked a radio for communication (and even if they had been so equipped, the already overworked tank commander would have had to be the radio operator, too). The Matilda was designed as an infantry tank, so it was very heavily armoured, but relatively slow and somewhat undergunned (the 40mm main gun only had solid shot for anti-armour use: there was no high explosive round for softer targets).

Matilda II at Yad la-Shiryon Museum (via Wikipedia)

Matilda II at Yad la-Shiryon Museum (via Wikipedia)

The Matilda and its successor the Valentine would probably still the best Allied tanks in the world in early 1941, when they swept Italian forces before them, and several times fought the German African corps to a standstill. The German response to their shocking failures in 1940, had been to upgrade the Panzer III and IV with slightly improved armour, and the short barreled 50 mm gun. But they were still on a losing wicket engaging the British infantry tanks in any sort of close terrain, such as in the siege of Tobruk. Fortunately for Rommel, out in the open terrain of the desert he could deploy his tanks behind screens of high-powered anti-tank guns, which the British tanks lacked the long-range high explosive shells to engage effectively.

It also helped that too many British cavalry officers in the desert war still had a “tally ho!” attitude and were frequently drawn into unsupported tank charges against German or Italian tanks who were able to draw the fast but lightly armoured British cruisers into easy killing range of their anti-tank guns.

M4A1 Sherman tank at Canadian Forces Base Borden (via Wikipedia)

M4A1 Sherman tank at Canadian Forces Base Borden (via Wikipedia)

This is where the myth of the value of the Sherman tank comes from. The Sherman arrived at a time when it’s armour and weapon were on a par with the Panzer III and IV tanks that it was facing. Despite the fact that its 75 mm gun was greatly inferior as an anti-tank weapon to the new British six pounder guns that were starting to equip British tanks, the high explosive shell that the Sherman could fire was incredibly useful for engaging Rommel’s 88 mm guns at long-distance in the flat desert terrain.

For several months, it seemed as though the mechanically reliable Sherman would be a war winner, despite its notable tendency to explode in flames whenever it was hit. (Allied troops refer to it as a Ronson — “lights first time every time”. German troops just referred to it as a “Tommy Cooker”.) But this concept was fantasy, which could be easily demonstrated within a few months, though it took the US government another two years to admit it.

[...]

T-34/85 at musée des blindés de Saumur (via Wikipedia)

T-34/85 at musée des blindés de Saumur (via Wikipedia)

In all of this so far, I have barely mentioned the Russians at all. Their T34 tank was possibly the single most effective of the war, and was the breakthrough that forced everyone else to rethink their designs. So we can say without a shadow of a doubt that the T34 was the best tank of the war for almost two years — from the time of Barbarossa (June 22, 1941) until the appearance of the Panther at Kursk (July 5, 1943). It certainly held this title unchallenged by the Sherman and Churchill tanks that appeared during its reign, and probably by the Tiger as well.

The Tiger is a problem for this sort of discussion, because it re-introduces the concept of ‘what for’ into the debate. The Tiger was a far superior heavy infantry support or assault tank to the T34, but a far inferior battlefield manoeuvre or pursuit tank. In fact the Tiger was so slow and limited in cross country ability, that it was actually more effective as a defensive weapon once the Germans were thrown back on that approach, than it had been for re-igniting their Blitzkreig glory days.

He sums up the post with a league table of “best tanks” for given years and purposes:

Having noted the necessary division between medium cruisers and heavy assault/infantry support tanks however, we can still make a fair summary.

So, in contrast to what many history books and documentaries will tell you, the French had the best tanks in 1939, and the British had the best tanks of 1940 and 1945. Also in contrast to what many history books will tell you, the Shermans effective front-line role can best be defined as the few months between the battle of Alamein, and the arrival of Tiger tanks in Tunisia. All attempts to use it after that in Italy or northern France just demonstrated how pathetic it was in modern engagements. Even the British Firefly version with the 17 pounder, was extremely vulnerable to any German tank. In fact it is amusing to note, that they came into their own for the blitzkrieg across open country in pursuit of the defeated German armies across France; which has a direct parallel to the inferior German tanks pursuing the defeated French in 1940. (The equally inadequate British Cromwell tanks, being significantly faster, were actually still better at this pursuit than the Shermans.) The best tank of the Sherman’s period of functional use, of course being the T34.

So our list of ‘best tanks’ could go something like this.
1939 — Best cruiser – Somua 35, Best support – CharB.
1940 — Best support becomes Matilda II.
1941 — Best cruiser initially Panzer III/IV with short 50mm guns, becomes T34 when Russia enters the war.
1942 — Best support is Tiger.
1943 — Best cruiser is Panther.
1944 — Best support is Tiger II.
1945 — Best ‘all purpose’ is Centurion.

SpaceX test launch goes wrong

Filed under: Space, Technology — Tags: , — Nicholas Russon @ 08:36

As they say, this is why you do the testing: to find out what can go wrong (and hopefully fix the design to prevent that from happening again). The Washington Post‘s Christian Davenport reports:

A new test rocket manufactured by Elon Musk’s upstart space company, SpaceX, blew itself up a few hundred feet over the Texas prairie after a malfunction was detected, the company said in a statement Friday evening.

At its facility in McGregor, Tex., the company was testing a three-engine version of the F9R test vehicle, the successor to its re­usable Grasshopper rocket, which was designed to launch and then land on the same site.

“During the flight, an anomaly was detected in the vehicle and the flight termination system automatically terminated the mission,” company spokesman John Taylor said in the statement.

The rocket never veered off course, and there were no injuries or near injuries, the statement said. A representative from the Federal Aviation Administration was on site during the test flight.

The company stressed that rooting out problems like the one exposed in the flight is the purpose of the test program and said Friday’s test “was particularly complex, pushing the limits of the vehicle further than any previous test. As is our practice, the company will be reviewing the flight record details to learn more about the performance of the vehicle prior to our next test.”

August 5, 2014

New ways to bug a room

Filed under: Technology — Tags: , , , — Nicholas Russon @ 08:33

MIT, Adobe and Microsoft have developed a technique that allows conversations to be reconstructed based on the almost invisible vibrations of surfaces in the same room:

Researchers at MIT, Microsoft, and Adobe have developed an algorithm that can reconstruct an audio signal by analyzing minute vibrations of objects depicted in video. In one set of experiments, they were able to recover intelligible speech from the vibrations of a potato-chip bag photographed from 15 feet away through soundproof glass.

In other experiments, they extracted useful audio signals from videos of aluminum foil, the surface of a glass of water, and even the leaves of a potted plant. The researchers will present their findings in a paper at this year’s Siggraph, the premier computer graphics conference.

“When sound hits an object, it causes the object to vibrate,” says Abe Davis, a graduate student in electrical engineering and computer science at MIT and first author on the new paper. “The motion of this vibration creates a very subtle visual signal that’s usually invisible to the naked eye. People didn’t realize that this information was there.”

[...]

Reconstructing audio from video requires that the frequency of the video samples — the number of frames of video captured per second — be higher than the frequency of the audio signal. In some of their experiments, the researchers used a high-speed camera that captured 2,000 to 6,000 frames per second. That’s much faster than the 60 frames per second possible with some smartphones, but well below the frame rates of the best commercial high-speed cameras, which can top 100,000 frames per second.

I was aware that you could “bug” a room by monitoring the vibrations of a non-soundproofed window, at least under certain circumstances, but this is rather more subtle. I wonder how long this development has been known to the guys at the NSA…

August 4, 2014

Who is to blame for the outbreak of World War One? (Part six of a series)

Filed under: Britain, Europe, History, Military, Technology — Tags: , , , , — Nicholas Russon @ 00:02

Over the last week, I’ve posted entries on what I think are the deep origins of the First World War (part one, part two, part three, part four, part five). And yes, to be honest, I didn’t think it would take quite this many entries to begin to explain how the world catastrophe of August 1914 came about — putting together this series of blog posts has been educational for me, and I hope it’s been at least of interest to you. The previous post examined the history of the Dual Monarchy of Austria-Hungary, in some detail (yes, it matters). Today, we finally clear the Victorian era altogether and begin to look at the last decade-or-so before the outbreak of the war.

The Anglo-German naval race

Even after the creation of the German Reich in 1871, Germany was not seen (by the British government) to be a major threat to British interests: Germany had no significant presence beyond Europe to worry the Colonial Office, and instead was seen as a potentially useful balancing factor in the European theatre. That all changed with the accession of Kaiser Wilhelm II as explained by Christopher Clark in The Sleepwalkers:

The 1890s were [...] a period of deepening German isolation. A commitment from Britain remained elusive and the Franco-Russian Alliance seemed to narrow considerably the room for movement on the continent. Yet Germany’s statesmen were extraordinarily slow to see the scale of the problem, mainly because they believed that the continuing tension between the world empires was in itself a guarantee that these would never combine against Germany. Far from countering their isolation through a policy of rapprochement, German policy-makers raised the quest for self-reliance to the status of a guiding principle. The most consequential manifestation of this development was the decision to build a large navy.

In the mid-1890s, after a long period of stagnation and relative decline, naval construction and strategy came to occupy a central place in German security and foreign policy. Public opinion played a role here — in Germany, as in Britain, big ships were the fetish of the quality press and its educated middle-class readers. The immensely fashionable “navalism” of the American writer Alfred Thayer Mahan also played a part. Mahan foretold in The Influence of Sea Power upon History: 1660–1783 (1890) a struggle for global power that would be decided by vast fleets of heavy battleships and cruisers. Kaiser Wilhelm II, who supported the naval programme, was a keen nautical hobbyist and an avid reader of Mahan; in the sketchbooks of the young Wilhelm we find many battleships — lovingly pencilled floating fortresses bristling with enormous guns. But the international dimension was also crucial: it was above all the sequence of peripheral clashes with Britain that triggered the decision to acquire a more formidable naval weapon. After the Transvaal episode, the Kaiser became obsessed with the need for ships, to the point where he began to see virtually every international crisis as a lesson in the primacy of naval power.

The Royal Navy (RN) had been Britain’s most obvious sign of global dominance, and Britain’s fleets had gone through many technological changes over the century since Waterloo. What had been for centuries a slow, steady process of gradual improvement and incremental change suddenly became the white-hot centre of rapid, even revolutionary, change:

At the same time that you need to add armour to protect the ship, you also need to mount heavier, larger guns. Between placing your order with the shipyard for a new ship, the metallurgical wizards may have (and frequently did) come up with bigger, better guns that could defeat the armour on your not-yet-launched ship. Oh, and you now needed to revise the design of the ship to carry the newer, heavier guns, too.

The ship designers were in a race with the gun designers to see who could defeat the latest design by the other group. It’s no wonder that ships could become obsolete between ordering and coming into service: sometimes, they could become obsolete before launch.

The weapons themselves were undergoing change at a relatively unprecedented rate. As late as the mid-1870′s, a good case could be made for muzzle-loading cannon being mounted on warships: until the gas seal of the breech-loader could be made safe, muzzle-loaders had an advantage of not killing their own crews at distressingly high frequency. Once that technological handicap had been overcome, then the argument came down to the best way to mount the weapons: turrets or barbettes.

The RN’s international prestige invited envious imitators (like Wilhelm) and challengers (the United States Navy and the Imperial Japanese Navy), but the RN was the supreme naval power against which all other nations measured themselves. In 1889, parliament passed the Naval Defence Act, which specified that the Royal Navy would be maintained at the “two-power standard”: that the RN’s fleet of capital ships would be at least equal to the number of battleships maintained by the next two largest navies (at that time, the French and Russian navies). The increased spending allowed ten battleships plus cruisers and torpedo boats to be added to the fleet … but the French and Russian navies added twelve battleships between them over the same period of time. “Another British expansion, known as the Spencer Programme, followed in 1894 aimed to match foreign naval growth at a cost of over £31 million. Instead of deterring the naval expansion of foreign powers, Britain’s Naval Defence Act contributed to a naval arms race. Other powers including Germany and the United States bolstered their navies in the following years as Britain continued to increase its own naval expenditures.”

In The War That Ended Peace, Margaret MacMillan describes the implicit power of the RN in peacetime:

In August 1902 another great naval review took place at Spithead in the sheltered waters between Britain’s great south coast port of Portsmouth and the Isle of Wight, this time to celebrate the coronation of Edward VII. Because he had suddenly come down with appendicitis earlier in the summer, the coronation itself and all festivities surrounding it had been postponed. As a result, most of the ships from foreign navies (except those of Britain’s new ally Japan) as well as those from the overseas squadrons of the British navy had been obliged to leave. The resulting smaller review was, nevertheless, The Times said proudly, a potent display of Britain’s naval might. The ships displayed at Spithead were all in active service and all from the fleets already in place to guard Britain’s home waters. “The display may be less magnificent than the wonderful manifestation of our sea-power witnessed in the same waters five years ago. But it will demonstrate no less plainly what that power is, to those who remember that we have a larger number of ships in commission on foreign stations now than we had then, and that we have not moved a single ship from Reserve.” “Some of our rivals,” The Times warned, “have worked with feverish activity in the interval, and they are steadily increasing their efforts. They should know that Britain remained vigilant and on guard, and prepared to spend whatever funds were necessary to maintain its sovereignty of the seas.”

Admiral Fisher’s new broom

Admiral Sir John "Jackie" Fisher (via Wikipedia)

Admiral Sir John “Jackie” Fisher (via Wikipedia)

In 1904, Admiral Sir John “Jackie” Fisher was appointed as First Sea Lord (the professional head of the RN, reporting to the First Lord of the Admiralty, a cabinet minister). Fisher was a full-steam-ahead reformer, with vast notions of modernizing and reforming the navy. He was brilliant, argumentative, abrasive, tactless, and aggressive but could also be charming and persuasive. “When addressing someone he could become carried away with the point he was seeking to make, and on one occasion, the king asked him to stop shaking his fist in his face.” (Fortunately for Fisher, the king was a personal friend, so this did not hinder his career.)

Margaret MacMillan describes him in The War That Ended Peace:

Jacky Fisher, as he was always known, shoots through the history of the British navy and of the prewar years like a runaway Catharine wheel, showering sparks in all directions and making some onlookers scatter in alarm and others gasp with admiration. He shook the British navy from top to bottom in the years before the Great War, bombarding his civilian superiors with demands until they usually gave way and steamrollering over his opponents in the navy. He spoke his mind freely in his own inimitable language. His enemies were “skunks”, “pimps”, “fossils”, or “frightened rabbits”. Fisher was tough, dogged and largely immune to criticism, not surprising perhaps in someone from a relatively modest background who had made his own way in the navy since he was a boy. He was also supremely self-confident. Edward VII once complained that Fisher did not look at different aspects of an issue. “Why should I waste my time,” the admiral replied, “looking at all sides when I know my side is the right side?”

Fisher had been a maverick throughout his career (which makes it even more amazing that he eventually did rise to become First Sea Lord), as his actions when he took command of the Mediterranean Fleet clearly illustrate:

A programme of realistic exercises was adopted including simulated French raids, defensive manoeuvres, night attacks and blockades, all carried out at maximum speed. He introduced a gold cup for the ship which performed best at gunnery, and insisted upon shooting at greater range and from battle formations. He found that he too was learning some of the complications and difficulties of controlling a large fleet in complex situations, and immensely enjoyed it.

Notes from his lectures indicate that, at the start of his time in the Mediterranean, useful working ranges for heavy guns without telescopic sights were considered to be only 2000 yards, or 3000-4000 yards with such sights, whereas by the end of his time discussion centred on how to shoot effectively at 5000 yards. This was driven by the increasing range of the torpedo, which had now risen to 3000-4000 yards, necessitating ships fighting effectively at greater ranges. At this time he advocated relatively small main armaments on capital ships (some had 15 inch or greater), because the improved technical design of the relatively small (10 inch) modern guns allowed a much greater firing rate and greater overall weight of broadside. The potentially much greater ranges of large guns was not an issue, because no one knew how to aim them effectively at such ranges. He argued that “the design of fighting ships must follow the mode of fighting instead of fighting being subsidiary to and dependent on the design of ships.” As regards how officers needed to behave, he commented, “‘Think and act for yourself’ is the motto for the future, not ‘Let us wait for orders’.”

Lord Hankey, then a marine serving under Fisher, later commented, “It is difficult for anyone who had not lived under the previous regime to realize what a change Fisher brought about in the Mediterranean fleet. … Before his arrival, the topics and arguments of the officers messes … were mainly confined to such matters as the cleaning of paint and brasswork. … These were forgotten and replaced by incessant controversies on tactics, strategy, gunnery, torpedo warfare, blockade, etc. It was a veritable renaissance and affected every officer in the navy.” Charles Beresford, later to become a severe critic of Fisher, gave up a plan to return to Britain and enter parliament, because he had “learnt more in the last week than in the last forty years”.

One of his first changes was to sell nearly one hundred elderly ships and move dozens of less capable vessels from the active list to the reserve fleet, to free up the crews (and the maintenance budget) for more modern vessels, describing the ships as “too weak to fight and too slow to run away”, and “a miser’s hoard of useless junk”. Between his reforms as Third Sea Lord (where he had championed the development of the modern destroyer and vastly increased the efficiency and productivity of the shipyards) and his new role as First Sea Lord, Fisher was able to get more done even on a budget that dropped nearly 10% in the year of his appointment than his predecessor had managed.

HMS Dreadnought and the naval revolution

Fisher was not a naval designer, but he knew how to push new ideas to the front and get them adopted. The one thing that most people remember him for is the revolutionary battleship HMS Dreadnought, the first all big gun, fast steam turbine powered battleship, and when she went into commission, she signified the obsolescence of every other capital ship in every navy from that moment onwards.

HMS Dreadnought underway, circa 1906-07

HMS Dreadnought underway, circa 1906-07

Dreadnought was the platonic ideal of a battleship: she was faster than any other capital ship in any other navy, her guns were at least the equivalent in range, rate of fire, and throw of shot, and her armour was sufficient to allow her to take punishment from opposing ships and still deal out damage herself. She was the first British ship to be equipped with electrical controls allowing the entire main armament to be fired from a central location. Thanks to Fisher’s earlier efforts with the shipyards, Dreadnought took just a year to build — far faster than any other battleship had been built.

The “entirely crazy Dreadnought policy of Sir J. Fisher and His Majesty”

The Kaiser was not happy with the new British battleship, as it had been German policy since his accession to build up the German navy to at least provide a tool for pressuring Britain (if not for actually confronting the Royal Navy in battle). Now his entire naval plan had been upset by the Dreadnought revolution. Margaret MacMillan:

As far as the Kaiser and [Admiral] Tirpitz were concerned the responsibility for taking the naval race to a new level rested with what Wilhelm called the “entirely crazy Dreadnought policy of Sir J. Fisher and His Majesty”. The Germans were prone to see Edward VII as bent on a policy of encircling Germany. The British had made a mistake in building dreadnoughts and heavy cruisers, in Tirpitz’s view, and they were angry about it: “This annoyance will increase as they see that we follow them immediately.” [...] Who could tell what the British might do? Did their history not show them to by hypocritical, devious and ruthless? Fears of a “Kopenhagen”, a sudden British attack just like the one in 1807 when the British navy had bombarded Copenhagen and seized the Danish fleet, were never far from the thoughts of the German leadership once the naval race had started.

German fears of British attack increased almost in lockstep with British fears of German attack (William Le Queux had his equivalents among the German press and popular novelists). The thought had actually occurred to Fisher himself, who outlined a possible coup de main against the German fleet. The king responded “My God, Fisher, you must be mad!” and the suggestion was ignored, thankfully.

The popular worries about an attack from Britain fed the support for the German Navy laws, which funded dreadnought and battlecruiser building programs. In direct proportion, the increased German support for their naval expansion worked to the advantage of British politicians who wanted to build more dreadnoughts of their own. And, in fairness, Britain risked far more by allowing an enlarged German navy than Germany risked by stopping their building program … but in either case, the fear of popular unrest kept the shipyards humming anyway. As Churchill later wrote, “The Admiralty had demanded six ships; the economists offered four; and we finally compromised on eight.”

There we go, finally getting within striking distance of the triggering events of the First World War … and I’m still not sure how many more posts it will take to get us there! More to come this week.

July 31, 2014

NFL to test player tracking RFID system this year

Filed under: Football, Media, Technology — Tags: , , — Nicholas Russon @ 07:01

Tom Pelissero talks about the new system which will be installed at 17 NFL stadiums this season:

The NFL partnered with Zebra Technologies, which is applying the same radio-frequency identification (RFID) technology that it has used the past 15 years to monitor everything from supplies on automotive assembly lines to dairy cows’ milk production.

Work is underway to install receivers in 17 NFL stadiums, each connected with cables to a hub and server that logs players’ locations in real time. In less than a second, the server can spit out data that can be enhanced graphically for TV broadcasts with the press of a button.

[...]

TV networks have experimented in recent years with route maps and other visual enhancements of players’ movements. But league-wide deployment of the sensors and all the data they produce could be the most significant innovation since the yellow first-down line.

The data also will go to the NFL “cloud,” where it can be turned around in seconds for in-stadium use and, eventually, a variety of apps and other visual and second-screen experiences. Producing a set of proprietary statistics on players and teams is another goal, Shah said.

NFL teams — many already using GPS technology to track players’ movements, workload and efficiency in practice — won’t have access to the in-game information in 2014 because of competitive considerations while the league measures the sustainability and integrity of the data.

“But as you imagine, longer-term, that is the vision,” Shah said. “Ultimately, we’re going to have a whole bunch of location-based data that’s coming out of live-game environment, and we want teams to be able to marry that up to what they’re doing in practice facilities themselves.”

Zebra’s sensors are oblong, less than the circumference of a quarter and installed under the top cup of the shoulder pad, Stelfox said. They blink with a signal 25 times a second and run on a watch battery. The San Francisco 49ers and Detroit Lions and their opponents wore them for each of the two teams home games in last season as part of a trial run.

About 20 receivers will be placed around the bands between the upper and lower decks of the 17 stadiums that were selected for use this year. They’ll provide a cross-section of environments and make sure the technology is operational across competitive settings before full deployment.

July 28, 2014

US government department to be replaced by Google

Filed under: Business, Government, Technology — Tags: , , — Nicholas Russon @ 09:18

The National Journal‘s Alex Brown talks about a federal government department facing the end of the line thanks to search engines like Google:

A little-known branch of the Commerce Department faces elimination, thanks to advances in technology and a snarkily named bill from Sens. Tom Coburn and Claire McCaskill.

The National Technical Information Service compiles federal reports, serving as a clearinghouse for the government’s scientific, technical, and business documents. The NTIS then sells copies of the documents to other agencies and the public upon request. It’s done so since 1950.

But Coburn and McCaskill say it’s hard to justify 150 employees and $66 million in taxpayer dollars when almost all of those documents are now available online for free.

Enter the Let Me Google That for You Act.

“Our goal is to eliminate you as an agency,” the famously grumpy Coburn told NTIS Director Bruce Borzino at a Wednesday hearing. Pulling no punches, Coburn suggested that any NTIS documents not already available to the public be put “in a small closet in the Department of Commerce.”

H/T to Jim Geraghty for the link. He assures us that despite any similarities to situations portrayed in his recent political novel The Weed Agency, he didn’t make this one up.

July 25, 2014

QotD: The singularity already happened

Filed under: Media, Quotations, Technology — Tags: , , — Nicholas Russon @ 00:01

The gulf that separates us from the near past is now so great that we cannot really imagine how one could design a spacecraft, or learn engineering in the first place, or even just look something up, without a computer and a network. Journalists my age will understand how profound and disturbing this break in history is: Do you remember doing your job before Google? It was, obviously, possible, since we actually did it, but how? It is like having a past life as a conquistador or a phrenologist.

Colby Cosh, “Who will be the moonwalkers of tomorrow?”, Maclean’s, 2014-07-24.

July 21, 2014

The science of ballistics, the art of war, and the birth of the assault rifle

Filed under: History, Military, Technology — Tags: , , , — Nicholas Russon @ 15:47

Defence With A “C” summarizes the tale of how we got to the current suite of modern military small arms. It’s a long story, but if you’re interested in firearms, it’s a fascinating one.

To understand why we’ve arrived where we are now with the NATO standard 5.56mm calibre round you have to go all the way back to the war of 1939-1945. Much study of this conflict would later inform decision making surrounding the adoption of the 5.56, but for now there was one major change that took place which would set the course for the future.

The German Sturmgewehr 44 is widely accepted as the worlds first true assault rifle. Combining the ability to hit targets out to around 500 yards with individual shots in a semi-automatic mode, as well as the ability to fire rapidly in fully automatic mode (almost 600 rounds per minute) the StG 44 represented a bridge between short ranged sub-machine guns and longer ranged bolt action rifles.

[...]

After the second world war the US army began conducting research to help it learn the lessons of its previous campaigns, as well as preparing it for potential future threats. As part of this effort it began to contract the services of the Operations Research Office (ORO) of the John Hopkins University in Baltimore, Maryland, for help in conducting the scientific analysis of various aspects of ground warfare.

On October 1st, 1948, the ORO began Project ALCLAD, a study into the means of protecting soldiers from the “casualty producing hazards of warfare“. In order to determine how best to protect soldiers from harm, it was first necessary to investigate the major causes of casualties in war.

After studying large quantities of combat and casualty reports, ALCLAD concluded that first and foremost the main danger to combat soldiers was from high explosive weapons such as artillery shells, fragments from which accounted for the vast majority of combat casualties. It also determined that casualties inflicted by small arms fire were essentially random.

Allied troops in WW2 had been generally armed with full-sized bolt action rifles (while US troops were being issued the M1 Garand), optimized to be accurate out to 600 yards or more, yet most actual combat was at much shorter ranges than that. Accuracy is directly affected by the stress, tension, distraction, and all-around confusion of the battlefield: even at such short ranges, riflemen required many shots to be expended in hopes of inflicting a hit on an enemy. The ORO ran a series of tests to simulate battle conditions for both expert and ordinary riflemen and found some unexpected results:

A number of significant conclusions were thus drawn from these tests. Firstly, that accuracy — even for prone riflemen, some of them expert shots, shooting at large static targets — was poor beyond ranges of about 250 yards. Secondly, that under simulated conditions of combat shooting an expert level marksman was no more accurate than a regular shot. And finally that the capabilities of the individual shooters were far below the potential of the rifle itself.

This in turn — along with the analysis of missed shots caught by a screen behind the targets — led to three further conclusions.

First, that any effort to try and make the infantry’s general purpose weapon more accurate (such as expensive barrels) was largely a waste of time and money. The weapon was, and probably always would be, inherently capable of shooting much tighter groups than the human behind it.

Second, that there was a practical limit to the value of marksmanship training for regular infantry soldiers. Beyond a certain basic level of training any additional hours were of limited value*, and the number of hours required to achieve a high level of proficiency would be prohibitive. This was particularly of interest for planning in the event of another mass mobilisation for war.

July 16, 2014

Rolling Stone is your top source for reliable, informative gun news

Filed under: Media, Technology, USA — Tags: , — Nicholas Russon @ 08:10

The folks at Rolling Stone are concerned for your safety, so they’ve helpfully put together a primer on the five “most dangerous” guns in America. Because they love you, America:

Rolling Stone dangerous guns 3

Yes, we’re apparently talking about grenade launchers here. I didn’t even know grenade launchers were available to civilians. Awesome!

Rolling Stone dangerous guns 5

Wait, “the explosive that creates the energy to fire the gun occurs in the fixed shell of a shotgun rather than the metallic cartridge of a rifle”. Why would I expect the charge that propels the shot out of a shotgun to be ignited in a rifle cartridge? Is this some sort of magic that allows you to fire a different weapon than the one you’re holding? No wonder Rolling Stone thinks this is such a dangerous weapon!

H/T to Charles C. W. Cooke for the link.

July 15, 2014

The sheer difficulty of obtaining a warrant

Filed under: Government, Law, Liberty, Technology, USA — Tags: , , , — Nicholas Russon @ 08:21

Tim Cushing wonders why we don’t seem to sympathize with the plight of poor, overworked law enforcement officials who find the crushing burden of getting a warrant for accessing your cell phone data to be too hard:

You’d think approved warrants must be like albino unicorns for all the arguing the government does to avoid having to run one by a judge. It continually acts as though there aren’t statistics out there that show obtaining a warrant is about as difficult as obeying the laws of thermodynamics. Wiretap warrants have been approved 99.969% of the time over the last decade. And that’s for something far more intrusive than cell site location data.

But still, the government continues to argue that location data, while possibly intrusive, is simply Just Another Business Record — records it is entitled to have thanks to the Third Party Doctrine. Any legal decision that suggests even the slightest expectation of privacy might have arisen over the past several years as the public’s relationship with cell phones has shifted from “luxury item/business tool” to “even grandma has a smartphone” is greeted with reams of paper from the government, all of it metaphorically pounding on the table and shouting “BUSINESS RECORDS!”

When that fails, it pushes for the lower bar of the Stored Communications Act [PDF] to be applied to its request, dropping it from “probable cause” to “specific and articulable facts.” The Stored Communications Act is the lowest bar, seeing as it allows government agencies and law enforcement to access electronic communications older than 180 days without a warrant. It’s interesting that the government would invoke this to defend the warrantless access to location metadata, seeing as the term “communications” is part of the law’s title. This would seem to imply what’s being sought is actual content — something that normally requires a higher bar to obtain.

Update: Ken White at Popehat says warrants are not particularly strong devices to protect your liberty and lists a few distressing cases where warrants have been issued recently.

We’re faced all the time with the ridiculous warrants judges will sign if they’re asked. Judges will sign a warrant to give a teenager an injection to induce an erection so that the police can photograph it to fight sexting. Judges will, based on flimsy evidence, sign a warrant allowing doctors to medicate and anally penetrate a man because he might have a small amount of drugs concealed in his rectum. Judges will sign a warrant to dig up a yard based on a tip from a psychic. Judges will kowtow to an oversensitive politician by signing a warrant to search the home of the author of a patently satirical Twitter account. Judges will give police a warrant to search your home based on a criminal libel statute if your satirical newspaper offended a delicate professor. And you’d better believe judges will oblige cops by giving them a search warrant when someone makes satirical cartoons about them.

I’m not saying that warrants are completely useless. Warrants create a written record of the government’s asserted basis for an action, limiting cops’ ability to make up post-hoc justifications. Occasionally some prosecutors turn down weak warrant applications. The mere process of seeking a warrant may regulate law enforcement behavior soomewhat.

Rather, I’m saying that requiring the government to get a warrant isn’t the victory you might hope. The numbers — and the experience of criminal justice practitioners — suggests that judges in the United States provide only marginal oversight over what is requested of them. Calling it a rubber stamp is unfair; sometimes actual rubber stamps run out of ink. The problem is deeper than court decisions that excuse the government from seeking warrants because of the War on Drugs or OMG 9/11 or the like. The problem is one of the culture of the criminal justice system and the judiciary, a culture steeped in the notion that “law and order” and “tough on crime” are principled legal positions rather than political ones. The problem is that even if we’d like to see the warrant requirement as interposing neutral judges between our rights and law enforcement, there’s no indication that the judges see it that way.

The economic side of Net Neutrality

Filed under: Business, Economics, Technology — Tags: , , , , — Nicholas Russon @ 07:41

In Forbes, Tim Worstall ignores the slogans to follow the money in the Net Neutrality argument:

The FCC is having a busy time of it as their cogitations into the rules about net neutrality become the second most commented upon in the organisation’s history (second only to Janet Jackson’s nip-slip which gives us a good idea of the priorities of the citizenry). The various internet content giants, the Googles, Facebooks and so on of this world, are arguing very loudly that strict net neutrality should be the standard. We could, of course attribute this to all in those organisations being fully up with the hippy dippy idea that information just wants to be free. Apart from the obvious point that Zuckerberg, for one, is a little too young to have absorbed that along with the patchouli oil we’d probably do better to examine the underlying economics of what’s going on to work out why people are taking the positions they are.

Boiling “net neutrality” down to its essence the argument is about whether the people who own the connections to the customer, the broadband and mobile airtime providers, can treat different internet traffic differently. Should we force them to be neutral (thus the “neutrality” part) and treat all traffic exactly the same? Or should they be allowed to speed up some traffic, slow down other, in order to prioritise certain services over others?

We can (and many do) argue that we the consumers are paying for this bandwidth so it’s up to us to decide and we might well decide that they cannot. Others might (and they do) argue that certain services require very much more of that bandwidth than others, further, require a much higher level of service, and it would be economically efficient to charge for that greater volume and quality. For example, none of us would mind all that much if there was a random second or two delay in the arrival of a gmail message but we’d be very annoyed if there were random such delays in the arrival of a YouTube packet. Netflix would be almost unusable if streaming were subject to such delays. So it might indeed make sense to prioritise such traffic and slow down other to make room for it.

You can balance these arguments as you wish: there’s not really a “correct” answer to this, it’s a matter of opinion. But why are the content giants all arguing for net neutrality? What’s their reasoning?

As you’d expect, it all comes down to the money. Who pays more for what under a truly “neutral” model and who pays more under other models. The big players want to funnel off as much of the available profit to themselves as possible, while others would prefer the big players reduced to the status of regulated water company: carrying all traffic at the same rate (which then allows the profits to go to other players).

The attraction (and danger) of computer-based models

Filed under: Environment, Science, Technology — Tags: , , — Nicholas Russon @ 00:02

Warren Meyer explains why computer models can be incredibly useful tools, but they are not the same thing as an actual proof:

    Among the objections, including one from Green Party politician Chit Chong, were that Lawson’s views were not supported by evidence from computer modeling.

I see this all the time. A lot of things astound me in the climate debate, but perhaps the most astounding has been to be accused of being “anti-science” by people who have such a poor grasp of the scientific process.

Computer models and their output are not evidence of anything. Computer models are extremely useful when we have hypotheses about complex, multi-variable systems. It may not be immediately obvious how to test these hypotheses, so computer models can take these hypothesized formulas and generate predicted values of measurable variables that can then be used to compare to actual physical observations.

[...]

The other problem with computer models, besides the fact that they are not and cannot constitute evidence in and of themselves, is that their results are often sensitive to small changes in tuning or setting of variables, and that these decisions about tuning are often totally opaque to outsiders.

I did computer modelling for years, though of markets and economics rather than climate. But the techniques are substantially the same. And the pitfalls.

Confession time. In my very early days as a consultant, I did something I am not proud of. I was responsible for a complex market model based on a lot of market research and customer service data. Less than a day before the big presentation, and with all the charts and conclusions made, I found a mistake that skewed the results. In later years I would have the moral courage and confidence to cry foul and halt the process, but at the time I ended up tweaking a few key variables to make the model continue to spit out results consistent with our conclusion. It is embarrassing enough I have trouble writing this for public consumption 25 years later.

But it was so easy. A few tweaks to assumptions and I could get the answer I wanted. And no one would ever know. Someone could stare at the model for an hour and not recognize the tuning.

July 10, 2014

Throwing a bit of light on security in the “internet of things”

Filed under: Technology — Tags: , , , , — Nicholas Russon @ 07:36

The “internet of things” is coming: more and more of your surroundings are going to be connected in a vastly expanded internet. A lot of attention needs to be paid to security in this new world, as Dan Goodin explains:

In the latest cautionary tale involving the so-called Internet of things, white-hat hackers have devised an attack against network-connected lightbulbs that exposes Wi-Fi passwords to anyone in proximity to one of the LED devices.

The attack works against LIFX smart lightbulbs, which can be turned on and off and adjusted using iOS- and Android-based devices. Ars Senior Reviews Editor Lee Hutchinson gave a good overview here of the Philips Hue lights, which are programmable, controllable LED-powered bulbs that compete with LIFX. The bulbs are part of a growing trend in which manufacturers add computing and networking capabilities to appliances so people can manipulate them remotely using smartphones, computers, and other network-connected devices. A 2012 Kickstarter campaign raised more than $1.3 million for LIFX, more than 13 times the original goal of $100,000.

According to a blog post published over the weekend, LIFX has updated the firmware used to control the bulbs after researchers discovered a weakness that allowed hackers within about 30 meters to obtain the passwords used to secure the connected Wi-Fi network. The credentials are passed from one networked bulb to another over a mesh network powered by 6LoWPAN, a wireless specification built on top of the IEEE 802.15.4 standard. While the bulbs used the Advanced Encryption Standard (AES) to encrypt the passwords, the underlying pre-shared key never changed, making it easy for the attacker to decipher the payload.

Older Posts »
« « Sometimes, “a phase” really is just a phase| Millennials starting to get jaded about the virtues of government » »

Powered by WordPress