Quotulatiousness

January 7, 2011

How not to handle public health issues like influenza

Filed under: Bureaucracy, Health, Media — Tags: , , , , , — Nicholas @ 09:38

I was astonished to hear a radio reporter yesterday admit that much of the reason for the drop in people getting flu shots is the massively overblown oh-my-god-we’re-all-going-to-die media panic last year over Swine flu H1N1. In case you somehow managed to miss out on it last year, every news broadcast seemed to feature yet another doctor or public health official telling us that we faced a worldwide pandemic of H1N1 which was the invincible, all-conquering Überflu to top all plagues we’d ever faced before. Death tolls in the millions were confidently predicted. Every individual who died seemed to be mentioned personally . . . because there were so relatively few compared to those poor folks who died of “ordinary” seasonal flu.

Lorne Gunter gives a bit of credit where it’s due:

Give Allison McGeer credit for being frank about what’s behind this winter’s flu outbreak in Ontario: unnecessary panic over last year’s swine flu “pandemic.” Dr. McGeer, head of infection control at Toronto’s Mount Sinai Hospital, says flu cases are way up this season because vaccinations are way down; and vaccinations are way down, likely, because too much was made of the swine flu by media and officialdom last winter.

It is a medical case of the doctors who cried wolf, in other words.

[. . .]

There is a fine line between erring on the side of caution and crying wolf. And last year, the UN’s World Health Organization (WHO) blew through that barrier with abandon.

Just as it had on SARS and bird flu and the Ebola virus, the WHO overreacted to swine flu, issuing cautions that were out of all proportion to the risk the disease posed to the public. (Remember in 2003 when the WHO recommended people from around the world stay away from Toronto because the city was host to a few hundred SARS infections?)

But unlike those earlier panics, the WHO pulled out every stop on swine flu. It was as if the UN agency had been surprised that its earlier scares had failed to grow into full-blown pandemics; and so they figured that, finally, swine flu was due to become a worldwide infection requiring a dramatic response from international health officials.

As I wrote last year in May, when even the most panic-stricken media outlets were no longer playing the JuggernautOfDoom theme:

This would have been a good opportunity for de-escalating the panic mongering (and perhaps even attempting to rein-in the media, who were equally to blame for the tone of the information getting to the public). They chose, instead, to actively hide the fact that H1N1 cases were running below the level of ordinary seasonal flu cases (total H1N1 deaths: approximately 18,000 — typical annual death toll from seasonal flu: 250,000-500,000).

The biggest problem isn’t that they over-reacted this time, it’s that it has reduced their credibility the next time they start issuing health warnings. And that’s a bad thing. Unless they pull the same stunt next time, too. In which case, we may start hearing talk about setting up competing organizations to do the job the current entities appear to have given up on.

January 6, 2011

QotD: The “information elite”

Filed under: Media, Quotations, Randomness — Tags: , — Nicholas @ 16:50

I noticed in the mid-nineties the new buzzword was “the Information Elite,” a proposed new class that included, by definition, anyone in the media, no matter how low-level or rote/mechanical in their actual job function. And you know who couldn’t get enough of talking about the “Information Elite?” The media, of course! Because everytime they brought it up, and fretted about this new class distinction that might have harmful effects for sooociiiiety, they were of course flattering themselves by naming themselves “the Elite.”

While pretending to worry about this new class, of course they were all delighting inside. Who wouldn’t? The dirty little secret is that pretty much anyone wants to be “elite” in some way or another. So any cute new catchphrase putting you into some new elite is going to be, well, a little attractive.

Anyway, that’s how class distinctions harden, I’m pretty sure, at the lower levels of the class, among the more marginal/aspirational members of the purported class, because they want the class to exist, because they need for it it to exist — in order for them to belong to it.

The low-level line producer at MSNBC needs the fiction of the “Information Elite” as a class a hell of a lot more than, say, Steven Spielberg does. Steven Spielberg doesn’t really have to worry about his status or position in the pecking order. He has enough individual accomplishments that he has no need to inflate his ego with the accomplishments of other people, to whom he is connected only by his purported class.

Ace, “The Illusion of the “Professional” Class and the Rise of the Liberal Aristocracy”, Ace of Spades HQ, 2011-01-06

Drug-sniffing dogs nowhere near as accurate as billed

Filed under: Law, Liberty, USA — Tags: , , — Nicholas @ 13:17

Everyone loves dogs, right? They’re “man’s best friend”. They’re also a significant part of the war on drugs. And they’re far from infallible:

Drug-sniffing dogs can give police probable cause to root through cars by the roadside, but state data show the dogs have been wrong more often than they have been right about whether vehicles contain drugs or paraphernalia.

The dogs are trained to dig or sit when they smell drugs, which triggers automobile searches. But a Tribune analysis of three years of data for suburban departments found that only 44 percent of those alerts by the dogs led to the discovery of drugs or paraphernalia.

For Hispanic drivers, the success rate was just 27 percent.

For something as important in the arsenal of drug warriors, drug-sniffing dogs and their handlers don’t appear to have training standards of any consistency:

But even advocates for the use of drug-sniffing dogs agree with experts who say many dog-and-officer teams are poorly trained and prone to false alerts that lead to unjustified searches. Leading a dog around a car too many times or spending too long examining a vehicle, for example, can cause a dog to give a signal for drugs where there are none, experts said.

“If you don’t train, you can’t be confident in your dog,” said Alex Rothacker, a trainer who works with dozens of local drug-sniffing dogs. “A lot of dogs don’t train. A lot of dogs aren’t good.”

The dog teams are not held to any statutory standard of performance in Illinois or most other states, experts and dog handlers said, though private groups offer certification for the canines.

No standards for training? Lucrative police department budgets? Nope, no possible way that unscrupulous folks would ever take advantage of that opening.

Orders of magnitude, US dollar version

Filed under: Economics, Randomness, USA — Tags: — Nicholas @ 07:53

Page Tutor provides a very useful visual reference to the terms Million, Billion, and Trillion:

Believe it or not, this next little pile is $1 million dollars (100 packets of $10,000). You could stuff that into a grocery bag and walk around with it.

H/T to Tim Harford for the link.

Mark Steyn on the state of Britain

Filed under: Britain, Education, Government, History — Tags: , — Nicholas @ 07:38

From a longer column on the general state of decline in the Anglosphere, Mark Steyn points out the negative aspects of British public education on modern day “Britons”:

In cutting off two generations of students from their cultural inheritance, the British state has engaged in what we will one day come to see as a form of child abuse, one that puts a huge question mark over the future. Why be surprised that legions of British Muslims sign up for the Taliban? These are young men who went to school in Luton and West Bromwich and learned nothing of their country of nominal citizenship other than that it’s responsible for racism, imperialism, colonialism, and all the other bad -isms of the world. If that’s all you knew of Britain, why would you feel any allegiance to Queen and country? And what if you don’t have Islam to turn to? The transformation of the British people is, in its own malign way, a remarkable achievement. Raised in schools that teach them nothing, they nevertheless pick up the gist of the matter, which is that their society is a racket founded on various historical injustices. The virtues Hayek admired? Ha! Strictly for suckers.

When William Beveridge laid out his blueprint for the modern British welfare state in 1942, his goal was the “abolition of want,” to be accomplished by “cooperation between the State and the individual.” In attempting to insulate the citizenry from the vicissitudes of fate, Sir William succeeded beyond his wildest dreams: Want has been all but abolished. Today, fewer and fewer Britons want to work, want to marry, want to raise children, want to lead a life of any purpose or dignity. Churchill called his book The History of the English-Speaking Peoples — not the English-Speaking Nations. The extraordinary role played by those nations in the creation and maintenance of the modern world derived from their human capital.

What happens when, as a matter of state policy, you debauch your human capital? The United Kingdom has the highest drug use in Europe, the highest incidence of sexually transmitted disease, the highest number of single mothers; marriage is all but defunct, except for toffs, upscale gays, and Muslims. For Americans, the quickest way to understand modern Britain is to look at what LBJ’s Great Society did to the black family and imagine it applied to the general population. One-fifth of British children are raised in homes in which no adult works. Just under 900,000 people have been off sick for over a decade, claiming “sick benefits,” week in, week out, for ten years and counting. “Indolence,” as Machiavelli understood, is the greatest enemy of a free society, but rarely has any state embraced this oldest temptation as literally as Britain. There is almost nothing you can’t get the government to pay for.

And this bit where he shows that the British government defies parody:

For its worshippers, Big Government becomes a kind of religion: the state as church. After the London Tube bombings, Gordon Brown began mulling over the creation of what he called a “British equivalent of the U.S. Fourth of July,” a new national holiday to bolster British identity. The Labour Party think-tank, the Fabian Society, proposed that the new “British Day” should be July 5th, the day the National Health Service was created. Because the essence of contemporary British identity is waiting two years for a hip operation. A national holiday every July 5th: They can call it Dependence Day.

Even Time Lords could get confused by this matchup

Filed under: Britain, Media — Tags: , , , — Nicholas @ 07:18

I can’t improve on The Register‘s take:

Doctor Who the 10th, David Tennant, is planning to get hitched to his fictional daughter Georgia Moffett, who also happens to be the real daughter of his fictional fifth incarnation.

Moffet is the real-life fruit of former Time Lord Peter Davison’s loins, and played Who offspring Jenny in 2008’s The Doctor’s Daughter. Davison and his future son-in-law Tennant appeared together in 2007’s Children in Need Doctor Who special Time Crash, well after Ms Moffet really existed, but before she was spawned as her soon-to-be husband’s television child.

Paradoxically, this means that Davison and Tennant came together as both individuals and the same person, while one was the father of the future daughter of the other.

January 5, 2011

Starting to recover

Filed under: Administrivia — Tags: — Nicholas @ 14:57

Whatever the bug was, it pretty much wiped me out for the last 36-48 hours. While I wasn’t actually running (much of) a fever, I was getting all the joys of fever dreams interrupting what sleep I could get. I’d manage to fall asleep, then whatever dream I was having would segue into a weird kind of video game (think something like Tetris or Bejewelled) with the same song clip playing over and over again (“Imelda” by Mark Knopfler). I’d wake up, overheated and sweaty, throw off the covers, chill down again, cover up and go back to sleep. Repeat and repeat and repeat.

Physically, it wasn’t too bad (except for the express train running through my intestines), but my brain was running on far too little sleep. Hopefully it’s pretty much over.

January 4, 2011

Posting will continue to be light for a bit

Filed under: Administrivia — Tags: — Nicholas @ 08:01

I had planned on resuming normal blogging today, after the holidays, but I’m fighting off some kind of stomach bug at the moment, so blogging will have to wait until later.

January 3, 2011

Healthy skepticism about study results

Filed under: Bureaucracy, Media, Science — Tags: , , , , — Nicholas @ 13:30

John Allen Paulos provides some useful mental tools to use when presented with unlikely published findings from various studies:

Ioannidis examined the evidence in 45 well-publicized health studies from major journals appearing between 1990 and 2003. His conclusion: the results of more than one third of these studies were flatly contradicted or significantly weakened by later work.

The same general idea is discussed in “The Truth Wears Off,” an article by Jonah Lehrer that appeared last month in the New Yorker magazine. Lehrer termed the phenomenon the “decline effect,” by which he meant the tendency for replication of scientific results to fail — that is, for the evidence supporting scientific results to seemingly weaken over time, disappear altogether, or even suggest opposite conclusions.

[. . .]

One reason for some of the instances of the decline effect is provided by regression to the mean, the tendency for an extreme value of a random quantity dependent on many variables to be followed by a value closer to the average or mean.

[. . .]

This phenomenon leads to nonsense when people attribute the regression to the mean as the result of something real, rather than to the natural behavior of any randomly varying quantity.

[. . .]

In some instances, another factor contributing to the decline effect is sample size. It’s become common knowledge that polls that survey large groups of people have a smaller margin of error than those that canvass a small number. Not just a poll, but any experiment or measurement that examines a large number of test subjects will have a smaller margin of error than one having fewer subjects.

Not surprisingly, results of experiments and studies with small samples often appear in the literature, and these results frequently suggest that the observed effects are quite large — at one end or the other of the large margin of error. When researchers attempt to demonstrate the effect on a larger sample of subjects, the margin of error is smaller and so the effect size seems to shrink or decline.

[. . .]

Publication bias is, no doubt, also part of the reason for the decline effect. That is to say that seemingly significant experimental results will be published much more readily than those that suggest no experimental effect or only a small one. People, including journal editors, naturally prefer papers announcing or at least suggesting a dramatic breakthrough to those saying, in effect, “Ehh, nothing much here.”

The availability error, the tendency to be unduly influenced by results that, for one reason or another, are more psychologically available to us, is another factor. Results that are especially striking or counterintuitive or consistent with experimenters’ pet theories also more likely will result in publication.

January 2, 2011

Dave Barry’s 2010 review

Filed under: Government, History, Humour, Media, Politics — Tags: , , — Nicholas @ 13:53

Who better than Dave Barry to recount to us the manifold miseries we endured and depths of despair we plumbed:

Let’s put things into perspective: 2010 was not the worst year ever. There have been MUCH worse years. For example, toward the end of the Cretaceous Period, Earth was struck by an asteroid that wiped out about 75 percent of all of the species on the planet. Can we honestly say that we had a worse year than those species did? Yes, we can, because they were not exposed to “Jersey Shore.”

So on second thought we see that this was, in fact, the worst year ever. The perfect symbol for the awfulness of 2010 was the BP oil spill, which oozed up from the depths and spread, totally out of control, like some kind of hideous uncontrollable metaphor. (Or “Jersey Shore.”) The scariest thing about the spill was, nobody in charge seemed to know what to do about it. Time and again, top political leaders personally flew down to the Gulf of Mexico to look at the situation firsthand and hold press availabilities. And yet somehow, despite these efforts, the oil continued to leak. This forced us to face the disturbing truth that even top policy thinkers with postgraduate degrees from Harvard University — Harvard University! — could not stop it.

The leak was eventually plugged by non-policy people using machinery of some kind. But by then our faith in our leaders had been shaken, especially because they also seemed to have no idea of what to do about this pesky recession. Congress tried every remedy it knows, ranging all the way from borrowing money from China and spending it on government programs, to borrowing MORE money from China and spending it on government programs. But in the end, all of this stimulus created few actual jobs, and most of those were in the field of tar-ball collecting.

January 1, 2011

QotD: The end of nerd subculture

Filed under: History, Media, Quotations — Tags: , — Nicholas @ 17:14

That was the year the final issue of Watchmen came out, in October. After that, it seemed like everything that was part of my otaku world was out in the open and up for grabs, if only out of context. I wasn’t seeing the hard line between “nerds” and “normals” anymore. It was the last year that a T-shirt or music preference or pastime (Dungeons & Dragons had long since lost its dangerous, Satanic, suicide-inducing street cred) could set you apart from the surface dwellers. Pretty soon, being the only person who was into something didn’t make you outcast; it made you ahead of the curve and someone people were quicker to befriend than shun. Ironically, surface dwellers began repurposing the symbols and phrases and tokens of the erstwhile outcast underground.

Fast-forward to now: Boba Fett’s helmet emblazoned on sleeveless T-shirts worn by gym douches hefting dumbbells. The Glee kids performing the songs from The Rocky Horror Picture Show. And Toad the Wet Sprocket, a band that took its name from a Monty Python riff, joining the permanent soundtrack of a night out at Bennigan’s. Our below-the-topsoil passions have been rudely dug up and displayed in the noonday sun. The Lord of the Rings used to be ours and only ours simply because of the sheer goddamn thickness of the books. Twenty years later, the entire cast and crew would be trooping onstage at the Oscars to collect their statuettes, and replicas of the One Ring would be sold as bling.

The topsoil has been scraped away, forever, in 2010. In fact, it’s been dug up, thrown into the air, and allowed to rain down and coat everyone in a thin gray-brown mist called the Internet. Everyone considers themselves otaku about something — whether it’s the mythology of Lost or the minor intrigues of Top Chef. American Idol inspires — if not in depth, at least in length and passion — the same number of conversations as does The Wire. There are no more hidden thought-palaces — they’re easily accessed websites, or Facebook pages with thousands of fans. And I’m not going to bore you with the step-by-step specifics of how it happened. In the timeline of the upheaval, part of the graph should be interrupted by the words the Internet. And now here we are.

Patton Oswalt, “Wake Up, Geek Culture. Time to Die”, Wired, 2010-12-27

« Newer Posts

Powered by WordPress