The gap between Americans raised before World War II and after was huge in a way that’s difficult to recall for those of us who came of age after the ’60s. Greatest Generation parents who might have grown up without on-demand indoor plumbing and survived the Depression and fighting in Europe, the Pacific, North Africa, and Korea came from a different planet than the one on which they raised their kids. To their credit, they bequeathed to the baby boomers a world that was still full of major problems but one that was much richer and full of opportunities. And to their credit, the boomers (of which I’m a very late example, having been born in 1963) readily went about using new opportunities and freedoms (expressive, sexual, educational, economic) to build the world they wanted to live in.
In the late ’60s and a good chunk of the ’70s, youth-oriented pop music was central to that project. Whatever you might think of the Beatles’ music, their very existence — and their constant self-recreations — made everything seem possible. They were far from alone as pop music maguses, too.
Simply by talking with major pop figures, Rolling Stone could be a vital and compelling magazine because it served as something like a boomer conversation pit. Over time, however, music stopped playing the same sort of vital role in generational conversations — don’t get me wrong, it’s still a part of it all. But as the mainstream in every area of life splintered and recombined into a million different subspecies, no single form of cultural expression matters so much to so many people anymore.
That’s a good thing for the culture and the country (and the planet, really), but Rolling Stone has been looking for a replacement core identity for decades now. The magazine that once published New Journalism masterpieces about David Cassidy and stardom, Patty Hearst’s rescuers, and “Charlie Simpson’s Apocalypse” had trouble figuring out how to deal with a world in which pop and movie stars were less interesting than ever (and more disciplined in terms of talking with the press) and in which men and women of good faith might actually disagree over complicated aesthetic and ideological matters. There has been a lot of good writing and reporting over the years, but there’s no question, I think, that the magazine is chasing trends and insights rather than creating them.
In a world in which pop culture — especially youth-oriented pop culture — allows a thousand flowers to bloom in a way that was unimaginable even 40 years ago, Rolling Stone can no longer get by simply by talking with Patti Smith or John Lennon or Bob Dylan for 25,000 words at a time. It might have reinvented itself as a clubhouse where people who love music or movies or whatever could get together to argue over politics, economics, and policy. That could indeed be interesting, especially in a world where large chunks of young Americans are going right, left, and especially libertarian. Just as there is no longer one dominant mode of music, there is no longer one dominant mode of politics.
But the people at the helm of Rolling Stone cannot seemingly even acknowledge that anyone who might disagree with them on, say, the effects of minimum wage laws on the poor, is worth a second thought. All they can do, out of a sense of liberal guilt, is publish radical calls to arm that they must know are ridiculous. Sadly, a magazine that was once required reading for anyone who wanted to know what the younger generation cared about is now a pedantic, insecure, and ultimately ineffective tool of Democratic Party groupthink.
Nick Gillespie, “Rolling Stone‘s Sad ‘5 Economic Reforms Millennials Should Be Fighting For'”, Hit and Run, 2014-01-04
April 3, 2015
April 2, 2015
Jonathan Kay, formerly of the National Post, is now the editor-in-chief of The Walrus. Here’s the start of his first editorial for the magazine:
“Any slighting reference to Canada is bound to produce a flurry of anguished letters, most of them attached to manuscripts,” Michael Kinsley wrote in The New Republic three decades ago. “On the other hand, so is any favorable reference to Canada, so it would be futile to add at this point that I think it’s a lovely country and we’re darn lucky to have it next door, especially considering the alternatives. Yet Canada is, for all its acknowledged merits, a nation of assistant professors, each armed with articles designed to ‘dispel misunderstanding.’ These literary missiles are aimed at the American media, ready to be fired at the slightest provocation.”
Any Canadian past the age of thirty will recognize the whiny writing that Kinsley aptly skewered: until recently, our relationship with the United States was the great neurotic obsession of our intellectual life. This neurosis didn’t just produce insecurity; it also produced bad writing.
In the domain of foreign policy, especially, virtually every debate — missile defence, Cuba, Afghanistan, Iraq, terrorism, peacekeeping — was brought back to the question of whether we were doing enough to distinguish ourselves from the southern hegemon. To describe our place in the world in a way that made us feel morally superior, we became reliant on a canonical set of clichés — honest broker, human security, global citizenship, soft power. The dreariness of these tropes was unavoidable, because the approved form of argumentation among all those assistant professors was to string old ideas together in new ways.
This attitude is gone — or at least very much on the wane. Whatever you may think about the way Stephen Harper has changed Canada, it is undeniable that we have become a richer, more interesting, and less insecure country than we were just a decade ago. I’ve lost count of the number of international surveys that Canada (and Toronto, its largest city) now tops. Ambitious Canadians in every field have better reasons to stick around than they did even a few years ago.
And all those assistant professors whom Michael Kinsley disparaged have become less whiny: having shed our anxieties about our relationship with the US, Canadian intellectuals now draft their impassioned manifestos in a country that is important and interesting in its own right.
Needless to say, this is good news for The Walrus, a magazine that explores Canada and its place in the world. Never in my lifetime has it been a better time to write — and read — about this wonderful country.
February 10, 2015
Richard Anderson supplies the appropriate level of disdain:
It’s always nice when a big important magazine notices Canada. It’s also a big important British magazine. Even nowadays it’s extra special when mother says we’ve done so very well for ourselves. Did we mention the solarium we’re having installed? The Americans don’t have a solarium. Just thought we’d mention that. We got a great deal with the contractor. Excellent references.
Torontonians are known through out our fair dominion for two things: Having a gigantic tower that is no longer the most gigantic in the world and being incredibly smug. The original logo for Toronto actually featured a very smug looking beaver carefully ignoring the rest of Canada. If you paid close attention it was obvious the beaver was looking at New York but in a very nonchalant sort of way.
I hate it when The Economist or the OECD or UN or the OAS or whoever the hell puts out these surveys. Like most rankings the whole thing is a bit of numerical legerdemain. A recurring example of how the easiest way to bullshit your way through life is to use numbers. In what real common sense way is Toronto better than Sydney? Did you talk to someone who has lived in both cities?
Didn’t bloody think so. That would be journalism.
As a native Torontonian I would like to ask the editors of The Economist, those non-byline using smug bastards, why they think Toronto is so wonderful? Yes I know you visited here one summer for a conference. You strolled down Bloor Street and bought something at the Roots Store or Holts. It was so terribly clean and the homeless people were so very polite. Have you lived here? Would you ever in your right mind move from Chelsea to the Annex? Exactly. You’d prefer to be cramped and gouged in London than less cramped and less gouged in Toronto. Why? Because it’s friggin’ London! The potholes are older and more historic than the whole of Toronto.
January 8, 2015
At The Federalist Robert Tracinski nominates perhaps the most appropriate candidate for media
man of the year:
As the year winds to a close, it is traditional to pick a “man of the year,” or in our more enlightened age, a “person of the year.” I’ve never done that before, but this year there is one candidate who has left his mark so indelibly on 2014 that I would be remiss if I did not acknowledge his vast influence.
Thus, my own personal pick for 2014′s Person of the Year: Ben Trovato.
He has been everywhere and had a hand in just about every big story, from Ferguson to the University of Virginia. He has been most active in his usual fields, journalism and politics, but we can see his impact as far afield as espionage and even retail.
You’ve never heard of him? Maybe so, but you already know him very well.
For those who suspect that Ben Trovato is not a real, literal person, you’re right. But the whole point of old Ben’s influence is that it doesn’t matter whether he’s literally real. Or whether anything is literally real, for that matter.
I first heard of Ben Trovato while reading a curious little volume of unusual word origins. A number of these supposed etymologies, most of the really colorful ones, were attributed to “Ben Trovato.” The name is taken from an old Italian saying: se non è vero, è ben trovato. Roughly translated: if it’s not true, it’s a good story. These were the kind of word origins that you really wanted to be true, but for which there was no real evidence. In contemporary parlance, they are “too good to check.”
I think you can begin to see why 2014 has been the year of Ben Trovato. It has been a year full of things that were non vero, but which had really good narratives. Or at least really convenient narratives.
January 7, 2015
Claire Berlinski wasn’t working as a journalist earlier today, but she happened to be right in the area of the terrorist attack on the offices of the French satirical magazine, Charlie Hebdo:
If I sound incoherent, it’s because I am shaken. The reasons will be obvious.
I had no intention of reporting on this from the scene of the Charlie-Hebdo massacre. I was walking up Boulevard Richard Lenoir to meet a friend who lives in the neighborhood. But the moment I saw what I did, I knew for sure what had happened. A decade in Turkey teaches you that. That many ambulances, that many cops, that many journalists, and those kinds of faces can mean only one thing: a massive terrorist attack.
I also knew from the location just who’d been attacked: Charlie-Hebdo, the magazine known for many things, but, above all, for its fearlessness in publishing caricatures of Mohamed. They’d been firebombed for this in 2011, but their response — in effect — was the only one free men would ever consider: “As long as we’re alive, you’ll never shut us up.”
They are no longer alive. They managed to shut them up.
The only thing I didn’t immediately know was how many of them had died.
All of them, it seems, or close enough. So did two police officers who had been assigned to protect their offices. Twelve are dead for sure; I assume that number will rise; seven are seriously injured. It was at the time I was there unclear how many were wounded.
And the attackers are still at large.
Given that two police officers are dead, now doesn’t seem the time to say what comes to mind about the fact that the assailants escaped. It will say this much though: if they’re not dead before nightfall, I’ll say exactly what comes to mind, respect for the dead be damned.
This was the Twitter update sent shortly before the attack began:
Meilleurs vœux, au fait. pic.twitter.com/a2JOhqJZJM
— Charlie Hebdo (@Charlie_Hebdo_) January 7, 2015
This was the worst terrorist attack in Europe since the London tube bombings of 2005. If I’m correct — I have not checked carefully — it was also the worst in France since the Nazis were running the place.
I was there only by luck: I had no desire to see this. Luck is probably not the right word. I wish I hadn’t seen it. But lucky, certainly is the right word to use in noting that I was running late, and thus there a few minutes after the fact. Had I not been running late, it’s fairly obvious what might have happened. They weren’t discriminate in their targets.
There wasn’t much for me to do. I didn’t even have a pen on me. I spoke to a cameraman from France 3, to make sure I understood the facts. I didn’t ask if I could quote him, so I won’t use his name. But his comment summed up the sentiment. “This is the kind of thing you expect in Pakistan. And now it’s coming here.”
January 2, 2015
The radicalization of renown is good for America.
In these times of seemingly limited job and business opportunities, celebrity has become a goal attainable by all.
Gaining public attention by performing for the masses once required skills — deft strokes with ochre on the walls of Paleolithic caves, facility with trident and net in the Roman coliseum, recitation of iambic pentameter by the swath from the stage at the Globe.
Talent and practice were needed for popularity from the dawn of time until the debut of America’s Funniest Home Videos in 1990. And even then a contestant had to have steady hands and steely resolve to keep the video rolling while his son pedaled off an improvised plywood ramp trying to leap a row of Tonka toys on his Big Wheel and got whacked in the testicles.
But what does 18-year-old Bethany Mota who still lives at home with her parents (two-page spread, People, pp. 196-7) do? She does “reviews of new makeup, clothes, and other mall finds.” Her YouTube channel has 5.9 million subscribers. She “reportedly makes $40,000 a month.”
There are 10,900,000 teenage girls in America, an estimated 10,899,999 of whom have the same skill set as Bethany. This includes the teenage girl at my house who is presently locked in her bedroom sharing “reviews of new makeup, clothes, and other mall finds” with her 5.9 million Facebook friends. She is about to get pages 196 and 197, torn from People and heavily marked with a highlighter pen, shoved under her door. Bethany Mota, you are a beacon of hope.
December 30, 2014
Charles Stross outlines the reason SF writers pretty much stopped writing short stories en masse in the mid-to-late 1950s:
A typical modern novel is in the range 85,000-140,000 words. But there’s nothing inevitable about this. The shortest work of fiction I ever wrote and sold was seven words long; the longest was 196,000 words. I’ve written plenty of short stories, in the 3000-8000 word range, novelettes (8000-18,000 words), and novellas (20,000-45,000 words). (Anything longer than a novella is a “short novel” and deeply unfashionable these days, at least in adult genre fiction, which seems to be sold by the kilogram.)
Genre science fiction in the US literary tradition has its roots in the era of the pulp magazines, from roughly 1920 to roughly 1955. (The British SF/F field evolved similarly, so I’m going to use the US field as my reference point.) These were the main supply of mass-market fiction to the general public in the days before television, when reading a short story was a viable form of mass entertainment, and consequently there was a relatively fertile market for short fiction up to novella length. In addition, many of these magazines serialized novels: it was as serials that Isaac Asimov’s Foundation and E. E. “Doc” Smith’s The Skylark of Space were originally published, among others.
For a while, during this period, it was possible to earn a living (not a very good living) churning out pulp fiction in short formats. It’s how Robert Heinlein supplemented his navy pension in the 1930s; it’s how many of the later-great authors first gained their audiences. But it was never a good living, and in the 1950s the bottom fell out of the pulp market — the distribution channel itself largely dried up and blew away, a victim of structural inefficiencies and competition from other entertainment media. The number of SF titles on sale crashed, and the number of copies each sold also crashed. Luckily for the writers a new medium was emerging: the mass market paperback, distributed via the same wholesale channel as the pulp magazines and sold through supermarkets and drugstore wire-racks. These paperbacks were typically short by modern standards: in some cases they provided a market for novellas (25,000 words and up — Ace Doubles consisted of two novellas, printed and bound back-to-back and upside-down relative to one another, making a single book).
The market for short fiction gradually recovered somewhat. In addition to the surviving SF magazines (now repackaged as digest-format paperback monthlies) anthologies emerged as a market. But after 1955 it was never again truly possible to earn a living writing short stories (although this may be changing thanks to the e-publishing format shift — it’s increasingly possible to publish stand-alone shorter works, or to start up a curatorial e-periodical or “web magazine” as the hip young folks call them). And the readership profile of the remaining magazines slowly began to creep upwards, as new readers discovered SF via the paperback book rather than the pulp magazine. With this upward trending demographic profile, the SF magazines entered a protracted, generational spiral of dwindling sales: today they still exist, but nobody would call a US newsstand magazine with monthly sales of 10,000-15,000 copies a success story.
A side-effect of dwindling sales is that the fixed overheads of running a magazine (the editor’s pay check) remains the same but there’s less money to go around. Consequently, pay rates for short fiction stagnated from the late 1950s onwards. 2 cents/word was a decent wage in 1955 — it was $20 for a thousand words, so $80-500 for a short story or novelette. But the monthly magazines were still paying 5 cents/word in the late 1990s! This was pin money. It was a symbolic reward. It would cover your postage and office supplies bill — if you were frugal.
December 22, 2014
Celebrity gossip is psychologically healthy.
It provides an outlet, a useful sublimation, of our self-destructive subconscious compulsion to lean over the back fence and cluck (or tweet) about the godawful things our relatives, friends, and neighbors do.
Celebrities are not our family. Although there are so many celebrities that we are probably related to some. But they’re not the niece looking daggers at us across the Thanksgiving turkey because of what we said to Uncle Bill about her hookup with that McDermott idiot. They’re not the daughter locked in her bedroom running up our Visa card bill with online shopping for new makeup, clothes, and other mall finds.
Celebrities are not our friends. They don’t borrow our money or power tools. They don’t forget it’s their turn to carpool the kids to junior high. They don’t come over when we’re busy watching The View and litter the kitchen table with used Kleenex, pouring their hearts out about their (remarkably frequent) divorces. They don’t get caught — unless Dean McDermott is late to the set for his televised therapy session on True Tori — necking with our spouses in the coat closet at our cocktail parties.
December 14, 2014
Let’s just say that there’s not a lot of profit for a monthly magazine with single-issue newsstand sales as low as this:
From a business standpoint, The New Republic was undoubtedly facing an uphill battle for profitability, even before last week’s events. According to the Pew Research Center and the Alliance for Audited Media, single copy sales of the magazine (considered the most objective measure of a magazine’s print appeal) have steadily declined over the past year, dropping to around 1,900 per issue.
They note that, between the first and second halves of 2013, newsstand sales fell by 57%, and fell a further 20% in the first of half of 2014.
One thousand, nine hundred readers. Per month.
Let me just give you a bit of perspective here:
At shortly after 10 a.m. on a quiet Sunday, I’ve already had more visitors to my obscure little personal blog today than there were copies of The New Republic sold in a recent month (not counting subscriptions).
That is not a viable business.
H/T to Kathy Shaidle, who also gets more daily traffic than TNR sells in a month.
December 8, 2014
The two Reason stalwarts did an “Ask Me Anything” session at Reddit last week:
We’re Matt Welch (/u/MattWelchReason) and Nick Gillespie (/u/Nick_Gillespie), the editors of Reason magazine, Reason.com and Reason TV and co-authors of 2011’s The Declaration of Independents: How Libertarian Politics Can Fix What’s Wrong With America.
Matt’s also the co-host of The Independents on Fox Business Network and Nick is a columnist for The Daily Beast and Time.com.
Go ahead and ask us anything about politics, culture, and ideas and the libertarian movement, 2016, you name it. But we’ve got to warn you that quite probably the toughest question — “Ever wonder what it’d look like if you switched faces?” — has already been asked and answered #triggerwarning
Proof: Matt and Nick
December 1, 2014
Nicholas Frankovich on how at least some liberals view their conservative foes:
In the liberal imagination, the conservative plays many parts, all of them villainous, the most flamboyant being that of the crank who combines political activism with mental instability: a dangerous combination. Earlier this week Ian Tuttle documented a few random but typical reports from those who have recently sighted this menacing character. I especially liked Ian’s excerpt from a column by Charles Blow, who sees “the fear that makes the face flush when people stare into a future in which traditional power — their power — is eroded.”
Blow means status anxiety. The idea is that conservatives are either downwardly mobile or fearful of becoming so. Conservatism is reduced to the image of people blustering and raging as they tumble down the social ladder, either in fact or in their fevered delusions. The term “status anxiety” has fallen out of fashion, but obviously the concept has not. As an explanation for conservatism and for anti-Communism particularly, it came into vogue in the mid 20th century, popularized by the sociologists Daniel Bell and Seymour Martin Lipset but especially by the Columbia historian Richard Hofstadter, who in the run-up to the 1964 presidential election published “The Paranoid Style in American Politics” (Harper’s, November 1964), the classic essay on conservatism as mental illness.
Hofstadter began with a reference to the “angry minds at work mainly among extreme right-wingers, who have now demonstrated in the Goldwater movement how much political leverage can be got out of the animosities and passions of a small minority.” This was less a news hook for a groundbreaking psychoanalysis of American history than the psychoanalysis of American history was a context in which Hofstadter could situate Barry Goldwater and his supporters.
Meanwhile, “The Unconscious of a Conservative: A Special Issue on the Mind of Barry Goldwater” appeared as the October–November issue of the newly founded (and short-lived, as it would turn out) Fact magazine. “1,189 psychiatrists say Goldwater is psychologically unfit to be president!” the cover read. (The American Psychiatric Association later established the “Goldwater rule”: “It is unethical for a psychiatrist to offer [to media] a professional opinion [of a public figure’s mental health] unless he or she has conducted an examination and has been granted proper authorization for such a statement.”)
September 30, 2014
Nick Gillespie responds to a really dumb argument against libertarianism:
As one of the folks (along with Matt Welch, natch), who started the whole “Libertarian Moment” meme way back in 2008, it’s been interesting to see all the ways in which folks on the right and left get into such a lather at the very notion of expanding freedom and choice in many (though sadly not all) aspects of human activity.
Indeed, the brain freeze can get so intense that it turns occasionally smart people into mental defectives.
To wit, Damon Linker’s recent essay in The Week (a great magazine, by the way), which argues that the outcomes of U.S. military intervention in Iraq and Libya disprove libertarianism, in particular, the Hayekian principle of “spontaneous order.”
No shit. Linker is being super-cereal here, kids:
Now it just so happens that within the past decade or so the United States has, in effect, run two experiments — one in Iraq, the other in Libya — to test whether the theory of spontaneous order works out as the libertarian tradition would predict.
In both cases, spontaneity brought the opposite of order. It produced anarchy and civil war, mass death and human suffering.
You got that? An archetypal effort in what Hayek would call “constructivism,” neocon hawks would call “nation building,” and what virtually all libertarians (well, me anyways) called a “non sequitur” in the war on terror that was doomed to failure from the moment of conception is proof positive that libertarianism is, in Linker’s eyes, “a particularly bad idea” whose “pernicious consequences” are plain to see.
In the sort of junior-high-school rhetorical move to which desperate debaters cling, Linker even plays a variation on the reductio ad Hitlerum in building case:
Some bad ideas inspire world-historical acts of evil. “The Jews are subhuman parasites that deserve to be exterminated” may be the worst idea ever conceived. Compared with such a grotesquely awful idea, other bad ideas may appear trivial. But that doesn’t mean we should ignore them and their pernicious consequences.
Into this category I would place the extraordinarily influential libertarian idea of “spontaneous order.”
What nuance: Exterminating Jews may be the worst idea…! When a person travels down such a rhetorical path, it’s best to back away quickly, with a wave of the hand and best wishes for the rest of his journey. Who can seriously engage somebody who starts a discussion by saying, “You’re not as bad as the Nazis, I’ll grant you that”…? I’d love to read his review of the recent Teenage Mutant Ninjas movie: “Not as bad as Triumph of the Will, but still a bad film…”
September 8, 2014
When Mao died, The Economist wrote:
“In the final reckoning, Mao must be accepted as one of history’s great achievers: for devising a peasant-centered revolutionary strategy which enabled China’s Communist Party to seize power, against Marx’s prescriptions, from bases in the countryside; for directing the transformation of China from a feudal society, wracked by war and bled by corruption, into a unified, egalitarian state where nobody starves; and for reviving national pride and confidence so that China could, in Mao’s words, ‘stand up’ among the great powers.” (emphasis mine)
The current estimate is that, during the Great Leap Forward, between thirty and forty million Chinese peasants starved to death. Critics questioning that figure have suggested that the number might have been as low as two and half million.
I am curious — has the Economist ever published an explicit apology or an explanation of how they got the facts so completely backwards, crediting the man responsible for what was probably the worst famine in history with creating a state “where nobody starves?” Is it known who wrote that passage, and has anyone ever asked him how he could have gotten the facts so terribly wrong?
David D. Friedman, “A Small Mistake”, Ideas, 2014-09-07.
July 13, 2014
From this week’s Goldberg File email from Jonah Goldberg:
I think I’ve stumbled onto a handy heuristic — or, if that word makes you want to smash my guitar on the Delta House wall, rule of thumb — for listening to Obama. Whenever he talks about himself, immediately flip it around so he’s saying the opposite. Think about it. “I’m not interested in photo-ops.” Boom. Translation: “I think photo-ops are really, really important. And that’s why I’m not going to have my picture taken with a bunch kids at the border.”
Now, sometimes, a literal reversal of meaning doesn’t work. But the key is to look at any statement he offers about others as an insight into his own mental state.
When Obama denounces cynicism, he’s actually being cynical. What he’s doing is expressing his frustration with people who are justifiably cynical about him. Why can’t you people fall for what I am saying!?
When he says he doesn’t care about “politics,” just problem-solving, what he’s really saying is he wants his political agenda to go unchallenged by other political agendas.
[…] whenever he says ideology and ideologues are a problem, what he’s actually saying is that competing ideologues and ideologies are the problem. That is, unless, you’re the sort of person who actually thinks Obama isn’t an ideologue, which is just adorable.
It’s not so much that he’s lying. Though if he were a Game of Thrones character, “Obama the Deceiver, First of His Name” would be a pretty apt formal title. No, he’s projecting. It’s an ego thing. I am fond of pointing out Obama’s insufficiently famous confession, “I actually believe my own bullsh*t.” What I like about it is that’s it’s like a verbal Escher drawing. He believes his own b.s. but by calling it b.s. he acknowledges it’s not believable. It’s like sarcastically insisting that you’re being serious. It’s earnest irony or ironic earnestness. If you take the statement too seriously, you could end up like android #1 in “I, Mudd.”
Anyway, I don’t take psychoanalysis, too seriously (“If you did, what would happen to me?” — The Couch). But I think Obama’s penchant for deriding his opponents as cynics and opportunists stems from the fact that he sees the world through precisely those sorts of prisms. But he tells himself he’s different because he does it for good purposes and besides, he’s so awesome his b.s. is true. No one knows if God can make a rock so heavy He can’t lift it, but Obama can sling such exquisite b.s. even he can believe. And because he believes it, he can’t tolerate the idea that others don’t.
Every President’s public image fades as his term of office runs down. It’s like the law of gravity … yet most of the media are still in love with the glamour of early-term Obama and keep hoping that somehow everyone else will believe hard enough with them that it will come back.
July 3, 2014
We’ve all seen many examples of health news stories where the headline promised much more than the article delivered: this is why stories have headlines in the first place — to get you to read the rest of the article. This sometimes means the headline writer (except on blogs, the person writing the headline isn’t the person who wrote the story), knowing less of what went into writing the story, grabs a few key statements to come up with an appealing (or appalling) headline.
This is especially true with science and health reporting, where the writer may not be as fully informed on the subject and the headline writer almost certainly doesn’t have a scientific background. The correct way to read any kind of health report in the mainstream media is to read skeptically — and knowing a few things about how scientific research is (or should be) conducted will help you to determine whether a reported finding is worth paying attention to:
Does the article support its claims with scientific research?
Your first concern should be the research behind the news article. If an article touts a treatment or some aspect of your lifestyle that is supposed to prevent or cause a disease, but doesn’t give any information about the scientific research behind it, then treat it with a lot of caution. The same applies to research that has yet to be published.
Is the article based on a conference abstract?
Another area for caution is if the news article is based on a conference abstract. Research presented at conferences is often at a preliminary stage and usually hasn’t been scrutinised by experts in the field. Also, conference abstracts rarely provide full details about methods, making it difficult to judge how well the research was conducted. For these reasons, articles based on conference abstracts should be no cause for alarm. Don’t panic or rush off to your GP.
Was the research in humans?
Quite often, the ‘miracle cure’ in the headline turns out to have only been tested on cells in the laboratory or on animals. These stories are regularly accompanied by pictures of humans, which creates the illusion that the miracle cure came from human studies. Studies in cells and animals are crucial first steps and should not be undervalued. However, many drugs that show promising results in cells in laboratories don’t work in animals, and many drugs that show promising results in animals don’t work in humans. If you read a headline about a drug or food ‘curing’ rats, there is a chance it might cure humans in the future, but unfortunately a larger chance that it won’t. So there is no need to start eating large amounts of the ‘wonder food’ featured in the article.
How many people did the research study include?
In general, the larger a study the more you can trust its results. Small studies may miss important differences because they lack statistical “power”, and are also more susceptible to finding things (including things that are wrong) purely by chance.
Did the study have a control group?
There are many different types of studies appropriate for answering different types of questions. If the question being asked is about whether a treatment or exposure has an effect or not, then the study needs to have a control group. A control group allows the researchers to compare what happens to people who have the treatment/exposure with what happens to people who don’t. If the study doesn’t have a control group, then it’s difficult to attribute results to the treatment or exposure with any level of certainty.
Also, it’s important that the control group is as similar to the treated/exposed group as possible. The best way to achieve this is to randomly assign some people to be in the treated/exposed group and some people to be in the control group. This is what happens in a randomised controlled trial (RCT) and is why RCTs are considered the ‘gold standard’ for testing the effects of treatments and exposures. So when reading about a drug, food or treatment that is supposed to have an effect, you want to look for evidence of a control group and, ideally, evidence that the study was an RCT. Without either, retain some healthy scepticism.