• The psychosocial causes and consequences of online video game play were evaluated.
• Over a 1- and 2-year period, evidence for social compensation processes were found.
• Among young adults, online games appear to be socially compensating spaces.
• No significant displacement or compensation patterns were found for adolescents.
• No significant displacement or compensation patterns were found for older adults.
Due to its worldwide popularity, researchers have grown concerned as to whether or not engagement within online video gaming environments poses a threat to public health. Previous research has uncovered inverse relationships between frequency of play and a range of psychosocial outcomes, however, a reliance on cross-sectional research designs and opportunity sampling of only the most involved players has limited the broader understanding of these relationships. Enlisting a large representative sample and a longitudinal design, the current study examined these relationships and the mechanisms that underlie them to determine if poorer psychosocial outcomes are a cause (i.e., pre-existing psychosocial difficulties motivate play) or a consequence (i.e., poorer outcomes are driven by use) of online video game engagement. The results dispute previous claims that online game play has negative effects on the psychosocial well-being of its users and instead indicate that individuals play online games to compensate for pre-existing social difficulties.
December 22, 2014
December 15, 2014
At BoingBoing, Jason Louv talks about getting back into his teenage passion (Dungeons and Dragons), but also worries that as a culture, we’re losing our opportunities — and capability — to imagine:
There’s just something about high Arthurian or Tolkienesque fantasy that cuts so deeply into the Western unconscious, finding a far more central vein than anything that Lovecraft or Edgar Rice Burroughs or Jack Kirby were able to mine. Nothing beats the experience of the Grail Quest, of becoming a heroic adventurer in a medieval world full of fantastic creatures, on a mission to slay the dragon and liberate the princess — or at least get some decent gold, treasure and experience points.
Until I left for college, fantasy paperbacks and comics were my world when I was alone, and role-playing games were my world when I was with friends. And how much more real, in a way, the inner palaces of my adolescent imagination felt to me than the gritty “reality” of so-called adult life, of endless war, losing friends to drugs, economic chaos, tumultuous relationships, chasing dollars.
Am I so wrong to want to go back to the Garden?
The Interior Castle
While our culture dismisses any use of the imagination as wasted time — something that distracts us from the “real” world of quantification and monetization — mystics and artists throughout history have told us that the imagination is the vehicle which brings us into contact with reality, not away from it.
William Blake is an exemplar of this approach — “The world of imagination is the world of eternity,” he wrote. “It is the divine bosom into which we shall all go after the death of the vegetated body. This world of imagination is infinite and eternal, whereas the world of generation is finite and temporal.”
In 1577, the Spanish Carmelite nun Teresa of Ávila wrote a prayer manual called The Interior Castle, which describes her path to union with God as a kind of epic single-player Dungeons and Dragons game. In it, she describes a vision she received of the soul as a castle-shaped crystal globe, containing seven mansions. These mansions — representing seven stages of deepening faith — were to be traversed through internal prayer. Throughout the book, she warns that this imaginary internal world will be consistently assaulted by reptilian specters, “toads, vipers and other venomous creatures,” representing the impurities of the soul to be vanquished by the spiritual pilgrim.
Sixty-five years earlier, St. Ignatius of Loyola designed his Spiritual Exercises as the training manual of the Jesuits, in which adherents were to deeply imagine themselves partaking in incidents from the life of Christ, creating inward virtual realities built up over years as a way of coming closer to God. Similar techniques exist in many world religions — in the stark inner visualizations of Tantric Buddhism, for instance. Such mystics speak not just of the vital importance of daydreaming and fantasy, but of the disciplined imagination as literally the door to divinity.
As we progress into the 21st century, this is a door that we are slowly losing the key to. The French Situationist author Annie Le Brun, in her 2008 book The Reality Overload: The Modern World’s Assault on the Imaginal Realm, suggests that information technology is causing blight and desertification in the world of the imagination just as surely as pollution and global warming are causing blight and desertification in the physical world. We are gaining the ability to communicate and hoard information, but losing the ability to imagine.
I literally cannot get my head around what it must be like to be a child or teenager now, raised in a completely digitized world — where fantasy and long reverie have given way to the instant gratification of electronic media. There can be no innocence or imagination or wonderment in the world of Reddit, Pornhub and 4Chan — just blank, numb, drooling fixation on a screen flickering with horrors in a dark and lonely room, the hell of isolation within one’s own id. I recently saw a blog post about a toilet training apparatus with an attachment for an iPad. No, no, no.
Just as electronic media is stripping us of our right to privacy, so is it stripping us of our right to an inner world. Everything is to be put on public display, even our most intimate moments and thoughts.
We need to go back. We need to re-discover the door to the inner worlds — a door that I believe encouraging young people to read printed books, and to play analog role-playing games like Dungeons and Dragons, can re-open.
December 13, 2014
It’s ridiculous to claim that smoking marijuana is a healthy habit. It does increase the risk of certain kinds of cancers, although the numbers are not huge, they’re also not zero. Jacob Sullum says “Marijuana Kills! But Not Very Often. Especially When Compared to Alcohol and Tobacco.“
In a new Heritage Foundation video, anti-pot activist Kevin Sabet bravely tackles “the myth that marijuana doesn’t kill.” Although cannabis consumers (unlike drinkers) do not die from acute overdoses, he says, “marijuana does kill people” through suicide, chronic obstructive pulmonary disease, car crashes, and other accidents.
I won’t say Sabet is attacking a straw man, since overenthusiastic cannabis fans have been known to say that “marijuana doesn’t kill anyone” (although the top Google result for that phrase is an article by Sabet explaining why that’s not true). But I will say that Sabet manages to obscure the fact that marijuana does not kill people very often, especially compared to the death tolls from legal drugs such as tobacco and alcohol, which is the relevant point in evaluating the scientific basis for pot prohibition. Let’s take a closer look at the four ways that marijuana kills, according to Sabet:
Suicide. Some research does find a correlation between suicide and marijuana use, but that does not mean the relationship is causal. A longitudinal study published by The British Journal of Psychiatry in 2009 reached this conclusion:
Although there was a strong association between cannabis use and suicide, this was explained by markers of psychological and behavioural problems. These results suggest that cannabis use is unlikely to have a strong effect on risk of completed suicide, either directly or as a consequence of mental health problems secondary to its use.
Furthermore, there is some evidence that letting patients use marijuana for symptom relief reduces the risk of suicide. Still, if reefer has ever driven anyone to kill himself, that would be enough to prove Sabet’s point. You can’t say it has never happened!
December 6, 2014
In an interview with Jenny Vrentas, former Viking great Fran Tarkenton discusses this year’s crop of rookie quarterbacks (including the Vikings’ Teddy Bridgewater), the NFL’s ongoing disciplinary issues with Ray Rice and Adrian Peterson, the long-term issues with NFL doctors dispensing painkillers, and the advent of performance-enhancing drugs. On the issue of league discipline, he believes the league should not allow Rice or Peterson to play again:
VRENTAS: Are you saying the Vikings should move on from Peterson because of his age, or because of the child abuse case that led to his suspension?
TARKENTON: I followed the Clippers thing. That owner [Donald Sterling] didn’t get indicted for any crime, but the racial comments he made were totally inappropriate, and we took a stand. The whole world and the NBA, we have zero tolerance to racism. And I think that’s right. I agree with that. But I also think we ought to have zero tolerance to child abuse and domestic violence. I don’t think [Peterson] should play again in the NFL. I don’t think Ray Rice should play again. Either we have zero tolerance, or we don’t. And what is more egregious than domestic violence and child abuse? I don’t know of anything, unless you kill somebody.
VRENTAS: Peterson has not played since the child-abuse charges first surfaced in September, and now he’s been suspended for the rest of the season, pending appeal. Do you think the response shows that teams and the league are starting to take these issues more seriously?
TARKENTON: Kind of. They have been a little bit wishy-washy. [The Vikings] were going to play Adrian Peterson [before reversing course in September]. Other teams were going to play other players [involved in cases of domestic violence]. And the NFL was going to give just a two-game suspension to Ray Rice. I don’t think we’ve gotten beyond “win at any cost” yet. And I think we need to get there. We should have zero tolerance to racism. We don’t believe that, right? Is that more important than zero tolerance to domestic abuse and child abuse? Unless we as a society think that way, then we won’t make progress. And the whole domestic violence thing, that has been tolerated universally, but certainly in the NFL. We can’t tolerate that. All these behaviors that are so egregious continue. We need to set an example.
And on the topic of team doctors and the use of drugs to get players back into games (but which had potentially serious long-term health implicatons:
VRENTAS: You wrote a letter to the New York Times regarding painkiller abuse, in response to the DEA’s recent spot checks of NFL team medical staffs. This has been a subject you have been vocal about. What was your experience with painkiller use during your playing career?
TARKENTON: This has been going on forever. I was playing for the New York Giants, and I hurt my shoulder in a game against the Pittsburgh Steelers. I came in at halftime, and the doctor had a great big long needle, punched a few different places, and told me, “Show me where it hurts the worst.” I said, “Ow,” and he jammed a combination of xylocaine and cortisone into my shoulder. That’s not good for my shoulder, but he’s my team doctor. I don’t think he’s going to do something that hurts my career, right? He’s like my family doctor. If my family doctor tells me to take a pill, I’ll take a pill. So every Friday, I went on the subway from old Yankee Stadium, where we practiced, all the way down to lower Manhattan to St. Vincent’s Hospital, and they did the same thing they did at halftime. They shot my shoulder. It didn’t really help me, but it allowed me to play. Now, when I come back to Minnesota, my shoulder is worse. The year we played the Pittsburgh Steelers in the Super Bowl in New Orleans, my shoulder was already deteriorating, and I hurt it early in the season in Dallas. The rest of the year I could not throw a ball in practice; I could not throw a ball in warm-ups over 10 yards. When I got in the game, I could throw it maybe 40 yards, because my adrenaline was up, but there was nothing on it. But every Friday, guess what they shot me with? Butazolidin. That’s what they shot horses with. Shot me up every Friday, all the way to the Super Bowl. I retired at age 39, and I see my doctors down here [in Atlanta] because my shoulder is killing me. They say, “You’ve got the shoulder of a 75-year old man. You need your shoulder replaced.” I talked to a lot of the old guys — Roger Staubach, Otto Graham, Sammy Baugh, Johnny Unitas, Y.A. Tittle — and none of them had shoulders replaced. I had my shoulder replaced, because they shot me up. Where was the conscience back then? People say, “You knew what they were doing.” I knew what they were doing, but I didn’t think they would hurt me. I didn’t think my shoulder was going to fall apart.
December 5, 2014
Michael White says we need to follow up our success in reading our own genetic code by decoding a different one:
There are thousands of mutations that occur in the breast cancer-linked genes BRCA1 and BRCA2. Some of these cause breast or ovarian cancer, while others are harmless. When we design a genetic test for predisposition to breast cancer, we have to know which ones to test for. The same is true of almost any gene that plays a role in disease — you’ll find many mutations in that gene in the general population, only some of which cause health problems. So how do we know which mutations to worry about?
We start by using the genetic code. The genetic code, cracked by scientists in the 1960s, makes it surprisingly easy to “read” our DNA and understand how a particular mutation affects a gene. As genetic testing takes on a bigger role in predicting, diagnosing, and treating disease, we rely on this code to help us make sense of the data. Unfortunately, the genetic code applies to less than two percent of our DNA. In an effort to read the rest, researchers are trying to crack a new genetic code — and this next one is turning out to be much more difficult to solve than the first. In fact, scientists may have to give up the idea that we can use a “code” to “read” the rest of our DNA.
When scientists were working out the original genetic code in the 1950s and ’60s, all sorts of complicated schemes were proposed to explain how information is stored in our genes. The problem they were trying to solve was how a gene, made of DNA, codes the information to make a particular protein — an enzyme, a pump, a piece of cellular scaffolding, or some other critical component of the cell’s working machinery. They were looking for a code that would translate the four-letter DNA alphabet of genes into the 20-letter amino acid alphabet of proteins.
Thanks to its simplicity, the genetic code is a powerful tool in our hunt for mutations that cause disease. Unfortunately, it has also led to the genetic equivalent of a drunk looking for his lost keys under the lamppost. Researchers have put much of their effort into looking for disease mutations in those parts of our genomes that we can read with the genetic code — that is, parts that consist of canonical genes that code for proteins. But these genes make up less than two percent of our DNA; much more of our genetic function is outside of genes in the relatively uncharted “non-coding” portions. We have no idea how many disease-causing mutations are in that non-coding portion — for some types of mutations, it could be as high as 90 percent.
December 4, 2014
Before I forget it, I must record two valuable health hints that I learned from Xenophon. He used to say: “The man is a fool who puts good manners before health. If you are troubled with wind, never hold it in. It does great injury to the stomach. I knew a man who once nearly killed himself by holding in his wind. If for some reason or other you cannot conveniently leave the room — say, you are sacrificing or addressing the Senate — don’t be afraid to belch or break wind downwards where you stand. Better that the company should suffer some slight inconvenience than that you should permanently injure yourself. And again, when you suffer from a cold, don’t constantly blow your nose. That only increases the flow of rheum and inflames the delicate membranes of your nose. Let it run. Wipe, don’t blow.” I have always taken Xenophon’s advice, at least about nose-blowing: my colds don’t last nearly so long now as they did. Of course, caricaturists and satirists soon made fun of me as having a permanently dripping nose, but what did I care for that? Messalina told me that she thought I was extremely sensible to take such care of myself: if I were suddenly to die or fall seriously ill, what would become of the City and Empire, not to mention herself and our little boy?
Robert Graves, Claudius the God, 1935.
December 2, 2014
On his blog, Charles Stross talks about the mundane irritations and accumulated friction of a life lived past age 50 or so:
Beyond the obvious (gross physiological deterioration and pathologies of senescence), what are the psychological symptoms of ageing?
I tend to be somewhat impatient or short-tempered these days. Examples: getting worked up about people obstructing a sidewalk in front of me, or carelessly blowing smoke over their shoulder and into my face, walking while texting … you know the drill. This I put down largely to the chronic low-grade pain of the middle-aged body: joints that creak and pop, muscles that need an extra stretch, sore feet. […]
My memory, as previously noted, is a sieve. Partly I find myself living in a cluttered cognitive realm: I have so much context to apply to any new piece of incoming data. If middle-aged people seem slow at times it may not be because they’re stupid (although stupidity is a non-ageist affliction) but because they’re processing a lot more data than a young mind has on hand to digest. That shop window display? You’re not just looking at this seasons clothing fashions, but integrating changes in fashion across multiple decades and recognizing when this stuff was last new. (And if fashion is your thing, you’re trying to remember how far back in the wardrobe you hung it last time you wore it, all those years ago.) A side-effect of this: when experiencing something familiar through long repetition you forget it — you don’t remember it as a new experience but merely as an instance of a familiar one and (eventually) as nothing at all. (For those of you with a workday routine, this can cut in quite early: how well do you remember your last commute to work? If you do remember it, do you remember it only because it was exceptional—a truck nearly t-boning you, for example?)
An intersecting effect of the aches and pains and the difficulty retrieving information is that you have to focus hard on tasks — it’s hard to execute a day with six or seven distinct non-routine activities in it, because that requires planning and planning requires lots of that difficult mental integration. Planning is exhausting. Instead you focus on maintaining routines (get up, brush teeth, take meds, shave, use toilet, make coffee … check. Go to gym: check. Eat lunch: check. Work at desk: check …) and scheduling one or two exceptional tasks. Mental checklists help a lot, but you run into the sieve-shaped memory problem again: this is where digital prosthesis (or an overflowing filofax) come in handy.
Your perspective on current events changes. Take the news media. Everything new is old after a time: you see the large-scale similarities across decades even without becoming a student of history. Today’s invasion or oil crisis is just like the one before last. Our current political leadership are stuck in the same ideological monkey’s-paw trap as their predecessors the last time their party was in power. And so on. So you tend to discount current events and lose interest in the news until something new happens. (If you’re wondering why I’m obsessively interested in the Scottish independence thing this year, it’s because it’s a disruptive event: nothing like it has happened in UK politics for a very long time indeed. It’s fresh.)
December 1, 2014
In Reason, Baylen Linnekin looks at the FDA’s soon-to-be-implemented rules on menu labelling:
Earlier this week, the FDA released rules that will force food sellers around the country to provide point-of-sale calorie information to consumers. The rules cover chain restaurants, vending machines, “movie theaters, sports stadiums, amusement parks, bowling alleys and miniature golf courses that serve prepared foods.” The rules apply to foods and beverages — including beer, wine, and spirits — sold at these places.
Farley’s enthusiasm might have been tempered by research showing mandatory menu-labeling doesn’t work — and may even be counterproductive.
Because the new rules will cost more than a billion dollars not to stop the obesity epidemic and maybe make it better, some who have to spend that money aren’t pleased.
For example, that potato salad you buy at your grocery deli counter will fall under the new rules. That doesn’t sit well with grocery store owners.
“Grocery stores are not chain restaurants, which is why Congress did not initially include them in the law,” said National Grocers Association president and CEO, Peter J. Larkin in a statement. “We are disappointed that the FDA’s final rules will capture grocery stores, and impose such a large and costly regulatory burden on our members.”
As I wrote last year, the NRA, which represents restaurant chains across the country, supported the national menu-labeling rule as a shield against a growing, costly, and unworkable patchwork of different state and local menu-labeling laws.
It’s the same reason that food manufacturers, facing mandatory GMO-labeling pressure in dozens of states, counties, and cities around the country, are pushing for Congress to pass a uniform national GMO-labeling law.
Do I understand why the restaurant industry and food manufacturers are pushing for one bad federal law instead of hundreds or thousands of worse laws at the state and local level? Absolutely. Do I support such laws? Not at all.
November 25, 2014
Scott Alexander wrote this back in July. I think it’s still relevant as a useful perspective-enhancer:
The year 1969 comes up to you and asks what sort of marvels you’ve got all the way in 2014.
You explain that cameras, which 1969 knows as bulky boxes full of film that takes several days to get developed in dark rooms, are now instant affairs of point-click-send-to-friend that are also much higher quality. Also they can take video.
Music used to be big expensive records, and now you can fit 3,000 songs on an iPod and get them all for free if you know how to pirate or scrape the audio off of YouTube.
Television not only has gone HDTV and plasma-screen, but your choices have gone from “whatever’s on now” and “whatever is in theaters” all the way to “nearly every show or movie that has ever been filmed, whenever you want it”.
Computers have gone from structures filling entire rooms with a few Kb memory and a punchcard-based interface, to small enough to carry in one hand with a few Tb memory and a touchscreen-based interface. And they now have peripherals like printers, mice, scanners, and flash drives.
Lasers have gone from only working in special cryogenic chambers to working at room temperature to fitting in your pocket to being ubiquitious in things as basic as supermarket checkout counters.
Telephones have gone from rotary-dial wire-connected phones that still sometimes connected to switchboards, to cell phones that fit in a pocket. But even better is bypassing them entirely and making video calls with anyone anywhere in the world for free.
Robots now vacuum houses, mow lawns, clean office buildings, perform surgery, participate in disaster relief efforts, and drive cars better than humans. Occasionally if you are a bad person a robot will swoop down out of the sky and kill you.
For better or worse, video games now exist.
Medicine has gained CAT scans, PET scans, MRIs, lithotripsy, liposuction, laser surgery, robot surgery, and telesurgery. Vaccines for pneumonia, meningitis, hepatitis, HPV, and chickenpox. Ceftriaxone, furosemide, clozapine, risperidone, fluoxetine, ondansetron, omeprazole, naloxone, suboxone, mefloquine, – and for that matter Viagra. Artificial hearts, artificial livers, artificial cochleae, and artificial legs so good that their users can compete in the Olympics. People with artificial eyes can only identify vague shapes at best, but they’re getting better every year.
World population has tripled, in large part due to new agricultural advantages. Catastrophic disasters have become much rarer, in large part due to architectural advances and satellites that can watch the weather from space.
We have a box which you can type something into and it will tell you everything anyone has ever written relevant to your query.
We have a place where you can log into from anywhere in the world and get access to approximately all human knowledge, from the scores of every game in the 1956 Roller Hockey World Cup to 85 different side effects of an obsolete antipsychotic medication. It is all searchable instantaneously. Its main problem is that people try to add so much information to it that its (volunteer) staff are constantly busy deleting information that might be extraneous.
We have the ability to translate nearly major human language to any other major human language instantaneously at no cost with relatively high accuracy.
We have navigation technology that over fifty years has gone from “map and compass” to “you can say the name of your destination and a small box will tell you step by step which way you should be going”.
We have the aforementioned camera, TV, music, videophone, video games, search engine, encyclopedia, universal translator, and navigation system all bundled together into a small black rectangle that fits in your pockets, responds to your spoken natural-language commands, and costs so little that Ethiopian subsistence farmers routinely use them to sell their cows.
But, you tell 1969, we have something more astonishing still. Something even more unimaginable.
“We have,” you say, “people who believe technology has stalled over the past forty-five years.”
1969’s head explodes.
November 23, 2014
At Mother Jones, Kevin Drum looks at some of the reasons Obamacare is not being embraced by the working and middle classes the way many expected:
Here’s an interesting chart that follows up on a post I wrote a few days ago about Democrats and the white working class. Basically, I made the point that Democrats have recently done a lot for the poor but very little for the working and middle classes, and this is one of the reasons that the white working class is increasingly alienated from the Democratic Party.
I got various kinds of pushback on this, but one particular train of criticism suggested that I was overestimating just how targeted Democratic programs were. Sure, they help the poor, but they also help the working class a fair amount, and sometimes even the lower reaches of the middle class. However, while there’s some truth to this for certain programs (unemployment insurance, SSI disability), the numbers I’ve seen in the past don’t really back this up for most social welfare programs.
Obamacare seems like an exception, since its subsidies quite clearly reach upward to families in the working and middle classes. Today, however, Bill Gardner points me to a Brookings paper from a few months ago that suggests just the opposite. The authors calculate net gains and losses from Obamacare, and conclude that nearly all its benefits flow to the poor. If I interpolate their chart a bit, winners are those with household incomes below $25,000 or so, and losers are those with incomes above $25,000.
November 22, 2014
In The Diplomat, James R. Holmes says that we can learn a lot about fighting infectious diseases like ebola by reading what Thucidides wrote about the plague that struck Athens during the opening stages of the Peloponnesian War:
Two panelists from our new partner institution, a pair of Africa hands, offered some striking reflections on the fight against Ebola.
Their presentations put in me in the mind of … classical Greece. Why? Mainly because of Thucydides. Thucydides’ history of the Peloponnesian War isn’t just a (partly) eyewitness account of a bloodletting from antiquity; it’s the Good Book of politics and strategy. Undergraduates at Georgia used to look skeptical when I told them they could learn ninety percent of what they needed to know about bareknuckles competition from Thucydides. The remainder? Technology, tactics, and other ephemera. Thucydides remains a go-to source on the human factor in diplomacy and warfare.
But I digress. Ancient Greece suffered its own Ebola outbreak, a mysterious plague that struck Athens oversea during the early stages of the conflict. And the malady struck, perchance, at precisely the worst moment for Athens, after “first citizen” Pericles had arranged for the entire populace of Attica, the Athenian hinterland, to withdraw within the city walls. The idea was to hold the fearsome Spartan infantry at bay with fixed fortifications while the Athenian navy raided around the perimeter of the Spartan alliance.
That’s where the parallel between then and now becomes poignant. Thucydides notes, for example, that doctors died “most thickly” from the plague. The Brown presenters noted that, likewise, public-health workers in Africa — doctors, nurses, stretcher-bearers — are among the few to deliberately make close contact with the stricken. Relief teams, consequently, take extravagant precautions to quarantine the disease within makeshift facilities while shielding themselves from contagion. Sometimes these measures fail.
Now as in ancient Greece, furthermore, the prospect of disease and death deters some would-be healers altogether from succoring the afflicted. Selflessness has limits. Some understandably remain aloof — today as in Athens of yesteryear.
Teams assigned to bury the slain also find themselves in dire peril. Perversely, the dead from Ebola are more contagious than living hosts. That makes disposing of bodies in sanitary fashion a top priority. As the plague ravaged Athens, similarly, corpses piled up in the streets. No one would perform funeral rites — even in this deeply religious society. Classicist Victor Davis Hanson ascribes some of Athens’ barbarous practices late in the war — such as cutting off the hands of captured enemy seamen to keep them from returning to war — in part to the plague’s debasing impact on morals, ethics, and religion.
November 18, 2014
Except stereotypes are not inaccurate. There are many different ways to test for the accuracy of stereotypes, because there are many different types or aspects of accuracy. However, one type is quite simple — the correspondence of stereotype beliefs with criteria. If I believe 60% of adult women are over 5′ 4″ tall, and 56% voted for the Democrat in the last Presidential election, and that 35% of all adult women have college degrees, how well do my beliefs correspond to the actual probabilities? One can do this sort of thing for many different types of groups.
And lots of scientists have. And you know what they found? That stereotype accuracy — the correspondence of stereotype beliefs with criteria — is one of the largest relationships in all of social psychology. The correlations of stereotypes with criteria range from .4 to over .9, and average almost .8 for cultural stereotypes (the correlation of beliefs that are widely shared with criteria) and.5 for personal stereotypes (the correlation of one individual’s stereotypes with criteria, averaged over lots of individuals). The average effect in social psychology is about .20. Stereotypes are more valid than most social psychological hypotheses.
Which raises a question: Why do so many psychologists emphasize stereotype inaccuracy when the evidence so clearly provides evidence of such high accuracy? Why is there this Extraordinary Scientific Delusion?
There may be many explanations, but one that fits well is the leftward lean of most psychologists. If we can self-righteously rail against other people’s inaccurate stereotypes, we cast ourselves as good, decent egalitarians fighting the good fight, siding with the oppressed against their oppressors. Furthermore, as Jon Haidt has repeatedly shown, ideology blinds people to facts that are right under their noses. Liberal social scientists often have assumed stereotypes were inaccurate without bothering to test for inaccuracy, and, when the evidence has been right under their noses, they have avoided looking at it. And when something happens where they can’t avoid looking at it, they have denigrated its importance. Which is, in some ways, very amusing — if, after 100 years of proclaiming the inaccuracy of stereotypes to the world, can we really just say “Never mind, it’s not that important” after the evidence comes in showing that stereotype accuracy is one of the largest relationships in all of social psychology?
November 14, 2014
In Reason, Elizabeth Nolan Brown reviews the findings of a recent survey on what kind of kinks are no longer considered weird or unusual (because so many people fantasize about ‘em or are actively partaking of ‘em):
Being sexually dominated. Having sex with multiple people at once. Watching someone undress without their knowledge. These are just a few of the totally normal sexual fantasies uncovered by recent research published in the Journal of Sexual Medicine. The overarching takeaway from this survey of about 1,500 Canadian adults is that sexual kink is incredibly common.
While plenty of research has been conducted on sexual fetishes, less is known about the prevalence of particular sexual desires that don’t rise to the level of pathological (i.e., don’t harm others or interfere with normal life functioning and aren’t a requisite for getting off). “Our main objective was to specify norms in sexual fantasies,” said lead study author Christian Joyal. “We suspected there are a lot more common fantasies than atypical fantasies.”
Joyal’s team surveyed about 717 Québécois men and 799 women, with a mean age of 30. Participants ranked 55 different sexual fantasies, as well as wrote in their own. Each fantasy was then rated as statistically rare, unusual, common, or typical.
Of course, the statistics also show where men and women differ in some areas:
Notably, men were more likely than women to say they wanted their sexual fantasies to become sexual realities. “Approximately half of women with descriptions of submissive fantasies speciﬁed that they would not want the fantasy to materialize in real life,” the researchers note. “This result conﬁrms the important distinction between sexual fantasies and sexual wishes, which is usually stronger among women than among men.”
The researchers also found a number of write-in “favorite” sexual fantasies that were common among men had no equivalent in women’s fantasies. These included having sex with a trans woman (included in 4.2 percent of write-in fantasies), being on the receiving end of strap-on/non-homosexual anal sex (6.1 percent), and watching a partner have sex with another man (8.4 percent).
Next up, the researchers plan to map subgroups of sexual fantasies that often go together (for instance, those who reported submissive fantasies were also more likely to report domination fantasies, and both were associated with higher levels of overall sexual satisfaction). For now, they caution that “care should be taken before labeling (a sexual fantasy) as unusual, let alone deviant.”
It would be interesting to see the results of this study replicated in other areas — Quebec may or may not be representative of the rest of western society.
Update, 28 November: Maggie McNeill is not impressed by the study at all.
But there’s a bigger problem, which as it turns out I’ve written on before when the titillation du jour was the claim that fewer men were paying for sex:
… the General Social Survey … has one huge, massive flaw that was mentioned by my psychology professors way back in the Dark Ages of the 1980s, yet seems not to trouble those who rely upon it so heavily these days: it is conducted in person, face to face with the respondents. And that means that on sensitive topics carrying criminal penalties or heavy social stigma, the results are less than solid; negative opinions of its dependability on such matters range from “unreliable” to “useless”. The fact of the matter is that human beings want to look good to authority figures (like sociologists in white lab coats) even when they don’t know them from Adam, so they tend to deviate from strict veracity toward whatever answer they think the interviewer wants to hear…
“Clinically, we know what pathological sexual fantasies are: they involve non-consenting partners, they induce pain, or they are absolutely necessary in deriving satisfaction,” Christian Joyal, the lead author of the study, said…The researchers found that only two sexual fantasies were…rare: Sexual activities with a child or an animal…only nine sexual fantasies were considered unusual…[including] “golden showers,” cross-dressing, [and] sex with a prostitute…
Joyal’s claim that sadistic and rape fantasies are innately “pathological” is both insulting and totally wrong; we “know” no such thing. And did you think it was a coincidence that pedophilia and bestiality were the only two fantasies to fall into the “rare” category during a time when those are the two most vilified kinks in the catalog, kinks which will result in permanent consignment to pariah status if discovered? Guess again; as recently as the 1980s it was acceptable to at least talk about both of these, and neither is as rare as this “study” pretends. But Man is a social animal, and even if someone is absolutely certain of his anonymity (which in the post-Snowden era would be a much rarer thing than either of those fantasies), few are willing to risk the disapproval of a lab-coated authority figure even if he isn’t sitting directly in front of them. What this study shows is not how common these fantasies actually are, but rather how safe people feel admitting to them. And while that’s an interesting thing in itself, it isn’t what everyone from researchers to reporters to readers is pretending the study measured.
I do occasional work for my hospital’s Addiction Medicine service, and a lot of our conversations go the same way.
My attending tells a patient trying to quit that she must take a certain pill that will decrease her drug cravings. He says it is mostly covered by insurance, but that there will be a copay of about one hundred dollars a week.
The patient freaks out. “A hundred dollars a week? There’s no way I can get that much money!”
My attending asks the patient how much she spends on heroin.
The patient gives a number like thirty or forty dollars a day, every day.
My attending notes that this comes out to $210 to $280 dollars a week, and suggests that she quit heroin, take the anti-addiction pill, and make a “profit” of $110.
At this point the patient always shoots my attending an incredibly dirty look. Like he’s cheating somehow. Just because she has $210 a week to spend on heroin doesn’t mean that after getting rid of that she’d have $210 to spend on medication. Sure, these fancy doctors think they’re so smart, what with their “mathematics” and their “subtracting numbers from other numbers”, but they’re not going to fool her.
At this point I accept this as a fact of life. Whatever my patients do to get money for drugs — and I don’t want to know — it’s not something they can do to get money to pay for medication, or rehab programs, or whatever else. I don’t even think it’s consciously about them caring less about medication than about drugs, I think that they would be literally unable to summon the motivation necessary to get that kind of cash if it were for anything less desperate than feeding an addiction.
Scott Alexander, “Apologia Pro Vita Sua”, Slate Star Codex, 2014-05-25.
November 12, 2014
In The Federalist, Daniel Payne explains what the food nannies really mean by the term “national food policy”:
In the past I have used the term “food system” as shorthand for the industrial paradigm of food production, but for Bittman et al. to talk about the “food system” in such a way exposes it for the ridiculous concept it really is. There is no “food system,” not in the sense of a truly unified body of fully interdependent constituent parts: the “food system” is actually composed of millions of individuals acting privately and voluntarily, in different cities, counties, and states, as part of different companies and corporations and individual businesses, in elective concert with each other and with the rest of the world. To speak if it as a single “system” is deeply misguided, at least insofar as it is not a single entity but an endlessly complex patchwork of fully autonomous beings.
Thus when the authors write about “align[ing] agricultural policies,” they are not speaking in some ill-defined abstract about government policy; they are talking about forcing actual farmers to grow and do things the authors want. When they write of the Environmental Protection Agency and the U.S. Department of Agriculture monitoring “food production,” they are actually advocating that these federal agencies go after and punish people who are not farming in the way the authors want them to farm — and all this without Congress having passed a single law.
The authors are advocating, in other words, for a kind of executive dictatorship over the nation’s farmers, farms, and food supply. While it is unsurprising that they would use this dictatorship to attack the people who grow the food, it is also undeniable that this “national food policy” would target consumers as well. Such a “food system” cannot exist, after all, without people who are willing to purchase and consume its products.
The authors are not merely fed up with their big agribusiness boogeymen; they are also fed up with you for buying agribusiness products, and they want to use the government to make you stop. That you have broken no laws now, and will have broken no laws even after this “policy” goes into effect, is immaterial. They wish for the government to boss you around simply because your shopping purchases displease them. That they are too cowardly to come right out and say so is very telling of who they are—as men, and as advocates of the “public health.” Shame on them for being too spineless to tell the truth of their motives.