exurb1a
Published on 4 Jul 2016I notice that it’s also independence day. How fitting.
You just wait until we throw all your tea in the fucking ocean.The music is Pomp and Circumstance No.1 by Elgar ► https://www.youtube.com/watch?v=moL4MkJ-aLk
November 5, 2017
England: A Beginner’s Guide
September 22, 2017
QotD: Microaggressions and the out-groups
The debate over microaggressions often seems to focus on whether they are real. This is silly. Of course they’ve always been real; only the label is new. Microaggressions from the majority to the minority are as real as Sunday, and the effect of their accumulated weight is to make you feel always slightly a stranger in a strange land. The phenomenon is dispiriting, even more so because the offenders frequently don’t realize that their words were somewhere between awkward and offensive (once again).
On the other hand, in a diverse group, the other thing you have to say about microaggressions is that they are unavoidable. And that a culture that tries to avoid them is setting up to tear itself apart.
I’m using microaggressions broadly here: to define the small slights by which any majority group subtly establishes its difference from its minority members. That means that I am including groups that may not come to mind for victim status, like conservatives in very liberal institutions. And no doubt many of my readers are preparing to deliver a note or a comment saying I shouldn’t dare to compare historically marginalized groups with politically powerful ones.
I dare because it highlights the basic problem with extensively litigating microaggressions, which is that it is a highly unstable way of mediating social disputes. Deciding who is eligible to complain about microaggressions is itself an act by which the majority imposes its will, and it is felt as alienating by the minorities who are effectively told that they don’t have the same right to ask for decent treatment as other groups. As a conservative social scientist once told me, “When I think of my own laments about being an ideological minority, most of it is basically microaggression.”
Megan McArdle, “How Grown-Ups Deal With ‘Microaggressions'”, Bloomberg View, 2015-09-11.
September 5, 2017
QotD: Microaggressions
Whenever I first heard the word “microaggression,” sometime in the last five years, I’m sure I was unaware how big “micro” could get. The accusation of a microaggression was about to become a pervasive feature of the Internet, and particularly social media. An offense most of us didn’t even know existed, suddenly we were all afraid of being accused of.
We used to call this “rudeness,” “slights” or “ignorant remarks.” Mostly, people ignored them. The elevation of microaggressions into a social phenomenon with a specific name and increasingly public redress marks a dramatic social change, and two sociologists, Bradley Campbell and Jason Manning, have a fascinating paper exploring what this shift looks like, and what it means. (Jonathan Haidt has provided a very useful CliffsNotes version.)
Western society, they argue, has shifted from an honor culture — in which slights are taken very seriously, and avenged by the one slighted — to a dignity culture, in which personal revenge is discouraged, and justice is outsourced to third parties, primarily the law. The law being a cumbersome beast, people in dignity cultures are encouraged to ignore slights, or negotiate them privately by talking with the offender, rather than seeking some more punitive sanction.
Microagressions mark a transition to a third sort of culture: a victim culture, in which people are once again encouraged to take notice of slights. This sounds a lot like honor culture, doesn’t it? Yes, with two important differences. The first is that while victimhood is shameful in an honor culture — and indeed, the purpose of taking vengeance is frequently to avoid this shame — victim status is actively sought in the new culture, because victimhood is a prerequisite for getting redress. The second is that victim culture encourages people to seek help from third parties, either authorities or the public, rather than seeking satisfaction themselves.
Megan McArdle, “How Grown-Ups Deal With ‘Microaggressions'”, Bloomberg View, 2015-09-11.
March 25, 2017
If Walls Could Talk The History of the Home Episode 1: The Living Room
Published on 21 Jan 2017
First episode about the Living Room with Lucy Worsley Give a thumbs up for more episodes! 😀
February 25, 2017
“Sophisticated and affluent Americans, as a group, are pretty gullible”
Andrew Ferguson on the gullibility of SAPs (sophisticated and affluent people) in social science fields:
Every few weeks, it seems, a new crack appears in the seemingly impenetrable wall of social-science dogma. The latest appeared last month with the publication of a paper by the well-known research psychologist Scott Lilienfeld, a professor at Emory University and coauthor of the indispensable primer 50 Great Myths of Popular Psychology. Among other things, he is a great debunker, and he has trained his skeptical eye on “microaggressions.”
Sophisticated, affluent people in the United States (SAPs) have been trained through years of education to respect whatever is presented to them as “science,” even if it’s not very good science, even if it’s not science at all. Their years of education have not trained them how to tell the difference. Sophisticated and affluent Americans, as a group, are pretty gullible.
So when their leaders in journalism, academia, and business announce a new truth of human nature, SAPs around the country are likely to embrace it. The idea of microaggressions is one of these. It was first popularized a decade ago, and now the pervasiveness of microaggressions in American life is taken as settled fact.
We could have seen it coming. Already, by the time microaggressions became widely known, social scientists had invented the Implicit Association Test (IAT). The test, administered online and to college students throughout the country, pretended to establish that anti-black and anti-Latino prejudice among white Americans was ever-present yet, paradoxically, nearly invisible, often unrecognized by perpetrator and victim alike. Even people who had never uttered a disparaging remark about someone of another color were shown by the IAT to be roiling cauldrons of racial animus. You know who you are.
The IAT thus laid the predicate for microaggressions. They were the outward, unwitting expressions of implicit racism; not only were they evidence of it, they were offered as proof of it. (Circularity is a common tool in cutting-edge social science.) Microaggressions are usually verbal but they don’t have to be. In their pathbreaking paper “Racial Microaggressions in Everyday Life” (2007), the psychologist Derald Wing Sue and his team of researchers from Columbia University helpfully listed many common microaggressions. Saying “America is a melting pot” is really a demand that someone “assimilate to the dominant culture.” Having an office that “has pictures of American presidents” on the wall announces that “only white people can succeed.” Also, an “overabundance of liquor stores in communities of color” carries the microaggressive message that “people of color are deviant.”
H/T to Colby Cosh for the link.
October 28, 2016
Farewell to adolescence … we probably won’t miss it
In Aeon, Paula Fass discusses an odd social invention of the 20th century that appears to have gone well past its best before date:
Adolescence as an idea and as an experience grew out of the more general elevation of childhood as an ideal throughout the Western world. By the closing decades of the 19th century, nations defined the quality of their cultures by the treatment of their children. As Julia Lathrop, the first director of the United States Children’s Bureau, the first and only agency exclusively devoted to the wellbeing of children, observed in its second annual report, children’s welfare ‘tests the public spirit and democracy of a community’.
Progressive societies cared for their children by emphasising play and schooling; parents were expected to shelter and protect their children’s innocence by keeping them from paid work and the wrong kinds of knowledge; while health, protection and education became the governing principles of child life. These institutional developments were accompanied by a new children’s literature that elevated children’s fantasy and dwelled on its special qualities. The stories of Beatrix Potter, L Frank Baum and Lewis Carroll celebrated the wonderland of childhood through pastoral imagining and lands of oz.
The United States went further. In addition to the conventional scope of childhood from birth through to age 12 – a period when children’s dependency was widely taken for granted – Americans moved the goalposts of childhood as a democratic ideal by extending protections to cover the teen years. The reasons for this embrace of ‘adolescence’ are numerous. As the US economy grew, it relied on a complex immigrant population whose young people were potentially problematic as workers and citizens. To protect them from degrading work, and society from the problems that they could create by idling on the streets, the sheltering umbrella of adolescence became a means to extend their socialisation as children into later years. The concept of adolescence also stimulated Americans to create institutions that could guide adolescents during this later period of childhood; and, as they did so, adolescence became a potent category.
With the concept of adolescence, American parents, especially those in the middle class, could predict the staging of their children’s maturation. But adolescence soon became a vision of normal development that was applicable to all youth – its bridging character (connecting childhood and adulthood) giving young Americans a structured way to prepare for mating and work. In the 21st century, the bridge is sagging at both ends as the innocence of childhood has become more difficult to protect, and adulthood is long delayed. While adolescence once helped frame many matters regarding the teen years, it is no longer an adequate way to understand what is happening to the youth population. And it no longer offers a roadmap for how they can be expected to mature.
December 22, 2015
QotD: Communes
The anthropologist Richard Sosis examined the history of two hundred communes founded in the United States in the nineteenth century. Which kind of commune survived longest? Sosis found that the difference was stark: just 6% of the secular communes were still functioning twenty years after their founding, compared to 39% of religious communes. He found one master variable: the number of costly sacrifices that each commune demanded from its members. It was things like giving up alcohol and tobacco, fasting for days at a time, conforming to a communal dress code or hairstyle, or cutting ties with outsiders. For religious communes, the effect was perfectly linear: the more sacrifice a commune demanded, the longer it lasted. But Sosis was surprised to discover that demands for sacrifice did not help secular communes. Most of them failed within eight years, and there was no correlation between sacrifice and longevity.
Jonathan Haidt, quoted by Scott Alexander in “List Of The Passages I Highlighted In My Copy Of Jonathan Haidt’s The Righteous Mind“, Slate Star Codex, 2014-06-12.
December 21, 2015
QotD: Witches
It turns out that witchcraft beliefs arise in surprisingly similar forms in many parts of the world, which suggests either that there really are witches or (more likely) that there’s something about human minds that often generates this cultural institution. The Azande believed that witches were just as likely to be men as women, and the fear of being called a witch made the Azande careful not to make their neighbors angry or envious. That was my first hint that groups create supernatural beings not to explain the universe but to order their societies.
Jonathan Haidt, quoted by Scott Alexander in “List Of The Passages I Highlighted In My Copy Of Jonathan Haidt’s The Righteous Mind“, Slate Star Codex, 2014-06-12.
November 24, 2015
QotD: The real lack of diversity issue
As an editor, I have the privilege of working with all sorts of interesting and influential Canadians. On paper, many of these people are “diverse” — men, women, black, white, straight, gay, trans, cis, Jew, Christian, Hindu, Muslim. Yet scratch the surface, and you find a remarkable sameness to our intellectual, cultural, and political elites, no matter what words they use to self identify. In most cases, they grow up middle-class or wealthier, attend the same good schools, and join the same high-value social networks. They have nice teeth because mom and dad pay for braces, and hit a nice forehand (or three iron) because mom and dad pay for lessons. They know the best patisseries in Paris, because of that epic backpacking trip between undergrad and law school. And as ambitious young adults, they feel okay about ditching the law-firm grind for a prominent life in politics, art, journalism or activism — because a wealthy parent or spouse is paying the mortgage.
We rightly worry about how many women, or blacks, or First Nations individuals are represented in public life. Yet that concern is rarely extended to people whose marginalization cannot be reduced to tidy demographic categories.
In two decades of journalism, I have written and edited countless articles about Canada’s criminal justice system. But never once have I, or any of my close journalistic colleagues, ever spent a night in prison. I have written and edited countless articles about the Canadian military. But never once have I, or any of my close journalistic colleagues, witnessed the hell of war. Nor, to my knowledge, have I ever had a close colleague who lived in public housing; who experienced real hunger; who suffered from a serious health condition that went untreated for economic reasons; whose career or education was compromised by the need to support impoverished relatives; or who had been forced to remain in an abusive relationship for purely financial reasons. We often describe people like this as living “on the margins.” But collectively, this is a vast bulk of Canadians whose hardship and anxiety are rarely witnessed by politicians and media except through survey data and think-tank reports.
Jonathan Kay, “Diversity’s Final Frontier: The real schism in our society isn’t sex or race. It’s social class”, The Walrus, 2015-11-03.
October 17, 2015
Moynihan’s scissors
David Warren looks at the work of the late Daniel Patrick Moynihan:
We are celebrating this year, if that is the word, the fiftieth anniversary of perhaps the most inconsequential sociological study ever published. That was, The Negro Family: The Case For National Action, by the brilliant American politician and thinker, Daniel Patrick Moynihan (1927–2003).
Working then in the U.S. Department of Labour, Moynihan focused his attention on a counter-intuitive statistical fact. Unemployment among black males was falling, in 1965. But rates of welfare enrollment for black families was rising. This did not make sense. The two lines on this chart had always fallen or risen together. But they had crossed over in 1962. He had put his finger in what came to be called, “Moynihan’s scissors.”
[…] while the “Moynihan Report” is famous, and at one time, everyone claimed to have read it, it contains something so obnoxious to enlightened post-modern thought as to remain invisible to all participants in the discussion.
This was Moynihan’s sociological and anthropological observation that the American black culture was becoming “matriarchal.” Whether without, or more likely with the help of welfare programmes, women were becoming the heads of households, and men were being removed from that station.
(The background: All of the higher civilizations have been unambiguously patriarchal; matriarchy is associated in the prehistoric and anthropological record with savage, gratuitously violent, self-destructive tribes.)
Already, in 1965, one in four black kids in the USA were born out of wedlock. Today it is more than three in four, and levels of bastardy among the other races have risen in course. By the end of the last century (1990s), white children were as likely to be raised in fatherless homes as black children had been in the 1960s. “Progress” has been progressing rapidly.
The Nanny State has replaced fathers as the principal source of income for such families (bankrupting itself in the process), and the feminist movement has supplied the arguments — or more precisely, misandronist slogans and vindictive clichés — for the overthrow of “patriarchy” and its systematic replacement with a shrewish matriarchy in all facets of social life. The movement has been, moreover, so successful in achieving its objects — the emasculation of men, and degradation or actual inversion of traditional morality — that it has now moved on. For with the defeat of masculinity, new horizons of “gender-bending” or “transgendering” have come into view.
Now, part of the reason people can’t get their little heads around what has actually happened — first to the black family, then to the brown, then to the white — is the surviving, basically modern (i.e. pre-post-modern) belief that eunuchs behave much like fairies; that they become docile and effeminate, harmless and nurturing, sensitive and sweet; that their previously reprehensible “masculine” traits will quietly disappear. Some men do indeed respond to emasculation by becoming the pathetic, contemptible wimps that all women, including feminists, instinctively abhor. But some do not.
As a well-read student of social sciences and history, Moynihan knew better than this. The masculine capacity for violence (at all levels, spiritual as much as physical) does not go away. From Spartan Laconia, backwards and forwards through history on all continents, we see that eunuchs and other “homosexual” (the word is inadequate) guards and soldiers have been employed by the great warrior despots. This is because they make the fiercest fighters. Having no families, no heritage to protect, no women and children to feed and shelter in safety, they become a purely destructive force. They become men who do not care even for their own lives, let alone for the lives of others.
September 9, 2015
The rise of victimhood culture
Ronald Bailey thinks the rise of microaggression-awareness is a symptom of a decline in dignity culture and a sign of the coming of a new victimhood-based culture, and that it’s a really bad development:
Over at the Righteous Mind blog, New York University moral psychologist Jonathan Haidt is signposting a fascinating article, “Microaggression and Moral Cultures,” by two sociologists in the journal Comparative Sociology. The argument in the article is that U.S. society is in the midst of a large-scale moral change in which we are experiencing the emergence of a victimhood culture that is distinct from the honor cultures and dignity cultures of the past. If true, this bodes really bad for future social and political peace.
In honor cultures, people (men) maintained their honor by responding to insults, slights, violations of rights by self-help violence. Generally honor cultures exist where the rule of law is weak. In honor cultures, people protected themselves, their families, and property through having a reputation for swift violence. During the 19th century, most Western societies began the moral transition toward dignity cultures in which all citizens were legally endowed with equal rights. In such societies, persons, property, and rights are defended by recourse to third parties, usually courts, police, and so forth, that, if necessary, wield violence on their behalf. Dignity cultures practice tolerance and are much more peaceful than honor cultures.
Sociologists Bradley Campbell and Jason Manning are arguing that the U.S. is now transitioning to a victimhood culture that combines both the honor culture’s quickness to take offense with the dignity culture’s use of third parties to police and punish transgressions. The result is people are encouraged to think of themselves as weak, marginalized, and oppressed. This is nothing less than demoralizing and polarizing as everybody seeks to become a “victim.”
August 29, 2015
We need a new publication called The Journal of Successfully Reproduced Results
We depend on scientific studies to provide us with valid information on so many different aspects of life … it’d be nice to know that the results of those studies actually hold up to scrutiny:
One of the bedrock assumptions of science is that for a study’s results to be valid, other researchers should be able to reproduce the study and reach the same conclusions. The ability to successfully reproduce a study and find the same results is, as much as anything, how we know that its findings are true, rather than a one-off result.
This seems obvious, but in practice, a lot more work goes into original studies designed to create interesting conclusions than into the rather less interesting work of reproducing studies that have already been done to see whether their results hold up.
Everyone wants to be part of the effort to identify new and interesting results, not the more mundane (and yet potentially career-endangering) work of reproducing the results of older studies:
Why is psychology research (and, it seems likely, social science research generally) so stuffed with dubious results? Let me suggest three likely reasons:
A bias towards research that is not only new but interesting: An interesting, counterintuitive finding that appears to come from good, solid scientific investigation gets a researcher more media coverage, more attention, more fame both inside and outside of the field. A boring and obvious result, or no result, on the other hand, even if investigated honestly and rigorously, usually does little for a researcher’s reputation. The career path for academic researchers, especially in social science, is paved with interesting but hard to replicate findings. (In a clever way, the Reproducibility Project gets around this issue by coming up with the really interesting result that lots of psychology studies have problems.)
An institutional bias against checking the work of others: This is the flipside of the first factor: Senior social science researchers often actively warn their younger colleagues — who are in many cases the best positioned to check older work—against investigating the work of established members of the field. As one psychology professor from the University of Southern California grouses to the Times, “There’s no doubt replication is important, but it’s often just an attack, a vigilante exercise.”
[…]
Small, unrepresentative sample sizes: In general, social science experiments tend to work with fairly small sample sizes — often just a few dozen people who are meant to stand in for everyone else. Researchers often have a hard time putting together truly representative samples, so they work with subjects they can access, which in a lot of cases means college students.
A couple of years ago, I linked to a story about the problem of using western university students as the default source of your statistical sample for psychological and sociological studies:
A notion that’s popped up several times in the last couple of months is that the easy access to willing test subjects (university students) introduces a strong bias to a lot of the tests, yet until recently the majority of studies disregarded the possibility that their test results were unrepresentative of the general population.
February 21, 2015
QotD: Campbell’s Law
The most common problem is that all these new systems — metrics, algorithms, automated decisionmaking processes — result in humans gaming the system in rational but often unpredictable ways. Sociologist Donald T. Campbell noted this dynamic back in the ’70s, when he articulated what’s come to be known as Campbell’s law: “The more any quantitative social indicator is used for social decision-making,” he wrote, “the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.”
On a managerial level, once the quants come into an industry and disrupt it, they often don’t know when to stop. They tend not to have decades of institutional knowledge about the field in which they have found themselves. And once they’re empowered, quants tend to create systems that favor something pretty close to cheating. As soon as managers pick a numerical metric as a way to measure whether they’re achieving their desired outcome, everybody starts maximizing that metric rather than doing the rest of their job — just as Campbell’s law predicts.
Felix Salmon, “Why Quants Don’t Know Everything”, Wired, 2014-01-14
February 19, 2015
When you “believe” in science
In last week’s Goldberg File newsletter, Jonah Goldberg looked at the odd situation of people who “believe in” science:
When I hear people talk about science as if it’s something to “believe in,” particularly people who reject all sorts of science-y things (vaccines, nuclear power, etc. as discussed above), I immediately think of one of my favorite lines from Eric Voegelin: “When God is invisible behind the world, the contents of the world will become new gods; when the symbols of transcendent religiosity are banned, new symbols develop from the inner-worldly language of science to take their place.” This will be true, he added, even when “the new apocalyptics insist that the symbols they create are scientific.”
In other words, the “Don’t you believe in evolution!?!” people don’t really believe in science qua science, what they’re really after is dethroning God in favor of their own gods of the material world (though I suspect many don’t even realize why they’re so obsessed with this one facet of the disco ball called “science”). “Criticism of religion is the prerequisite of all criticisms,” quoth Karl Marx, who then proceeded to create his own secular religion.
This is nothing new of course. This tendency is one of the reasons why every time Moses turned his back on the Hebrews they started worshipping golden calves and whatnot.
At least Auguste Comte, the French philosopher who coined the phrase “sociology,” was open about what he was really up to when he created his “Religion of Humanity,” in which scientists, statesmen, and engineers were elevated to Saints. As I say in my column, the fight over evolution is really a fight over the moral status of man. And, if we are nothing but a few bucks worth of chemicals connected by water and electricity, than there’s really nothing holding us back from elevating “science” to divine status and in turn anointing those who claim to be its champions as our priests. It’s no coincidence that Herbert Croly was literally — not figuratively, the way Joe Biden means literally — baptized into Comte’s Religion of Humanity.
Personally, I think the effort to overthrow Darwin along with Marx and Freud is misguided. I have friends invested in that project and I agree that all sorts of terrible Malthusian and materialist crap is bound up in Darwinism. But that’s an argument for ranking out the manure, not burning down the stable.
November 17, 2014
A proposal to permanently fix the gender wage gap
Ashe Schow thinks we need to get serious about addressing this issue, and here is her proposal on how to accomplish this worthy end:
For example, if men want to go into gender studies, let them — that way, they’ll make less money and it will help close the gender gap. But women need to be kept away from such majors. Colleges and universities should in fact create separate lists of majors to give to men and women. If possible, women should not be told about any course of study that will yield lower-paying career choices in the future.
Among others, social science majors feed the gender gap. When women ask about those subjects or departments, colleges should tell them they don’t exist, or that all classes are full, except maybe the ones in economics. Even better, colleges should tell women that engineering, mathematics and finance are actually social sciences. Class rosters must then be watched carefully. If a woman somehow manages to sign up for a sociology class, she should instead be given the classroom number for a course in mechanical engineering.
When women express a desire to pursue teaching or social work jobs, they should be discouraged. In fact, college counselors should be instructed to tell them there are no such jobs available, along with some sort of plausible explanation, like: “There are no teaching jobs available anymore, because Republicans cut the budget and the government is closing all of the schools. How about a nice career in accounting?”
Women who ask too many questions should be promptly steered into a nearby organic chemistry class, because no one can remain mentally alert for too long.
Feminists who might disapprove of this proposal should first ask themselves if they would be making more money had someone forced them to become an engineer rather than an activist. Would they have avoided the misfortunes and oppression they now suffer and condemn had they pursued a more useful course of studies and ended up with a higher-paying job?