If you’re unfamiliar, the Dunning-Kruger Effect is the name of a cognitive bias where people consistently rate themselves as being higher skilled than others, even (especially?) then they are decidedly not. In other words, people are nowhere near as good as they think they are.
Diametrically opposed to that is Impostor Syndrome, where people refuse to acknowledge their accomplishments and competencies.
If you’re aware of both of them, you might constantly vacillate between them, occasionally thinking you’re awesome, then realizing that it probably means you aren’t, going back and forth like a church bell. I know nothing of this, I assure you. But the point is that I think they’re almost certainly related to the people that we surround ourselves with.
Matt Simmons, “The Impostor Effect vs Dunning-Kruger”, Standalone Sysadmin, 2013-02-27.
October 23, 2014
January 30, 2014
Canadians are often found wanting in comparison to Norwegians, Swedes, Finns, or Danes in any international ranking. Except for smugness, where Canada (of course) is the undisputed world leader. But according to Michael Booth, things are not quite as wonderful in Scandinavia as we’re led to believe:
Whether it is Denmark’s happiness, its restaurants, or TV dramas; Sweden’s gender equality, crime novels and retail giants; Finland’s schools; Norway’s oil wealth and weird songs about foxes; or Iceland’s bounce-back from the financial abyss, we have an insatiable appetite for positive Nordic news stories. After decades dreaming of life among olive trees and vineyards, these days for some reason, we Brits are now projecting our need for the existence of an earthly paradise northwards.
I have contributed to the relentless Tetris shower of print columns on the wonders of Scandinavia myself over the years but now I say: enough! Nu er det nok! Enough with foraging for dinner. Enough with the impractical minimalist interiors. Enough with the envious reports on the abolition of gender-specific pronouns. Enough of the unblinking idolatry of all things knitted, bearded, rye bread-based and licorice-laced. It is time to redress the imbalance, shed a little light Beyond the Wall.
First, let’s look at Denmark, where Booth has lived for several years:
Why do the Danes score so highly on international happiness surveys? Well, they do have high levels of trust and social cohesion, and do very nicely from industrial pork products, but according to the OECD they also work fewer hours per year than most of the rest of the world. As a result, productivity is worryingly sluggish. How can they afford all those expensively foraged meals and hand-knitted woollens? Simple, the Danes also have the highest level of private debt in the world (four times as much as the Italians, to put it into context; enough to warrant a warning from the IMF), while more than half of them admit to using the black market to obtain goods and services.
Perhaps the Danes’ dirtiest secret is that, according to a 2012 report from the Worldwide Fund for Nature, they have the fourth largest per capita ecological footprint in the world. Even ahead of the US. Those offshore windmills may look impressive as you land at Kastrup, but Denmark burns an awful lot of coal. Worth bearing that in mind the next time a Dane wags her finger at your patio heater.
Okay, but how about Norway? Aren’t they doing well?
The dignity and resolve of the Norwegian people in the wake of the attacks by Anders Behring Breivik in July 2011 was deeply impressive, but in September the rightwing, anti-Islamist Progress party — of which Breivik had been an active member for many years — won 16.3% of the vote in the general election, enough to elevate it into coalition government for the first time in its history. There remains a disturbing Islamophobic sub-subculture in Norway. Ask the Danes, and they will tell you that the Norwegians are the most insular and xenophobic of all the Scandinavians, and it is true that since they came into a bit of money in the 1970s the Norwegians have become increasingly Scrooge-like, hoarding their gold, fearful of outsiders.
Finland? I’ve always gotten on famously with Finns (and Estonians), although I haven’t met all that many of them:
I am very fond of the Finns, a most pragmatic, redoubtable people with a Sahara-dry sense of humour. But would I want to live in Finland? In summer, you’ll be plagued by mosquitos, in winter, you’ll freeze — that’s assuming no one shoots you, or you don’t shoot yourself. Finland ranks third in global gun ownership behind only America and Yemen; has the highest murder rate in western Europe, double that of the UK; and by far the highest suicide rate in the Nordic countries.
The Finns are epic Friday-night bingers and alcohol is now the leading cause of death for Finnish men. “At some point in the evening around 11.30pm, people start behaving aggressively, throwing punches, wrestling,” Heikki Aittokoski, foreign editor of Helsingin Sanomat, told me. “The next day, people laugh about it. In the US, they’d have an intervention.”
If you do decide to move there, don’t expect scintillating conversation. Finland’s is a reactive, listening culture, burdened by taboos too many to mention (civil war, second world war and cold war-related, mostly). They’re not big on chat. Look up the word “reticent” in the dictionary and you won’t find a picture of an awkward Finn standing in a corner looking at his shoelaces, but you should.
“We would always prefer to be alone,” a Finnish woman once admitted to me. She worked for the tourist board.
Sweden, though, must be the one without any real serious issues, right?
Anything I say about the Swedes will pale in comparison to their own excoriating self-image. A few years ago, the Swedish Institute of Public Opinion Research asked young Swedes to describe their compatriots. The top eight adjectives they chose were: envious, stiff, industrious, nature loving, quiet, honest, dishonest, xenophobic.
I met with Åke Daun, Sweden’s most venerable ethnologist. “Swedes seem not to ‘feel as strongly’ as certain other people”, Daun writes in his excellent book, Swedish Mentality. “Swedish women try to moan as little as possible during childbirth and they often ask, when it is all over, whether they screamed very much. They are very pleased to be told they did not.” Apparently, crying at funerals is frowned upon and “remembered long afterwards”. The Swedes are, he says, “highly adept at insulating themselves from each other”. They will do anything to avoid sharing a lift with a stranger, as I found out during a day-long experiment behaving as un-Swedishly as possible in Stockholm.
H/T to Kathy Shaidle (via Facebook) for the link.
January 19, 2014
There’s been some noise made about how the “reality TV” show 16 and Pregnant has influenced teens to such a degree that the teenage pregnancy rate dropped by a significant figure. Nick Gillespie has a few questions about the claims:
Television: Is there anything it can’t do?
After decades of being slammed by bluenoses, bureaucrats, and Bruce Springsteen for sexing up and dumbing down the masses, it turns out that the small screen has accomplished what no amount of promise rings, Twilight movies, or mandatory banana-on-a-condom classes have managed to do: reduce the number of teenage births.
At least that’s what the authors of a widely discussed new study say. In “Media Influences on Social Outcomes: The Impact of MTV’s 16 and Pregnant on Teen Childbearing,” (available online for the low, low price of $5.00 from the National Bureau of Economic Research, economists Melissa S. Kearney (University of Maryland) and Phillip B. Levine (Wellesley College) write “The introduction of 16 and Pregnant along with its partner shows, Teen Mom and Teen Mom 2, led teens to noticeably reduce the rate at which they give birth.” According to their calculations, the shows are responsible for “a 5.7 percent reduction in teen births in the 18 months following [their] introduction.”
The study is far less interesting for the specific claims it makes about teen birth rates than it is as a variation on persistent attitudes toward cultural production and consumption redolent of Frankfurt School anxieties over media’s impact on the proletariat. In many ways, “Media Influences on Social Outcomes” is simply the latest echo of the idea that TV, music, movies, novels, and the like don’t simply move audiences to laughter, tears, or contemplation but compel them to act in particular ways.
In other words, we’re all just mindless, easily brainwashed dupes who are being programmed by our media.
In more doctrinaire versions of Frankfurt School analysis, the producers of content are drivers and audience members are, well, just passengers along for the ride. To their credit, Kearney and Levine aren’t nearly so deterministic, even though they are quick to ascribe causative power to a particular set of programs.
In 2002’s Is Art Good for Us?, University of Tulsa professor Joli Jensen refers to this sort of thinking as an “instrumental view of culture.” It presumes “that art is an instrument like medicine or a toxin that can be injected into us and transform us.” This view, says Jensen, “is very tempting because if certain kinds of culture cause bad things in society, then you can change that culture and fix society.” The instrumental view implies formal or informal commissars that must oversee and direct cultural production, making sure more “good” art is made. After all, you are what you read, or watch, or hear. Morally suspect art leads to crime, chaos, and bad behavior.
January 18, 2014
Why do so many people believe in a god? Dennett’s Breaking the Spell is an attempt to examine that question, for Christian fundamentalists, Islamic teachers, Buddhist monks, atheists, and others. He begins by pointing to the commonality of pre-scientific answers in groups of people: “How do thunderstorms happen?” answered by “It must be someone up there with a giant hammer” (our example, not his). Then, probably after a minimum of discussion, a name such as “Thor” becomes agreed. Having successfully sorted out thunderstorms, in the sense that you now have an agreed answer to why they happen, other forces of nature are similarly identified and named. Soon you have a pantheon, a community of gods to blame everything on. It’s very satisfying when everyone around you agrees, so the pantheon soon becomes the accepted wisdom, and few question it. In some cultures, few dare to question it, because there are penalties if you do.
Terry Pratchett, Ian Stewart, & Jack Cohen, “Disbelief System”, The Science of Discworld IV: Judgement Day, 2013.
December 29, 2013
People in all cultures grow up and acquire a set of beliefs. One way of looking at this is to call the beliefs that are inherited “memes”. Just as “genes” code for hereditary traits, so memes are intended to show the inheritance of individual items, rather than a whole belief system. A tune like “Happy Birthday”, a concept like Father Christmas, atom, bicycle, or fairy — all are memes. A whole slew of memes that forms an interacting whole is called a memeplex, and religions are the best examples, which at various times and in various cultures have had, or still do have, many linked-up memes like “There is Heaven and there is Hell …” and “Unless you pray to this God you’ll go to Hell” and “You must kill those who don’t believe in this …” and so on. You will have some familiarity with other religions, and you will appreciate that we’re not saying that your religion is like that. It’s all the others, the mistaken ones …
Terry Pratchett, Ian Stewart, & Jack Cohen, “Disbelief System”, The Science of Discworld IV: Judgement Day, 2013.
December 17, 2013
Writing in Time, Camille Paglia tries to counter some of the received wisdom of academic feminism:
If men are obsolete, then women will soon be extinct — unless we rush down that ominous Brave New World path where women clone themselves by parthenogenesis, as famously do Komodo dragons, hammerhead sharks and pit vipers.
A peevish, grudging rancor against men has been one of the most unpalatable and unjust features of second- and third-wave feminism. Men’s faults, failings and foibles have been seized on and magnified into gruesome bills of indictment. Ideologue professors at our leading universities indoctrinate impressionable undergraduates with carelessly fact-free theories alleging that gender is an arbitrary, oppressive fiction with no basis in biology.
Is it any wonder that so many high-achieving young women, despite all the happy talk about their academic success, find themselves in the early stages of their careers in chronic uncertainty or anxiety about their prospects for an emotionally fulfilled private life? When an educated culture routinely denigrates masculinity and manhood, then women will be perpetually stuck with boys, who have no incentive to mature or to honor their commitments. And without strong men as models to either embrace or (for dissident lesbians) to resist, women will never attain a centered and profound sense of themselves as women.
From my long observation, which predates the sexual revolution, this remains a serious problem afflicting Anglo-American society, with its Puritan residue. In France, Italy, Spain, Latin America and Brazil, in contrast, many ambitious professional women seem to have found a formula for asserting power and authority in the workplace while still projecting sexual allure and even glamour. This is the true feminine mystique, which cannot be taught but flows from an instinctive recognition of sexual differences. In today’s punitive atmosphere of sentimental propaganda about gender, the sexual imagination has understandably fled into the alternate world of online pornography, where the rude but exhilarating forces of primitive nature rollick unconstrained by religious or feminist moralism.
October 1, 2013
I linked to an entertaining rant by Ace last week that talked about the “nummification” of modern life. At risk of being identified with the “get off my lawn you [26-year-old] kids” bracket, here’s another tale of western society’s almost complete flight from adulthood by Christopher Taylor:
But the culture has become a bit too childish and cutesy for me. If you look around you can see what’s happening easily enough. Adam Carolla recently went on a rant about Starbucks “coffee” and how childish its all become. I won’t link it here because it gets pretty foul and sexualized, but the basic gist is this: you didn’t have a coffee before work, you had a shake. That Caramel Moccachino with whipped cream and sprinkles on top wasn’t a coffee, it was candy in a cup.
You can extend this further. I saw an ad recently on TV for adult vitamins, clearly targeted at men. The selling point? They’re gummy vitamins. Multi-Vites! They’re chewable and sweet! Take a few of those in the morning before your coffee shake. And for lunch? A “power bar” which is a candy bar with vitamins in it.
This isn’t adult behavior, its Halloween all day long. Remember when you were 11 and mom wouldn’t let you gorge yourself out of the plastic pumpkin bucket you filled on Halloween night? And you kicked the side of the bed vowing that when you grew up you’d eat all the candy you wanted?
You’re supposed to grow out of that stage.
I’ve written about the annoyance of frat boy culture here many times, where men are perpetually the party boy they imagined themselves being in college. Never grow up, never get serious, always avoid responsibility. Your hair getting gray? Return it to your “natural” color with dye! Hey, idiot, gray is your natural color. Put away the Viagra, you’re old. Deal with it.
Except that’s not even the problem any more. We’re being told that adolescence now extends to age 25 by sociologists. Yes, I know sociology is about as much science as astrology, but this isn’t a suggestion, its a diagnosis.
Taylor also links to this BBC News Magazine article from last week, which advances the notion that expecting young people to become adults at 18 or even 25 is no longer realistic:
Frank Furedi, professor of sociology at the University of Kent, says we have infantilised young people and this has led to a growing number of young men and women in their late 20s still living at home.
“Often it’s claimed it’s for economic reasons, but actually it’s not really for that,” says Furedi. “There is a loss of the aspiration for independence and striking out on your own. When I went to university it would have been a social death to have been seen with your parents, whereas now it’s the norm.
“So you have this kind of cultural shift which basically means that adolescence extends into your late twenties and that can hamper you in all kinds of ways, and I think what psychology does is it inadvertently reinforces that kind of passivity and powerlessness and immaturity and normalises that.”
Furedi says that this infantilised culture has intensified a sense of “passive dependence” which can lead to difficulties in conducting mature adult relationships. There’s evidence of this culture even in our viewing preferences.
“There’s an increasing number of adults who are watching children’s movies in the cinema,” says Furedi. “If you look at children’s TV channels in America, 25% of the viewers are adults rather than children.”
He does not agree that the modern world is far more difficult for young people to navigate.
“I think that what it is, is not that the world has become crueller, it’s just that we hold our children back from a very early age. When they’re 11, 12, 13 we don’t let them out on their own. When they’re 14, 15, we hover all over them and insulate them from real-life experience. We treat university students the way we used to treat school pupils, so I think it’s that type of cumulative effect of infantilisation which is responsible for this.”
September 26, 2013
I think Ace is making a good point here … modern culture is being retuned to a younger, less adult-oriented default:
A moccachino, topped with lots of nummy whipped cream, is not a sophisticated taste. We emerge from the womb craving the sweetness of sugar, after all.
Again, it’s one thing to indulge in a treat. But it’s another thing to decide to simply revert to one’s childhood self.
Now when he was on this rant, I thought he was full of shit and just being annoyed because Being Annoyed is how Adam Carolla makes his rent.
He also, I’m sure, went off on his typical rant about adult men watching Super Hero Movies, which does in fact hurt my butt. And I’m sure he connected that to the New Nummy.
We are indeed becoming a more childlike people. We are more and more shirking the expected obligations of adulthood, such as marriage and procreation, and even more basically, we’re rejecting the obligation of adults to actually think, in terms of numbers, and of best outcomes, and so forth.
The national mode of thinking is now Nummy. “We” — and by we I mean Americans, not “we” meaning us here right now — increasingly think in terms of cute, and easy, and glib, and dumb, and fun.
Why, Yes, actually. Because having all of your trivial cultural preferences flattered by impersonal corporations at every turn is itself Very Nummy Indeed. All little girls want to be told that they’re the Best and Prettiest Little Girl there is, and all little boys want to be told they will play for the Yankees when they Get Big.
To have one’s head patted and cheeks pinched by Admiring Grown Ups at all possible times is the Nummiest Nummy Thing there is.
Now I have to caveat this: Prior to Tweener Girls becoming the default National Tastemakers, our national culture was determined by the tastes of 19 year old boys, per the Zanuck Postulate.*
So this isn’t just a sexist thing. It’s about losing at least those seven years of maturation, too.
We are drowning in nostalgia and crushing debt and we can’t see the latter because we’ve checked out into our Happy Place to chase the former.
I can’t blame the White House or BuzzFeed for these trends. They’re pushers, but they didn’t create the sad addiction. This stuff works in America.
But why? Why does it work?
When did we all check out of adulthood to revert to tweenerhood? And when did we stop thinking that might be a little indulgent and shameful?
August 31, 2013
Coyote Blog links to a Daily Mail article on the woman who wants to run your life (and Obama wants to help her):
I am a bit late on this, but like most libertarians I was horrified by this article in the Mail Online about Obama Administration efforts to nudge us all into “good” behavior. This is the person, Maya Shankar, who wants to substitute her decision-making priorities for your own [...]
If the notion — that a 20-something person who has apparently never held a job in the productive economy is telling you she knows better what is good for you — is not absurd on its face, here are a few other reasons to distrust this plan.
- Proponents first, second, and third argument for doing this kind of thing is that it is all based on “science”. But a lot of the so-called science is total crap. Medical literature is filled with false panics that are eventually retracted. And most social science findings are frankly garbage. If you have some behavior you want to nudge, and you give a university a nice grant, I can guarantee you that you can get a study supporting whatever behavior you want to foster or curtail. Just look at the number of public universities in corn-growing states that manage to find justifications for ethanol subsidies. Recycling is a great example, mentioned several times in the article. Research supports the sensibility of recycling aluminum and steel, but says that recycling glass and plastic and paper are either worthless or cost more in resources than they save. But nudgers never-the-less push for recycling of all this stuff. Nudging quickly starts looking more like religion than science.
- The 300 million people in this country have 300 million different sets of priorities and personal circumstances. It is the worst hubris to think that one can make one decision that is correct for everyone. Name any supposedly short-sighted behavior — say, not getting health insurance when one is young — and I can name numerous circumstances where this is a perfectly valid choice and risk to take.
August 18, 2013
Charles Stross points out that there’s been a vast change in the working world that the NSA and other acronyms didn’t see coming and haven’t prepared themselves to face:
The big government/civil service agencies are old. They’re products of the 20th century, and they are used to running their human resources and internal security processes as if they’re still living in the days of the “job for life” culture; potential spooks-to-be were tapped early (often while at school or university), vetted, then given a safe sinecure along with regular monitoring to ensure they stayed on the straight-and-narrow all the way to the gold watch and pension. Because that’s how we all used to work, at least if we were civil servants or white collar paper pushers back in the 1950s.
Let’s leave aside the prognostications of sociologists about over-broad cultural traits of an entire generation. The key facts are: Generation X’s parents expected a job for life, but with few exceptions Gen Xers never had that — they’re used to nomadic employment, hire-and-fire, right-to-work laws, the whole nine yards of organized-labour deracination. Gen Y’s parents are Gen X. Gen Y has never thought of jobs as permanent things. Gen Y will stare at you blankly if you talk about loyalty to their employer; the old feudal arrangement (“we’ll give you a job for life and look after you as long as you look out for the Organization”) is something their grandparents maybe ranted about, but it’s about as real as the divine right of kings. Employers are alien hive-mind colony intelligences who will fuck you over for the bottom line on the quarterly balance sheet. They’ll give you a laptop and tell you to hot-desk or work at home so that they can save money on office floorspace and furniture. They’ll dangle the offer of a permanent job over your head but keep you on a zero-hours contract for as long as is convenient. This is the world they grew up in: this is the world that defines their expectations.
To Gen X, a job for life with the NSA was a probably-impossible dream — it’s what their parents told them to expect, but few of their number achieved. To Gen Y the idea of a job for life is ludicrous and/or impossible.
This means the NSA and their fellow swimmers in the acronym soup of the intelligence-industrial complex are increasingly reliant on nomadic contractor employees, and increasingly subject to staff churn. There is an emerging need to security-clear vast numbers of temporary/transient workers … and workers with no intrinsic sense of loyalty to the organization. For the time being, security clearance is carried out by other contractor organizations that specialize in human resource management, but even they are subject to the same problem: Quis custodiet ipsos custodes?
July 14, 2013
At Samizdata, Natalie Solent had a rather strong reaction to an unwanted form of contact the other day:
Discussion point: the ethical issues surrounding unsolicited sales phone calls
Is it better to just hang them or should we draw and quarter first?
A few days on, and she’s a bit more philosophical about it:
Before being overwhelmed by phone-induced homicidal rage the other day, I had intended to discuss a subject that has been interesting me lately, namely how difficult it is to specify in advance rules for social interaction. More specifically, I was pondering how hard it is to lay down rules for dealing with unwanted contact. Cold calling is one form of that; what are traditionally described as “unwanted advances” are another.
The problem is that word “unwanted”. To say, as the organisational psychologist quoted in this article does, that “An unwanted advance is a form of injustice”, strikes me as unfair. We are not telepaths. Quite often the only way one can find out that unwanted contact is unwanted is to ask, that is, to initiate unwanted contact. On the other hand while we may not have telepathy, we most of us do have empathy to help us guess in advance when advances might be unwelcome. Phone sales companies know to the fifth decimal place exactly how likely their calls are to be welcome. They know that the first four of those decimal places are filled by zeros, scumbags that they are. Few men asking a woman out have quite such a large database of prior results upon which to draw. I’m glad I’m not a guy! That last breath before you open your mouth to begin the sentence that might get you rejected cruelly or rejected kindly must be painful.
July 8, 2013
In sp!ked, Brendan O’Neill discusses the unlikely comeback of “fate”:
Fate is making a comeback. The idea that a human being’s fortunes are shaped by forces beyond his control is returning, zombie-like, from the graveyard of bad historical ideas. The notion that a man’s character and destiny are determined for him rather than by him is back in fashion, after 500-odd years of having been criticised and ridiculed by humanist thinkers.
Of course, we’re far too sophisticated these days actually to use the f-word, fate. We don’t talk about a god called Fortuna, as the Romans did, believing that this blind, mysterious creature decided people’s fates with the spin of a wheel. Unlike long-gone Norse communities we don’t believe in goddesses called Norns, who would attend the birth of every child to determine his or her future. No, today we use scientific terms to argue that people’s fortunes are determined by higher powers than their little, insignificant selves.
We use and abuse neuroscience to claim certain people are ‘born this way’. We claim evolutionary psychology explains why people behave and think the way they do. We use phrases like ‘weather of mass destruction’, in place of ‘gods’, to push the idea that mankind is a little thing battered by awesome, destiny-determining forces. Fate has been brought back from the dead and she’s been dolled up in pseudoscientific rags.
[. . .]
It’s hard to overstate what a radical idea this was at the tailend of the Dark Ages. It’s this idea that gives rise to the concept of free will, to the concept of personality even. And it was an idea carried through to the Enlightenment and on to the humanist liberalism of the nineteenth and early twentieth centuries. In the words of the greatest liberal, John Stuart Mill, it is incumbent upon the individual to never ‘let the world, or his portion of it, choose his plan of life for him’.
But today, in our downbeat era that bears a bit of a passing resemblance to the Dark Ages, we’re turning the clock back on this idea. We’re rewinding the historic breakthroughs of the Renaissance and Enlightenment, and we’re breathing life back into the fantasy of fate. Ours is an era jampacked with deterministic theories, claims that human beings are like amoeba in a Petri dish being prodded and shaped by various forces. But the new determinism isn’t religious or supernatural, as it was in the pre-Enlightened era — it’s scientific determinism, or rather pseudo-scientific determinism.
May 3, 2013
An interesting take from Jonah Goldberg:
Is the American body politic suffering from an autoimmune disease?
The “hygiene hypothesis” is the scientific theory that the rise in asthma and other autoimmune maladies stems from the fact that babies are born into environments that are too clean. Our immune systems need to be properly educated by being exposed early to germs, dirt, whatever. When you consider that for most of human evolutionary history, we were born under shady trees or, if we were lucky, in caves or huts, you can understand how unnatural Lysol-soaked hospitals and microbially baby-proofed homes are. The point is that growing up in a sanitary environment might cause our immune systems to freak out about things that under normal circumstances we’d just shrug off.
Hence, goes the theory, the explosion in asthma rates in the industrialized world, the rise in peanut and wheat allergies and, quite possibly, the spike in autism rates. There’s also a puzzling explosion in autoimmune diseases. That’s where the body attacks healthy organs or tissues as if they were deadly invaders.
Which brings me to my point. If you think of bigotry as a germ or some other infectious disease vector, we live in an amazingly sanitized society. That’s not to say it doesn’t exist, of course. And we can all debate how prevalent it is later.
My point is that the institutions — the organs of the body politic — that are the most obsessed with eradicating bigotry (as liberals define it) tend to be the places that have to worry about it the least. The Democratic party is consumed with institutionalized angst about prejudice, intolerance, and bigotry in America. But the odds are that relatively few of these people (particularly those under the age of 50) have been exposed to much real racism or intolerance.
April 10, 2013
An interesting jaunt along the byways of human perception and social organization:
I think the book that taught me to ask “What if it really was like that?” systematically might have been Julian Jaynes’s The Origin of Consciousness in the Breakdown of the Bicameral Mind. Jaynes observed that Bronze Age literary sources take for granted the routine presence of god-voices in peoples’ heads. Instead of dismissing this as fantasy, he developed a theory that until around 1000BC it really was like that — humans had a bicameral consciousness in which one chamber or operating subsystem, programmed by culture, manifested to the other as the voice of God or some dominant authority figure (“my ka is the ka of the king”). Jaynes’s ideas were long dismissed as brilliant but speculative and untestable; however, some of his predictions are now being borne out by neuroimaging techniques not available when he was writing.
A recent comment on this blog pointed out that many cultures — including our own until around the time of the Industrial Revolution — constructed many of their customs around the belief that women are nigh-uncontrollably lustful creatures whose sexuality has to be restrained by strict social controls and even the amputation of the clitoris (still routine in large parts of the Islamic world). Of course today our reflex is to dismiss this as pure fantasy with no other function than keeping half the human species in perpetual subjection. But some years ago I found myself asking “What if it really was like that?”
Let’s be explicit about the underlying assumptions here and their consequences. It used to be believed (and still is over much of the planet) that a woman in her fertile period left alone with any remotely presentable man not a close relative would probably (as my commenter put it) be banging him like a barn door in five minutes. Thus, as one consequence, the extremely high value traditionally placed on physical evidence of virginity at time of marriage.
Could it really have been like that? Could it still be like that in the Islamic world and elsewhere today? One reason I think this question demands some attention is that the costs of the customs required to restrain female sexuality under this model are quite high on many levels. At minimum you have to prevent sex mixing, which is not merely unpleasant for both men and women but requires everybody to invest lots of effort in the system of control (wives and daughters cannot travel or in extreme cases even go outside without male escort, homes have to be built with zenanahs). At the extreme you find yourself mutilating the genitalia of your own daughters as they scream under the knife.