I believe the most important moment in the foreseeable future of philosophy will come when we realize that mad old Nazi bastard Heidegger had it right when he said that we are thrown into the world and must cope, and that theory-building consists of rearranging our toolkit for coping. I believe the biggest blind spot in analytical philosophy is its refusal to grapple with Heidegger’s one big insight, but that evolutionary biology coupled with Peirce offers us a way to stop being blind. I believe that when the insights of what is now called “evolutionary psychology” are truly absorbed by philosophers, many of the supposedly intractable problems of philosophy will vanish.
Eric S. Raymond, “What Do You Believe That You Cannot Prove?”, Armed and Dangerous, 2005-01-06.
June 23, 2017
QotD: Philosophy
June 18, 2017
QotD: Punishment, Coercion, and Revenge
Because I’m both both a libertarian and famous for conducting a successful propaganda campaign, libertarian activists sometimes come to me for tactical advice. During a recent email exchange, one of these criticized me for wishing (as he thought) to “punish” the Islamist enemies of the U.S. and Western civilization.
I explained that I have no desire to punish the perpetrators of 9/11; what I want is vengeance and death. Vengeance for us, death for them. Whether they experience ‘punishment’ during the process is of little or no interest to me.
My correspondent was reflecting a common confusion about the distinctions among coercion, revenge, and punishment. Coercion is intended to make another do your will instead of their own; vengeance is intended to discharge your own anger and fear. Punishment is neither of these things.
Punishment is a form of respect you pay to someone who is at least potentially a member of the web of trust that defines your ethical community. We punish ordinary criminals to deter them from repeating criminal behavior, because we believe they know what ethical behavior is and that by deterring them from crime we help them re-integrate with an ethical community they have never in any fundamental sense departed.
By contrast, we do not punish the criminally insane. We confine them and sometimes kill them for our own safety, but we do not make them suffer in an effort to deter them from insanity. Just to state the aim is to make obvious how absurd it is. Hannibal Lecter, and his all-too-real prototypes, lack the capacity to respond to punishment by re-integrating with an ethical community.
In fact, criminal psychopaths are not even potentially members of an ethical community to begin with. There is something broken or missing in them that makes participation in the web of trust impossible; perhaps the capacity to emotionally identify with other human beings, perhaps conscience, perhaps something larger and harder to name. They have other behavioral deficits, including poor impulse control, associated with subtle neurological damage. By existing, they demonstrate something most of us would rather not know; which is that there are creatures who — though they speak, and reason, and feign humanity — have nothing but evil in them.
Eric S. Raymond, “Punishment, Coercion, and Revenge”, Armed and Dangerous, 2005-07-05.
June 16, 2017
QotD: Cultural decline markers
In response to my previous post noting that the Flynn effect turns out to be a mirage, at least two respondents have suggested that average IQ has actually been falling, and have pointed to the alleged dumbing-down of politics and popular culture in the last fifty years.
I think both those respondents and the psychometricians are correct. That is, it seems to me that during my lifetime I’ve seen evidence that average IQ has risen a little, but that other traits involved in the “smart or stupid” judgment have eroded.
On the one hand, I’ve previously described the emergence of geek culture, which I take among other things as evidence that there are more bright and imaginative individuals around than there were when I was a kid. Enough of us, now, to claim a substantial slice of turf in the cultural marketplace. This good news is reinforced for me be the explosive growth of the hacker community, which today is easily a hundred times the size it was in, say, 1975 — and far larger than I ever dreamed it would be then.
On the other hand, when I compare Americans today to the country of my childhood there are ways the present comes off rather badly. We are more obese, we have shorter attention spans, our divorce rate has skyrocketed. All these and other indicators tell me that we have (on average) lost a significant part of our capacity to exert self-discipline, defer gratification, and honor contracts when the going gets tough.
To sum up, we’re brighter than we used to be, but lazier. We have more capacity, but we use less of it. Physically and mentally we are self-indulgent, flabby, unwilling to wake up from the consumer-culture dream of entitlement. We pursue happiness by means ever more elaborate and frenetic, diminishing returns long since having set in. When reality hands us a wake-up call like 9/11, too many of us react with denial and fantasy.
This is, of course, not a new complaint. Juvenal, Horace, and Petronius Arbiter wrote much the same indictment of their popular culture at the height of the Roman Empire. They were smart enough to understand, nigh on two millennia ago, that this is what happens to elites who have it easy, who aren’t tested and winnowed by war and famine and plague and poverty.
But there are important differences. One is that while decadence used to be an exclusive problem of the upper crust, we are all aristocrats now. More importantly, where the Romans believed that decadence in individuals and societies was inevitable, we know (because we’ve kept records) that as individuals we are taller, stronger, healthier, longer-lived and more intelligent than our ancestors — that, in fact, we have reaped large gains merely within the last century.
We have more capacity, but we use less of it. And, really, is it any surprise? Our schools are abandoning truth for left-wing bullshit about multiculturalism and right-wing bullshit about “intelligent design”. Our politics has become a wasteland of rhetorical assassinations in which nobody but the fringe crazies believe even their own slogans any more. Our cultural environment has become inward-turned, obsessed with petty intramural squabbles, clogged with cant. Juvenal would find it all quite familiar.
Eric S. Raymond, “People Getting Brighter, Culture Getting Dimmer”, Armed and Dangerous, 2005-08-28.
June 10, 2017
QotD: Quoting and mis-quoting Orwell
The interpretation of George Orwell could be a paradigm for how dead literary figures get knocked from pillar to post by the winds of political interpretation. During his lifetime, the author of 1984 and Animal Farm went from darling of the left to exile for having been willing to write the truth about Communist totalitarianism in allegories too pointed to ignore.
With the end of the Cold War, forty-two years after Orwell’s death, the poisonous fog breathed on Western intellectual life by Soviet agents of influence slowly began to lift. It became possible to say that Communist totalitarianism was evil and had always been evil, without being dismissed as a McCarthyite or reactionary not merely by those agents but by a lot of “no enemy to the left” liberal patsies who should have known better. In this climate, Orwell’s uncompromising truth-telling shone even more brightly than before. For some on the left, belated shame at their own complicity with evil transmuted itself into more adulation for Orwell, and more attempted identification with Orwell’s positions, than at any time in the previous fifty years.
Then came 9/11. Orwell’s sturdy common sense about the war against the fascisms of his day made him a model for a few thinkers of the left who realized they had arrived at another of Marx’s “world-historical moments”, another pivot point at which everything changed. Foremost among these was Christopher Hitchens, who would use Orwell to good effect in taking an eloquent and forceful line in favor of the liberation of Afghanistan and Iraq. For this, he was rewarded with the same vituperation and shunning by the Left that had greeted the publication of Orwell’s anti-totalitarian allegories fifty years before.
Eric S. Raymond, “Getting Orwell Wrong”, Armed and Dangerous, 2005-08-29.
June 3, 2017
QotD: Gay, Lesbian, and Bisexual
Fascinating. This NYT article bears out a suspicion I’ve held for a long time about the plasticity of sexual orientation. The crude one-sentence summary is that, if you go by physiological arousal reactions, male bisexuality doesn’t exist, while female bisexuality is ubiquitous.
I’ve spent most of my social time for the last thirty years around science fiction fans, neopagans, and polyamorists — three overlapping groups of people not exactly noted for either sexual inhibitions or reluctance to explore sexual roles that don’t fit the neat typologies of the mainstream culture. And there are a couple of things it’s hard not to notice about them:
First, a huge majority of the women in these cultures are bisexual. To the point where I just assume any female I meet in these contexts is bi. This reality is only slightly obscured by the fact that many of these women describe themselves and are socially viewed by others as ‘straight’, even as they engage in sexual play with each other during group scenes with every evidence of enjoyment. In fact, in these cultures the operational definition of ‘straight female’ seems to be one who has recreational but not relational/romantic sex with other women.
Second, this pattern is absolutely not mirrored in their male peers. Even in these uninhibited subcultures, homoerotic behavior involving self-described ‘straight’ men is rare and surprising. Such homeoeroticism as does go on is almost all self-describedly gay men fucking other self-describedly gay men; bisexuality in men, while an accepted and un-tabooed orientation, is actually less common than gayness and not considered quite normal by anybody. The contrast with everybody’s matter-of-fact acceptance of female bisexual behavior is extreme.
It is also an observable fact that many women in these cultures change either their sexual orientation or their sexual presentation over time, but that this is seldom true of men. That is, a woman may move from being sexually involved mostly with other women to being mostly involved with men, and back, several times during her adolescent and adult lifetime; nobody considers this surprising and it doesn’t involve much of a change in either self-image or social identity. Not so for men in these cultures; they tend to start out as straight or gay and stay that way, and on the unusual occasions that this changes it tends to involve a significant break in both self-image and social identity.
Eric S. Raymond, “Gayness is hard, lesbianism soft”, Armed and Dangerous, 2005-07-06.
June 1, 2017
QotD: Economics
Science may be the noblest endeavor of the human mind, but I believe (though I cannot prove) that the most crippling and dangerous kind of ignorance in the modern West is ignorance of economics, the way markets work, and the ways non-market allocation mechanisms are doomed to fail. Such economic ignorance is toxic, because it leads to insane politics and the empowerment of those whose rhetoric is altruist but whose true agenda is coercive control.
Eric S. Raymond, “What Do You Believe That You Cannot Prove?”, Armed and Dangerous, 2005-01-06.
May 29, 2017
QotD: Western intellectuals’ anti-Western bias
Much of the West’s intelligentsia is persistently in love with anything anti-Western (and especially anti-American), an infatuation that has given a great deal of aid and comfort to tyrants and terrorists in the post-9/11 world. Besides these obvious political consequences, the phenomenon Julian Benda famously called le trahison des clercs has laid waste to large swathes of the soft sciences through ideologies like deconstructionism, cultural relativism, and postmodernism.
I believe, but cannot prove, that le trahison des clercs is not a natural development of Western thought but a creation of deliberate propaganda, directly traceable to the successes of Nazi and Stalinist attempts to manipulate the climate of opinion in the early and mid-20th century. Consequently I believe that one of the most difficult and necessary tasks before us in the next half century will be to banish the influence of totalitarian nihilism from science in particular and our culture in general.
Eric S. Raymond, “What Do You Believe That You Cannot Prove?”, Armed and Dangerous, 2005-01-06.
May 24, 2017
ESR presents Open Adventure
Eric S. Raymond recently was entrusted with the original code for ADVENT, and he’s put it up on gitlab for anyone to access:
Colossal Cave Adventure was the origin of many things; the text adventure game, the dungeon-crawling D&D (computer) game, the MOO, the roguelike genre. Computer gaming as we know it would not exist without ADVENT (as it was known in its original PDP-10 incarnation).
Long ago, you might have played this game. Or maybe you’ve just heard stories about it, or vaguely know that “
xyzzy
” is a magic word, or have heard people say “You are in a maze of twisty little passages, all alike”,Though there’s a C port of the original 1977 game in the BSD game package, and the original FORTRAN sources could be found if you knew where to dig, Crowther & Woods’s final version – Adventure 2.5 from 1995 – has never been packaged for modern systems and distributed under an open-source license.
Until now, that is.
With the approval of its authors, I bring you Open Adventure. And with it some thoughts about what it means to be respectful of an important historical artifact when it happens to be software.
This is code that fully deserves to be in any museum of the great artifacts of hacker history. But there’s a very basic question about an artifact like this: should a museum preserve it in a static form as close to the original as possible, or is it more in the right spirit to encourage the folk process to continue improving the code?
Modern version control makes this question easier; you can have it both ways, keeping a pristine archival version in the history and improving it. Anyway, I think the answer to the general question is clear; if heritage code like this is relevant at all, it’s as a living and functional artifact. We respect our history and the hackers of the past best by carrying on their work and their playfulness.
May 20, 2017
QotD: Speaking (actual) truth to (actual) power
Nobody should want journalists ever to fear attacking the behavior of the U.S. military when they have actual evidence that it is wrong. Militaries are dangerous and terrible things, and a free press is a vital means of keeping them in check. It is right and proper that we make heroes of those who speak damning truths to power.
But it makes all the difference in the world when a journalist does not have actual evidence of wrongdoing. Especially when the journalist is a U.S. citizen and the claim gives aid and comfort to the declared enemies of the U.S. in wartime. Under those circumstances, such an attack is not heroic but traitorous.
I hope this is a teachable moment. Oliver Wendell Holmes observed that shouting “fire” in a crowded theater is not protected speech; if the speaker has no evidence of actual fire, the consequences to that speaker should be as dire as the risk of death by trampling he created for others. The Holmes test should be applied in politics as well.
[…]
After Vietnam and Watergate, a lot of journalists (and other people) lost the distinction between speaking truth to power and simply attacking whoever is in charge (especially any Republican in charge) on any grounds, no matter how factually baseless. Mere oppositionalism was increasingly confused with heroism even as the cultural climate made it ever less risky. Eventually we arrived at the ludicrous spectacle of multimillionaire media personalities posing as persecuted victims and wailing about the supposed crushing of dissent on national news and talk shows.
Eric S. Raymond, “Lies and Consequences”, Armed and Dangerous, 2005-02-12.
May 11, 2017
The transactional nature of “identity”
Eric S. Raymond on the rising chatter about “identity”:
These criticisms imply a theory of “identity” that is actually coherent and useful. Here it is:
Your “identity” is a set of predictive claims you assert about yourself, mostly (though not entirely) about what kinds of transactions other people can expect to engage in with you.
As an example of an exception to “mostly”, the claim “I am white” implies that I sunburn easily. But usually, an “identity” claim implies the ability and willingness to meet behavioral expectations held by other people. For example, if I describe my “identity” as “male, American, computer programmer, libertarian” I am in effect making an offer that others can expect me to need to shave daily, salute the Stars and Stripes, sling code, and argue for the Non-Aggression Principle as an ethical fundamental.
Thus, identity claims can be false (not cashed out in observed behavior) or fraudulent (intended to deceive). You don’t get to choose your identity; you get to make an offer and it’s up to others whether or not to accept.
[…]
I can anticipate several objections to this transactional account of identity. One is that is cruel and illiberal to reject an offer of “I claim identity X” if the person claiming feels that identity strongly enough. This is essentially the position of those journalists from The Hill.
To which I can only reply: you can feel an identity as a programmer as strongly as you want, but if you can’t either already sling code or are visibly working hard on repairing that deficiency, you simply don’t make the nut. Cruelty doesn’t enter into this; if I assent to your claim I assist your self-deceit, and if I repeat it I assist you in misleading or defrauding others.
It is pretty easy to see how this same analysis applies to “misgendering” people with the “wrong” pronouns. People who use the term “misgender” generally follow up with claims about the subject’s autonomy and feelings. Which is well enough, but such considerations do not justify being complicit in the deceit of others any more than they do with respect to “I am a programmer”.
A related objection is that I have stolen the concept of “identity” by transactionalizing it. That is, true “identity” is necessarily grounded not in public performance but private feelings – you are what you feel, and it’s somehow the responsibility of the rest of the world to keep up.
But…if I’m a delusional psychotic who feels I’m Napoleon, is it the world’s responsibility to keep up? If I, an overweight clumsy shortish white guy, feel that I’m a tall agile black guy under the skin, are you obligated to choose me to play basketball? Or, instead, are you justified in predicting that I can’t jump?
You can’t base “identity” on a person’s private self-beliefs and expect sane behavior to emerge any more than you can invite everyone to speak private languages and expect communication to happen.
May 8, 2017
QotD: The hidden power of language
I believe, but don’t know how to prove, a much stronger version of the Sapir-Whorf hypothesis than is currently fashionable. That is, I believe the way humans think is shaped in important ways by the linguistic categories they have available; thinking outside those categories is possible but more difficult, has higher friction costs. Accordingly, I believe that some derivation of Alfred Korzybski’s discipline of General Semantics will eventually emerge as an essential tool of the first mature human civilizations.
Eric S. Raymond, “What Do You Believe That You Cannot Prove?”, Armed and Dangerous, 2005-01-06.
May 7, 2017
QotD: The privilege of colourblindness
In Slate magazine, SF author Ursula LeGuin complains that the producers of the new Earthsea miniseries have butchered her work. One form of butchery that she zeroes in on is by casting characters who she intended to be red, brown, or black as white people.
I have mixed feelings. LeGuin has every right to be POed at how her intentions were ignored, but on the other hand my opinion of her has not been improved by learning that she intended the books as yet another wearisomely PC exercise in multiculturalism/multiracialism.
I liked those books when I read them as a teenager. I didn’t notice any character’s skin color. I would really prefer not to have had my experience of those characters retrospectively messed with by LeGuin’s insistence that the race thing is important.
Note: I am not claiming that all casting should be colorblind. I remember once watching an otherwise excellent Kenneth Branagh production of Much Ado About Nothing that was somewhat marred for me by Branagh’s insistence on casting an American black man as a Renaissance Italian lord. This was wrong in exactly the same way that casting a blue-eyed blond as Chaka Zulu or Genghis Khan would be — it’s so anti-historical that it interferes with the suspension of disbelief. Fantasy like LeGuin’s, however, doesn’t have this kind of constraint. Ged and Tenar don’t become either more or less plausible if their skin color changes.
But what really annoyed me was LeGuin’s claim that only whites have the “privilege” of being colorblind. This is wrong and tendentious in several different ways. Colorblindness is not a privilege of anyone, it’s a duty of everyone — to judge people not by the color of their skin but the content of their character, and to make race a non-issue by whatever act of will it takes. (It doesn’t take any effort at all for me.)
Eric S. Raymond, “The Racist of Earthsea”, Armed and Dangerous, 2004-12-16.
April 9, 2017
QotD: Re-assessing the pulp era’s racism
The skepticism I’m now developing about ascriptions of racism in pulp fiction really began, I think, when I learned that it had become fashionable to denigrate Rudyard Kipling’s Kim and other India stories as racist. This is clearly sloppy thinking at work. Kim was deeply respectful of its non-European characters, especially the Pathan swashbuckler Mahbub Ali and Teshoo Lama. Indeed, the wisdom and compassion of Kipling’s lama impressed me so greatly as a child that I think it founded my lifelong interest in and sympathy with Buddhism.
But I didn’t begin thinking really critically about race in pulp fiction until I read Tarzan and the Castaways a few years ago and noticed something curious about the way Burroughs and his characters used the adjective “white” (applied to people). That is: while it appeared on the surface to be a racial distinction, it was actually a culturist one. In Burroughs’s terms of reference (at least as of 1939), “white” is actually code for “civilized”; the distinction between “civilized” and “savage” is actually more important than white/nonwhite, and non-Europeans can become constructively “white” by exhibiting civilized virtues.
Realizing this caused me to review my assumptions about racial attitudes in Burroughs’s time. I found myself asking whether the use of “white” as code for “civilized” was prejudice or pragmatism. Because there was this about Burrough’s European characters: (1) in their normal environments, the correlation between “civilized” and “white” would have been pretty strong, and (2) none of them seemed to have any trouble treating nonwhite but civilized characters with respect. In fact, in Burroughs’s fiction, fair dealing with characters who are black, brown, green, red, or gorilla-furred is the most consistent virtue of the white gentleman.
I concluded that, given the information available to a typical European in 1939, it might very well be that using “white” as code for “civilized” was pragmatically reasonable, and that the reflex we have today of ascribing all racially-correlated labels to actually racist beliefs is actually unfair to Burroughs and his characters!
Eric S. Raymond, “Reading racism into pulp fiction”, Armed and Dangerous, 2010-01-18.
February 17, 2017
QotD: The rise of the geekgirls
When I was a teenager in the 1970s, there was not yet anything you could call “geek culture”. Sure, there were bright kids fascinated by computers or math or science, kids who were often “poorly socialized” in the jargon of the day and hung together as a defensive measure; I was one of them. But we didn’t see ourselves as having a social identity or affiliation the way the jocks or surfers or hippies did. We weren’t a subculture, nor even a community; we didn’t even have a label for ourselves.
Slowly, slowly that began to change. One key event was the eruption of science fiction into pop culture that began with the first Star Wars movie in 1977. This was our stuff and we knew it, even though most of us never joined the subculture of SF fandom proper. Personal computers made another big difference after 1980; suddenly, technology was cool and sexy in a way it hadn’t been for decades, and people who were into it started to get respect rather than (or in addition to) faint or not-so-faint scorn.
You could see the trend in movies. War Games in 1983; Revenge of the Nerds in 1984; Real Genius in 1985. To kids today Revenge of the Nerds doesn’t seem remarkable, because geek culture is more secure and confident today than a lot of older tribes like bikers or hippies. But at the time, the idea that you could have an entire fraternity of geeks — an autonomous social group with reason to be proud of itself and a recognized place in the social ecology — was funny; all by itself it was a comedy premise.
The heroes of Revenge of the Nerds were people who created a fraternity of their own, who bootstrapped a niche for themselves in Grant McCracken’s culture of plenitude. The movie was an extended joke, but it described and perhaps helped create a real phenomenon.
The term ‘geek’ didn’t emerge as a common label, displacing the older and much more sporadically-used ‘nerd’, until around the time of the Internet explosion of 1993-1994. I noticed this development because I didn’t like it; I still prefer to tell people I hang out with hackers (all hackers are geeks, but not all geeks are hackers). Another index of the success of the emerging geek culture is that around that time it stopped being an almost exclusively male phenomenon.
Yes, you catch my implication. When I was growing up we didn’t have geekgirls. Even if the label ‘geek’ had been in use at the time, the idea that women could be so into computers or games or math that they would identify with and hang out with geek guys would have struck us as sheerest fantasy. Even the small minority of geek guys who were good with women (and thus had much less reason to consider them an alien species) would have found the implications of the term ‘geekgirl’ unbelievable before 1995 or so.
(There are people who cannot read an account like the above without assuming that the author is simply projecting his own social and sexual isolation onto others. For the benefit of those people, I will report here that I had good relations with women long before this was anything but rare in my peer group. This only made the isolation of my peers easier to notice.)
What changed? Several things. One is that geek guys are, on the whole, better adjusted and healthier and more presentable today than they were when I was a teenager. Kids today have trouble believing the amount of negative social pressure on intelligent people to pass as normal and boring that was typical before 1980, the situation Revenge of the Nerds satirized and inverted. It meant that the nascent geek culture of the time attracted only the most extreme geniuses and misfits — freaks, borderline autists, obsessives, and other people in reaction against the mainstream. Women generally looked at this and went “ugh!”
But over time, geeky interests became more respectable, even high-status (thanks at least in part to the public spectacle of übergeeks making millions). The whole notion of opposition to the mainstream started to seem dated as ‘mainstream’ culture gradually effloresced into dozens of tribes freakier than geeks (two words: “body piercings”). Thus we started to attract people who were more normal, in psychology if not in talent. Women noticed this. I believe it was in 1992, at a transhumanist party in California, that I first heard a woman matter-of-factly describe the Internet hacker culture as “a source of good boyfriends”. A few years after that we started to get a noticeable intake of women who wanted to become geeks themselves, as opposed to just sleeping with or living with geeks.
The loner/obsessive/perfectionist tendencies of your archetypal geek are rare in women, who are culturally encouraged (and perhaps instinct-wired) to value social support and conformity more. Thus, women entering the geek subculture was a strong sign that it had joined the set of social identities that people think of as ‘normal’. This is still a very recent development; I can’t recall the term ‘geekgirl’ being used at all before about 1998, and I don’t think it became commonly self-applied until 2000 or so.
Eric S. Raymond, “The Revenge of the Nerds is Living Well”, Armed and Dangerous, 2004-12-20.
February 14, 2017
QotD: Explaining why men tend to be slobs, but women very much don’t
The central fact that controls the the preferences of both sexes is that bearing children is difficult and dangerous for women, but fertilizing a woman is almost trivially easy for a man. Furthermore, the female investment in childbearing is front-loaded (proportionally more of the risk is before and at birth) while the male investment is back-loaded (proportionately more of the risks and costs are incurred after birth).
Moderns living in a largely disease-free environment seldom realize how cruel and pressing these differences were over most of our species history. But before modern sanitation, death in childbirth was so common that men wealthy enough to afford it expected to have several wives during their lifetimes, losing many of them to childbed fever and other complications.
Also relevant is the extremely high rate of childhood death from infectious diseases and parasites that was characteristic of premodern societies. Disease resistance in humans is highly variable and generally increases with genetic mixing (the same reason a mongrel puppy or kitten is less likely to catch a disease than a purebreed). Thus, both men and women have instincts intended to maximize genetic variety in their offspring in order to maximize the chances that some will survive to reproductive age.
Our instincts evolved to cope with these patterns of life and death. The next piece we need to understand those instincts is what physical beauty means. Recent anthropology revealing strong cross-cultural patterns in the perception of pulchritude is helpful here.
In both sexes, the most important beauty indicators include symmetrical features and a good complexion (clear skin without blemishes, warts, etc.). It turns out these are indicators of resistance to infection and parasites, especially resistance in childhood and during adolescent growth. Good hair is also a health indicator.
In men, physical signs of strength, dexterity, and agility are also favored; this reflects the value female instinctive wiring puts on male specializations in burst exertion, hunting, and warfare. In women, signs of fertility and fitness to bear are favored (healthy and generous breasts, a certain range of hip-to-waist ratios).
Men fixate on physical beauty and youth because under primitive conditions it is a leading indicator of the ability to bear and suckle children. Through most of history, plain or ugly women were bad risks for the next round of infectious diseases — and their children, carrying their genes, were too.
The last piece of the puzzle is that men and women have asymmetrical information about the parentage of their children. A woman is seldom in doubt about which children are the issue of her womb; a man, by contrast, can never be as sure which are the fruit of his seed. Thus, genetic selfishness motivates the woman in a mated pair to sacrifice more for her children than it does the man. This is why women abandon their children far less often than men do.
While women do respond to male good looks, it’s not the agenda-topper for them that it is for men. To understand why this is, it helps to know that the optimal mating strategy for a woman begins with hooking a good provider, a man who will stick around to support the kids in spite of not being as sure that he’s their father as the woman is of being their mother. Where men look for fitness to bear children, women seek the capability and willingness to raise them.
Thus, robust health and infection resistance, while desirable in a potential husband, are not the be-all and end-all. Behavior traits indicating attachment, loyalty, nurturance, and kindness are more important than a tight six-pack. Men instinctively worry about these things less because they know women are more certain of parentage and thus more tightly bonded to their children. Fitness-to-raise also means that indicators of success and social status count for more in men. Men marry health and beauty, women marry security and good prospects.
There is, however, one important exception — one circumstance under which women are just as physical, beauty-oriented, and “shallow” in their mating preferences as men. That’s when they’re cheating.
Both sexes have a genetic-diversity incentive to screw around, but it manifests in different ways. Again, the reason is parentage uncertainty. For a man, diversity tactics are simple — boff as many hot babes as possible, accepting that you don’t know which of their kids are yours and counting on stronger maternal bonding to ensure they will have at least one devoted parent around. Because a woman can be more sure of who her offspring are, her most effective diversity tactic is different — get married to a good provider and then cheat on him.
Under those circumstances, she doesn’t have to value good character in a mating partner as much; hubby, who can’t tell the kids aren’t his, will supply that. Thus the relative value of handsomeness goes up when a woman is taking a lover on the sly. Marrying the lord and screwing the gardener is an old game, and from a genetic-selfishness point of view a very effective one.
Eric S. Raymond, “A Unified Theory of Male Slobbishness and Female Preening”, Armed and Dangerous, 2005-01- 06.