In any species that lives lives other than the solitary, brutish, and short variety, members cooperate. Cooperation is often a utility maximizing approach for basic economic reasons: if I’m well fed because I had a good hunting day, and you’re hungry because you had a bad day, a marginal calorie is worth much less to me than it is to you, so I should share some of my catch with you. This is true for two reasons: first, because if we’re kin, your future reproductive success redounds to the benefit of (some of) my genes, and second, because you might return the favor a day or a year later.
Nature, however, is better at generating frenemies than friends. A better way for me to reproduce my genes is to use a mixed strategy: helping you when it’s easy, defecting when I think I can get away with it, etc. I should ideally take food from you when offered, yet give back as little as I can get away with. I should be seen to be a good ally, and fair, and yet stab you in the back when I can get away with it.
In social species, there’s advanced technology to accomplish these goals: I can marshal alliances, vote people off the island, harass males away from fertile females, seize more than my share of the food for myself and my offspring.
It doesn’t matter if it’s nice; it matters if it’s effective. Gnon has no pity and laughs at your human ideals…especially because he created your human ideals to help you be a convincing liar in social games.
And thus deception slithered its way in to the garden of Eden and/or earthly delights.
What is the take away here? It is this: evolution has crafted every one of us for one mission: to pass our genes on to the next generation. The fact that you, or you, or you, have chosen not to have kids does not refute this; in fact, in supports this. Your genes will not be present in the next generation, and Gnon will laugh.
And what effects does this mission have on us? High libidos? Well, yes, some of that — but so much more. We’re the ape with the run away brains. Any ape that just had a high libido is long removed from the gene pool. Only the apes that also are excellent at joining alliances, marshaling allies, sniffing when the winds are changing, and defecting strategically reproduced with enough success to have contributed meaningfully to our genome.
A million years ago this alliance-making skill meant being on the right side of the alpha ape…and perhaps sneakily supporting the up-and-coming number two male.
Ten thousand years ago it meant being a member of a hunter gatherer tribe, and making status-degrading jokes about the one guy who was acting a bit big for his (deer hide) britches.
A thousand years ago, it meant … well, by a thousand years ago, social alliances for status games were starting to look pretty damned modern. It meant cobbling together wacky alliances from diverse groups like Diggers, Levelers, and Fifth Monarchists in order to overthrow one set of rulers and establish yourself in their place. Once in power there are all sorts of food-and-sex optimizing strategies for those good at the alliance game… like enslaving the foot soldiers of the old regime and selling them into slavery overseas, seizing their land, and more.
Clark, “Gamer Gate: Three Stages to Obit”, Popehat, 2014-10-21.
October 27, 2014
October 23, 2014
If you’re unfamiliar, the Dunning-Kruger Effect is the name of a cognitive bias where people consistently rate themselves as being higher skilled than others, even (especially?) then they are decidedly not. In other words, people are nowhere near as good as they think they are.
Diametrically opposed to that is Impostor Syndrome, where people refuse to acknowledge their accomplishments and competencies.
If you’re aware of both of them, you might constantly vacillate between them, occasionally thinking you’re awesome, then realizing that it probably means you aren’t, going back and forth like a church bell. I know nothing of this, I assure you. But the point is that I think they’re almost certainly related to the people that we surround ourselves with.
Matt Simmons, “The Impostor Effect vs Dunning-Kruger”, Standalone Sysadmin, 2013-02-27.
October 19, 2014
We are creatures of the sun, we men and women. We love light and life. That is why we crowd into the towns and cities, and the country grows more and more deserted every year. In the sunlight — in the daytime, when Nature is alive and busy all around us, we like the open hill-sides and the deep woods well enough: but in the night, when our Mother Earth has gone to sleep, and left us waking, oh! the world seems so lonesome, and we get frightened, like children in a silent house. Then we sit and sob, and long for the gas-lit streets, and the sound of human voices, and the answering throb of human life. We feel so helpless and so little in the great stillness, when the dark trees rustle in the night-wind. There are so many ghosts about, and their silent sighs make us feel so sad. Let us gather together in the great cities, and light huge bonfires of a million gas-jets, and shout and sing together, and feel brave.
Jerome K. Jerome, Three Men in a Boat (to say nothing of the dog), 1889.
October 1, 2014
In Time, Camille Paglia says that universities are unable to understand the real risks to young women on campus:
The gender ideology dominating academe denies that sex differences are rooted in biology and sees them instead as malleable fictions that can be revised at will. The assumption is that complaints and protests, enforced by sympathetic campus bureaucrats and government regulators, can and will fundamentally alter all men.
But extreme sex crimes like rape-murder emanate from a primitive level that even practical psychology no longer has a language for. Psychopathology, as in Richard von Krafft-Ebing’s grisly Psychopathia Sexualis (1886), was a central field in early psychoanalysis. But today’s therapy has morphed into happy talk, attitude adjustments, and pharmaceutical shortcuts.
There is a ritualistic symbolism at work in sex crime that most women do not grasp and therefore cannot arm themselves against. It is well-established that the visual faculties play a bigger role in male sexuality, which accounts for the greater male interest in pornography. The sexual stalker, who is often an alienated loser consumed with his own failures, is motivated by an atavistic hunting reflex. He is called a predator precisely because he turns his victims into prey.
Sex crime springs from fantasy, hallucination, delusion, and obsession. A random young woman becomes the scapegoat for a regressive rage against female sexual power: “You made me do this.” Academic clichés about the “commodification” of women under capitalism make little sense here: It is women’s superior biological status as magical life-creator that is profaned and annihilated by the barbarism of sex crime.
Many primitive societies believe that maleficent spirits cause all sorts of human misfortune that in the modern West we have learned to attribute to natural causes — cattle dying, crops failing, disease, drought, that sort of thing. A few societies have developed a more peculiar form of supernaturalism, in which evil spirits recede into the background and all misfortune is caused by the action of maleficent human sorcerers who must be found and rooted out to end the harm.
A society like that may be a grim, paranoid place with everyone constantly on the hunt for sorcerers — but a sorcerer can be punished or killed more easily than a spirit or a blind force of nature. Therein lies the perverse appeal of this sort of belief system, what I’ll call “sorcerism” — you may not be able to stop your cattle from dying, but at least you can find the bastard who did it and hurt him until you feel better. Maybe you can even prevent the next cattle-death. You are not powerless.
English needs, I think, a word for “beliefs which are motivated by the terror of being powerless against large threats”. I think I tripped over this in an odd place today, and it makes me wonder if our society may be talking itself into a belief system not essentially different from sorcerism.
Eric S. Raymond, “Heavy weather and bad juju”, Armed and Dangerous, 2011-02-03.
September 30, 2014
I have been living with someone from the Millennial generation for the last four years (he’s now 27) and sometimes I’m charmed and sometimes I’m exasperated by how him and his friends — as well as the Millennials I’ve met and interacted with both in person and in social media — deal with the world, and I’ve tweeted about my amusement and frustration under the banner “Generation Wuss” for a few years now. My huge generalities touch on their over-sensitivity, their insistence that they are right despite the overwhelming proof that suggests they are not, their lack of placing things within context, the overreacting, the passive-aggressive positivity, and, of course, all of this exacerbated by the meds they’ve been fed since childhood by over-protective “helicopter” parents mapping their every move. These are late-end Baby Boomers and Generation X parents who were now rebelling against their own rebelliousness because of the love they felt that they never got from their selfish narcissistic Boomer parents and who end up smothering their kids, inducing a kind of inadequate preparation in how to deal with the hardships of life and the real way the world works: people won’t like you, that person may not love you back, kids are really cruel, work sucks, it’s hard to be good at something, life is made up of failure and disappointment, you’re not talented, people suffer, people grow old, people die. And Generation Wuss responds by collapsing into sentimentality and creating victim narratives rather than acknowledging the realities of the world and grappling with them and processing them and then moving on, better prepared to navigate an often hostile or indifferent world that doesn’t care if you exist.
Brett Easton Ellis, “Generation Wuss”, Vanity Fair, 2014-09-26.
September 27, 2014
In the present instance, going back to the liver-pill circular, I had the symptoms, beyond all mistake, the chief among them being “a general disinclination to work of any kind.”
What I suffer in that way no tongue can tell. From my earliest infancy I have been a martyr to it. As a boy, the disease hardly ever left me for a day. They did not know, then, that it was my liver. Medical science was in a far less advanced state than now, and they used to put it down to laziness.
“Why, you skulking little devil, you,” they would say, “get up and do something for your living, can’t you?” — not knowing, of course, that I was ill.
And they didn’t give me pills; they gave me clumps on the side of the head. And, strange as it may appear, those clumps on the head often cured me — for the time being. I have known one clump on the head have more effect upon my liver, and make me feel more anxious to go straight away then and there, and do what was wanted to be done, without further loss of time, than a whole box of pills does now.
You know, it often is so — those simple, old-fashioned remedies are sometimes more efficacious than all the dispensary stuff.
Jerome K. Jerome, Three Men in a Boat (to say nothing of the dog), 1889.
August 25, 2014
There is a price to be paid for divorcing actions and concepts from the words that describe them. Government, and the law that undergirds it, is made up of words. Devalue the words, strip them of meaning, and you do the same thing to the concepts those words describe. Action follows Thought, and for Thought to exist there must be the Word.
This was George Orwell’s central insight when he invented Newspeak for his novel 1984. Language doesn’t just describe what we think about, and allow us to communicate with each other; in a major way, it actually determines what we think about, and how we think. We conceptualize the way we do, even in the abstract, using constructs of language — even mathematics and computer code is a kind of language. Orwell understood that the Word could actually be turned into a weapon, an invisible knife to cut away a man’s ability to think (and thus, to act). All you have to do is convince a man that the Word he’s hearing means something other than what he thought it meant … or can mean anything, really. Or nothing at all. Science, history, literature, even music — they evaporate like a puddle in the hot sun because the Words used to build them stop conveying meaning.
Words have meaning. They must have meaning, for if we are to communicate at all we must transmit meaning from one person to another. This is perhaps the most unforgivable part of the postmodernist assault on the language itself: it has weakened our ability to even describe the loss of meaning.
Monty, “In the beginning was the Word”, Ace of Spades HQ, 2014-01-27.
August 23, 2014
The second known fact is that people prefer the side of the room to the middle. This is obvious from the way a restaurant fills up. The tables along the left wall are occupied first, then those at the far end, then those along the right wall, and finally (and with reluctance) those in the middle. Such is the human revulsion to the central space that managements often despair of filling it and so create what is termed a dance floor. It will be realized that this behavior pattern could be upset by some extraneous factor, like a view of the waterfall from the end windows. If we exclude cathedrals and glaciers, the restaurant will fill up on the lines indicated, from left to right. Reluctance to occupy the central space derives from prehistoric instincts. The caveman who entered someone else’s cave was doubtful of his reception and wanted to be able to have his back to the wall and yet with some room to maneuver. In the center of the cave he felt too vulnerable. He therefore sidled round the walls of the cave, grunting and fingering his club. Modern man is seen to do much the same thing, muttering to himself and fingering his club tie. The basic trend of movement at a cocktail party is the same as in a restaurant. The tendency is toward the sides of the space, but not actually reaching the wall.
C. Northcote Parkinson, “Personality Screen, Or The Cocktail Formula”, Parkinson’s Law (and other studies in administration), 1957.
August 21, 2014
Media carries with it a credibility that is totally undeserved. You have all experienced this, in what I call the Murray Gell-Mann Amnesia effect. (I refer to it by this name because I once discussed it with Murray Gell-Mann, and by dropping a famous name I imply greater importance to myself, and to the effect, than it would otherwise have.)
Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray’s case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward — reversing cause and effect. I call these the “wet streets cause rain” stories. Paper’s full of them.
In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.
That is the Gell-Mann Amnesia effect. I’d point out it does not operate in other arenas of life. In ordinary life, if somebody consistently exaggerates or lies to you, you soon discount everything they say. In court, there is the legal doctrine of falsus in uno, falsus in omnibus, which means untruthful in one part, untruthful in all. But when it comes to the media, we believe against evidence that it is probably worth our time to read other parts of the paper. When, in fact, it almost certainly isn’t. The only possible explanation for our behavior is amnesia.
Michael Crichton, quoted in “The Gell-Mann Amnesia Effect”, Stephen Bodio’s Querencia, 2012-02-21.
August 13, 2014
Jack Schafer on the cyclical nature of the news and an explanation for certain story types growing into mythic form:
Has some wise guy flipped a switch and thrown the news into summer reruns?
Everywhere you look in your news feed is a story you’ve seen before. In northern Iraq, conquering jihadists have the Kurds calling on the United States for more help. North Korea is again stating its desire to nuke the White House. A virulent contagion abroad has Americans worrying when it will break out on our shores. And, in a rerun of a rerun, a Gaza war of tunnels, rockets, invasions, ceasefires, withdrawals, broken ceasefires, and shuttle diplomacy is claiming a record harvest of headlines.
But the periodicity of the news has another cause, as press scholar Jack Lule discovered more than a decade ago in his book Daily News, Eternal Stories. Lule proposed that the news was less a pure journalistic creation than it was the modern expression of ancient myths.
Like many all-encompassing formulas, Lule’s reduction of news into myth suffers by attempting to explain too much. But after reading his book, you can’t help but notice how many front-page stories collapse into the seven master myths he assembles (which will sound familiar to anybody who has brushed up against Joseph Campbell’s The Hero With a Thousand Faces): the victim, a casualty of randomness or a villain; the scapegoat, who is punished for straying outside the social order; the hero, who smites evil; the good mother, who “offers maternal comfort and protection”; the trickster, the rogue who disturbs the social order; the other world, typically foreign countries; and the flood, or any other disaster.
Few, if any, journalists would confess to consciously calling myths to convey the news, perhaps in part because so few of them are aware of the mythic thrust of their work. Instead, the ancient outlines express themselves spontaneously in copy, as reporters, who are usually voluminous readers, seek to infuse higher meaning to the disparate facts they’ve collected in their notebooks, even if they’re covering something as prosaic as a funeral or a legislative battle.
Few readers would confess to myth-seeking in their media choices, yet Lule makes the undeniable case that audiences prefer news when it is fashioned into something more eternal than pure information. Lule writes:
Newspaper sales, magazine circulation, television news ratings, and website traffic all surge during dramatic and sensational events: schoolyard killings, royal weddings, hurricanes, assassinations, airline crashes, and inaugurations. What are people seeking? They’re not going to use these stories to vote for a candidate. They want compelling dramas. They want satisfying stories that speak to them of history and fate and the fragility of life. They want myth.
July 27, 2014
This is an elementary and self-evident Principle. Indeed, it is so axiomatic that few examples of it will be given in these pages. The only point to stress is that it is useless to hope to obtain complete security in passive defense. It is also unsound. “He who tries to defend everything saves nothing.” declared Marshal Foch, echoing Frederick the Great. It should be noted that the very act of assuming the offensive imparts a certain degree of security. Make as if to strike a man, and he instinctively assumes a defensive attitude. As General Rowan Robinson expresses it in his Imperial Defence, “The highest form of strategic security is that obtained through the imposition of our will upon the enemy, through seizing the initiative and maintaining it by offensive action.” There may sometimes be an element of risk in this, but, as we have seen, war in its nature involves risk.
Lt. Colonel Alfred H. Burne, The Art of War on Land, 1947.
July 11, 2014
DSM-5 turns “everyday anxiety, eccentricity, forgetting and bad eating habits into mental disorders”
Helene Guldberg reviews Saving Normal: An Insider’s Revolt Against Out-of-Control Psychiatric Diagnosis, DSM-5, Big Pharma, and the Medicalization of Ordinary Life by Allen Frances.
Frances’ arguments about the dangers of inflating psychiatric conditions and psychiatric diagnosis are persuasive — maybe more so because he honestly admits to his own role in developing such an inflation. He is keenly aware of the risks of diagnostic inflation ‘because of painful firsthand experience’, he writes. ‘Despite our efforts to tame excessive diagnostic exuberance, DSM-IV had since been misused to blow up the diagnostic bubble’. He is particularly concerned about the exponential increase in the diagnosis of psychiatric conditions in children, writing: ‘We failed to predict or prevent three new false epidemics of mental disorder in children — autism, attention deficit, and childhood bipolar disorder. And we did nothing to contain the rampant diagnostic inflation that was already expanding the boundary of psychiatry far beyond its competence.’
Take Attention Deficit Hyperactivity Disorder (ADHD), which is ‘spreading like wildfire’. This diagnosis is applied so promiscuously that ‘an amazing 10 per cent of kids now qualify’, Frances writes. He points out that in the US, boys born in January are 70 per cent more likely to be diagnosed with ADHD than boys born in December. The reason diagnosing ADHD is so problematic is that it essentially is a description of immaturity, including symptoms such as ‘lack of impulse control’, ‘hyperactivity’ or ‘inattention’. Boys born in January are the youngest in their school year group (in the US) and thus they are more likely to be immature; in the UK, the youngest children in a school classroom are born in August, and so here, August-born kids are more likely to be diagnosed with ADHD. We have medicalised immaturity.
Until 1980, the DSMs were ‘deservedly obscure little books that no one much cared about or read’. DSM-I (published in 1952) and DSM-II (published in 1968) were ‘unread, unloved and unused’. Now, says Frances, this ‘bible’ of psychiatry ‘determines all sorts of important things that have an enormous impact on people’s lives — like who is considered well and who sick; what treatment is offered; who pays for it; who gets disability benefit; who is eligible for mental health, school vocational and other services; who gets to be hired for a job, can adopt a child, or pilot a plane, or qualifies for life insurance; whether a murderer is a criminal or mental patient; what should be the damages awarded in lawsuits; and much, much more’.
Today, as a result of various trends, including the impact of the DSMs, many human behaviours, quirks, eccentricities and woes which in the past would have been seen as parts of the rich tapestry of life are now branded mental disorders.
July 10, 2014
Lindsay Leigh Bentley contrasts her own “tomboy” childhood with that of Ryland, who was born female but whose parents have transitioned her (at age 5):
I have no degree in early childhood development, nor have I studied psychology. I didn’t even graduate from College.
I am also not here to pass judgement on Ryland’s parents. I believe that they are doing what they believe to be the most loving thing for their child. I’m simply sharing my story because I see so much of my 5-year-old self in this child.
I was born the second daughter to two loving, amazing, supportive parents. They would go on to have 2 more daughters. The four of us couldn’t be more different, even down to our hair and eye color. Our parents embraced our differences and allowed us to grow as individuals, not concerned with the social “norms” for girls. I often joke that I was the boy my dad never had. My dad is a free spirit, 100% unconcerned with what people think of him, and he thought nothing of “out of the box” behavior. I function more as a firstborn than a second born (however, this does not make me the firstborn, amiright?)
I wanted to be a boy. Desperately wanted to be a boy. I thought boys had more fun. I felt like a boy in the way that our society views genders. I liked blue and green more than pink and purple. I remember sitting up as high as I could climb in our huge mulberry tree, bow & arrow in hand, trying to kiss my elbow (a neighbor lady had told me that if I could accomplish this, that I would turn into a boy, which was what I wanted in that moment, as a child, more than anything.)
Thankfully, my parents didn’t adhere to the archaic stereotypes that “boys like blue” and “girls like pink;” that “boys play with dinosaurs, and girls play with dolls.” Had they told me that liking these things made me a boy, I would have concluded that I was a boy.
They just let me be me. They let me be a girl who wore jeans more often than skirts. They let me play with slingshots rather than princess wands. They didn’t conclude that I was gay, or transgender. They didn’t put me in a box that would shape my future, at the expense of my own free will.