The second known fact is that people prefer the side of the room to the middle. This is obvious from the way a restaurant fills up. The tables along the left wall are occupied first, then those at the far end, then those along the right wall, and finally (and with reluctance) those in the middle. Such is the human revulsion to the central space that managements often despair of filling it and so create what is termed a dance floor. It will be realized that this behavior pattern could be upset by some extraneous factor, like a view of the waterfall from the end windows. If we exclude cathedrals and glaciers, the restaurant will fill up on the lines indicated, from left to right. Reluctance to occupy the central space derives from prehistoric instincts. The caveman who entered someone else’s cave was doubtful of his reception and wanted to be able to have his back to the wall and yet with some room to maneuver. In the center of the cave he felt too vulnerable. He therefore sidled round the walls of the cave, grunting and fingering his club. Modern man is seen to do much the same thing, muttering to himself and fingering his club tie. The basic trend of movement at a cocktail party is the same as in a restaurant. The tendency is toward the sides of the space, but not actually reaching the wall.
C. Northcote Parkinson, “Personality Screen, Or The Cocktail Formula”, Parkinson’s Law (and other studies in administration), 1957.
August 23, 2014
August 21, 2014
Media carries with it a credibility that is totally undeserved. You have all experienced this, in what I call the Murray Gell-Mann Amnesia effect. (I refer to it by this name because I once discussed it with Murray Gell-Mann, and by dropping a famous name I imply greater importance to myself, and to the effect, than it would otherwise have.)
Briefly stated, the Gell-Mann Amnesia effect is as follows. You open the newspaper to an article on some subject you know well. In Murray’s case, physics. In mine, show business. You read the article and see the journalist has absolutely no understanding of either the facts or the issues. Often, the article is so wrong it actually presents the story backward — reversing cause and effect. I call these the “wet streets cause rain” stories. Paper’s full of them.
In any case, you read with exasperation or amusement the multiple errors in a story, and then turn the page to national or international affairs, and read as if the rest of the newspaper was somehow more accurate about Palestine than the baloney you just read. You turn the page, and forget what you know.
That is the Gell-Mann Amnesia effect. I’d point out it does not operate in other arenas of life. In ordinary life, if somebody consistently exaggerates or lies to you, you soon discount everything they say. In court, there is the legal doctrine of falsus in uno, falsus in omnibus, which means untruthful in one part, untruthful in all. But when it comes to the media, we believe against evidence that it is probably worth our time to read other parts of the paper. When, in fact, it almost certainly isn’t. The only possible explanation for our behavior is amnesia.
Michael Crichton, quoted in “The Gell-Mann Amnesia Effect”, Stephen Bodio’s Querencia, 2012-02-21.
August 13, 2014
Jack Schafer on the cyclical nature of the news and an explanation for certain story types growing into mythic form:
Has some wise guy flipped a switch and thrown the news into summer reruns?
Everywhere you look in your news feed is a story you’ve seen before. In northern Iraq, conquering jihadists have the Kurds calling on the United States for more help. North Korea is again stating its desire to nuke the White House. A virulent contagion abroad has Americans worrying when it will break out on our shores. And, in a rerun of a rerun, a Gaza war of tunnels, rockets, invasions, ceasefires, withdrawals, broken ceasefires, and shuttle diplomacy is claiming a record harvest of headlines.
But the periodicity of the news has another cause, as press scholar Jack Lule discovered more than a decade ago in his book Daily News, Eternal Stories. Lule proposed that the news was less a pure journalistic creation than it was the modern expression of ancient myths.
Like many all-encompassing formulas, Lule’s reduction of news into myth suffers by attempting to explain too much. But after reading his book, you can’t help but notice how many front-page stories collapse into the seven master myths he assembles (which will sound familiar to anybody who has brushed up against Joseph Campbell’s The Hero With a Thousand Faces): the victim, a casualty of randomness or a villain; the scapegoat, who is punished for straying outside the social order; the hero, who smites evil; the good mother, who “offers maternal comfort and protection”; the trickster, the rogue who disturbs the social order; the other world, typically foreign countries; and the flood, or any other disaster.
Few, if any, journalists would confess to consciously calling myths to convey the news, perhaps in part because so few of them are aware of the mythic thrust of their work. Instead, the ancient outlines express themselves spontaneously in copy, as reporters, who are usually voluminous readers, seek to infuse higher meaning to the disparate facts they’ve collected in their notebooks, even if they’re covering something as prosaic as a funeral or a legislative battle.
Few readers would confess to myth-seeking in their media choices, yet Lule makes the undeniable case that audiences prefer news when it is fashioned into something more eternal than pure information. Lule writes:
Newspaper sales, magazine circulation, television news ratings, and website traffic all surge during dramatic and sensational events: schoolyard killings, royal weddings, hurricanes, assassinations, airline crashes, and inaugurations. What are people seeking? They’re not going to use these stories to vote for a candidate. They want compelling dramas. They want satisfying stories that speak to them of history and fate and the fragility of life. They want myth.
July 27, 2014
This is an elementary and self-evident Principle. Indeed, it is so axiomatic that few examples of it will be given in these pages. The only point to stress is that it is useless to hope to obtain complete security in passive defense. It is also unsound. “He who tries to defend everything saves nothing.” declared Marshal Foch, echoing Frederick the Great. It should be noted that the very act of assuming the offensive imparts a certain degree of security. Make as if to strike a man, and he instinctively assumes a defensive attitude. As General Rowan Robinson expresses it in his Imperial Defence, “The highest form of strategic security is that obtained through the imposition of our will upon the enemy, through seizing the initiative and maintaining it by offensive action.” There may sometimes be an element of risk in this, but, as we have seen, war in its nature involves risk.
Lt. Colonel Alfred H. Burne, The Art of War on Land, 1947.
July 11, 2014
DSM-5 turns “everyday anxiety, eccentricity, forgetting and bad eating habits into mental disorders”
Helene Guldberg reviews Saving Normal: An Insider’s Revolt Against Out-of-Control Psychiatric Diagnosis, DSM-5, Big Pharma, and the Medicalization of Ordinary Life by Allen Frances.
Frances’ arguments about the dangers of inflating psychiatric conditions and psychiatric diagnosis are persuasive — maybe more so because he honestly admits to his own role in developing such an inflation. He is keenly aware of the risks of diagnostic inflation ‘because of painful firsthand experience’, he writes. ‘Despite our efforts to tame excessive diagnostic exuberance, DSM-IV had since been misused to blow up the diagnostic bubble’. He is particularly concerned about the exponential increase in the diagnosis of psychiatric conditions in children, writing: ‘We failed to predict or prevent three new false epidemics of mental disorder in children — autism, attention deficit, and childhood bipolar disorder. And we did nothing to contain the rampant diagnostic inflation that was already expanding the boundary of psychiatry far beyond its competence.’
Take Attention Deficit Hyperactivity Disorder (ADHD), which is ‘spreading like wildfire’. This diagnosis is applied so promiscuously that ‘an amazing 10 per cent of kids now qualify’, Frances writes. He points out that in the US, boys born in January are 70 per cent more likely to be diagnosed with ADHD than boys born in December. The reason diagnosing ADHD is so problematic is that it essentially is a description of immaturity, including symptoms such as ‘lack of impulse control’, ‘hyperactivity’ or ‘inattention’. Boys born in January are the youngest in their school year group (in the US) and thus they are more likely to be immature; in the UK, the youngest children in a school classroom are born in August, and so here, August-born kids are more likely to be diagnosed with ADHD. We have medicalised immaturity.
Until 1980, the DSMs were ‘deservedly obscure little books that no one much cared about or read’. DSM-I (published in 1952) and DSM-II (published in 1968) were ‘unread, unloved and unused’. Now, says Frances, this ‘bible’ of psychiatry ‘determines all sorts of important things that have an enormous impact on people’s lives — like who is considered well and who sick; what treatment is offered; who pays for it; who gets disability benefit; who is eligible for mental health, school vocational and other services; who gets to be hired for a job, can adopt a child, or pilot a plane, or qualifies for life insurance; whether a murderer is a criminal or mental patient; what should be the damages awarded in lawsuits; and much, much more’.
Today, as a result of various trends, including the impact of the DSMs, many human behaviours, quirks, eccentricities and woes which in the past would have been seen as parts of the rich tapestry of life are now branded mental disorders.
July 10, 2014
Lindsay Leigh Bentley contrasts her own “tomboy” childhood with that of Ryland, who was born female but whose parents have transitioned her (at age 5):
I have no degree in early childhood development, nor have I studied psychology. I didn’t even graduate from College.
I am also not here to pass judgement on Ryland’s parents. I believe that they are doing what they believe to be the most loving thing for their child. I’m simply sharing my story because I see so much of my 5-year-old self in this child.
I was born the second daughter to two loving, amazing, supportive parents. They would go on to have 2 more daughters. The four of us couldn’t be more different, even down to our hair and eye color. Our parents embraced our differences and allowed us to grow as individuals, not concerned with the social “norms” for girls. I often joke that I was the boy my dad never had. My dad is a free spirit, 100% unconcerned with what people think of him, and he thought nothing of “out of the box” behavior. I function more as a firstborn than a second born (however, this does not make me the firstborn, amiright?)
I wanted to be a boy. Desperately wanted to be a boy. I thought boys had more fun. I felt like a boy in the way that our society views genders. I liked blue and green more than pink and purple. I remember sitting up as high as I could climb in our huge mulberry tree, bow & arrow in hand, trying to kiss my elbow (a neighbor lady had told me that if I could accomplish this, that I would turn into a boy, which was what I wanted in that moment, as a child, more than anything.)
Thankfully, my parents didn’t adhere to the archaic stereotypes that “boys like blue” and “girls like pink;” that “boys play with dinosaurs, and girls play with dolls.” Had they told me that liking these things made me a boy, I would have concluded that I was a boy.
They just let me be me. They let me be a girl who wore jeans more often than skirts. They let me play with slingshots rather than princess wands. They didn’t conclude that I was gay, or transgender. They didn’t put me in a box that would shape my future, at the expense of my own free will.
July 6, 2014
At first sight the chances would appear to favor the defender; for he can remain still, he can dig, he can shoot accurately; whereas the assailant, while on the move, is dangerously exposed and can do none of these things. The latter, however has important advantages on his side. The forward rush, the excitement, a goal to win, combine to give him a moral uplift wholly lacking in the defender, who is always looking to right and left, anxious lest his flanks be turned and communications severed. The assailant, especially against a passive defense, has freedom of action and power of maneuver and can accordingly concentrate superior forces against any selected point of his adversary’s line, or where the front is not continuous against his flanks and rear.
Major-General H. Rowan Robinson, quoted in The Art of War on Land by Lt. Colonel Alfred H. Burne, 1947.
July 1, 2014
Patriotic effusions, whosever they may be, seldom please citizens of other nations, because they are generally so self-congratulatory; and self-congratulation, which is no doubt an inescapable part of the human condition, is best kept to oneself even when justified. Occasional outbursts may be acceptable, as after a triumph in a just war — but just wars are themselves infrequent events in human history. Ignorance of and disdain for others are often the corollaries of noisy patriotism; it was with good reason that Doctor Johnson said that patriotism is the last refuge of a scoundrel. Quite often it is the first refuge as well.
On the other hand, some kind of collective self-belief is necessary, for otherwise effort would be in vain and achievement impossible. A country completely without patriotism, even if this state remained implicit rather than explicit, would be an unhappy place. As in most things human, a balance must be struck.
Theodore Dalrymple, “A More Sinister Equality”, Taki’s Magazine, 2014-04-06
June 24, 2014
In (of all places) the Toronto Star, Brian Platt says that “gaydar” is a real thing and that it works better for conservatives than for liberals:
In less than the blink of an eye, your subconscious “gaydar” makes a judgment about someone’s sexual orientation based entirely on facial traits — and it’s usually right.
So says the research of Nicholas Rule, a University of Toronto psychologist giving a talk on the subject this week as part of WorldPride. “The gist of it is that people can accurately judge someone’s sexual orientation from very minimal information about them,” Rule said in an interview.
“You only need to see a face for less than 40 milliseconds to judge sexual orientation with the same level of accuracy that you get if you take all the time in the world.
“To put that in perspective, it takes 400 milliseconds to blink your eye.”
Facial “gaydar” is 65-per-cent accurate on average, according to Rule and his co-researchers at U of T’s Social Perception & Cognition Laboratory. These judgments can be reliably made based on the eyes alone, though facial shape and texture are also big factors.
“Conservatives are more accurate than liberals in making these judgments when they study a face, because conservatives are more likely to use stereotypes,” Rule said. “Of course, stereotypes are often wrong, but they do have what we call kernels of truth. Liberals tend to not want to use stereotypes in making judgments, and it impairs their accuracy.”
June 2, 2014
I talked a few times with the realtor, and they were as helpful as realtors usually are: not helpful. They couldn’t answer any important questions for me, because realtors don’t know anything important about the properties they sell. Well, that’s not entirely true. They often know very important things about the properties they sell. Those are invariably the things they’re hiding from you, hoping to entice you into standing in the decrepit shack they’re listing while they perform their Svengali perorations about its potential. Weave a tapestry of possibilities in the air that’ll have you frisking yourself in no time, looking for your checkbook before that handyman that’s interested in the property snatches it from under your nose.
Oh, I know that handyman. That guy gets around. I never learned his name, but he seemed to be interested in every property I was interested in Maine. No matter where I went — Turner, Cornish, Peru, Livermore Falls, Norway, Rumford…
Anyway, that polymath handyman with the lead foot and the nose for diamonds in the rough was always one step ahead of us, ready to stuff our defeat into the jaws of his victory. He was very interested in Turner, I hear.
Sippican Cottage, “I’m Fixing A Hole Where The Intertunnel Gets In “, Sippican Cottage, 2013-11-13
May 31, 2014
My dear friend, clear your mind of cant. You may talk as other people do: you may say to a man, ‘Sir, I am your most humble servant.’ You are not his most humble servant….You tell a man, ‘I am sorry you had such bad weather the last day of your journey, and were so much wet.’ You don’t care six-pence whether he was wet or dry. You may talk in this manner; it is a mode of talking in society: but don’t think foolishly.
Dr. Samuel Johnson to James Boswell on May 15th, 1783. (quoted by John Derbyshire in “A Whining Pretension to Goodness”, Taki’s Magazine, 2013-10-17)
May 30, 2014
All commanders must have been aware of the advantages of vigorous pursuit; hence the mere fact that they did not succeed in achieving it shows that there must be some big predisposing cause militating against its attainment. This cause may be defined as lassitudo certamine (to coin an expression), that moral and physical fatigue and reaction that usually supervenes toward the close of a hard-fought struggle as the daylight departs and the pursuit should just be starting. At the battle of Orthez Wellington thoroughly defeated Soult but omitted to pursue him. Why? Almost certainly because he was himself wounded just at the close of the action, and his physical and mental powers at that critical moment no doubt suffered temporary eclipse. In the same way Marlborough after his brilliant exploit in forcing the Lines of the Geet in 1705 made no attempt to pursue. He had just taken part himself in a fierce cavalry charge, and was physically bouleversé. It is doubtful whether in any army this potential weakness is sufficiently recognized and systematically combatted.
Lt. Colonel Alfred H. Burne, The Art of War on Land, 1947.
May 29, 2014
Kevin Williamson on the most recent mass killing:
Mass murders on the Elliot Rodger model are not a modern thing; we all know the story of Columbine, but the worst school slaughter in American history happened in 1927 in Michigan. Nor are they a gun thing; that Michigan massacre required no firearms, and neither did the crimes of Timothy McVeigh. They are not a “white privilege” thing, soiled as I feel for being obliged to write the words “white privilege”; the worst such massacre in recent U.S. history was carried out by a Korean-born American. They are not a male thing; Brenda Spencer’s explanation of her shooting spree in San Diego inspired the song “I Don’t Like Mondays.” They are not an American thing; Anders Breivik of Norway carried out the largest mass murder in modern history, though it is possible that Beijing’s Tian Mingjian killed more; Europe, the Americas, and Asia have experienced roughly comparable numbers of mass murders, with the Asian numbers slightly ahead of the rest. They are not an ideological thing; mass murders sometimes issue manifestos, but they are generally incoherent and shallow. The phenomenon of mass killings has little to do with race, sex, politics, economics, or the availability of legal firearms. Such episodes are primarily an act of theater.
Elliot Rodger’s family was in relatively difficult financial circumstances, though relatively must be emphasized. His father was the assistant director of The Hunger Games, and the young man was apparently proud of his BMW coupe, but his family’s financial position was modest by Hollywood standards. Through his family, Rodger enjoyed some enviable social connections, but could not achieve the connection he desired, a romantic one. His was an individualism suffered as a burden. In another century, his life might have been given some structure by the church or by his extended family, or simply by the fundamental struggle to feed and shelter himself, which was the organizing principle of the great majority of human lives for millennia. Modernity sets us free, but it does not offer any answer to the question, “Free to do what?”
Art, particularly theater, has for a long time helped to answer that question. What we see on stage, however far removed from our own experience, is an intensified version of our own lives. The Mass is, if nothing else, an act of theater, but it is also the case, as Mikhail Bakunin wrote, that “the passion for destruction is a creative passion.” It is not mere coincidence that so many mass murderers, from the Columbine killers to McVeigh, imagine themselves to be instigators of revolution, or that their serial-killer cousins so often think of themselves as artists. Their delusions are pathetic, but they are not at all alien to common human experience. That they so often end in suicide is not coincidence, either. Their rampages are at once a quest for significance and a final escape from significance and its burdens. Whatever particular motive such killers cite is secondary at best. The killing itself is the point — it is not a means to some other end.
May 20, 2014
Charles Stross doesn’t typically write stories with traditional hero characters, and he explains why, before digging deeper into the likely origins of the stereotype:
I will confess that I find it difficult to write fictional heroes with a straight face. After all, we are all the heroes of our internal narrative (even those of us who others see as villains: nobody wakes up in the morning, twirls their moustache, and thinks, how can I most effectively act to further the cause of EVIL™ today?). And people who might consider themselves virtuous or heroic within their own framework, may be villains when seen from the outside: it’s a common vice of fascists (who seem addicted to heroic imagery — it’s a very romantic form of political poison, after all, the appeal to the clean and manly virtue of cold steel in subordination to the will of the State), and also of paternalist authoritarians.
[...] it seems pretty damn clear that the superhero archetypes hail back to the polytheistic religions of yore, to the Greek, Roman, Norse, and Egyptian pantheons and their litany of family feuds and bad-tempered bickering. (And is it just me or are half the biggest plots in
superheropre-monotheist mythology the punch-line to the God-Father (or occasionally one of his more troublesome sons) failing to keep his cock to himself, and the other half due to a jealous squabble between goddesses that escalates into a nuclear grudge-fest until suddenly Trojan Wars break out?)
We have this in common with our 5000-years-dead ancestors: we’re human beings, and our neural architecture hasn’t changed that much since the development of language and culture (unless you believe Julian Jaynes — and I don’t). We still have the same repertoire of emotional reactions. We still have a dismaying tendency to think it’s all about us, for any value of “it” you care to choose. We fall for a whole slew of common cognitive biases, including a complex of interacting heuristics that make us highly vulnerable to supernatural beliefs and religions. (The intentional stance per Dennett means we ascribe actions to intentionality; confirmation bias leads us to assume intentionality to natural events because this is something that’s been bred into us throughout the many millions of years of predator/prey arms races that weeded out those of our ancestors who weren’t fast enough to correlate signs such as lion prints at the nearby watering hole with other signs like Cousin Ugg going missing and realize there was a connection. So our ancestors looked on as lightning zapped another unfortunate Cousin Ugg, felt instinctively that there had to be a reason, and decided there was a Lightning God somewhere and he’d gotten mad at our tribe.)
We have other biases. We look at people with good skin and bilaterally symmetrical features (traits indicative of good health) and we see them as beautiful (hey, again: we’re the end product of endless generations of organisms that did best when they forged reproductive partnerships with other organisms that were in good health), so obviously they’ve been blessed by the gods. And the gods bless those who are virtuous, because virtue (by definition) is what the gods bless you for. So beauty comes to be equated with good; and this plays itself out in our fictions, where our heroes and favoured protagonists are mostly handsome or pretty and the villains are ugly as sin …