Paranoia is the political currency of the emotionally unstable culture warrior who is always certain that someone is looking at them the wrong way, threatening them on social media or committing some other microaggression against them.
There are people who can’t hear a “good morning” without seeing a conspiracy against them. In normal life, they’re diagnosed as paranoid schizophrenics. In political activism, they’re just sensitive to microaggressions.
The left has always valued oversensitivity because neurotics are easy to convince of their own victimhood. And career victims naturally dehumanize those they consider to be their oppressors. As neurosis has become culture, the only way to escape accusations of privilege is to find your own victimhood. And there’s nothing like being forced to develop insecurities to make you insecure.
Culture War 2.0 is culturally crazy. Its personal dysfunction so entangled with politics that there is no way to tell where one begins and the other ends. It politicizes insecurity and narcissism for campaigns that are indistinguishable from trolling. It reduces every issue to personal unhappiness and demands the abolition of traditional freedoms and rights as the answer to that unhappiness.
Its demands are unanswerable because they are personal. There is no solution to them except sanity. And in a cultural conflict where insanity is an asset, sanity is no answer. It’s an accusation.
The truly paranoid see the world as hostile and are driven to destroy it. Their crusade to find happiness by making everyone else miserable can only [succeed] halfway. They can never be happy, but they can always make others miserable.
Daniel Greenfield, “Our Insecure Culture Warriors”, Sultan Knish, 2015-11-02.
January 5, 2016
November 2, 2015
A different view of Macbeth
Anthony King looks at Macbeth as a PTSD sufferer:
Although the descriptions are graphic, Shakespeare’s play itself includes few on-stage battle scenes. Only at the very end does Macbeth actually fight on stage, a last stand in which he kills the young Siward (his last victim) and is in turn killed by MacDuff. For the rest of the play, all of Macbeth’s violence is set off stage, described but never seen. The audience imagines his violence — they do not witness it.
Justin Kurzel’s striking new adaptation of Macbeth, released on October 2, 2015 to critical acclaim and starring Marion Cotillard and Michael Fassbender, represents a cinematographic inversion of the original. In his film, battle predominates. The film begins with an extended combat sequence. Macbeth and his army are gathered on a bleak moor as they prepare for battle against Macdonaldwald’s army, unseen in the dense fog. The camera pans across the black-striped war-painted faces until, initiated by Macbeth, the host issues a war cry and plunges toward their enemies, who appear spectrally in the distance through the murk. In ultra-slow motion, the two armies clash and brutal fighting follows. Most notably, one of Macbeth’s boy soldiers, on whom the camera dwells tellingly before the battle, has his throat cut during the fighting and bleeds out darkly on screen. Eventually, Macbeth charges Macdonaldwald and slashes him to the ground. The scene is followed by a long sequence in which the dead are gathered and prepared for cremation, including the boy soldier, whose image haunts the rest of the film.
Macbeth’s apparently fearless heroism and remorseless violence is on display throughout these sequences. Yet the sequences highlight an aspect of Macbeth’s character normally absent from adaptations of the play and presumably from the original play, but highly relevant to a 21st-century audience. Macbeth is a combat veteran and, despite his courage, he is plainly severely traumatized by his war experiences. Kurzel and Fassbender construct him as a victim of PTSD, and he displays the classic symptoms of this perturbing condition.
September 30, 2015
Helicopter parents have raised a generation of needy, emotionally fragile young adults
In Psychology Today, Peter Gray looks at how universities are unequipped to handle the anxieties and emotional neediness of today’s students:
A year ago I received an invitation from the head of Counseling Services to join other faculty and administrators, at the university I’m associated with, for discussions about how to deal with the decline in resilience among students. At the first meeting, we learned that emergency calls to Counseling had more than doubled over the past five years. Students are increasingly seeking help for, and apparently having emotional crises over, problems of everyday life. Recent examples mentioned included a student who felt traumatized because her roommate had called her a “bitch” and two students who had sought counseling because they had seen a mouse in their off-campus apartment. The latter two also called the police, who kindly arrived and set a mousetrap for them.
Faculty at the meetings noted that students’ emotional fragility has become a serious problem when in comes to grading. Some said they had grown afraid to give low grades for poor performance, because of the subsequent emotional crises they would have to deal with in their offices. Many students, they said, now view a C, or sometimes even a B, as failure, and they interpret such “failure” as the end of the world. Faculty also noted an increased tendency for students to blame them (the faculty) for low grades—they weren’t explicit enough in telling the students just what the test would cover or just what would distinguish a good paper from a bad one. They described an increased tendency to see a poor grade as reason to complain rather than as reason to study more, or more effectively. Much of the discussions had to do with the amount of handholding faculty should do versus the degree to which the response should be something like, “Buck up, this is college.” Does the first response simply play into and perpetuate students’ neediness and unwillingness to take responsibility? Does the second response create the possibility of serious emotional breakdown, or, who knows, maybe even suicide?
Two weeks ago, the head of Counseling (who has now moved up to another position in the University) sent us all a follow-up email, announcing a new set of meetings. His email included this sobering paragraph: “I have done a considerable amount of reading and research in recent months on the topic of resilience in college students. Our students are no different from what is being reported across the country on the state of late adolescence/early adulthood. There has been an increase in diagnosable mental health problems, but there has also been a decrease in the ability of many young people to manage the everyday bumps in the road of life. Whether we want it or not, these students are bringing their struggles to their teachers and others on campus who deal with students on a day-to-day basis. The lack of resilience is interfering with the academic mission of the University and is thwarting the emotional and personal development of students.”
[…]
In my next essay in this series I’ll examine the research evidence suggesting that so-called “helicopter parenting” really is at the core of the problem. But I don’t blame parents, or certainly not just parents. Parents are in some ways victims of larger forces in the society — victims of the continuous exhortations from “experts” about the dangers of letting kids be, victims of the increased power of the school system and the schooling mentality that says kids develop best when carefully guided and supervised by adults, and victims of increased legal and social sanctions for allowing kids into public spaces without adult accompaniment. We have become, unfortunately, a “helicopter society.”
If we want to prepare our kids for college — or for anything else in life! — we have to counter all these social forces. We have to give our children the freedom, which children have always enjoyed in the past, to get away from adults so they can practice being adults, that is, practice taking responsibility for themselves.
September 13, 2015
Teaching microaggressions
At Ace of Spades HQ, Ace sums up the deepest problem with the movement to find microaggressions everywhere and in everything:
The idea is that just as cognitive therapy teaches people to not make a big deal out of trivialities (like teaching people who have a phobia about elevators to learn to not be afraid of elevators), microagression brain-programming is a malicious form of cognitive therapy teaching people the exact opposite — to fear this, hate that, fly off the handle about this other thing, and generally carry on like a lunatic about things that sane people do not even think about.
And just as the good form of cognitive therapy can make a hysteric or neurotic a well-functioning individual, so can the insidious form of it turn a well-functioning individual into a hysteric or neurotic.
I think it’s 100% right and I’m glad someone had the guts to say so.
Colleges and progressives generally are teaching young people how to be mentally ill.
June 15, 2015
April 19, 2015
The latest “breakthrough” in helping schizophrenics take their medicine
Scott Alexander recently attended a local psychiatry conference, with some essential themes being emphasized:
This conference consisted of a series of talks about all the most important issues of the day, like ‘The Menace Of Psychologists Being Allowed To Prescribe Medication’, ‘How To Be An Advocate For Important Issues Affecting Your Patients Such As The Possibility That Psychologists Might Be Allowed To Prescribe Them Medication’, and ‘Protecting Members Of Disadvantaged Communities From Psychologists Prescribing Them Medication’.
As somebody who’s noticed that the average waiting list for a desperately ill person to see a psychiatrist is approaching the twelve month mark in some places, I was pretty okay with psychologists prescribing medication. The scare stories about how psychologists might prescribe medications unsafely didn’t have much effect on me, since I continue to believe that putting antidepressants in a vending machine would be a more safety-conscious system than what we have now (a vending machine would at least limit antidepressants to people who have $1.25 in change; the average primary care doctor is nowhere near that selective). Annnnnyway, this made me kind of uncomfortable at the conference and I Struck A Courageous Blow Against The Cartelization Of Medicine by sneaking out without putting my name on their mailing list.
But before I did, I managed to take some notes about what’s going on in the wider psychiatric world, including:
– The newest breakthrough in ensuring schizophrenic people take their medication (a hard problem!) is bundling the pills with an ingestable computer chip that transmits data from the patient’s stomach. It’s a bold plan, somewhat complicated by the fact that one of the most common symptoms of schizophrenia is the paranoid fear that somebody has implanted a chip in your body to monitor you. Can you imagine being a schizophrenic guy who has to explain to your new doctor that your old doctor put computer chips in your pills to monitor you? Yikes. If they go through with this, I hope they publish the results in the form of a sequel to The Three Christs of Ypsilanti.
– The same team is working on a smartphone app to detect schizophrenic relapses. The system uses GPS to monitor location, accelerometer to detect movements, and microphone to check tone of voice and speaking pattern, then throws it into a machine learning system that tries to differentiate psychotic from normal behavior (for example, psychotic people might speak faster, or rock back and forth a lot). Again, interesting idea. But again, one of the most common paranoid schizophrenic delusions is that their electronic devices are monitoring everything they do. If you make every one of a psychotic person’s delusions come true, such that they no longer have any beliefs that do not correspond to reality, does that technically mean you’ve cured them? I don’t know, but I’m glad we have people investigating this important issue.
April 4, 2015
QotD: DO NOT GIVE PSYCHOTHERAPY TO PEOPLE WITHOUT THEIR CONSENT
You know, I love science as much as anyone, maybe more, but I have grown to dread the phrase “…according to the research”.
They say that “Confronting triggers, not avoiding them, is the best way to overcome PTSD”. They point out that “exposure therapy” is the best treatment for trauma survivors, including rape victims. And that this involves reliving the trauma and exposing yourself to traumatic stimuli, exactly what trigger warnings are intended to prevent. All this is true. But I feel like they are missing a very important point.
YOU DO NOT GIVE PSYCHOTHERAPY TO PEOPLE WITHOUT THEIR CONSENT.
Psychotherapists treat arachnophobia with exposure therapy, too. They expose people first to cute, little spiders behind a glass cage. Then bigger spiders. Then they take them out of the cage. Finally, in a carefully controlled environment with their very supportive therapist standing by, they make people experience their worst fear, like having a big tarantula crawl all over them. It usually works pretty well.
Finding an arachnophobic person, and throwing a bucket full of tarantulas at them while shouting “I’M HELPING! I’M HELPING!” works less well.
And this seems to be the arachnophobe’s equivalent of the PTSD “advice” in the Pacific Standard. There are two problems with its approach. The first is that it avoids the carefully controlled, anxiety-minimizing setup of psychotherapy.
The second is that YOU DO NOT GIVE PSYCHOTHERAPY TO PEOPLE WITHOUT THEIR CONSENT.
If a person with post-traumatic stress disorder or some other trigger-related problem doesn’t want psychotherapy, then even as a trained psychiatrist I am forbidden to override that decision unless they become an immediate danger to themselves or others.
And if they do want psychotherapy, then very likely they want to do it on their own terms. I try to read things that challenge my biases and may even insult or trigger me, but I do it when I feel like it and not a moment before. When I am feeling adventurous and want to become stronger in some way, I will set myself some strenuous self-improvement task, whether it be going on a long run or reading material I know will be unpleasant. But at the end of a really long and exasperating day when I’m at my wit’s end and just want to relax, I don’t want you chasing me with a sword and making me run for my life, and I don’t want you forcing traumatic material at me.
Scott Alexander, “The Wonderful Thing About Triggers”, Slate Star Codex, 2014-05-30.
February 27, 2015
The changes in language describing changing gender
Charlotte Allen discusses how quickly the language has changed when talking about transsexuality over a very short time:
In 2012 the board of trustees of the American Psychiatric Association (APA) approved a set of proposed revisions to its Diagnostic and Statistical Manual of Mental Disorders (the new version is the DSM-5), designed to remove the stigma of mental illness from the transgender classification. Earlier versions of the DSM had defined transgenderism as “gender identity disorder,” which seemed to imply illness. The DSM-5 changed that term to “gender dysphoria.” The change paralleled the association’s removal of homosexuality as a mental disorder in 1973. It signaled that whatever problems transgenders might experience were not due to a pathological misconception that their bodies and gender identities were mismatched but to the fact that their bodies and gender identities were mismatched. Hormones, surgery, cosmetics, and different clothes might still be the “cure” (enabling transgenders to qualify for medical reimbursement for a variety of procedures), but the APA was making it clear, as far as it was concerned, that the problem was not inside the transgender’s head.
The medical evidence for a mismatch between brains and bodies is ambiguous. The two studies cited most frequently by transgender activists, published in 1995 and 2000, examined the brains of a total of seven male-to-female transgenders and found that a region of the hypothalamus, an almond-shaped area of the brain that controls the release of hormones by the pituitary gland, was female-typical in those brains. But those studies have been criticized for not controlling for the estrogen—which affects the size of the hypothalamus—that most male-to-female transgenders take daily in order to maintain their feminine appearance.
Accompanying the APA’s change of classification was a change of vocabulary. Ever since the days of Christine Jorgensen (1926-1989), the World War II serviceman whose surgery in Denmark during the early 1950s brought transgenderism under the media spotlight for the first time, the procedure was known in popular parlance as a “sex change operation.” Then in the 1990s, when the idea of one’s “gender” as something distinct from one’s biological sex began to take hold (thanks to the efforts of academic feminists and other postmodernists, who argued that gender is “socially constructed”), the preferred term became “gender reassignment surgery.” Now the preferred phrase seems to be “gender confirmation surgery.” The change in terminology renders more credible transpeople’s claims to have always belonged to the gender to which they have transitioned.
The once commonly used word “transsexual” has thus become passé — even verboten in the most sensitive circles — just during the past decade. For example, Washington Post reporter Abby Ohlheiser issued a severe scolding to news media for using the word “transsexual” in reference to a 27-year-old male-to-female victim of a grisly murder and dismemberment at the hands of her 28-year-old male lover (who subsequently committed suicide) in Brisbane, Australia, in October 2014. “Although some individuals do identify as ‘transsexual,’ the term is often viewed as old-fashioned and not an appropriate umbrella word,” Ohlheiser wrote in a column deriding the coverage of the crime as “transphobic.” Ohlheiser also objected to media describing the victim, Mayang Prasetyo, as a “prostitute” (Prasetyo had been working as an escort before her death) and reproducing photos of Prasetyo’s busty self clad in a tiny swimsuit that she had posted on the Internet. “Many of the articles covering the murder are laden with provocative photographs of the victim in a bikini, as if any story about a trans person is an excuse to view and scrutinize trans bodies,” Ohlheiser wrote.
February 25, 2015
Dealing with “dark tetrad” personalities
Bobby Stein linked to this column in Psychology Today from last summer, talking about how to deal with sadists, psychopaths, narcissists, and Machiavellians:
There are several personality types that are more likely to harm another than the average person would. Sadists possess an intrinsic motivation to inflict suffering on innocent others, even when this comes at a personal cost. This is because for sadistic personalities, cruelty is pleasurable, generally exciting, and can be sexually stimulating.
In a recent study, Buckels and colleagues examined examples of everyday sadism as part of what they refer to as the “Dark Tetrad,” sadism plus the original members of the “Dark Triad”—psychopathy, narcissism, and Machiavellianism. These personalities have some overlap and are characterized by callous manipulation, self-centeredness, disagreeableness, and exploitation. In their research, the team sought to determine whether everyday sadism could be captured in the laboratory, as well as whether measures of sadistic personality would predict these behaviors beyond already established measures of the Dark Triad. Among the findings were that sadistic personalities were the most likely members of the Dark Tetrad to select the task involving killing from an array of unpleasant tasks. Those sadists who killed more bugs derived greater pleasure from the act than those who killed fewer bugs.
In a second, related study, those high in sadism, psychopathy, and/or narcissism, as well as those low in empathy and perspective-taking, were willing to aggress against an innocent person when aggression was easy. Only sadists increased the intensity of their attack once they realized the person would not fight back, however. Furthermore, sadists, unlike the other “dark personalities,” were the only ones willing to expend additional time and energy (in this case, first completing a boring task) in order to have the opportunity to hurt an innocent person.
Previous research has found that although psychopaths have no qualms about hurting others, they are more likely to do so when it serves a specific purpose. Narcissists are less likely to aggress upon another unless their ego is threatened. Machiavellians will usually aggress upon others only if there are sufficient perceived benefits and the risk to themselves is acceptably low.
February 7, 2015
Is there a relationship between physical illness and depression?
Last month, Scott Alexander tried to show the evidence, pro and con, on whether we have detected a causal relationship between physical ailments and depression:
Start with From inflammation to sickness and depression [PDF], Dantzer et al (2008), who note that being sick makes you feel lousy [citation needed]. Drawing upon evolutionary psychology, they theorize this is an adaptive response to make sick people stay in bed (or cave, or wherever) so the body can focus all of its energy on healing. A lot of sickness behavior – being tired, not wanting to do anything, not eating, not wanting to hang around other people – seems kind of like mini-depression.
All of this stuff is regulated by chemicals called cytokines, which are released by immune cells that have noticed an injury or infection or something. They are often compared to a body-wide “red alert” sending the message “sickness detected, everyone to battle stations”. This response is closely linked to the idea of “inflammation”, the classic example of which is the locally infected area that has turned red and puffy. Most inflammatory cytokines handle the immune response directly, but a few of them – especially interleukin-1B and tumor necrosis factor alpha – cause this depression-like sickness behavior.
[…]
Here are some other suspicious facts about depression and inflammation:
– Exercise, good diet and sleep reduce inflammation; they also help depression.
– Stress increases inflammation and is a known trigger for depression.
– Rates of depression are increasing over time, with the condition seemingly very rare in pre-modern non-Westernized societies. This is commonly attributed to the atomization and hectic pace of modern life. But levels of inflammation are also increasing over time, probably because we have a terrible diet that disrupts the gut microbiota that are supposed to be symbioting with the immune system. Could this be another one of the things we think are social that turn out to be biological?
– SSRI antidepressants, like most medications, have about five zillion effects. One of the effects is to reduce the level of inflammatory cytokines in the body. Is it possible that this is why they work, and all of this stuff about serotonin receptors in the brain is a gigantic red herring?
– It’s always been a very curious piece of trivia that treating depression comorbid with heart disease significantly decreases your chances of dying from the heart disease. People just sort of nod their heads and say “You know, mind-body connection”. But inflammation is known to be implicated in cardiovascular disease. If treating depression is a form of lowering inflammation, this would make perfect sense.
– Rates of depression are much higher in sick people. Cancer patients are especially famous for this. No one gets too surprised here, because having cancer is hella depressing. But it’s always been interesting (to me at least) that as far as we can tell, antidepressants treat cancer-induced depression just as well as any other type. Are antidepressants just that good? Or is the link between cancer being sad and cancer causing depression only part of the story, with the other part being that the body’s immune response to cancer causes inflammatory cytokine release, which antidepressants can help manage?
– Along with cancer, depression is common in many other less immediately emotion-provoking illnesses like rheumatoid arthritis and diabetes. The common thread among these illnesses is inflammation.
– Inflammation changes the activity level of the enzyme indoleamine 2,3 dioxygenase. This enzyme produces kynurenines which interact with the NMDA receptor, a neurotransmitter receptor implicated in depression and various other psychiatric diseases (in case your first question upon learning about this pathway is the same as mine: yes, kynurenines got their name because they were first found in dog urine).
– Sometimes doctors treat diseases like hepatitis by injecting artificial cytokines to make the immune system realize the threat and ramp up into action. Cytokine administration treatments very commonly cause depression as a side effect. This depression can be treated with standard antidepressants.
– Also, it turns out we can just check and people with depression have more cytokines.
There’s also some evidence against the theory. People with depression have more cytokines, but it’s one of those wishy-washy “Well, if you get a large enough sample size, you’ll see a trend” style relationships, rather than “this one weird trick lets you infallibly produce depression”.
[…]
So in conclusion, I think the inflammatory hypothesis of depression is very likely part of the picture. Whether it’s the main part of the picture or just somewhere in the background remains to be seen, but for now it looks encouraging. Anti-inflammatory drugs do seem to treat depression, which is a point in the theory’s favor, but right now the only one that has strong evidence behind it has side effects that make it undesirable for most people. There’s a lot of room to hope that in the future researchers will learn more about exactly how this cytokine thing works and be able to design antidepressant drugs that target the appropriate cytokines directly. Until then, your best bets are the anti-inflammatory mainstays: good diet, good sleep, plenty of exercise, low stress levels, and all the other things we already know work.
December 15, 2014
December 13, 2014
Tobacco – 480,000. Alcohol – 88,000. Marijuana – > 0
It’s ridiculous to claim that smoking marijuana is a healthy habit. It does increase the risk of certain kinds of cancers, although the numbers are not huge, they’re also not zero. Jacob Sullum says “Marijuana Kills! But Not Very Often. Especially When Compared to Alcohol and Tobacco.“
In a new Heritage Foundation video, anti-pot activist Kevin Sabet bravely tackles “the myth that marijuana doesn’t kill.” Although cannabis consumers (unlike drinkers) do not die from acute overdoses, he says, “marijuana does kill people” through suicide, chronic obstructive pulmonary disease, car crashes, and other accidents.
I won’t say Sabet is attacking a straw man, since overenthusiastic cannabis fans have been known to say that “marijuana doesn’t kill anyone” (although the top Google result for that phrase is an article by Sabet explaining why that’s not true). But I will say that Sabet manages to obscure the fact that marijuana does not kill people very often, especially compared to the death tolls from legal drugs such as tobacco and alcohol, which is the relevant point in evaluating the scientific basis for pot prohibition. Let’s take a closer look at the four ways that marijuana kills, according to Sabet:
Suicide. Some research does find a correlation between suicide and marijuana use, but that does not mean the relationship is causal. A longitudinal study published by The British Journal of Psychiatry in 2009 reached this conclusion:
Although there was a strong association between cannabis use and suicide, this was explained by markers of psychological and behavioural problems. These results suggest that cannabis use is unlikely to have a strong effect on risk of completed suicide, either directly or as a consequence of mental health problems secondary to its use.
Furthermore, there is some evidence that letting patients use marijuana for symptom relief reduces the risk of suicide. Still, if reefer has ever driven anyone to kill himself, that would be enough to prove Sabet’s point. You can’t say it has never happened!
October 23, 2014
QotD: When “impostor syndrome” meets the “Dunning-Kruger effect”
The more I think about things like the Dunning-Kruger Effect and Impostor Syndrome, the more I suspect they’re sociological as opposed to psychological.
If you’re unfamiliar, the Dunning-Kruger Effect is the name of a cognitive bias where people consistently rate themselves as being higher skilled than others, even (especially?) then they are decidedly not. In other words, people are nowhere near as good as they think they are.
Diametrically opposed to that is Impostor Syndrome, where people refuse to acknowledge their accomplishments and competencies.
If you’re aware of both of them, you might constantly vacillate between them, occasionally thinking you’re awesome, then realizing that it probably means you aren’t, going back and forth like a church bell. I know nothing of this, I assure you. But the point is that I think they’re almost certainly related to the people that we surround ourselves with.
Matt Simmons, “The Impostor Effect vs Dunning-Kruger”, Standalone Sysadmin, 2013-02-27.
September 16, 2014
QotD: The real value of work
People without meaningful work and copious free time don’t write symphonies or create great works of art. They don’t live a life of the mind. They drink too much, or get in fights, or watch a lot of internet porn, or commit crimes. They don’t contribute to the economy or culture, as a rule. They just…exist. And it goes on like that, sometimes for generations.
Labor is the fate of all humankind. Always has been. We work to live. Work gives shape and meaning to our lives. It’s not just the income we derive from it; it’s the knowledge that we are able to function as adults in the wider world, and provide for ourselves and our families. It’s feeling the satisfaction of having contributed something to the maintenance of civilization, even if it means we haul trash away or keep the grass mowed. It’s all honorable work, necessary work, and not something to be ashamed of.
It’s not an outrage, it’s just the way things are. To try and embitter people about that, to make them feel that the natural order of things is unfair, is just to do an enormous amount of harm to the very people you’re claiming to want to help.
Monty, “We’re now living in a post-labor Utopia. Have you heard about this?”, Ace of Spades HQ, 2014-02-06
August 14, 2014
QotD: How to create a depressive society
The widespread perception that almost everyone else was a moron — why, just look at the things people post and say on the Internet! – would facilitate a certain philosophy of narcissism; we would have people walking around convinced they’re much smarter, and much more sophisticated and enlightened, than everyone else.
Marinating in the perception that most people are stupid, hateful, sick, and needlessly cruel would undoubtedly alter people’s aspirations and ambitions in life. Why strive to create a new invention, miracle cure, remarkable technology, or wondrous innovation to help the masses? It would be pearls before swine, a gift to a thoroughly undeserving population that had earned its miserable circumstances. The hopeless ignorance and hateful philosophies of the great unwashed might, however, spur quiet calls for the restoration of a properly thinking aristocracy to help steer society in the correct direction.
If we wanted to build a society designed to promote depression, we would want to make children seem like a burden. Children are a smaller, slightly altered version of ourselves; Christopher Hitchens described parenthood as “realizing that your heart is running around in somebody else’s body.” To hate life, you have to hate children. If they are a form of immortality — half of our genetic code and half of our habits, good and ill, walking around a generation later — then a depressive society would condition its members to hate the possibilities of their future.
If we wanted to build a society designed to promote depression, we would want to make old age seem to be a horrible fate. (It is the only alternative to death!) Our depressive society would want to not merely celebrate youth, but we would want to constantly reinforce the sense that one is approaching mental and physical obsolescence. A celebrity who appeared much younger than her years would be celebrated and everyone would openly demand to know her secret. The unspoken expectation would be that anyone could achieve the same result if she simply tried hard enough. We would exclaim, “Man, he’s getting old!” in response to those who didn’t look the same as when we first saw them.
We would want to make sure that appearances not merely counted, but that attractiveness is preeminent. That anonymous and yet public realm of the Internet would ensure that anyone in the world could safely mock the appearance of others to a public audience and then return to picking Cheetos out of his chest hair.
Jim Geraghty, “Robin Williams and Our Strange Times: Does our society set the stage for depression?”, National Review, 2014-08-12.



