I think it’s time to declare the internet a failure. At least with respect to its early promises of increased knowledge sharing and positive impact on collaboration.
Decentralization of media, Social media, has increased the rate in which misinformation is being transmitted. Once learned invalid information has to be unlearned and that is a much harder task than educating people with accurate information in the first place.
Social media also appears to have increased the rate that people cluster around misinformation and create specialized groups of individuals that aggressively seek to disseminate their ideas.
The asocial aspect of social media encourages individuals to behave in ways that they normally wouldn’t when face-to-face with people that don’t share their views. It has made intolerant people more belligerent and it has forced tolerant people to adopt less tolerant stances.
The trend seems to be to continue to partition people into increasingly specialized and narrowly focused groups. At the extreme we see individuals with highly individualized views agitating groups with more generally accepted views.
People have become more militant, intolerant, and unaccepting of society. The impact on society is a weakening of collaborative spirit, increased cynicism, and further increases to militancy.
In the mid-90s I was very excited at the opportunities collective information sharing could produce. We’ve realized some of those but I simply didn’t foresee the degradation of democratic values that reveal the best in the humanity.
Social media has increased the ability to create social anxiety by pouring misinformation into peoples’ lives with ideas that they are directly threatened or that there are limits to resources, ideas that are often mere fabrication.
Today we are bombarded daily with absurdity, aggression, fear mongering, and intolerance. It’s as if we unwound the clock a hundred years and abandoned the great freedom experiment. Only now the weapons to resolve differences of opinion are much more destructive.
Hard to be bullish on the consequences of increased nationalism around the globe.
The world does face some difficult issues we need to address but things are not nearly as bad as what has become status quo thinking.
Douglas Gunn, posting to Facebook, 2017-02-20.
March 3, 2017
November 8, 2016
The strongest bias in American politics is not a liberal bias or a conservative bias; it is a confirmation bias, or the urge to believe only things that confirm what you already believe to be true.
Emma Roller, “Your Facts or Mine?”, New York Times, 2016-10-25.
October 31, 2015
Shaunacy Ferro is here to harsh your paranormal mellow with six possible explanations for ghostly activities:
1. ELECTROMAGNETIC FIELDS
For decades, a Canadian neuroscientist named Michael Persinger has been studying the effects of electromagnetic fields on people’s perceptions of ghosts, hypothesizing that pulsed magnetic fields, imperceptible on a conscious level, can make people feel as if there is a “presence” in the room with them by causing unusual activity patterns in the brain’s temporal lobes. […]
Infrasound is sound at levels so low humans can’t hear it (though other animals, like elephants, can). Low frequency vibrations can cause distinct physiological discomfort. Scientists studying the effects of wind turbines and traffic noise near residences have found that low-frequency noise can cause disorientation, feelings of panic, changes in heart rate and blood pressure, and other effects that could easily be associated with being visited by a ghost [PDF]. […]
Shane Rogers, an engineering professor at Clarkson University, has spent the past few months touring reportedly haunted locations looking for not-so-paranormal activity: mold growth. Preliminary research indicates that some molds can cause symptoms that sound pretty ghostly—like irrational fear and dementia. […]
4. CARBON MONOXIDE POISONING
In 1921, a doctor named W.H. Wilmer published an odd story about a haunted house in the medical journal the American Journal of Ophthalmology. The family who lived in this haunted residence, called the H family in the medical literature, began experiencing weird phenomena when they moved into an old house—hearing furniture moving around and strange voices in the night, feeling the presence of invisible specters. They report being held down in bed by ghosts, feeling weak, and more. As it turned out, a faulty furnace was filling their house with carbon monoxide, causing aural and visual hallucinations. The furnace was fixed, and the H family went back to their lives, sans ghosts.
5. SOMEONE ELSE SAID IT WAS REAL.
In a 2014 study, Goldsmiths, University of London psychologists had participants watch a video of a “psychic” supposedly bending a metal key with his mind. In one condition, study subjects watched the video with a “participant” who was actually working with the researchers and professed to see the key bending. Those subjects were more likely to report that they saw the key bend than subjects who were paired with someone who asserted that the key didn’t bend or said nothing. […]
6. WE WANT TO BELIEVE.
“There is a motivational side to belief in ghosts,” French explains. “We all want to believe in life after death. The idea of our mortality is one we are not generally comfortable with.” Confirmation bias holds powerful sway over our perceptions. “We find it much easier to believe evidence for something we want to believe anyway,” he says.
September 19, 2015
July 24, 2015
Matt Ridley on the danger to all scientific fields when one field is willing to subordinate fact to political expediency:
For much of my life I have been a science writer. That means I eavesdrop on what’s going on in laboratories so I can tell interesting stories. It’s analogous to the way art critics write about art, but with a difference: we “science critics” rarely criticise. If we think a scientific paper is dumb, we just ignore it. There’s too much good stuff coming out of science to waste time knocking the bad stuff.
Sure, we occasionally take a swipe at pseudoscience — homeopathy, astrology, claims that genetically modified food causes cancer, and so on. But the great thing about science is that it’s self-correcting. The good drives out the bad, because experiments get replicated and hypotheses put to the test. So a really bad idea cannot survive long in science.
Or so I used to think. Now, thanks largely to climate science, I have changed my mind. It turns out bad ideas can persist in science for decades, and surrounded by myrmidons of furious defenders they can turn into intolerant dogmas.
This should have been obvious to me. Lysenkoism, a pseudo-biological theory that plants (and people) could be trained to change their heritable natures, helped starve millions and yet persisted for decades in the Soviet Union, reaching its zenith under Nikita Khrushchev. The theory that dietary fat causes obesity and heart disease, based on a couple of terrible studies in the 1950s, became unchallenged orthodoxy and is only now fading slowly.
What these two ideas have in common is that they had political support, which enabled them to monopolise debate. Scientists are just as prone as anybody else to “confirmation bias”, the tendency we all have to seek evidence that supports our favoured hypothesis and dismiss evidence that contradicts it—as if we were counsel for the defence. It’s tosh that scientists always try to disprove their own theories, as they sometimes claim, and nor should they. But they do try to disprove each other’s. Science has always been decentralised, so Professor Smith challenges Professor Jones’s claims, and that’s what keeps science honest.
What went wrong with Lysenko and dietary fat was that in each case a monopoly was established. Lysenko’s opponents were imprisoned or killed. Nina Teicholz’s book The Big Fat Surprise shows in devastating detail how opponents of Ancel Keys’s dietary fat hypothesis were starved of grants and frozen out of the debate by an intolerant consensus backed by vested interests, echoed and amplified by a docile press.
December 18, 2014
“[C]onservatives are underrepresented in academia because they don’t want to be there, or they’re just not smart enough to cut it”
The advantage of the quote in the headline is that it allows the person saying it to feel more positive about his or her own worldview, while side-stepping the real issue. Pascal-Emmanuel Gobry looks at a new report that addresses this issue:
I have had the following experience more than once: I am speaking with a professional academic who is a liberal. The subject of the under-representation of conservatives in academia comes up. My interlocutor admits that this is indeed a reality, but says the reason why conservatives are under-represented in academia is because they don’t want to be there, or they’re just not smart enough to cut it. I say: “That’s interesting. For which other under-represented groups do you think that’s true?” An uncomfortable silence follows.
I point this out not to score culture-war points, but because it’s actually a serious problem. Social sciences and humanities cannot be completely divorced from the philosophy of those who practice it. And groupthink causes some questions not to be asked, and some answers not to be overly scrutinized. It is making our science worse. Anyone who cares about the advancement of knowledge and science should care about this problem.
That’s why I was very gratified to read this very enlightening draft paper [PDF] written by a number of social psychologists on precisely this topic, attacking the lack of political diversity in their profession and calling for reform. For those who have the time and care about academia, the whole thing truly makes for enlightening reading. The main author of the paper is Jonathan Haidt, well known for his Moral Foundations Theory (and a self-described liberal, if you care to know).
Although the paper focuses on the field of social psychology, its introduction as well as its overall logic make many of its points applicable to disciplines beyond social psychology.
The authors first note the well-known problems of groupthink in any collection of people engaged in a quest for the truth: uncomfortable questions get suppressed, confirmation bias runs amok, and so on.
But it is when the authors move to specific examples that the paper is most enlightening.
May 20, 2014
Charles Stross doesn’t typically write stories with traditional hero characters, and he explains why, before digging deeper into the likely origins of the stereotype:
I will confess that I find it difficult to write fictional heroes with a straight face. After all, we are all the heroes of our internal narrative (even those of us who others see as villains: nobody wakes up in the morning, twirls their moustache, and thinks, how can I most effectively act to further the cause of EVIL™ today?). And people who might consider themselves virtuous or heroic within their own framework, may be villains when seen from the outside: it’s a common vice of fascists (who seem addicted to heroic imagery — it’s a very romantic form of political poison, after all, the appeal to the clean and manly virtue of cold steel in subordination to the will of the State), and also of paternalist authoritarians.
[…] it seems pretty damn clear that the superhero archetypes hail back to the polytheistic religions of yore, to the Greek, Roman, Norse, and Egyptian pantheons and their litany of family feuds and bad-tempered bickering. (And is it just me or are half the biggest plots in
superheropre-monotheist mythology the punch-line to the God-Father (or occasionally one of his more troublesome sons) failing to keep his cock to himself, and the other half due to a jealous squabble between goddesses that escalates into a nuclear grudge-fest until suddenly Trojan Wars break out?)
We have this in common with our 5000-years-dead ancestors: we’re human beings, and our neural architecture hasn’t changed that much since the development of language and culture (unless you believe Julian Jaynes — and I don’t). We still have the same repertoire of emotional reactions. We still have a dismaying tendency to think it’s all about us, for any value of “it” you care to choose. We fall for a whole slew of common cognitive biases, including a complex of interacting heuristics that make us highly vulnerable to supernatural beliefs and religions. (The intentional stance per Dennett means we ascribe actions to intentionality; confirmation bias leads us to assume intentionality to natural events because this is something that’s been bred into us throughout the many millions of years of predator/prey arms races that weeded out those of our ancestors who weren’t fast enough to correlate signs such as lion prints at the nearby watering hole with other signs like Cousin Ugg going missing and realize there was a connection. So our ancestors looked on as lightning zapped another unfortunate Cousin Ugg, felt instinctively that there had to be a reason, and decided there was a Lightning God somewhere and he’d gotten mad at our tribe.)
We have other biases. We look at people with good skin and bilaterally symmetrical features (traits indicative of good health) and we see them as beautiful (hey, again: we’re the end product of endless generations of organisms that did best when they forged reproductive partnerships with other organisms that were in good health), so obviously they’ve been blessed by the gods. And the gods bless those who are virtuous, because virtue (by definition) is what the gods bless you for. So beauty comes to be equated with good; and this plays itself out in our fictions, where our heroes and favoured protagonists are mostly handsome or pretty and the villains are ugly as sin …
April 10, 2014
The last few days have provided both a good laugh and some food for thought on the important question of confirmation bias — people’s tendency to favor information that confirms their pre-existing views and ignore information that contradicts those views. It’s a subject well worth some reflection.
The laugh came from a familiar source. Without (it seems) a hint of irony, Paul Krugman argued on Monday that everyone is subject to confirmation bias except for people who agree with him. He was responding to this essay Ezra Klein wrote for his newly launched site, Vox.com, which took up the question of confirmation bias and the challenges it poses to democratic politics. Krugman acknowledged the research that Klein cites but then insisted that his own experience suggests it is actually mostly people he disagrees with who tend to ignore evidence and research that contradicts what they want to believe, while people who share his own views are more open-minded, skeptical, and evidence driven. I don’t know when I’ve seen a neater real-world example of an argument that disproves itself. Good times.
Yuval Levin, “Confirmation Bias and Its Limits”, National Review, 2014-04-09
October 15, 2013
David Harsanyi wishes the nonsense we tell to pollsters was a bit closer to the truth, at least in some cases:
A recent Rasmussen poll found that one in three Americans would rather win a Nobel Prize than an Oscar, Emmy or Grammy.
Though there’s no way to disprove this peculiar finding, I’m rather confident that it’s complete baloney. The average American probably can’t name more than one Nobel Prize winner — if that. Even if they could, it’s unlikely many would choose a life in physics or “peace” over being a celebrated actor, musician or television star. Put it this way, any man who tells you he wants the life of Nobel Prize-winning Ahmet Uzumcu, Director General of the Organization for the Prohibition of Chemical Weapons, instead of George Clooney is lying. And that includes Ahmet Uzumcu.
Polls might have been precise in forecasting recent elections (though, 2012 pollsters only received an average “C+ grade” in a poll conducted by Pew Research Center; we’re waiting on a poll that tells us what to think about polls that poll polls), but it’s getting difficult to believe much of anything else. Beyond sampling biases or phraseology biases, many recent polls prove that Americans will tell pollsters what they think they think, but not how they intend to act. Part of the problem is social desirability bias — the tendency to give answers that they believe will be viewed favorably by others. That might explain why someone would tell a pollster that he would rather win a Nobel Prize than a Grammy. There is also confirmation bias — the tendency of people to say things that confirm their beliefs or theories. Whatever the case, voters are fooling themselves in various ways. And when it comes to politics, they’re also giving small-government types like myself false hope.
Over the last few months, we seem to have been added to some sort of polling telephone list, as we’ve had dozens of calls from various institutions conducting “important public research” and insisting that we have to take part in their surveys. It’s quite remarkable how angry they get when I say we don’t want to take part. They go from vaguely pleasant at the start of the call to downright authoritarian by the time I hang up the phone … how dare I not want to give them the data they’re asking for? They’ve collectively become more irritating than the calls from “Bob” at “Windows Technical Support”.
July 30, 2012
James Delingpole on the recent paper from Anthony Watts and his co-authors:
Have a look at this chart. It tells you pretty much all you need to know about the much-anticipated scoop by Anthony Watts of Watts Up With That?
What it means, in a nutshell, is that the National Oceanic and Atmospheric Administration (NOAA) — the US government body in charge of America’s temperature record, has systematically exaggerated the extent of late 20th century global warming. In fact, it has doubled it.
Is this a case of deliberate fraud by Warmist scientists hell bent on keeping their funding gravy train rolling? Well, after what we saw in Climategate anything is possible. (I mean it’s not like NOAA is run by hard-left eco activists, is it?) But I think more likely it is a case of confirmation bias. The Warmists who comprise the climate scientist establishment spend so much time communicating with other warmists and so little time paying attention to the views of dissenting scientists such as Henrik Svensmark — or Fred Singer or Richard Lindzen or indeed Anthony Watts — that it simply hasn’t occurred to them that their temperature records need adjusting downwards not upwards.
What Watts has conclusively demonstrated is that most of the weather stations in the US are so poorly sited that their temperature data is unreliable. Around 90 per cent have had their temperature readings skewed by the Urban Heat Island effect. While he has suspected this for some time what he has been unable to do until his latest, landmark paper (co-authored with Evan Jones of New York, Stephen McIntyre of Toronto, Canada, and Dr. John R. Christy from the Department of Atmospheric Science, University of Alabama, Huntsville) is to put precise figures on the degree of distortion involved.
March 10, 2012
May 6, 2011
Remember that old saw about it being impossible to reason someone out of an opinion they were never reasoned into? Ian Leslie looks at a new paper about the function of reasoning:
This is a widespread habit, of course, and one we might notice in ourselves in other contexts. Whether it’s relationships or politics or the workplace, we have a tendency to start off with we want and then reason backwards towards it; to cloak our true motivations or prejudices in the guise of reason. It’s been shown again and again in studies that we have a very strong ‘confirmation bias’; once we have an idea about the world we like (Obama is un-American, my girlfriend is cheating on me, the world is or isn’t getting warmer) we pick up on evidence we think supports our hypothesis and ruthlessly disregard evidence that undermines it, even without realising we’re doing so.
[. . .]
We tend to think of reason as an abstract, truth-seeking method that gets contaminated by our desires and motivations. But the paper argues it’s the other way around — that reasoning is a non-violent weapon given to us by evolution to help us get our way. Its capacity to help us get to the truth about things is a by-product, albeit a hugely important one. In many ways, reasoning does as much to screw us up as it does to help us. The paper’s authors, Dan Sperber and Hugo Mercier, put it like this:
The evidence reviewed here shows not only that reasoning falls quite short of reliably delivering rational beliefs and rational decisions. It may even be, in a variety of cases, detrimental to rationality. Reasoning can lead to poor outcomes, not because humans are bad at it, but because they systematically strive for arguments that justify their beliefs or their actions. This explains the confirmation bias, motivated reasoning, and reason-based choice, among other things.
H/T to Tim Harford for the link.