At The Daily Sceptic, Mike Hearn looks at the often incredible poll results turned up by YouGov that seem to indicate that well over half the population of Britain are budding medical fascists who want nothing more than a full-on pandemic tyranny from now to the end of time:
Recently YouGov announced that 64% of the British public would support mandatory booster vaccinations and another polling firm claimed that 45% would support indefinite home detention for the unvaccinated (i.e., forced vaccination of the entire population). The extreme nature of these claims immediately attracted attention, and not for the first time raised questions about how accurate polling on Covid mandates actually is. In this essay I’m going to explore some of the biases that can affect these types of poll, and in particular pro-social, mode and volunteering biases, which might be leading to inaccurately large pro-mandate responses.
There’s evidence that polling bias on COVID topics can be enormous. In January researchers in Kenya compared results from an opinion poll asking whether people wore masks to actual observations. They discovered that while 88% of people told the pollsters that they wore masks outside, in reality only 10% of people actually did. Suspicions about mandate polls and YouGov specifically are heightened by the fact that they very explicitly took a position on what people “should” be doing in 2020, using language like “Britons still won’t wear masks”, “this could prove a particular problem”, “we are far behind our neighbours” and most concerning of all – “our partnership with Imperial College”. Given widespread awareness of how easy it is to do so-called push polling, it’s especially damaging to public trust when a polling firm takes such strong positions on what the public should be thinking and especially in contradiction of evidence that mask mandates don’t work. Thus it makes sense to explore polling bias more deeply.
[…]
Given the frequency with which large institutions say things about COVID that just don’t add up, it’s not entirely surprising that people are suspicious of claims that most of their friends and neighbours are secretly nursing the desire to tear up the Nuremberg Code. But while we can debate whether the chat-oriented user interface is really ideal for presenting multi-path survey results, and it’s especially debatable whether YouGov should be running totally different kinds of polls under the same brand name, it’s probably not an attempt to manipulate people. Or if it is, it’s not a very competent one.
When I was much younger, I’d very occasionally get a call on our land line from a polling firm. I’d sometimes take part in the poll, although I don’t recall every seeing any of the polls I took part in being published later. After a few years, I stopped taking part and now I hang up as soon as it’s clear that the call is from a polling company. Apparently I’m far from alone in this learned aversion to dealing with polls:
Online panel polling solves the problem of low phone response rates but introduces a new problem: the sort of people who answer surveys aren’t normal. People who answer an endless stream of surveys for tiny pocket-money sized rewards are especially not normal, and thus aren’t representative of the general public. All online panel surveys face this problem and thus pollsters compete on how well they adjust the resulting answers to match what the “real” public would say. One reason elections and referendums are useful for polling agencies is they provide a form of ground truth against which their models can be calibrated. Those calibrations are then used to correct other types of survey response too.
A major source of problems is what’s known as “volunteering bias”, and the closely related “pro-social bias”. Not surprisingly, the sort of people who volunteer to answer polls are much more likely to say they volunteer for other things too than the average member of the general population. This effect is especially pronounced for anything that might be described as a “civic duty”. While these are classically considered positive traits, it’s easy to see how an unusually strong belief in civic duty and the value of community volunteering could lead to a strong dislike for people who do not volunteer to do their “civic duty”, e.g. by refusing to get vaccinated, disagreeing with community-oriented narratives, and so on.
In 2009 Abraham et al showed that Gallup poll questions about whether you volunteer in your local community had implausibly risen from 26% in 1977 to a whopping 46% in 1991. This rate varied drastically from the rates reported by the U.S. census agency: in 2002 the census reported that 28% of American adults volunteered.