Michael Blastland thinks there’s a serious issue with how pollsters do their work:
I don’t know about you, but quite often there seems to me only one sensible answer the questions posed in these attempts to canvass opinion: I don’t know.
But that’s not really what I mean. What I really mean is: “it depends”. And for that reason, I might not answer.
Yet the standard way for pollsters to treat people like me is to ignore them.
“Excluding don’t-knows and no answers” say the reports, before telling us that most of us think we should or shouldn’t do this or that. It’s as if the “don’t knows” haven’t been paying attention while the “no answers” don’t care.
Strip out the apathetic and the ignorant and see what’s left, they seem to say.
But isn’t it at least arguable that we’ve thought about it and decided uncertainty is the best response?
Lots of issues don’t fall into easily classified answers, and pollsters often take the easy way out and provide one or two obvious answers (usually tailored to the interests of the commissioning organization, of course), and leave people with a more nuanced view out of the equation.