Cory Doctorow points out that no “adult content” filter is a replacement for parental guidance and supervision:
Last week’s announcement of a national scheme to “block adult content at the point of subscription” (as the BBC’s website had it) was a moment of mass credulity on the part of the nation’s media, and an example of how complex technical questions and hot-button save-the-children political pandering are a marriage made in hell when it comes to critical analysis in the press.
Under No 10’s proposal, the UK’s major ISPs — BT, Sky, TalkTalk and Virgin — will invite new subscribers to opt in or out of an “adult content filter.” But for all the splashy reporting on this that dominated the news cycle, no one seemed to be asking exactly what “adult content” is, and how the filters’ operators will be able to find and block it.
Adult content covers a lot of ground. While the media of the day kept mentioning pornography in this context, existing “adult” filters often block gambling sites and dating sites (both subjects that are generally considered “adult” but aren’t anything like pornography), while others block information about reproductive health and counselling services aimed at GBLT teens (gay, bisexual, lesbian and transgender).
[. . .]
The web is vast, and adult content is a term that is so broad as to be meaningless. Even if we could all agree on what adult content was, there simply aren’t enough bluenoses and pecksniffs to examine and correctly classify even a large fraction of the web, let alone all of it (despite the Radio 4 newsreader’s repeated assertion that the new filter would “block all adult content”.)
What that means is that parents who opt their families into the scheme are in for a nasty shock: first, when their kids (inevitably) discover the vast quantities of actual, no-fooling pornography that the filter misses; and second, when they themselves discover that their internet is now substantially broken, with equally vast swathes of legitimate material blocked.