In the Guardian, Cory Doctorow talks about the actual scale of effort the British government is attempting to mandate to “protect the children from pr0n”:
In order to filter out adult content on the internet, a company has to either look at all the pages on the internet and find the bad ones, or write a piece of software that can examine a page on the wire and decide, algorithmically, whether it is inappropriate for children.
Neither of these strategies are even remotely feasible. To filter content automatically and accurately would require software capable of making human judgments — working artificial intelligence, the province of science fiction.
As for human filtering: there simply aren’t enough people of sound judgment in all the world to examine all the web pages that have been created and continue to be created around the clock, and determine whether they are good pages or bad pages. Even if you could marshal such a vast army of censors, they would have to attain an inhuman degree of precision and accuracy, or would be responsible for a system of censorship on a scale never before seen in the world, because they would be sitting in judgment on a medium whose scale was beyond any in human history.
Think, for a moment, of what it means to have a 99% accuracy rate when it comes to judging a medium that carries billions of publications.
Consider a hypothetical internet of a mere 20bn documents that is comprised one half “adult” content, and one half “child-safe” content. A 1% misclassification rate applied to 20bn documents means 200m documents will be misclassified. That’s 100m legitimate documents that would be blocked by the government because of human error, and 100m adult documents that the filter does not touch and that any schoolkid can find.
In practice, the misclassification rate is much, much worse. It’s hard to get a sense of the total scale of misclassification by censorware because these companies treat their blacklists as trade secrets, so it’s impossible to scrutinise their work and discover whether they’re exercising due care.