It will require no new legislation since it is a voluntary scheme being introduced by the ISPs themselves. The new elements announced by Perry are that it will be applied to existing as well as new customers and that it will be automatic. Subscribers will be able to opt-out of the filtering, but only temporarily. "We will have automatic put on, so if you turn the filter off at 9pm, it turns on again at 7am," said Perry (reported by Ars Technica).
There are obvious questions. Who provides the list of blocked websites? What is the definition of pornography to be used? What rights of appeal do website owners have? What effect will an ISP porn filter have on P2P porn dissemination?
For Americans it is also an issue of free speech. “The moral imperative of the issue is lost when filtering is forced; it becomes in this case not a defensive shield for families with inquisitive kids, but a societal-level imposition of one perspective of what’s smut, and what’s just fine,” reports The Next Web. “It’s not the place of a government to decide what form of information isn’t fit for its citizens to consume.”
The issue, however, is wider than pornography – it is one of state-controlled censorship. Once the technology is in place, the temptation to use it to block additional sites will be evident. As long ago as 2011 the British Phonographic Institute asked BT to use its Cleanfeed porn filter to block The Pirate Bay. Beyond copyright censorship lies political censorship – could sites like WikiLeaks find themselves targeted.
“Yes,” the Martha Group (marthamitchelleffect.org) told Infosecurity. “Because the concept of what is or is not obscenity (does or does not constitute porn) is a malleable opinion that can be abused. Could the carnage of 'Collateral Murder' [leaked allegedly by Bradley Manning to WikiLeaks] be classed as obscene, or some kind of porn? Ridiculous as that may appear today, what about a situation analogous to WW2? And porn's usage within our lexicon is increasing: food-porn, tech-porn, war-porn.”
Meanwhile, Google is taking a different approach: it is building a database of child pornography images rather than a list of pornography sites to block. “Since 2008,” wrote Jacquelline Fuller, director, Google Giving on the official company blog on Saturday, “we’ve used ‘hashing’ technology to tag known child sexual abuse images, allowing us to identify duplicate images which may exist elsewhere. Each offending image in effect gets a unique ID that our computers can recognize without humans having to view them again. Recently, we’ve started working to incorporate encrypted ‘fingerprints’ of child sexual abuse images into a cross-industry database. This will enable companies, law enforcement and charities to better collaborate on detecting and removing these images, and to take action against the criminals.”