The second part in the series, I look at why filtering technology often fails and current methods in place to deal with illegal forms of pornography
It sounds good in principle. Block all pornography in order to protect the children. Who can argue against that? David Cameron’s plan to block pornography has placated a frenzied Daily Mail for the time being, but in part two of this series on regulating pornography, I look at how technology can be used to limit expression resulting in the good things we often want to have access to getting blocked by filtering technologies. The position David Cameron wants is for us to all start from a position of ‘porn banned’, subscribers will have to systematically unblock things they require access to.Let’s start with a relatively simple question. What is pornography? Why not just defining it as two people having sex? If that’s the case, then some television programs on after the watershed would be blocked under this definition. How about nudity? Ok – Robin Thicke’s uncensored video for Blurred Lines now blocked. Any film involving Kelly Brook – blocked, blocked, and blocked. I am being rather simplistic, but consider this- according to the Internet Filter Review, there are 4.2m pornographic websites and 68 million search engine requests to view them every day. According to a recent TechCrunch article, "16.6 per cent of the traffic that visits Tumblr takes place on adult blogs" and 11.4% of its 200,000 most visited domains feature adult content. PrivCo estimates some 1 in 6 of all web pages on Tumblr contain some adult content.In other words, porn makes up a massive amount of the web content out there. Why can’t we just stick a block on these pages and limit their visibility on the web? We already are doing that. Yet, the regulatory methods available to the government can only do so much. We generally use two methods at the content layer – the sledgehammer approach of blocking the site completely. (See the Pirate Bay) or earmarking specific webpages to block users from accessing. This is where the design features of the Internet’s architecture can actually act as a regulator. In the UK wee use a hybrid hierarchal/design control system called Cleanfeed
. Suspect images are identified and blacklisted by an organization called the Internet Watch Foundation. The suspect images are blocked through the Cleanfeed
protocol which is a protocol allows ISPs to match users’ requests against a blacklist of pages on the web. If a user tries to access the page/image, they simply are rejected by their ISP. This prevents the user from being prosecuted for possession of an indecent image of a child under Section 160 of the Criminal Justice Act. IWF also passes on information to the criminal justice authorities to help them locate the sources of illegal material. The IWF is very good at what it does, considering it operates with very limited resources, yet without much public accountability. This was highlighted in 2008 when it blocked access to a Wikipedia page for the Scorpions' Virgin Killer
. The cover art for the albums portrays a ten year old girl posing nude, with a faux glass shatter obscuring her genitalia. The end result of the block meant that nearly 95 per cent of all British users were barred from editing the site. The fallout brought the IWF into the limelight and they quickly reversed their decision. But it begs the question? Who decides what porn is and what is not? Who are the gatekeepers that decide whether something is tasteful or obscene?
Wikipedia page for TheScorpions Virgin Killer was blocked by IWF
As Dr Emily Laidlaw of the University of East Anglia correctly points out, “communication technologies that enable or disable participation in discourse online are privately-owned” and as owners of the infrastructure and intelligence that makes up the net, are “gatekeepers to our digital democratic experience”. Before I examine the role gatekeepers play, let’s look at the current role filtering plays in the browsing experience. Porn filters block good content. A 2007 paper by the University of California, Berkeley, tested 15 combinations of internet content filters and filter settings. The most restrictive of those filters managed to block 91% of adult content – but it also mistakenly blocked 23 per cent of "clean" webpages. A recent case reported in the mainstream press reported a case where a British Library search filter denied a man using its Wi-Fi network access to Hamlet, because it contained “violent content”. ONS stats released a few weeks ago say 43 per cent of people aged over 16 use the internet to seek health-related information.
The citizens of Scunthorpe, a small town in Lincolnshire, have run into problems with unsophisticated filters, as have sexual health charities and politicians with unfortunate names. An investigation by the Telegraph into smartphone filters reveals some UK carriers block LGBT sites because they deal with 'mature content'. Access to certain sites including LGBT news sites proves problematic for minors who can't authorise for the filters to be removed, or users with company phones who may not want to approach their employers regarding the online blocks.Furthermore questions are to be asked about accountability and oversight of these filtering technologies. Requests for filtered material on the Internet return a “404 error” and don’t inform the user as to why the material has been blocked. 404 ('file not found') or 403 ('access denied') codes mean completely different things to different types of users. Would the IP address of someone trying to access a blocked site, even the Virgin Killers page on Wikipedia, be stored and added to a database for the police to ‘maintain’. What if someone sends you a link in an email via spam and you accidentally click on it? Would you be added to a database of people who are potentially paedophiles? In the real world, the offline world, pornography is regulated on television by a very small and often unheard of regulator called the Authority for Television on Demand (ATVOD). ATVOD is the independent co-regulator for the editorial content of UK video on demand services that fall within the statutory definition of On Demand Programme Services. ATVOD gets a whopping two complaints a day. No complaints had resulted in a finding of breach of content standards. It has 4 members of staff and a budget of £400,000. In terms of content, the requirements are much less intensive than those applicable (under EU and UK law) to television broadcasting, and deal only with incitement to hatred and the protection of minors. However, the requirement that programmes which ‘might seriously impair the physical, mental or moral development of minors are only made available in such a way as to ensure that minors will not normally hear or see’ the content is relevant. ATVOD suggested blocking websites that do not offer age restriction protection to visitors from the UK. Although it seems counter-intuitive for porn sites in the US to be subject to UK law, in R v Waddon the court ruled that "the content of American websites could come under British jurisdiction when downloaded in the United Kingdom". The free stuff is the shop window," says Peter Johnson, ATVOD Chief Executive, referring to free pornographic content on websites accessible in the UK. "If you're offering [hardcore pornography] in your shop window, you're breaking UK law. Even if you're not in the UK, you're breaking UK law because our children can access it."Therefore your shop is trading illegally. Therefore funds should not be flowing from the UK to your shop, because your shop is fundamentally operating in an unlawful capacity."The ATVOD approach of demanding websites accessible in the UK that provide access to hard-core pornography seems far more effective and fair than a default ‘on’ filtering system.As long as ATVOD are accountable.