Filtering, not Censoring
22nd July 2013
I'm strongly against censorship on the internet. I and the company I work for run a system with over 10,000 websites on it which administrators edit themselves, with a huge variety in terms of content. I get quite a few requests from people wanting sites shut down, and sometimes we get threatened with defamation suits, but short of a court order, or the site owner breaking our terms and conditions, so far we've resisted these requests.
However, I find myself agreeing with at least part of David Cameron's initiative today.
As I understand it, there are two main initiatives. The first is an 'opt-out' adult content filter to be put in place by all ISPs. This would mean that by default any broadband connection would block content deemed unsafe for children unless the person who pays for the connection has notified the ISP to the contrary.
I don't have a problem with this, if it's done carefully and its limitations are understood. The main limitation is this: you won't stop a determined teen from accessing the content he or she wants. Install TOR on your laptop, job done. What the filter will do is (in 95% of cases) stop your 8 or 9 year old from stumbling across content which really isn't suitable for them.
I personally will happily opt out - I have no kids at home, and I'm an adult. I want to view adult content: in my view, the content filtered should not just be pornography, it might be race hate material, violent (18 rated) movies, upsetting news images of faraway famines or massacres. I want my version of the internet to be warts and all, and because I'm an adult I need have no shame in doing so. If the 'opt-out' adult content filter is to be successful, there should be no stigma attached to opting out.
As such, I don't see any difference between this measure and placing lads' mags on the top shelf of the newsagent, or limiting swearing and violence on TV until after the 9 o'clock watershed. It's a perfectly sensible method of protecting our kids from things we don't want them to know about yet. Opting out should be seen in the same light and equally sinister as watching telly after 9pm.
It's also flexible and therefore not a nanny-state measure - if you prefer to opt-out and then manage the settings on your system to control what your kids see, then that's entirely up to you, and again should have no stigma attached.
What kind of content should be filtered is not contentious, because it's not being censored, it's being filtered. There is occasional debate over whether the BBC gets its watershed right, or whether WH Smith should put Vogue on the top shelf to avoid young girls becoming anorexic, and I see the adult content filter having much the same level of controversy.
One argument is that the filter might prevent abused children from accessing services to help them due to over-active blocking - we should counter this by fine-tuning the filter and ensuring that other avenues for help are better promoted in other ways.
The second initiative is much, much more problematic. It involves extending the range of images which are illegal to possess. At the moment, this is pretty much the area of photographs of under-sixteens involved in sexual activity, and as such is fairly easy to police, and images deemed 'extreme': bestiality, necrophilia and acts which injure someone, (much harder to police). However, extending this to images of rape or simulated rape seems almost impossible to define (despite the fact that it has apparently been done here in Scotland). As has been mentioned in many articles, there are many films where a rape scene is inherent to the plot - surely this can't be considered illegal?
I would have great fears that this would grow arms and legs - there have already been cases where the laws on obscenity have been used (such as this thankfully unsuccessful one ) to prosecute someone for holding images of consensual and entirely legal sexual activities.
If adults wish to exchange images of any consensual sexual activity they should be able to do so without fear of prosecution. If those images are staged to look non-consensual, that has nothing to do with it - this article outlines some examples and arguments.
☝