Monday, April 14, 2008

Censorship: Thinking of the Children

The Guardian reports that Dr Byron has not confirmed her intention of checking on government implementation of her reforms by 2011. This may be wise – her ministerial sponsor, Ed Balls, will soon be in the Treasury to sort out the mess he allowed to develop in the last decade (“goodbye, Darling”), and her Prime Ministerial simperer will be out of Downing Street as soon as he calls an election, by 2010 at latest. There will perhaps be a Communications Bill as early as 2011, and the agenda will have moved beyond her ‘enforced self-regulation’ (sic) or co-regulation, to full-blown regulation and its outer limits. Reviews of a barely living media literacy strategy and a grand Council for Child Internet Safety will by then have somewhat faded from the political agenda.

Its worth examining the proposal in full. In her Impact Assessment in the Table (unnumbered) at paragraph 3.121, the civil servants have persuaded the project to adopt six options, including ‘do nothing’, the holy trinity (regulate/co-regulate /self-regulate), and two agency options: a new agency or Ofcom. It dismisses agencies as too independent of government and therefore unable to exercise political influence to engage disparate departments in ‘joined up government’. This also prevents self-regulation, while of course regulation is too inflexible (until 2011?). Therefore, “on balance” – though no formal method is ever revealed for this impact assessment outcome – the decision is to transfer the Home Office Internet Safety Taskforce (“HSTF”) into the “multi-stakeholder council”, the Council for Child Internet Safety. She states at Paragraph 3.122:
“this, broadly speaking, is a self-regulatory approach with industry and government working in partnership”
Crucially, she states that “the Council would need to think carefully about who was best-placed to monitor compliance with industry standards.”

Quite right – and who sets these industry standards? The report considers these in Chapter 4 and it is here that we arrive at the crux of the matter: enforced self-regulation – which Byron admits means that non-UK actors cannot join in the full work of the strategy. Given the preponderance of US-based actors in this sector, including all the major social networks and the large ISPs (excepting French Wanadoo, Italian Tiscali and UK-based BT), one might have thought this is a pretty powerful argument, but the need to link political to regulatory to parental strategy (what a camel this will be with this Council!) overcomes considerations of international political economy.

So what censorship and codification is envisaged? Well, not censorship by ISPs, yet. “I do not recommend that the UK pursue a policy of blocking non-illegal material at a network level at present. However, this may need to be reviewed if the other measures in this report fail to have an impact” on children viewing inappropriate content. It would have helped to have had rather more quantifiable goals, but she leaves it to a measure of opacity that allows for political judgments (Para 4.60).

So what other measures are proposed? Well, in a blithe over-riding of the E-Commerce Directive, she suggests that companies “should not hide behind the law” (P4.18) when they could monitor content beyond the Article 14 protections: “It seems fair for companies to balance the benefits of making their sites safer for children, and the added value this brings to their brand, against the risk of liability”. Yes, but what has it to do with better regulation? Its companies’ own decision until and unless she recommends government drops the guillotine threat in P4.60.

So what else if companies do decide to take advantage of protections against liability offered by a settled decade-old European policy? Well “Having filters set on by default would not make parents engage” – phew! No censorship by default. But all computer buyers must receive the software pre-installed, as in France: “since 2004, the French government has required all ISPs to provide their customers with filtering software”. Note that no French evidence appears to have been presented to the Byron Review, so this is second-hand, it seems (I am happy to be corrected if this is not so).

There is a stick to this voluntary system in P.4.75:

"if these approaches, which seek to engage parents with the issues and available tools fail to have an impact on the number and frequency of children coming across harmful or inappropriate content online within a three year timeframe, I suggest that Government consider pursuing a policy of requiring content filters on new home computers to be switched on by default."

In Search, the review appears to go against the Information Commissioner and Article 29 Working Party attempts to prevent too much tracking by search providers – specifically their recent recommendation that data be deleted or irrevocably anonymised after 6 months. By contrast, Byron wants ‘safe search’ settings applied BY DEFAULT which would require a permanent record by the search provider for that IP address, or by maintaining a permanent cookie. In particular she recommends (P4.81) industry work towards systems that “give users the option of ‘locking on’ safe search on to a particular computer; and develop ways for parental control software to automatically communicate with search engines so that safe search is always on when the child uses the computer”.


Whatever the technical complexity for providers, the complexity for users is likely to increase, and the danger that this is abused broadly is high (its very easy to imagine the child locking the computer so that he can access uncensored results but the parent cannot, to “handcuff” the censor into false information – boy hackers will be boy hackers).

No comments:

Post a Comment