Questions of content moderation and intermediary liability have seeped into just about everything these days, and not just with regards to Section 230 but also a whole host of laws in the US and around the world. A lot of people seem to think that a long list of societal and political failings can be rectified by regulating content online, and don't talk about how these problems run deeper and have been around for a long time. One person who doesn't fall into this trap is Heather Burns from the Open Rights Group, and she joins Mike on this week's episode to talk about why regulating the internet won't magically fix everything else.
If you would believe the UK government, there are two types of people. In the one category, you have law abiding citizens whose every movement, communication and social network activity must be monitored and digitally analyzed to keep them at bay, for their own good. In the other category, you have murderers, pedophiles and terrorists. If you object to belonging to the first category, you must therefore be part of the other, or at least a partner in crime of the scoundrels identified in category two. This would be so according to the unbelievably backward rhetoric of parts of the UK government not too long ago. To make sure society runs smoothly, the government devised the Communication Data Bill, aka. “Snooper’s Charter”, which would enable mass surveillance of digital communications.
As Glyn Moody noted, the Snooper’s Charter has been declared effectively dead after Liberal Democrat leader Nick Clegg announced his party would not support the Bill after some heavy scrutiny by two critical parliamentary committees. The debate on digital surveillance is far from over, however, as several sectors of law enforcement will continue to push for ubiquitous interception, because it is ‘useful’. Of course, conveniently forgetting about proportionality when dreaming up laws to use or control digital technology has become an all too common thread worldwide.
The UK Open Rights Group, an EFF sister organization, has released a report and a series of particularly funny videos to put an end to the Snooper’s Charter, and also to inform policy makers and the public at large about how the discussion about digital surveillance should be held (disclaimer: I helped compile this report).
In the report, twelve experts from different fields explain clearly how and why digital surveillance has come about, what its intent is, and why mass surveillance such as that proposed by the Snooper’s Charter is probably the worst possible next step to take, considering the ability of current technology to effectively monitor everyone and everything.
Journalist and surveillance expert Duncan Campbell puts the Snooper's Charter in historical perspective and explains:
“The manner in which the new Bill has been introduced and managed, fall full square within long British historical precedents that position privacy rights as an irritant to be managed by a combination of concealment, secrecy, information management, and misinformation.”
One of the most notable features of the Snooper’s Charter is the de facto centralized search engine – or “Filter” – which scours several public and private datasets to analyze communications in-depth. Cambridge University computer scientist Richard Clayton explains:
“It is fundamentally inherent to this proposal that Filter data should be collected on everyone’s activity and that this data should be made available en masse from the private companies, the Internet Services Providers and telephone companies that provide services, to government systems for the correlation processing.”
Information privacy rights advocate Caspar Bowden does not mince any words:
“It ought to be obvious that continuously recording the pattern of interactions of every online social relationship, and analyzing them with the “Filter”, is simply tyrannical.”
Rachel Robinson from “Liberty”, the National Council for Civil Liberties, considers what this type of surveillance will likely lead to:
“If the present proposals for the collection of communications data become law, proposals for other types of blanket or random surveillance irrespective of suspicion “just in case” are a logical next step.”
Professor Peter Sommer explains one of the underlying problems:
“Legislators need knowledge of the technical capabilities of surveillance technologies” because: “The legal words need to reflect the reality of how the technology works.”
Joss Wright, computer scientist at the Oxford Internet Institute, notes a fundamental and frequently repeated mistake in thinking about regulating internet technology:
“Equating the Internet with historical technologies when making policy is not simply wrong, it is dangerously misleading.”
Together with Professor Emmenthal below, policy makers should finally start realizing that “technology’s interaction with the social ecology is such that technical developments frequently have environmental, social, and human consequences that go far beyond the immediate purposes of the technical devices and practices themselves […]” (Kranzberg, 1986). Fortunately, the Open Rights Group established 10 clear recommendations to continue the discussion on digital surveillance law, which will also be applicable in other countries.