Works of pure theory in Anglophone European internet law scholarship are fairly rare, and those that exist often come from scholars whose background is in a field other than traditional law, e.g. sociology, politics or criminology. While some of this work is excellent, it may lack a full understanding both of the nuances of legal analysis and the realities of commercial legal culture. For all these reasons, it is to be warmly welcomed that in what one might call the second stage of his distinguished career, Chris Reed, one of Europe’s leading researchers into the more commercial and practical aspects of internet law, has decided to turn his years of experience in helping both draft and critique European internet and e-commerce laws towards theorising how to regulate for the on-line world, in the form of a series of pieces which so far include Taking Sides on Net Neutrality, The Law of Unintended Consequences–embedded models in IT regulation and more recently, How to Make Bad Law: Lessons from Cyberspace. The latest of these pieces (which are destined eventually to form a book on regulation, I believe) ((Usefully, Reed has also posted on his blog his use of these pieces in teaching a coherent course on Internet law and regulation, along with extra conclusions and slides.)) appeared in late 2010 and takes on the near cliché of internet law that “what is legal offline should also be legal online,” or more formally, the principle of equivalence. While it is something of a kneejerk assumption in many domains, notably freedom of speech, that this approach is axiomatically mandatory, Reed dissects the desirability, applicability and most interestingly perhaps, the failures of the principle in the context of the history of (mainly European) internet regulation.
Reed defines equivalence as a starting point as “an approach in which all laws and regulations should, so far as possible, be equivalent online and offline. In other words, the same legal principles should regulate an online technology activity as those which applied to the equivalent offline technology activity.” Reed’s first point is that this should not be confused with the similarly-popular notion of technology neutrality. “Technology neutrality addresses the choice between the available substantive rules which could be used to implement … legal principles,” while equivalence, in his view, is about choosing those legal principles for regulating the online world in the first place. Equivalence therefore takes precedence in the regulatory toolkit and is arguably the more important issue to get right. Reed also muses as to whether a distinction is needed between “technology indifference”–which is an “attempt … to define a rule in such a way that it applies equally well to the activity whatever technology is used to undertake it” and a concept he does not name but I will call technology non-discrimination which is “a legislative aim that the rules should not discriminate between technologies and should continue to apply effectively even if new technologies are developed.” A good example of problematic regulation which might have been elucidated by applying these concepts lies in the recent controversial redrafting of the part of the EU Privacy and Electronic Communications Directive dealing with cookies (art 5(3)), where despite frequent claims to technology-neutrality the results have been nothing of the kind either initially or after reform.
Returning to equivalence though, Reed makes a cogent distinction between “pure” equivalence and “result” equivalence (a concept which seems drawn partially, one might hazard, partly from feminist legal theory and partly from the comparative law doctrines of e.g. Zweigert and Kotz). Applying the exact same rules on and offline will often simply produce a mess, given the huge differences in the environment–one of the best examples being the attempt in early jurisprudence to map ISPs and hosts of unlawful defamatory material to newspaper or TV publishers with consequent full liability. Instead, Reed points us towards “functional equivalence,” where the idea is to get the new online rule right by making sure that even if formally or even substantively quite different, it achieves the same result offline as online. This raises the further problem that, in Reed’s view, “equivalence” is often most neatly met by having one rule for both online and offline activities, with the practical result of a need to revise (and generalise?) the offline rule to cover both domains. If the rule brings in entirely new regulation, this may be politically plausible–Reed’s example is the UK Terrorism Act 2006, which introduced the new offence of disseminating terrorist publications, and applied it to both hard copy and electronic versions simultaneously–but in other cases it may require political will or judicial happenstance and may never or only very slowly happen. One success story Reed cites is the adaptation of the English common law of fraud by the UK Fraud Act 2006 to deal with the problem that in online fraud the fraudster rarely knows the mental state of his victim (the offence was redrafted to pivot solely on the intention of the fraudster). This however depended on funded law reform by the English Law Commission–whose time and resources are finite (as are, one imagines, those of similar national bodies). Such holistic reform may simply often not be possible.
But the biggest problem with “functional equivalence” is how to define what is functionally the same scenario to be regulated. This is hard enough offline: online, it is a corker. Reed highlights one of the most obvious problems, that of categorisation. Is a search engine, for example, a piece of essential infrastructure, like water or gas supplies; a distributor of electronic content like an ISP or host; an intentional recopier of copyright material, possibly without permission of the rightsholder; a publisher like a newspaper with the freedom of speech privileges that implies; or the virtual equivalent of physical trespasser? Get this wrong (as the Belgian courts notoriously did in Copiepresse) and you have a scenario where the internet disappears in the deluge of unparseable material and the digital society vanishes. One sleight of hand Reed doesn’t mention is to avoid the categorisation problem by explicitly regulating only functions, not who undertakes them. This is largely what the DMCA and the EU E-Commerce Directive do to deal with the problems of online intermediary liability: a strategy that has lead to much testing of limits, yes, and of course essentially passes the buck back to the courts (cf. Napster, Grokster, Google Adwords, L’Oreal v eBay at the ECJ, et al) but at least has had some durability about it.
But sometimes there simply is no functional equivalent between the online and offline worlds. What then? How can we tell when equivalence simply won’t work? Here Reed’s analysis does falter. In his view, for example, there are “no major theoretical obstacles” to regulating copyright online and offline by “equivalent” rules (P. 269): the problem is a procedural not a substantive one, namely the restraining influence of international treaties preventing states from going it alone with their own most appropriate solutions (a bit like Greece being stuck in the Eurozone). This writer would beg to differ: one of Reed’s own criteria for applying equivalence is that there is a balance of interests among stakeholders which can be identified offline and mirrored online–it is hard to see how this is possible in the current online content wars where balances are entirely skewed from the offline by easy copying, easy distribution, anonymity and encryption (to name but a few factors). But these cavils aside this is a rare and enormously useful primer on how to regulate for the internet–one wishes some elected representatives could be forced to read a copy.