Monthly Archives: May 2011

« April 2011 (1)
August 2011 (1) »

Cybersecurity through Information Theory

Derek E. Bambauer, Conundrum, 96 Minn. L. Rev. ____ (forthcoming 2012), available at SSRN.

It is rare to find satisfying cybersecurity scholarship. This is not the fault of the talented scholars who have written in this field. I am a fan of the work of many who have tried to lead us to legal and geopolitical solutions to the problems of viruses, worms, botnets, cyberwar, and cyberterrorism. But these individuals have had their considerable talents stymied by cybersecurity’s fundamental knowledge problems. To make a useful contribution, an author must understand technical concepts famous for their complexity, from TCP/IP to BGP, and be able to untangle complex relationships like the ones between the FBI and NSA and the United States and China. Even worse, cybersecurity scholars can never know whether they have the details right, because these topics are shrouded in layers of official and de facto secrecy.

For these reasons, I have never felt entirely satisfied by a single work about cybersecurity, at least not until now. Derek Bambauer has written a fine article about this topic entitled Conundrum, available on SSRN and forthcoming in the Minnesota Law Review. This useful article points the way to a more interesting and more useful new way forward for cybersecurity scholarship and discourse.

Reviewing the state of cybersecurity scholarship, Bambauer helpfully diagnoses an underappreciated narrowness to past approaches: scholars and policymakers have too often treated cybersecurity as a problem of infrastructure alone. They focus only on macro-level, technological concerns, asking questions like how can we detect the source of a cyberattack? how easily can we quarantine a troublesome part of the network? and how well do domestic and international law provide tools to bring to justice a cyberterrorist or cyberwarrior?

As Bambauer demonstrates, this narrow focus pushes scholars to adopt correspondingly narrow legal and political frames, meaning cybersecurity is seen primarily as best addressed with “well-established, comfortable, yet poorly-fitting models from criminal law, national security law, and military law.” Id. at 10. Solutions built upon these frames focus, to the exclusion of almost anything else, on preventing, detecting, and stopping cyberthreats, which means they lead almost inexorably to calls to “fix the attribution problem” online, calls to bolt on some new protocol to the Internet to destroy the network’s inherent untraceability. Not only are these solutions never likely to come to pass, but also if they ever did, they would strike a blow to things we value, like generativity, privacy, and the ability to resist dictators and tyrants.

Bambauer breaks free of these narrow frames by connecting cybersecurity to information theory. He peels away the hard shells surrounding the fiber-optic cables that comprise our international infrastructure, to reveal their delightful chewy centers, the communications that we’re trying to protect. This move makes enormous sense. After all, we are not protecting cables because we like cables but instead, we are trying to ensure that after a crippling cyberattack, we can still send email messages, text messages, and telephone calls, and access websites, databases, and control systems.

This is a wise move because it allows Bambauer to connect cybersecurity to a rich intellectual history, from Claude Shannon to George Akerloff, Joseph Stiglitz, Michael Spence, and beyond. Building on the work of these thinkers, Bambauer asks a critical question that too often goes unanswered in cybersecurity debates: what exactly are we trying to protect? To this essential and underexplored question, he provides three answers: access, alteration, and integrity.

Technical experts in information security won’t be very impressed with the novelty of this list, as it echoes the venerable information security triad, confidentiality, integrity, and availability. But Bambauer helps us realize that non-technical experts in law and policy have been proposing solutions that protect these goals only indirectly. Attribution allows us to trace the source of an attack—except when it doesn’t—which helps us find, stop, and bring to justice our attackers—except when we can’t—which deters others thereby making our network safer—except when it won’t.

Bambauer’s singular focus on information allows him to find much more direct and narrow ways to protect access, alteration, and integrity, proposals that sound very different than those that have been proposed before. His guiding principle is redundancy. (He calls it inefficiency, but more on that in a minute.) Data and networks should be rendered more redundant than an unregulated market would produce. As a tenant of national and international policy, we should encourage and sometimes mandate technological redundancy, forcing businesses by regulation to create and disperse more copies of their data and to establish more network interconnections than they would otherwise choose, perhaps paid for by government subsidy.

This is a sound prescription, and we should focus our energy on ways to recover quickly from an attack rather than think only about prevention or retaliation. What I like most about this focus is it helps cure cybersecurity’s knowledge problem, by focusing our attention more on facts that are readily available—how interconnected are the nation’s networks and critical data centers?—and less on facts that we civilians can never know—how powerful are China’s infowar capabilities?

The article, of course, isn’t perfect. First, when Bambauer talks of redundancy, he uses the term inefficiency. He does this, I think, to suggest that his proposals are counter-intuitive and maybe even radical; it seems a heresy of the first order to argue for inefficiency in this law-and-economics-drenched age. But as he himself notes throughout his paper, computer security experts have recognized the importance of redundant systems for decades. The Internet itself was in part architected on the principle of robustness through redundant links. Bambauer isn’t being a radical here, he is simply importing neglected principles from information security theory, principles for too long underappreciated by legal scholars and policymakers. By choosing the surprising term over the conventional one, Bambauer obscures his contribution.

Second, Bambauer is wrong if he suggests that the solution to cybersecurity will be found in information theory alone. Criminal law, national security law, and military law must play a role, and the “problem of attribution” isn’t irrelevant.

Despite these mostly cosmetic flaws, the important lesson of the article is that we need to think of cybersecurity as a problem we should view through two different lenses. But there is no reason to confine this lesson solely to cybersecurity. This article should help us remember that every cyberlaw policy dispute can be seen through the same dual lenses, one focused on infrastructure and the other on content and communications. This echoes but expands upon Orin Kerr’s important article about the “internal” and “external” views of cyberlaw. Orin S. Kerr, The Problem of Perspective in Internet Law, 91 Geo. L.J. 357 (2003). From net neutrality, to Wikileaks, to online privacy, to whatever we are worrying about tomorrow, we should always view our debates through these two lenses, the infrastructural and the informational, the technological and the human, the network and the social. Two lenses give us the stereoscopic vision to make better sense of cybersecurity, Bambauer convincingly demonstrates; it can probably do the same for many other great problems of the day.