The Politics of Deep Packet Inspection: What Drives Surveillance by Internet Service Providers?

UVic CrestToday, I am happy to make my completed doctoral dissertation available to the public. The dissertation examines what drives, and hinders, wireline network practices that are enabled by Deep Packet Inspection (DPI) routers. Such routers are in wide use by Internet service providers (ISPs) in Canada, the United States, and United Kingdom, and offer the theoretical capacity for service providers to intrusively monitor, mediate, and modify their subscribers’ data packets in real or near-real time. Given the potential uses of the routers, I was specifically interested in how the politics of deep packet inspection intersected with the following issues: network management practices, content control and copyright, advertising, and national security/policing.

Based on the potential capabilities of deep packet inspection technologies – and the warnings that such technologies could herald the ‘end of the Internet’ as it is know by citizens of the West – I explored what has actually driven the uptake of the technology in Canada, the US, and the UK. I ultimately found that though there were variations in different states’ regulatory processes, regulators tended to arrive at common conclusions. Regulatory convergence stands in opposition to the divergence that arose as elected officials entered into the DPI debates: such officials have been guided by domestic politics, and tended to reach significantly different conclusions. In effect, while high-expertise regulatory networks reached common conclusions, elected political officials have demonstrated varying degrees of technical expertise and instead have focused on the politics of communications surveillance. In addition to regulators and elected officials, court systems have also been involved in adjudicating how, when, and under what conditions DPI can be used to mediate data traffic. Effectively, government institutions have served as the primary arenas in which DPI issues are taken up, though the involved government actors often exhibited their own interests in how issues were to be taken up or resolved. The relative role of these different state bodies in the case studies arguably reflects underlying political cultures: whereas regulators are principally involved in the Canadian situation, elected officials and courts play a significant role in the US, whereas the UK has principally seen DPI debates settled by regulators and elected officials.

Ultimately, while there are important comparative public policy conclusions to the dissertation, such conclusions only paint part of the picture about the politics of deep packet inspection. The final chapter of the dissertation discusses why the concepts of surveillance and privacy are helpful, but ultimately insufficient, to appreciate the democratic significance of deep packet inspection equipment. In response, I suggest that deliberative democratic theory can provide useful normative critiques of DPI-based packet inspection. Moreover, these critiques can result in practical policy proposals that can defray DPI-based practices capable of detrimentally stunting discourse between citizens using the Internet for communications. The chapter concludes with a discussion of how this research can be advanced in the future; while I have sought to clear away some of the murk concerning the technology, my research represents only the first of many steps to reorient Internet policies such that they support, as opposed to threaten, democratic values.

Formal Abstract:

Surveillance on the Internet today extends beyond collecting intelligence at the layer of the Web: major telecommunications companies use technologies to monitor, mediate, and modify data traffic in real time. Such companies functionally represent communicative bottlenecks through which online actions must pass before reaching the global Internet and are thus perfectly positioned to develop rich profiles of their subscribers and modify what they read, do, or say online. And some companies have sought to do just that. A key technology, deep packet inspection (DPI), facilitates such practices.

In the course of evaluating the practices, regulations, and politics that have driven DPI in Canada, the US, and UK it has become evident that the adoption of DPI tends to be dependent on socio-political and economic conditions. Simply put, market or governmental demand is often a prerequisite for the technology’s adoption by ISPs. However, the existence of such demand is no indication of the success of such technologies; regulatory or political advocacy can lead to the restriction or ejection of particular DPI-related practices.

The dissertation proceeds by first outlining how DPI functions and then what has driven its adoption in Canada, the US, and UK. Three conceptual frameworks, path dependency, international governance, and domestic framing, are used to explain whether power structures embedded into technological systems themselves, international standards bodies, or domestic politics are principally responsible for the adoption or resistance to the technology in each nation. After exploring how DPI has arisen as an issue in the respective states I argue that though domestic conditions have principally driven DPI’s adoption, and though the domestic methods of governing DPI and its associated practices have varied across cases, the outcomes of such governance are often quite similar. More broadly, I argue that while the technology and its associated practices constitute surveillance and can infringe upon individuals’ privacy, the debates around DPI must more expansively consider how DPI raises existential risks to deliberative democratic states. I conclude by offering some suggestions on defraying the risks DPI poses to such states.

Download ‘The Politics of Deep Packet Inspection: What Drives Surveillance by Internet Service Providers?’ (.pdf)

Controversial Changes to Public Domain Works

by Muskingum University Library

A considerable number of today’s copyfight discussions revolve around the usage of DRM to prevent transformative uses of works, to prevent the sharing of works, and to generally limit how individuals engage with the cultural artefacts around them. This post takes a step back from that, thinking through the significance of transforming ‘classic’ works of the English literary canon instead of looking at how new technologies butt heads against free speech. Specifically, I want to argue that NewSouth, Inc.’s decision to publish Huckleberry Finn without the word “nigger” – replacing it with “slave” – demonstrates the importance of works entering the public domain. I restrain from providing a normative framework to evaluate NewSouth’s actual decision – whether changing the particular word is good – and instead use their decision to articulate the conditions constituting ‘bad’ transformations versus ‘good’ transformations of public domain works. I will argue that uniform, uncontested, and totalizing modifications of public domains works is ‘bad’, whereas localized, particular, and discrete transformations should be encouraged given their existence as free expressions capable of (re)generating discussions around topics of social import.

Copyright is intended to operate as an engine to generating expressive content. In theory, by providing a limited monopoly over expressions (not the ideas that are expressed) authors can receive some kind of restitution for the fixed costs that they invest in creating works. While true (especially in the digital era) that marginal costs trend towards zero, pricing based on marginal cost alone fails to adequately account for the sunk costs of actual writing. Admittedly, some do write for free (blogs and academic articles in journals might stand as examples) but many people still write with the hope earning their riches through publications. There isn’t anything wrong with profit motivating an author’s desire to create.

Continue reading

Decrypting Blackberry Security, Decentralizing the Future

Photo credit: HonouCountries around the globe have been threatening Research in Motion (RIM) for months now, publicly stating that they would ban BlackBerry services if RIM refuses to provide decryption keys to various governments. The tech press has generally focused on ‘governments just don’t get how encryption works’ rather than ‘this is how BlackBerry security works, and how government demands affect consumers and businesses alike.’ This post is an effort to more completely respond to the second focus in something approximating comprehensive detail.

I begin by writing openly and (hopefully!) clearly about the nature and deficiencies of BlackBerry security and RIM’s rhetoric around consumer security in particular. After sketching how the BlackBerry ecosystem secures communications data, I pivot to identify many of the countries demanding greater access to BlackBerry-linked data communications. Finally, I suggest RIM might overcome these kinds of governmental demands by transitioning from a 20th to 21st century information company. The BlackBerry server infrastructure, combined with the vertical integration of the rest of their product lines, limits RIM to being a ‘places’ company. I suggest that shifting to a 21st century ‘spaces’ company might limit RIM’s exposure to presently ‘enjoyed’ governmental excesses by forcing governments to rearticulate notions of sovereignty in the face of networked governance.

Continue reading

Draft: Code-Bodies and Algorithmic Voyeurism

Surveillance_timestampsI’ve recently been reading some of David Lyon’s work, and his idea of developing an ethic of voyeurism has managed to intrigue me. I don’t think that I necessarily agree with his position in its entirety, but I think that it’s an interesting position. This paper, entitled “Code-Bodies and Algorithmic Surveillance: Examining the impacts of encryption, rights of publicity, and code-specters,” is an effort to think through how voyeurism might be understood in the context of Deep Packet Inspection using the theoretical lenses of Kant and Derrida. This paper is certainly more ‘theoretical’ than the working paper that I’ve previously put together on DPI, but builds on that paper’s technical discussion of DPI to think about surveillance, voyeurism, and privacy.

As always, I welcome positive, negative, and ambivalent comments on the draft. Elements of it will be adopted for a paper that I’ll be presenting at a Critical Digital Studies workshop in a month or two – this is your chance to get me to reform positions to align with your own! *grin*

Conference Presentation: The Ontological Crisis of Melacholia

200903121318.jpg
I’ll be presenting my paper “The Ontological Crisis of Melancholia: Searching for foundations in the ether of cyberspace” tomorrow at the (inter)disciplinarities: theory & crisis conference tomorrow. If you have any thoughts or comments on the paper, feel free to drop me a line – I’m hoping to polish it over the next few months and then start shopping it around to a few journals. The abstract is below:

Abstract

In The Psychic Life of Power, Judith Butler argues that the power structures ordering individuals and states alike are predicated on a mourning that cannot be mourned; melancholia permeates the primary ordering structures of the individual and the state. Butler takes up this absence, and alerts us to the state’s reliance on citizens’ melancholia to support its continued being. The state, constituted by the melancholic, reasserts and normalizes the melancholia responsible for plunging the modern subject into its ontological crisis of Being; it perpetuates the subjects’ inability to authentically ground their selfhood.

In this paper, I ask whether digital environments are spaces that can facilitate the resolution of modern subjects’ ontological crisis, and thus might provoke the reconstitution of modern politics. In responding to this inquiry, I take up Butler’s analysis of mourning and melancholia and situate her politics of identity in the context of Cyberspace. Specifically, I investigate whether the modern subject can work through their crisis within the plasticity of digital spaces, or if these spaces only superficially present possibilities for working through crisis. In interrogating these possibilities, I consider how psychosocial norms of embodied life are (being) embedded throughout digital spaces, and reflect on the implications of state-held norms being reaffirmed in these new media environments. I conclude by adopting the stance that Cyberspace may enable some individuals to acknowledge their experience of melancholia, but stop short of claiming that the possibilities afforded by this space’s plasticity can or will provoke a widespread reconstitution of modern politics.

Who Decides ‘Analogue’ Citizenships?

Typically when asked ‘who is responsible for setting citizenship rules’ there are two general answers that fall out. On the one hand we might hear ‘the government is responsible for setting down citizenship regulations,’ and on the other we might hear ‘the people are responsible for establishing membership guidelines.’ The latter explicitly locates power in the hands of the people, whereas the former recognises legitimised political bureaucracies and machinations are responsible for citizenship. In this post, I want to briefly look at some of the processes and theoretical discussions surrounding citizenship and immigration, and in particular how they relate to ‘Fortress Europe’ and a recent British controversy surrounding citizenship tests.

The Boundaries of Citizenship

Western nation-states have developed around liberal conceptions of citizenship. As a consequence, citizenship is associated with a particular legal status that requires members to fulfill a set of legally enforceable requirements. These requirements can include holding a certain amount of money, it may involve active participation as a citizen (i.e. remaining active in communities that one is being naturalised by volunteering, being active in local politics, etc), or being born in a geographic area.

Continue reading