I learned today that I was successful in winning a Social Sciences and Human Research Council (SSHRC) award. (Edit September 2009: I’ve been upgraded to a Joseph Armand Bombardier Canada Graduate Scholarship). Given how difficult I found it to find successful research statements (save for through personal contacts) I wanted to post my own statement for others to look at (as well as download if they so choose). Since writing the below statement, some of my thoughts on DPI have become more nuanced, and I’ll be interested in reflecting on how ethics might relate to surveillance/privacy practices. Comments and ideas are, of course, welcomed.
Interrogating Internet Service Provider Surveillance:
Deep Packet Inspection and the Confluence of International Privacy Regimes
Context and Research Question
Internet Service Providers (ISPs) are ideally situated to survey data traffic because all traffic to and from the Internet must pass through their networks. Using sophisticated data traffic monitoring technologies, these companies investigate and capture the content of unencrypted digital communications (e.g. MSN messages and e-mail). Despite their role as the digital era’s gatekeepers, very little work has been done in the social sciences to examine the relationship between the surveillance technologies that ISPs use to survey data flows and the regional privacy regulations that adjudicate permissible degrees of ISP surveillance. With my seven years of employment in the field of Information Technology (the last several in network operations), and my strong background in conceptions of privacy and their empirical realization from my master’s degree in philosophy and current doctoral work in political science, I am unusually well-suited suited to investigate this relationship. I will bring this background to bear when answering the following interlinked questions in my dissertation: What are the modes and conditions of ISP surveillance in the privacy regimes of Canada, the US, and European Union (EU)? Do common policy structures across these privacy regimes engender common realizations of ISP surveillance techniques and practices, or do regional privacy regulations pertaining to DPI technologies preclude any such harmonization?
Thesis – Literature
Given ISPs’ role in governing data networks, it is crucial to interrogate the methods and technologies they utilize to survey traffic flowing through their networks. Data sent on the Internet is separated into discrete packets that are shuttled to the message’s recipients and then reassembled at their destination. ISPs presently use Deep Packet Inspection (DPI) technologies to investigate each packet that enters and leaves their networks. These technologies effectively let ISPs open sealed letters (the packets), read their contents (the packets’ payload/message), reseal the letters, and then pass them to the recipients, so long as the contents are deemed ‘acceptable’ by ISPs’ evaluation heuristics.
ISPs operate in various privacy regimes. Each regime (e.g. Canada, America, EU) has a unique legal and technological discourse, and a particular conception of the complexity, dynamic, and diversity of processing personal data (Bennett and Raab 2006). Accompanying these regimes are divergent understandings of permissible and impermissible degrees of surveillance, and Diebert et al. (2008) have shown that state-mandates impact ISPs’ content filtering practices. Diebert et al. (2008) do not, however, address the expansive surveillance possibilities of DPI, instead limiting their work to (relatively) archaic methods of monitoring and blocking Internet content. Moreover, while some scholars address facets of privacy and surveillance regulation pertaining to digital networks (Haggerty 2006, Lace 2005, Lyon 2007, Solove 2004, 2008), they focus on theoretical abstractions and digital environments, such social networking sites, without examining the surveillance capabilities of ISPs themselves.
Much of the empirical work pertaining to ISPs and data surveillance has surveyed people’s perceptions of surveillance and privacy. These surveys have concluded that people do not want their online activities tracked and conversations monitored (Pew Internet and American Life Project 2000), do not trust businesses to handle their personal information (Harris Interactive 2002), and that respondents generally perceive their privacy as ‘very important’ (EPIC 2005). These indicate what people think about surreptitious surveillance without investigating how ISPs inspect individuals’ data traffic. Recent articles interrogating DPI technologies and data traffic (Anderson 2008, Clayton et al. 2008, Rossenhovel 2008, Topolski 2008) have focused on overviews of DPI technologies, neglecting the particular privacy regimes these technologies function in. While some work exists that concerns the abstract regulation of digital system (Lessig 2004, Gallowway and Thacker 2008, Ohm 2008) it does not consider the specific regulatory situations in Canada, America, or the EU, nor does it reflect on the roles of various policy agents and their capacity to impact privacy and surveillance regulations in these privacy regimes.
By examining policy instruments, transnational actors, conceptual frameworks that motivate international privacy agreements, and legal decisions concerning privacy rights in Canada, the US, and the EU, commonalities and dissonant approaches to surveillance practices can be identified in privacy regimes. Sabatier’s (1988) advocacy coalition approach and Kingdon’s (2003) work on agendas can mutually assist in identifying how and why particular policies have been developed and whether they are motivated towards a common cross-regime understanding of permissible levels of ISP surveillance.
Thesis – Methodology
Drawing on my years of experience in Information Technology, I will initially focus on the technical capabilities of DPI technologies used by ISPs. Drawing on corporate, academic, and legal technical analysis of national, transnational, and global stakeholders in DPI, I will expose the uses and technical capabilities of these technologies. With an understanding of the technologies, I will extend my investigation to relevant Canadian, American, and EU privacy codes, regulations, and fair information practices. This information will be synthesized with privacy theory and surveillance studies literature, as well as with international agreements that influence acceptable national surveillance and data handling practices. The integration of empirical, technical, theoretical, and legal literature will provide me with a firm understanding of the Canadian, American, and EU privacy regimes and how they interrelate with privacy and surveillance implications of DPI. On this foundation, I will break new ground in the social science by investigating whether these regimes pressure ISPs to adopt common surveillance practices, and can consequently be seen as substantively realizing common ISP surveillance practices across privacy regimes, or if the regimes instead provoke ISPs to adopt dissonant surveillance practices.
Thesis – Preparation and Coursework
I began my doctoral studies in the Political Science Department at the University of Victoria this September, and will complete my dissertation by 2012. Recognizing this topic’s boldness, I am working under the supervision of Dr. Colin Bennett, a world expert in the governance of privacy. To prepare for my dissertation, I am taking graduate classes in comparative policy, international relations, multi-disciplinary theory, and surveillance studies. These will assist in refining my methodological approaches and sensitize my work to the global, transnational, national, and provincial/state challenges concerning the governance of privacy. In addition, I am attending the 10th Annual Privacy and Security Conference in February 2009 to discuss my research with government officials, technology experts, and academics. Next summer, I am attending the Surveillance Studies Summer Seminar, where leading international faculty in surveillance studies will lead seminars on the topics of surveillance and privacy.
Relevant Experience and Thesis Dissemination
I am a research assistant for the New Transparency Project (NewT), which in part aims to render transparent the flows of digital information as they pertain to surveillance. As part of my duties, I am preparing working papers on surveillance technologies, assisting faculty associated with this sub-branch of the project, and will be the major research assistant for the 2011 workshop on digitally mediated surveillance. Organizing a multidisciplinary graduate conference in May 2008 has provided me with experience that will be useful in assisting with this workshop. NewT’s resources offer me the opportunity to disseminate my research at annual workshops, through edited books, and confirmed special edited journals. Additionally, Dr. Arthur Kroker has invited me to co-author, and present, a paper with Dr. Bennett in 2009 on the topic of privacy and citizenship implications of ISP surveillance. Beyond these academic disseminations, I will continue providing research findings to government bodies investigating issues of digitally mediated surveillance, as well as public legal groups, and members of the media.
Over the course of the fellowship, I will pursue these lines of dissemination. In addition, I will establish an interactive website with the assistance of a (tentatively) contracted web development firm. This website will initially let Canadians identify how their ISP is using DPI technologies, alert them to its implications, and suggested ways of protesting the surveillance. In subsequent years, information on American and EU ISPs will be added. In addition, I will continue to update my personal website, where I share my research through a collaborative wiki, as well as through blog posts concerning emerging technologies’ privacy and surveillance implications. My dissertation, accompanied with public outreach, will contribute to contemporary surveillance and privacy literature, and public awareness, by interrogating ISP surveillance practices and their relationship to privacy regimes to uncover commonalities and dissonances of ISP surveillance practices across privacy regimes.
Anderson, Nate (2008). “Throttle 5 million P2P users with $800K DPI monster,” ArsTechnica. Published May 12, 2008, at http://arstechnica.com/news.ars/post/20080512-throttle-5m-p2p-users-in-real-time-with-800000-dpi-monster.html
Bennett, Colin J and Charles Raab (2006). The Governance of Privacy: Policy Instruments in Global Perspective. Cambridge, Mass.: The MIT Press.
Clayton, Richard, et al. (2006). “Ignoring the Great Firewall of China,” from 6th Workshop on Privacy Enhancing Technologies, at http://www.cl.cam.ac.uk/~rnc1/ignoring.pdf
Deibert, Ronald et al. (eds.) (2008). Access Denied: The Practice and Policy of Global Internet Filtering. Cambridge, Mass.: The MIT Press.
Electronic Privacy Information Center (EPIC) (2005). Public Opinion on Privacy at http://epic.org/privacy/survey/
Gallowway, Alexander R., Eugene Thacker (2008). The Exploit: A Theory of Networks. Minneapolis: University of Minneapolis Press.
Goldsmith, Jack and Tim Wu (2006). Who Controls the Internet? Illusions of a Borderless World. Toronto: Oxford University Press.
Haggerty, Kevin D. and Richard V. Ericson (eds.) (2006). The New Politics of Surveillance and Visibility. Toronto: University of Toronto Press.
Harris Interactive (2002). Privacy on and off the Internet: What consumers want. Hackensack, NJ.
Kingdon, John W. (2003). Agendas, Alternatives, and Public Policies (Second Edition). Toronto: Addiso-Wesley Educational Publishers Inc.
Lace, Susanne (ed.) (2005). The Glass Consumer: Life in a surveillance society. Bristol: National Consumer Council.
Lyon, David (2007). Surveillance Studies: An Overview. Cambridge, UK: Polity.
Ohm, Paul (2008). “The Rise and Fall of Invasive ISP Surveillance,” SSRN eLibrary, at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1261344
Pew Internet and American Life Project (2000). Trust and Online Privacy: Why Americans want to rewrite the rules. Published August 20, 2000.
Rossenhovel, Carsten (2008). “Peer-to-Peer Filters: Ready for Internet Prime Time?” Internet Evolution, at http://www.internetevolution.com/document.asp?doc_id=148803&page_number=1
Sabatier, Paul A. (1988) “An advocacy coalition framework of policy change and the role of policy- oriented learning,” Policy Studies, vol. 21, pp. 129-168.
Solove, Daniel J. (2004). The Digital Person: Technology and Privacy in the Information Age. New York: New York University Press.
Solove, Daniel J. (2008). Understanding Privacy. Cambridge, Mass.: Harvard University Press.
Topolski, Robert M. (2008). “NebuAd and Partner ISPs: Wiretapping, Forgery, and Browser Hijacking” Free Press and Public Knowledge. Published June 12, 2008.
Other Key Texts
Allot Communications Ltd. (2007). Digging Deeper Into Deep Packet Inspection, at http://www.getadvanced.net/learning/whitepapers/networkmanagement/Deep%20Packet%20Inspection_White_Paper.pdf
Anderson, Nate (2007). “Deep packet inspection meets ‘Net neutrality, CALEA,” ArsTechnica. Published June 25, 2007, at http://arstechnica.com/articles/culture/Deep-packet-inspection-meets-net-neutrality.ars
Bennett, Colin (1992). Regulating Privacy: Data Protection and Public Policy in Europe and the United States. Ithaca: Cornell University Press.
Bennett, Colin (1997). “Convergence Revisted: Toward a Global Policy for the Protection of Personal Data,” in Technology and Privacy: The New Landscape, Phillip E. Agre and Marc Rotenberg (eds). Cambridge, Mass.: The MIT Press. pp. 99-123.
Bennett, Colin (2001). “Cookies, Web Bugs, Webcams and Cue Cats: Patterns of Surveillance on the World Wide Web,” Ethics and Information Technology, vol. 3, pp. 197-210.
Bennett, Colin, and Rebecca Grant (1999). Visions of Privacy: Policy Choices for the Digital Age. Toronto: University of Toronto Press.
Bond, Jonathan C. (2008). “Defining Disclosure in a Digital Age: Updating the Privacy Act for the Twenty-First Century,” The George Washington Law Review, vol. 76(5), pp. 1233-1258.
Castells, Manuel (2000). The Rise of the Network Society. Malden, Mass.: Blackwell Publishing.
Clayton, Richard (2008). “The Phorm ‘Webwise’ System,” Light Blue Touchpaper: Security Research Computer Laboratory, University of Cambridge (Blog) at http://www.cl.cam.ac.uk/~rnc1/080404phorm.pdf
Dutrisac, James George (2007). Counter-Surveillance in an Algorithmic World. Unpublished master’s thesis, Queens University, CA.
European Union (1995). Directive 95/46/EC of the European Parliament and the Council on the Protection of Individuals with Regard to the Processing of Personal Data and on the Free Movement of Such Data, Brussels, OJ no. L281 (24 October 1995).
European Union (2000). Directive 95/46/EC of the European Parliament and of the Council on the adequacy of the protection provided by the safe harbour privacy principles and related frequently asked questions issued by the US Department of Commerce, Brussels, OJ no. L215 (August 25, 2000).
Gandy Jr., Oscar H (1993). The Panoptic Sort. Boulder, CO: Westview Press.
Howlett, Michael, M. Ramesh (2003). Studying Public Policy: Policy Cycles and Policy Subsystems (Second Edition). Toronto: Oxford University Press.
Industry Canada (1998). “The Protection of Personal Information: Building Canada’s Information Economy and Society,” Task Force on Electronic Commerce, Industry Canada, Justice Canada, at http://www.ifla.org/documents/infopol/canada/privacy.pdf
Lessig, Lawrence (2006). Code Version 2.0. New York: Basic Books.
Lyon, David (2001). Surveillance Society: Monitoring Everyday Life. Buckingham: Open University Press.
OECD (1985). Declaration on Transborder Data Flows. OECD: Paris, at http://www.oecd.org/document/25/0,3343,en_2649_34255_1888153_1_1_1_1,00.html
OECD (1992). Guidelines for the Security of Information Systems. OECD: Paris, at http://www.oecd.org/dataoecd/16/22/15582260.pdf
OECD (1997). Cryptography Policy: The Guidelines and the Issues. OECD: Paris, at http://www.oecd.org/document/11/0,3343,en_2649_34255_1814731_1_1_1_1,00.html
Raab, Charles and Colin Bennett (1996). “Taking the Measure of Privacy: Can Data Protection be Evaluated?,” International Review of Administrative Sciences, vol. 61, pp. 535-556.
Ponemon Institute and the Information and Privacy Commissioner/Ontario (2004). Cross-National Study of Canadian and US Corporate Privacy Practices, at http://www.ipc.on.ca/images/Resources/cross.pdf
Riphagen, David (2008). “Can the European Data Retention Directive Settle in the United States?” Published June 28, 2008. URL: http://web.mac.com/d.a.riphagen/davidriphagen.nl/Weblog_Washington/Entries/2008/7/7_Can_the_European_Data_Retention_Directive_Settle_in_the_United_States_of_America_files/EU%20Data%20Retention%20Act%20Policy%20Transplantation%20_final.pdf
Solove, Daniel J. and Marc Rotenburg, Paul M. Schwartz (2006). Privacy, Information, and Technology. New York: Aspen Publishers Inc.
Schoeman, Ferdinand David. (ed.) (1984). Philosophical Dimensions of Privacy: An Anthropology. New York”: Cambridge University Press.
United Nations (1990). Guidelines Concerning Computerized Personal Data Files, Resolution A/RES/45/9 adopted by the General Assembly on 14 December1990. URL: http://www.un.org/documents/ga/res/45/a45r095.htm
United States, Department of Commerce National Telecommunications and Information Administration (1997). Privacy and Self Regulation In The Information Age. Washington, DC: Department of Commerce.
United States, Privacy Protection Study Commissions (1977). Protecting Privacy in an Information Society. Washington, DC: Government Printing Office.
Zhang, Jian, Phillip Porras, Johannes Ullrich (2008). “Highly Predictive Blacklisting,” from USENIX Security ’08, at http://www.usenix.org/events/sec/tech/full_papers/zhang/zhang_html/index.html
12 thoughts on “Deep Packet Inspection and the Confluence of Privacy Regimes”
Just wanted to say congrats.
Well deserved. You write some good stuff.
…and remember… keep your eyes open for spin land 😉
Makes a fellow Canadian proud.
Thanks – I’d do what I can to see and avoid spin grin
Now I’m certain I have to pay attention to you.
These technologies effectively let ISPs open sealed letters (the packets), read their contents (the packets’ payload/message), reseal the letters, and then pass them to the recipients, so long as the contents are deemed ‘acceptable’ by ISPs’ evaluation heuristics.
It wouldn’t surprise me if the following point hasn’t occurred in your recent “nuanced thoughts.”
As I commented over at P2PNet, the concept of a data packet as “sealed” and needs to be opened is flawed. The envelope containing the packet is transparent. What is inside is readily visible to anyone or anything. (Hence the flaw in the really unfortunate name “Deep Packet Inspection”. Its not that deep.)
Even encrypted data is in a transparent envelope. Its use or need is similar to the ciphers used by the military over radio waves. Anyone can capture the broadcast, but if you don’t have the key, the content is meaningless.
Which does bring up a frightening concept (even to me). Newer DPI appliances are claiming to have the capability to decrypt packet streams and apply the optimizations to them. This is a hazy area at the moment where vendors claims and real world application are two different things. It does have real application on the corporate world, as encrypted traffic still needs optimization on congested links. At the moment this is the one weakness in traffic shaping. The corporate world is where most of the demand for SSL shaping is coming from.
For consideration purposes, it could very well be possible to intercept the Key exchange between two data points, and then use those Keys later to break into an encrypted stream.
If that is already or eventually becomes true, then we have a true DPI device, with far reaching consequences in terms of expectations of privacy.
hehe – thanks! Nothing like knowing I’m ‘under surveillance’ 😛
The letter metaphor isn’t so hot, though I’ll admit that were I writing the grant again I’d probably use it (it’s simple, and there isn’t space in them to really nuance what is going on; max two pages, and you need to avoid much ‘tech talk’).
I’m actually not totally in agreement that ‘deep’ packet inspection isn’t appropriate. As I understand it, most manufacturers are pushing their devices (in PR) as using the OSI model; per that, the ‘deep’ analogy works reasonably well. It’s only really once you move to ‘real world’ understandings of how packets move that ‘deep’ becomes somewhat silly. That’s at least my initial thought – I’m sure you can correct it if I’m out to lunch grin
It’s interesting/bit worrying that they’re talking of decrypting traffic. At the moment, as I’m sure you know, devices can identify particular encrypted traffic (presumably with the intent of them mediating it – Skype comes to mind, as does a Bittorrent connection); it gets at the application, rather than the content. At the moment, however, particular applications and modes of encrypted traffic are identified using application traffic heuristics (i.e. the pattern that suggests a Skype session, traffic types typically associated with BT, etc.)
Actually grabbing keys…that’s certainly something I hadn’t really thought about. What vendors are using the decrypt/encrypt talk? Have links to white papers?
Citrix was the first one that I heard this claim from (WANScalar, NetScalar) but I haven’t been able to get a clear answer of when or how encrypted traffic is managed.
I found a patent application by Microsoft, but they are looking at applying a rule before the traffic is encrypted, and then handling the encryption. So this would be gateway level encryption, not software level encryption, and it happens at the wrong point of the communication stream.
Its really hazy as to what the actual capabilities are. But many encryption schemes are vulnerable to a “man-in-the-middle” attack (even Quantum encryption! http://en.wikipedia.org/wiki/Quantum_cryptography#Man_in_the_middle_attack)
Since your ISP (or school, or business) has network devices that are increasing in intelligence, the man-in-the-middle capability is being built into the network. A few years ago, it required someone to hook up a computer and manually track the communications. This is where things will get really interesting.
Oh and I consider the OSI model irrelevant. Did you know that TCP/IP predates the model and is not actually OSI compliant? (IPX/SPX was supposed to be OSI compliant as it was developed with the OSI model in mind).
After layer 3 or 4 (somewhere around there), the OSI model is irrelevant in real world TCP or UDP/IP communications.
But yeah, before DPI, we only concerned ourselves with ports or other header information when applying security policies. Hence the new Deep Inspection term. But we always used network sniffers to see all packets. That required human eyeballs and was in a real sense Deep Packet Inspection. We were (and do now) inspecting the packets. Its how you look for bad or malformed packets. OR look for people using plan test passwords when logging into their mailboxes…
Once the next month or so is up (running around the country to workshops/conferences) I want to look into DPI appliances’ capacity to foil encryption. If it would effectively undermine typically used encryption schemes, then that seems particularly interesting.
I’m not surprised that you’re none too impressed with OSI – one of my tasks will be to take the work on DPI that approaches it from an OSI model and then try to translate it into TCP/IP in a way that is accessible to non-technical readers. OSI is convenient because literature from vendors is often in the OSI format, and I generally find it easier to explain OSI over TCP/IP. Still, it’s important to shift discussions into the terms of how networks are actually realized, rather than in highly theoretical and relatively unpractical terms.
Thanks for the vendor names – it’ll help when I really take up the hunt in a bit!
Here’s one you might to look at:
The vendor claim for this product is this:
Featuring client- and server-side SSL sniffing, AppXcel provides complete transaction visibility and security of encrypted traffic, preventing SSL virus tunneling while guaranteeing end-to-end application-smart performance tuning for web-enabled, SSL-based applications.
AppXcel provides client- and server-side SSL security by expanding the capabilities of all network security devices to scan SSL encrypted traffic. AppXcel peels off the encryption from in-bound and out-bound SSL traffic, providing a clear-text copy to network security devices so they can detect real-time hacking or information leaks.
There’s three potential ways they are doing this, and two of them are troubling:
1) Brute forcing the key. Possible but unlikely. Requires a lot of horsepower and as keys continuously improve, this technology will be left behind.
2) Man-in-the-middle key exchange interception. Technically feasible. But is it being implemented. I have no idea.
3) Preloading of corporate keys to the device to protect corporate information. This is benign as it becomes part of the corporate security apparatus. Could not be used against encryption keys generated outside the corporate network (like your bank for example).
Thanks for the link. Methods (1) and (3) are (in my mind) the least worrying. I share your thoughts on brute force, and have thought about DPI for corporate security previously. Including keys would be a natural extension to that security aparatus.
It’s point (2) that I’m most interested in, and most nervous about. If it is possible to effectively use DPI for these kinds of interceptions, then I (hope?) expect we’ll see some quick innovations in security. At the same time, it’s my understanding that systems like the Diffie-Helleman key exchange were developed with the assumption that someone was listening in; are you suggesting that these exchanges are as vulnerable as any other mode of exchange? (Encryption is still a weak point; I have about 4 books beside me to read about it in far more depth, so sorry if this is a silly/stupid question.)
Darn, forgot to include the website link:
Encryption methodologies is one of my weak links too! The mathematics behind it quickly leaves me behind.
Which may be another point, if the security is so complex that its above the average layman, how do we know what to trust?
Damn, you’re making me think about this stuff more than I have in a while! (That’s a good thing though.)
Comments are closed.