In many ways, I can credit the NSA along with the excellent reporting of Nate Anderson for why I’m so interested in surveillance technologies. In particular, when the story broke in 2005 that the NSA was likely engaged in massive wiretaps of domestic and international data traffic I was drawn to the power and capacity for the ‘net to be used for truly broad-based surveillance efforts. This interest was heightened when Nate published the first of a set of articles on deep packet inspection (DPI) for Ars Technica. Without these two key moments, along perhaps with some interesting reporting on copyright, I’d probably still be thinking through the conditions of ontological psychology through a Heideggerian or Hegellian lens.
Given that I am engaged in research into surveillance technologies, and have the absolute pleasure to be associated with truly excellent scholars, activists, advocates, collaborators, and friends who share similar research interests, I wanted to take a moment to ask you, my readers, to help us map data traffic. As you may be aware, the NSA is reputed to have installed systems in various networking hubs that lets them examine massive amounts of data traffic. It’s not entirely known how they inspect this traffic, or the algorithms that are used to parse the fire hose of data they must be inundated by, but researchers at the University of Toronto have a decent idea of what ‘carrier hotels’, or major Internet exchange/collocation points, have likely been compromised by NSA surveillance instruments.
German Deep Packet Inspection (DPI) manufacturer, ipoque, has produced a white paper titled “Deep Packet Inspection: Technology, Applications & Network Neutrality.” In it, the company distinguishes between DPI as a technology and possible applications of the technology in a social environment. After this discussion they provide a differentiated ‘tiering’ of various bandwidth management impacts on network neutrality. In this post I offer a summary and comment of the white paper, and ultimately wonder whether or not there is an effective theoretical model, grounded in empirical study, to frame or characterize network neutrality advocates.
The first thing that ipoque does is try and deflate the typically heard ‘DPI analysis = opening a sealed envelop’ analogy, and argue that it is better to see packets as postcards, where DPI analysis involves looking for particular keywords or characters. In this analysis, because the technology cannot know of the meaning of what is being searched for, the DPI appliances cannot be said to violate one’s privacy given the technology’s lack of contextual awareness. I’ve made a similar kind of argument, that contextual meaning escapes DPI appliances (though along different lines) in a paper that I presented earlier this year titled “Moving Across the Internet: Code-Bodies, Code-Corpses, and Network Architecture,” though I think that its important to recognize a difference between a machine understandingsomething itself versus flagging particular words and symbols for a human operator to review. Ubiquitous, “non-aware,” machine surveillance can have very real effects where a human is alerted to communications – its something of a misnomer to say that privacy isn’t infringed simply because the machine doesn’t know what it’s doing. We ban and regulate all kinds of technologies because of what they can be used for rather than because the technology itself is inherently bad (e.g. wiretaps).
Over the past few days I’ve been able to attend to non-essential reading, which has given me the opportunity to start chewing through Bruce Schneier’s Beyond Fear. The book, in general, is an effort on Bruce’s part to get people thinking critically about security measures. It’s incredibly accessible and easy to read – I’d highly recommend it.
Early on in the text, Schneier provides a set of questions that ought to be asked before deploying a security system. I want to very briefly think through those questions as they relate to Deep Packet Inspection (DPI) in Canada to begin narrowing a security-derived understanding of the technology in Canada. My hope is that through critically engaging with this technology that a model to capture concerns and worries can start to emerge.
Question 1: What assets are you trying to protect?
- Network infrastructure from being overwhelmed by data traffic.
Question 2: What are the risks to these assets?
- Synchronous bandwidth-heavy applications running 24/7 that generate congestion and thus broadly degrade consumer experiences.
Question 3: How well does security mitigate those risks?
I’ll be chatting with Chris Cook tomorrow between 5:00-5:20 or so about deep packet inspection, network neutrality, and the Canadian situation. This will be the second time that I’ve had the opportunity to talk with Chris, and it’s always a pleasure. Hit Gorilla Radio’s posting for more information.
I’d just like to publicly thank the University of Victoria’s Communications department for their assistance these past few weeks. They’ve been wonderful in letting various media outlets around the country know about my research, which has let me disclose my research more widely then normal. UVic’s level of support to their graduate students is absolutely amazing – I’d highly recommend that any graduate student interested in a Canadian institution take a look at their offerings. UVic rocks!
All sorts of nasty things as said about ISPs that use Deep Packet Inspection (DPI). ISPs aren’t investing enough in their networks, they just want to punish early adopters of new technologies, they’re looking to deepen their regulatory powers capacities, or they want to track what their customers do online. ISPs, in turn, tend to insist that P2P applications are causing undue network congestion, and DPI is the only measure presently available to them to alleviate such congestion.
At the moment, the constant focus on P2P over the past few years has resulted in various ‘solutions’ including the development of P4P and the shift to UDP. Unfortunately, the cat and mouse game between groups representing record labels, ISPs (to a limited extent), and end-users has led to conflict that has ensured that most of the time and money is being put into ‘offensive’ and ‘defensive’ technologies and tactics online rather than more extensively into bandwidth-limiting technologies. Offensive technologies include those that enable mass analysis of data- and protocol-types to try and stop or delay particular modes of data sharing. While DPI can be factored into this set of technologies, a multitude of network technologies can just as easily fit into this category. ‘Defensive’ technologies include port randomizers, superior encryption and anonymity techniques, and other techniques that are primarily designed to evade particular analyses of network activity.
I should state up front that I don’t want to make myself out to be a technological determinist; neither ‘offensive’ or ‘defensive’ technologies are in a necessary causal relationship with one another. Many of the ‘offensive’ technologies could have been developed in light of increasingly nuanced viral attacks and spam barrages, to say nothing of the heightening complexity of intrusion attacks and pressures from the copyright lobbies. Similarly, encryption and anonymity technologies would have continued to develop, given that in many nations it is impossible to trust local ISPs or governments.
The CRTC is listening to oral presentations concerning Canadian ISPs’ use of Deep Packet Inspection (DPI) appliances to throttle Canadians’ Internet traffic. Rather than talk about these presentations in any length, I thought that I’d step back a bit and try to outline some of the attention that DPI has received over the past few years. This should give people who are newly interested in the technology an appreciation for why DPI has become the focus of so much attention and provide paths to learn about the politics of DPI. This post is meant to be a fast overview, and only attends to the North American situation given that it’s what I’m most familiar with.
Massive surveillance of digital networks took off as an issue in 2005, when the New York Times published their first article on the NSA’s warrantless wiretapping operations. The concern about such surveillance brewed for years, but (in my eyes) really exploded as the public started to learn about the capacities of DPI technologies as potential tools for mass surveillance.
DPI has been garnering headlines in a major way in 2007, which has really been the result of Nate Anderson’s piece, “Deep packet inspection meets ‘Net neutrality, CALEA.” Anderson’s article is typically recognized as the popular news article that put DPI on the scene, and the American public’s interest in this technology was reinforced by Comcast’s use of TCP RST packets, which was made possible using Sandvine equipment. These packets (which appear to have been first discussed in 1981) were used by Comcast to convince P2P clients that the other client(s) in the P2P session didn’t want to communicate with Comcast subscriber’s P2P application, which led to the termination of the data transmission. Things continued to heat up in the US, as the behavioural advertising company NebuAd began partnering with ISPs to deliver targeted ads to ISPs’ customers using DPI equipment. The Free Press hired Robert Topolski to perform a technical analysis of what NebuAd was doing, and found that NebuAd was (in effect) performing a man-in-the-middle attack to alter packets as they coursed through ISP network hubs. This report, prepared for Congressional hearings into the surveillance of Americans’ data transfers, was key to driving American ISPs away from NebuAd in the face of political and customer revolt over targeted advertising practices. NebuAd has since shut its doors. In the US there is now talk of shifting towards agnostic throttling, rather than throttling that targets particular applications. Discrimination is equally applied now, instead of honing in on specific groups.
In Canada, there haven’t been (many) accusations of ISPs using DPI for advertising purposes, but throttling has been at the center of our discussions of how Canadian ISPs use DPI to delay P2P applications’ data transfers. Continue reading