Deep Packet Inspection: What Innovation Will ISPs Encourage?

InnovationAll sorts of nasty things as said about ISPs that use Deep Packet Inspection (DPI). ISPs aren’t investing enough in their networks, they just want to punish early adopters of new technologies, they’re looking to deepen their regulatory powers capacities, or they want to track what their customers do online. ISPs, in turn, tend to insist that P2P applications are causing undue network congestion, and DPI is the only measure presently available to them to alleviate such congestion.

At the moment, the constant focus on P2P over the past few years has resulted in various ‘solutions’ including the development of P4P and the shift to UDP. Unfortunately, the cat and mouse game between groups representing record labels, ISPs (to a limited extent), and end-users has led to conflict that has ensured that most of the time and money is being put into ‘offensive’ and ‘defensive’ technologies and tactics online rather than more extensively into bandwidth-limiting technologies. Offensive technologies include those that enable mass analysis of data- and protocol-types to try and stop or delay particular modes of data sharing. While DPI can be factored into this set of technologies, a multitude of network technologies can just as easily fit into this category. ‘Defensive’ technologies include port randomizers, superior encryption and anonymity techniques, and other techniques that are primarily designed to evade particular analyses of network activity.

I should state up front that I don’t want to make myself out to be a technological determinist; neither ‘offensive’ or ‘defensive’ technologies are in a necessary causal relationship with one another. Many of the ‘offensive’ technologies could have been developed in light of increasingly nuanced viral attacks and spam barrages, to say nothing of the heightening complexity of intrusion attacks and pressures from the copyright lobbies. Similarly, encryption and anonymity technologies would have continued to develop, given that in many nations it is impossible to trust local ISPs or governments.

Continue reading

Deep Packet Inspection and the Discourses of Censorship and Regulation

boredomIn the current CRTC hearings over Canadian ISPs’ use of Deep Packet Inspection (DPI) to manage bandwidth, I see two ‘win situations’ for the dominant carriers:

  1. They can continue to throttle ‘problem’ applications in the future;
  2. The CRTC decides to leave the wireless market alone right now.

I want to talk about the effects of throttling problem applications, and how people talking about DPI should focus on the negative consequences of regulation (something that is, admittedly, often done). In thinking about this, however, I want to first attend to the issues of censorship models to render transparent the difficulties in relying on censorship-based arguments to oppose uses of DPI. Following this, I’ll consider some of the effects of regulating access to content through protocol throttling. The aim is to suggest that individuals and groups who are opposed to the throttling of particular application-protocols should focus on the effects of regulation, given that it is a more productive space of analysis and argumentation, instead of focusing on DPI as an instrument for censorship.

Let’s first touch on the language of censorship itself. We typically understand this action in terms of a juridico-discursive model, or a model that relies on rules to permit or negate discourse. There are three common elements to this model-type:

Continue reading

Economics of Authenticity on Twitter

BerealI’m on Twitter all the time; it’s central to how I learn about discussions taking place about Deep Packet Inspection, a good way of finding privacy-folk from around the world, and lets me feel semi-socialized even though I’m somewhat reclusive. When I use the social networking service, I intersperse bits of ‘me’ (e.g. This wine sucks!) beside news articles I’ve found and believe would be useful to my colleagues, and add in some (attempts at) humor. In this sense, I try to make my Twitter feed feel ‘authentic’, meaning that it is reasonably reflective of how I want to present myself in digital spaces. Further, that presentation resonates (to varying extents) with how I behave in the flesh.

When you hear social-media enthusiasts talk about their media environment, authenticity (i.e. not pretending to be someone/something you’re really, absolutely, not) is the key thing to aim for. Ignoring the amusing Heideggerian implications of this use of authenticity (“How very They!), I think that we can take this to mean that there is a ‘currency’ in social media called ‘authenticity’. There are varying ways of gauging this currency. Continue reading

Facebook Got Off Easy: Third-Parties and Data Collection

datadestroyI’m on Facebook, and have been for years. I also dislike Facebook, and have for several years. I don’t dislike the social networking service because it’s bad at what it aims to do, but because it’s far too good at what it does. Let’s be honest: Facebook does not exist to ‘help me connect to my friends’. Maybe that was its aim when it was first dreamt up, but the current goal of Facebook is to make money from my data. Part of this involves Facebook mining my data, and another (and more significant) part entails third-party developers mining my data. I want to think out loud about this latter group and their practices.

A core issue (amongst several others) that Office of the Privacy Commissioner of Canada (OPC) raised in their recent findings about Facebook focused on the data that third-party application developers gain access to when an individual installs an Facebook application. Before getting into this in any depth, I just want to recognize the full range of information that application developers can call on using the Facebook API: Continue reading

Background to North American Politics of Deep Packet Inspection

crtc566The CRTC is listening to oral presentations concerning Canadian ISPs’ use of Deep Packet Inspection (DPI) appliances to throttle Canadians’ Internet traffic. Rather than talk about these presentations in any length, I thought that I’d step back a bit and try to outline some of the attention that DPI has received over the past few years. This should give people who are newly interested in the technology an appreciation for why DPI has become the focus of so much attention and provide paths to learn about the politics of DPI. This post is meant to be a fast overview, and only attends to the North American situation given that it’s what I’m most familiar with.

Massive surveillance of digital networks took off as an issue in 2005, when the New York Times published their first article on the NSA’s warrantless wiretapping operations. The concern about such surveillance brewed for years, but (in my eyes) really exploded as the public started to learn about the capacities of DPI technologies as potential tools for mass surveillance.

DPI has been garnering headlines in a major way in 2007, which has really been the result of Nate Anderson’s piece, “Deep packet inspection meets ‘Net neutrality, CALEA.” Anderson’s article is typically recognized as the popular news article that put DPI on the scene, and the American public’s interest in this technology was reinforced by Comcast’s use of TCP RST packets, which was made possible using Sandvine equipment. These packets (which appear to have been first discussed in 1981) were used by Comcast to convince P2P clients that the other client(s) in the P2P session didn’t want to communicate with Comcast subscriber’s P2P application, which led to the termination of the data transmission. Things continued to heat up in the US, as the behavioural advertising company NebuAd began partnering with ISPs to deliver targeted ads to ISPs’ customers using DPI equipment. The Free Press hired Robert Topolski to perform a technical analysis of what NebuAd was doing, and found that NebuAd was (in effect) performing a man-in-the-middle attack to alter packets as they coursed through ISP network hubs. This report, prepared for Congressional hearings into the surveillance of Americans’ data transfers, was key to driving American ISPs away from NebuAd in the face of political and customer revolt over targeted advertising practices. NebuAd has since shut its doors. In the US there is now talk of shifting towards agnostic throttling, rather than throttling that targets particular applications. Discrimination is equally applied now, instead of honing in on specific groups.

In Canada, there haven’t been (many) accusations of ISPs using DPI for advertising purposes, but throttling has been at the center of our discussions of how Canadian ISPs use DPI to delay P2P applications’ data transfers. Continue reading

Solved: Apple SATA II 1.7 Firmware Problems

mbp13When something ‘just works’ 99.9% of the time, that .1% of downtime is particularly frustrating. This is what I recently experienced with my Time Capsule networking fiasco, and was paralleled by another problem stemming from an Apple firmware update.

The new MacBook Pros were shipped with their SATA II data speeds crippled; they were limited to 1.5Gps rather than the SATA II 3.0Gbps standardized speed. While this had no real effect for HDD users, it did affect SSD users – SSD is capable of taking advantage of the SATA II spec, and so SSD users rightly complained.

Apple heard these complaints, and released a firmware update for the MacBook Pro line; they warned that the update might not work with non-stock drives (!) but that it would restore SATA II speeds. I decided to update the firmware, just because having an up-to-date system is a good idea. This is right-minded thinking, right?

Wrong.

Continue reading