Deep Packet Inspection and Consumer Transparency

Image by David Clow

Rogers Communications modified their packet inspection systems last year, and ever since customers have experienced degraded download speeds. It’s not that random users happen to be complaining about an (effectively) non-problem: Rogers’ own outreach staff has confirmed that the modifications took place and that these changes have negatively impacted peer to peer (P2P) and non-P2P applications alike. Since then, a Rogers Communications senior-vice president, Ken Englehart, has suggested that any problems customers have run into are resultant of P2P applications themselves; no mention is made of whether or how Rogers’ throttling systems have affected non-P2P traffic.

In this brief post, I want to quickly refresh readers on the changes that Rogers Communications made to their systems last year, and also note some of the problems that have subsequently arisen. Following this, I take up what Mr. Englehart recently stated in the media about Rogers’ throttling mechanisms. I conclude by noting that Rogers is likely in compliance with the CRTC’s transparency requirements (or at least soon will be), but that such requirements are ill suited to inform the typical consumer.

Continue reading

Is Iran Now Actually Using Deep Packet Inspection?


Photo by Hamed Saber

I’ve previously written about whether the Iranian government uses deep packet inspection systems to monitor and mediate data content. As a refresher, the spectre of DPI was initially raised by the Wall Street Journal in a seriously flawed article several years ago. In addition to critiquing that article, last year I spent a while pulling together various data sources to outline the nature of the Iranian network infrastructure and likely modes of detecting dissident traffic.

Since January 2010, the Iranian government  may have significantly modified their network monitoring infrastructure. In short, the government seems to have moved from somewhat ham-fisted filtering systems (e.g. all encrypted traffic is throttled/blocked) to a granular system (where only certain applications’ encrypted traffic is blocked). In this post I’ll outline my past analyses of the Iranian Internet infrastructure and look at the new data on granular targeting of encrypted application traffic. I’ll conclude by raising some questions that need to be answered about the new surveillance system, and note potential dangers facing Iranian dissidents if DPI has actually been deployed.

Continue reading

Agenda Denial and UK Privacy Advocacy

stopFunding, technical and political savvy, human resources, and time. These are just a few of the challenges standing before privacy advocates who want to make their case to the public, legislators, and regulators. When looking at the landscape there are regularly cases where advocates are more successful than expected or markedly less than anticipated; that advocates stopped BT from permanently deploying Phorm’s Webwise advertising system was impressive, whereas the failures to limit transfers of European airline passenger data to the US were somewhat surprising.[1] While there are regular analyses of how privacy advocates might get the issue of the day onto governmental agendas there is seemingly less time spent on how opponents resist advocates’ efforts. This post constitutes an early attempt to work through some of the politics of agenda-setting related to deep packet inspection and privacy for my dissertation project. Comments are welcome.

To be more specific, in this post I want to think about how items are kept off the agenda. Why are they kept off, who engages in the opposition(s), and what are some of the tactics employed? In responding to these questions I will significantly rely on theory from R. W. Cobb’s and M. H. Ross’ Cultural Strategies of Agenda Denial, linked with work by other prominent scholars and advocates. My goal is to evaluate whether the strategies that Cobb and Ross write about apply to the issues championed by privacy advocates in the UK who oppose the deployment of the Webwise advertising system. I won’t be working through the technical or political backstory of Phorm in this post and will be assuming that readers have at least a moderate familiarity with the backstory of Phorm – if you’re unfamiliar with it, I’d suggest a quick detour to the wikipedia page devoted to the company.

Continue reading

Administrative Note: Website Refresh

Photo by Sanja Gjenero

I’ve been testing different ways to present the content of this website for about 9 months now, testing about 50 different themes, hundreds of plugins, and learning more about the loop than I though possible. For some time the iNove theme that I used has caused me a lot of headaches: it is slow, has regularly crashed the publishing functions of my WordPress install, and hadn’t been updated by the developer in years. This last item was particularly annoying, given that these lack of updates have prevented me from taking advantage of many new features in WordPress 3.0.

After spending (literally) countless hours trying to get iNove to work and find an alternate theme, I finally decided to take the plunge on switching themes on Sunday evening. The site came down for about twenty hours (two more than initially scheduled) and has since come back. I’m using the TwentyTen theme as the underbelly of the site, with many of the CSS changes courtesy of the TwentyTen – Blogging Inside Edition child theme. The changes in the child theme have been supplemented with my own modifications. I want to quickly catalogue what I did, in case there are features that someone else wants to incorporate some of these changes into their own TwentyTen installation.

Continue reading

Controversial Changes to Public Domain Works

by Muskingum University Library

A considerable number of today’s copyfight discussions revolve around the usage of DRM to prevent transformative uses of works, to prevent the sharing of works, and to generally limit how individuals engage with the cultural artefacts around them. This post takes a step back from that, thinking through the significance of transforming ‘classic’ works of the English literary canon instead of looking at how new technologies butt heads against free speech. Specifically, I want to argue that NewSouth, Inc.’s decision to publish Huckleberry Finn without the word “nigger” – replacing it with “slave” – demonstrates the importance of works entering the public domain. I restrain from providing a normative framework to evaluate NewSouth’s actual decision – whether changing the particular word is good – and instead use their decision to articulate the conditions constituting ‘bad’ transformations versus ‘good’ transformations of public domain works. I will argue that uniform, uncontested, and totalizing modifications of public domains works is ‘bad’, whereas localized, particular, and discrete transformations should be encouraged given their existence as free expressions capable of (re)generating discussions around topics of social import.

Copyright is intended to operate as an engine to generating expressive content. In theory, by providing a limited monopoly over expressions (not the ideas that are expressed) authors can receive some kind of restitution for the fixed costs that they invest in creating works. While true (especially in the digital era) that marginal costs trend towards zero, pricing based on marginal cost alone fails to adequately account for the sunk costs of actual writing. Admittedly, some do write for free (blogs and academic articles in journals might stand as examples) but many people still write with the hope earning their riches through publications. There isn’t anything wrong with profit motivating an author’s desire to create.

Continue reading

Review of Telecommunications Policy in Transition

Image courtesy of the MIT Press

This first: the edited collection is a decade old. Given the rate that communications technologies and information policies change, this means that several of the articles are…outmoded. Don’t turn here for the latest, greatest, and most powerful analyses of contemporary communications policy. A book published in 2001 is good for anchoring subsequent reading into telecom policy, but less helpful for guiding present day policy analyses.

Having said that: there are some genuine gems in this book, including one of the most forward thinking essays around network neutrality of the past decade by Blumenthal and Clark. Before getting to their piece, I want to touch on O’Donnell’s contribution, “Broadband Architectures, ISP Business Plans, and Open Access”. He reviews architectures and ISP service portfolios to demonstrate that open access is both technically and economically feasible, though acknowledges that implementation is not a trivial task. In the chapter he argues that the FCC should encourage deployment of open access ready networks to reduce the costs of future implementation; I think it’s pretty safe to say that that ship sailed by and open connection is (largely) a dead issue in the US today. That said, he has an excellent overview of the differences between ADSL and Cable networks, and identifies the pain points of interconnection in each architecture.

Generally, O’Donnell sees interconnection as less of a hardware problem and more of a network management issue. In discussing the need and value of open access, O’Donnell does a good job at noting the dangers of throttling (at a time well ahead of ISP’s contemporary throttling regimes), writing

differential caching and routing need not be blatant to be effective in steering customers to preferred content. The subtle manipulation of the technical performance of the network can condition users unconsciously to avoid certain “slower” web sites. A few extra milliseconds’ delay strategically inserted here and there, for example, can effectively shepard users from one web site to another (p53).

Continue reading