Deep Packet Inspection and Consumer Transparency

Image by David Clow

Rogers Communications modified their packet inspection systems last year, and ever since customers have experienced degraded download speeds. It’s not that random users happen to be complaining about an (effectively) non-problem: Rogers’ own outreach staff has confirmed that the modifications took place and that these changes have negatively impacted peer to peer (P2P) and non-P2P applications alike. Since then, a Rogers Communications senior-vice president, Ken Englehart, has suggested that any problems customers have run into are resultant of P2P applications themselves; no mention is made of whether or how Rogers’ throttling systems have affected non-P2P traffic.

In this brief post, I want to quickly refresh readers on the changes that Rogers Communications made to their systems last year, and also note some of the problems that have subsequently arisen. Following this, I take up what Mr. Englehart recently stated in the media about Rogers’ throttling mechanisms. I conclude by noting that Rogers is likely in compliance with the CRTC’s transparency requirements (or at least soon will be), but that such requirements are ill suited to inform the typical consumer.

Continue reading

Review of Telecommunications Policy in Transition

Image courtesy of the MIT Press

This first: the edited collection is a decade old. Given the rate that communications technologies and information policies change, this means that several of the articles are…outmoded. Don’t turn here for the latest, greatest, and most powerful analyses of contemporary communications policy. A book published in 2001 is good for anchoring subsequent reading into telecom policy, but less helpful for guiding present day policy analyses.

Having said that: there are some genuine gems in this book, including one of the most forward thinking essays around network neutrality of the past decade by Blumenthal and Clark. Before getting to their piece, I want to touch on O’Donnell’s contribution, “Broadband Architectures, ISP Business Plans, and Open Access”. He reviews architectures and ISP service portfolios to demonstrate that open access is both technically and economically feasible, though acknowledges that implementation is not a trivial task. In the chapter he argues that the FCC should encourage deployment of open access ready networks to reduce the costs of future implementation; I think it’s pretty safe to say that that ship sailed by and open connection is (largely) a dead issue in the US today. That said, he has an excellent overview of the differences between ADSL and Cable networks, and identifies the pain points of interconnection in each architecture.

Generally, O’Donnell sees interconnection as less of a hardware problem and more of a network management issue. In discussing the need and value of open access, O’Donnell does a good job at noting the dangers of throttling (at a time well ahead of ISP’s contemporary throttling regimes), writing

differential caching and routing need not be blatant to be effective in steering customers to preferred content. The subtle manipulation of the technical performance of the network can condition users unconsciously to avoid certain “slower” web sites. A few extra milliseconds’ delay strategically inserted here and there, for example, can effectively shepard users from one web site to another (p53).

Continue reading

Rogers, Network Failures, and Third-Party Oversight

Photo credit: Faramarz HashemiDeep packet inspection (DPI) is a form of network surveillance and control that will remain in Canadian networks for the foreseeable future. It operates by examining data packets, determining their likely application-of-origin, and then delaying, prioritizing, or otherwise mediating the content and delivery of the packets. Ostensibly, ISPs have inserted it into their network architectures to manage congestion, mitigate unprofitable capital investment, and enhance billing regimes. These same companies routinely run tests of DPI systems to better nuance the algorithmic identification and mediation of data packets. These tests are used to evaluate algorithmic enhancements of system productivity and efficiency at microlevels prior to rolling new policies out to the entire network.

Such tests are not publicly broadcast, nor are customers notified when ISPs update their DPI devices’ long-term policies. While notification must be provided to various bodies when material changes are made to the network, non-material changes can typically be deployed quietly. Few notice when a deployment of significant scale happens…unless it goes wrong. Based on user-reports in the DSLreports forums it appears that one of Rogers’ recent policy updates was poorly tested and then massively deployed. The ill effects of this deployment are still unresolved, over sixty days later.

In this post, I first detail issues facing Rogers customers, drawing heavily from forum threads at DSLreports. I then suggest that this incident demonstrates multiple failings around DPI governance: a failure to properly evaluate analysis and throttling policies; a failure to significantly acknowledge problems arising from DPI misconfiguration; a failure to proactively alleviate inconveniences of accidental throttling. Large ISPs’ abilities to modify data transit and discrimination conditions is problematic because it increases the risks faced by innovators and developers who cannot predict future data discrimination policies. Such increased risks threaten the overall generative nature of the ends of the Internet. To alleviate some of these risks a trusted third-party should be established. This party would monitor how ISPs themselves govern data traffic and alert citizens and regulators if ISPs discriminate against ‘non-problematic’ traffic types or violate their own terms of service. I ultimately suggest that an independent, though associated, branch of the CRTC that is responsible for watching over ISPs could improve trust between Canadians and the CRTC and between customers and their ISPs.

Continue reading

Ole, Intellectual Property, and Taxing Canadian ISPs

Ole, a Canadian independent record label, put forward an often-heard and much disputed proposal to enhance record label revenues: Ole wants ISPs to surveil Canada’s digital networks for copywritten works. In the record label’s filing on July 12 for the Digital Economy Consultations, entitled “Building Delivery Systems at the Expense of Content Creators,” Ole asserts that ISPs are functioning as “short circuits” and let music customers avoid purchasing music on the free market. Rather than go to the market, customers are (behaving as rational economic actors…) instead using ISP networks to download music. That music is being downloaded is an unquestionable reality, but the stance that this indicates ISP liability for customers’ actions seems to be an effort to re-frame record industries’ unwillingness to adopt contemporary business models as a matter for ISPs to now deal with. In this post, I want to briefly touch on Ole’s filing and the realities of network surveillance for network-grade content awareness in today market. I’ll be concluding by suggesting that many of the problems presently facing labels are of their own making and that we should, at best, feel pity and at worst fear what they crush in their terror throes induced by disruptive technologies.

Ole asserts that there are two key infotainment revenue streams that content providers, such as ISPs, maintain: the $150 Cable TV stream and the $50 Internet stream. Given that content providers are required to redistribute some of the $150/month to content creators (often between 0.40-0.50 cents of every dollar collected), Ole argues that ISPs should be similarly required to distribute some of the $50/month to content creators that make the Internet worth using for end-users. Unstated, but presumed, is a very 1995 understanding of both copyright and digital networks. In 1995 the American Information Infrastructure Task Force released its Intellectual Property and the National Information Infrastructure report, wherein they wrote;

…the full potential of the NII will not be realized if the education, information and entertainment products protected by intellectual property laws are not protected effectively when disseminated via the NII…the public will not use the services available on the NII and generate the market necessary for its success unless a wide variety of works are available under equitable and reasonable terms and conditions, and the integrity of those works is assured…What will drive the NII is the content moving through it.

Of course, the assertion that if commercial content creators don’t make their works available on the Internet then the Internet will collapse is patently false.

Continue reading

Choosing Winners with Deep Packet Inspection

I see a lot of the network neutrality discussion as one surrounding the conditions under which applications can, and cannot, be prevented from running. On one hand there are advocates who maintain that telecommunications providers – ISPs such as Bell, Comcast, and Virgin – shouldn’t be responsible for ‘picking winners and losers’ on the basis that consumers should make these choices. On the other hand, advocates for managed (read: functioning) networks insist that network operators have a duty and responsibility to fairly provision their networks in a way that doesn’t see one small group negatively impact the experiences of the larger consumer population. Deep Packet Inspection (DPI) has become a hot-button technology in light of the neutrality debates, given its potential to let ISPs determine what applications function ‘properly’ and which see their data rates delayed for purposes of network management. What is often missing in the network neutrality discussions is a comparison between the uses of DPI across jurisdictions and how these uses might impact ISPs’ abilities to prioritize or deprioritize particular forms of data traffic.

As part of an early bit of thinking on this, I want to direct our attention to Canada, the United States, and the United Kingdom to start framing how these jurisdictions are approaching the use of DPI. In the process, I will make the claim that Canada’s recent CRTC ruling on the use of the technology appears to be more and more progressive in light of recent decisions in the US and the likelihood of the UK’s Digital Economy Bill (DEB) becoming law. Up front I should note that while I think that Canada can be read as ‘progressive’ on the network neutrality front, this shouldn’t suggest that either the CRTC or parliament have done enough: further clarity into the practices of ISPs, additional insight into the technologies they use, and an ongoing discussion of traffic management systems are needed in Canada. Canadian communications increasingly pass through IP networks and as a result our communications infrastructure should be seen as important as defence, education, and health care, each of which are tied to their own critical infrastructures but connected to one another and enabled through digital communications systems. Digital infrastructures draw together the fibres connecting the Canadian people, Canadian business, and Canadian security, and we need to elevate the discussions about this infrastructure to make it a prominent part of the national agenda.

Continue reading

Thoughts on COUNTER: Counterfeiting and Piracy Research Conference

Last week I was a participant at the COUNTER: Counterfeit and Piracy Research Conference in Manchester, UK. I was invited to be part of a panel on deep packet inspection by Joseph Savirimuthu, as well as enjoy the conference more generally. It was, without a doubt, one of the best conferences that I have attended – it was thought-provoking and (at points) anger-inducing, good food and accommodations were provided, and excellent discussions were had. What I want to talk about are some of the resonating themes that coursed through the conference and try to situate a few of the positions and participants to give an insight into what was talked about.

The COUNTER project is a European research project exploring the consumption of counterfeit and pirated leisure goods. It has a series of primary research domains, including: (1) frequency and distribution of counterfeits; (2) consumer attitudes to counterfeit and pirated goods; (3) legal and ethical frameworks for intellectual property; (4) policy options for engaging with consumers of counterfeit; (5) the use of copyrighted goods for the creation of new cultural artifacts; (6) impacts of counterfeiting and control of intellectual property.

Continue reading