Rogers, Network Failures, and Third-Party Oversight

Photo credit: Faramarz HashemiDeep packet inspection (DPI) is a form of network surveillance and control that will remain in Canadian networks for the foreseeable future. It operates by examining data packets, determining their likely application-of-origin, and then delaying, prioritizing, or otherwise mediating the content and delivery of the packets. Ostensibly, ISPs have inserted it into their network architectures to manage congestion, mitigate unprofitable capital investment, and enhance billing regimes. These same companies routinely run tests of DPI systems to better nuance the algorithmic identification and mediation of data packets. These tests are used to evaluate algorithmic enhancements of system productivity and efficiency at microlevels prior to rolling new policies out to the entire network.

Such tests are not publicly broadcast, nor are customers notified when ISPs update their DPI devices’ long-term policies. While notification must be provided to various bodies when material changes are made to the network, non-material changes can typically be deployed quietly. Few notice when a deployment of significant scale happens…unless it goes wrong. Based on user-reports in the DSLreports forums it appears that one of Rogers’ recent policy updates was poorly tested and then massively deployed. The ill effects of this deployment are still unresolved, over sixty days later.

In this post, I first detail issues facing Rogers customers, drawing heavily from forum threads at DSLreports. I then suggest that this incident demonstrates multiple failings around DPI governance: a failure to properly evaluate analysis and throttling policies; a failure to significantly acknowledge problems arising from DPI misconfiguration; a failure to proactively alleviate inconveniences of accidental throttling. Large ISPs’ abilities to modify data transit and discrimination conditions is problematic because it increases the risks faced by innovators and developers who cannot predict future data discrimination policies. Such increased risks threaten the overall generative nature of the ends of the Internet. To alleviate some of these risks a trusted third-party should be established. This party would monitor how ISPs themselves govern data traffic and alert citizens and regulators if ISPs discriminate against ‘non-problematic’ traffic types or violate their own terms of service. I ultimately suggest that an independent, though associated, branch of the CRTC that is responsible for watching over ISPs could improve trust between Canadians and the CRTC and between customers and their ISPs.

Continue reading

Review: Internet Architecture and Innovation

Internet_Architecture_and_Innovation_coverI want to very highly recommend Barbara van Schewick’s Internet Architecture and Innovation. Various authors, advocates, scholars, and businesses have spoken about the economic impacts of the Internet, but to date there hasn’t been a detailed economic accounting of what may happen if/when ISPs monitor and control the flow of data across their networks. van Schewick has filled this gap by examining “how changes in the Internet’s architecture (that is, its underlying technical structure) affect the economic environment for innovation” and evaluating “the impact of these changes from the perspective of public policy” (van Schewick 2010: 2).

Her book traces the economic consequences associated with changing the Internet’s structure from one enabling any innovator to design an application or share content online to a structure where ISPs must first authorize access to content and design key applications  in house (e.g. P2P, email, etc). Barbara draws heavily from Internet history literatures and economic theory to buttress her position that a closed or highly controlled Internet not only constitutes a massive change in the original architecture of the ‘net, but that this change would be damaging to society’s economic, cultural, and political interests. She argues that an increasingly controlled Internet is the future that many ISPs prefer, and supports this conclusion with economic theory and the historical actions of American telecommunications corporations.

van Schewick begins by outlining two notions of the end-to-end principle undergirding the ‘net, a narrow and broad conception, and argues (successfully, in my mind) that ISPs and their interrogators often rely on different end-to-end understandings in making their respective arguments to the public, regulators, and each other.

Continue reading

Analyzing the Verizon-Google Net Neutrality Framework

Technology is neither good or bad. It’s also not neutral. Network neutrality, a political rallying cry meant to motivate free-speech, free-culture, and innovation advocates, was reportedly betrayed by Google following the release of a Verizon-Google policy document on network management/neutrality. What the document reveals is that the two corporations, facing a (seemingly) impotent FCC, have gotten the ball rolling by suggesting a set of policies that the FCC could use in developing a network neutrality framework. Unfortunately, there has been little even-handed analysis of this document from the advocates of network neutrality; instead we have witnessed vitriol and over-the-top rhetoric. This is disappointing. While sensational headlines attract readers, they do little to actually inform the public about network neutrality in a detailed, granular, reasonable fashion. Verizon-Google have provided advocates with an opportunity to pointedly articulate their views while the public is watching, and this is not an opportunity that should be squandered with bitter and unproductive criticism.

I’m intending this to be the first of a few posts on network neutrality.[1] In this post, I exclusively work through the principles suggested by Verizon-Google. In this first, and probationary, analysis I will draw on existing American regulatory language and lessons that might be drawn from the Canadian experience surrounding network management. My overall feel of the document published by Verizon-Google is that, in many ways, it’s very conservative insofar as it adheres to dominant North American regulatory approaches. My key suggestion is that instead of rejecting the principles laid out in their entirety we should carefully consider each in turn. During my examination, I hope to identify what principles and/or their elements could be usefully taken up into a government-backed regulatory framework that recognizes the technical, social, and economic potentials of America’s broadband networks.

Continue reading

Forrester Needs to Rethink on Privacy

Forrester has come out with a report that, in Susana Schwartz’s summary, “suggests that more should be done to integrate data about [ISPs’] customers’ online behaviours to offline systems.” In effect, to assist ISPs monetize their networks they need to aggregate a lot more data, in very intelligent ways. The killer section of the actual report is summarized by a Forrester researcher as follows;

“By integrating online and offline data, operators and their enterprise customers could add information about customers’ online behaviors to existing customer profiles so that CSRs could more efficiently handle calls and provide more relevant cross sell/upsell opportunities,” Stanhope said. “So much of the customer experience now comes from online activities that there is a huge repository of data that should be pushed deeper into enterprises for insights about interactions; enterprises collect so much data about what people do and see on their Web sites, yet they do little to draw insight.”

The aim of this is to ‘help’ customers find services they unknowingly are interested in, while making ‘more intelligence’ available to customer service representatives when customers call in. We’re talking about a genuinely massive aggregation of data that goes through ISP gateways and a dissolution of Chinese firewalls that presently segregate network logs with (most) subscriber information. Just so you don’t think that I’m reading into this too deeply, Stanhope (a senior analyst of consumer intelligence with Forrester Research) said to Schwartz:

Our clients are starting to plan for and lay the technical foundational by looking at how to bring together disparate environments, like CRM databases and customer databases, and then what they have to do to gather Web data, social media and search data so they can leverage what they already have … Many are now starting to look at how that can be a hub for Web data, which can be leveraged by other systems.

It’s this kind of language that gets privacy advocates both annoyed and worried. Annoyed, because such a massive aggregation and usage of personal data would constitute a gross privacy violation – both in terms of national laws and social norms – and worried because of the relative opaque curtain separating their investigations from the goings-on of ISPs. When we read words such as Stanhope’s, correlate it with the vendor-speak surrounding deep packet inspection, and look at the technology’s usage in developing consumer profiles, there is a feeling that everyone is saying that DPI won’t and can’t be used for massive data aggregation as configured…but it could and (Stanhope hopes) likely will once the time is right.

Canada has a strong regulatory position against the use of DPI or other network forensics for the kind of actions that Stanhope is encouraging. This said, given that ‘research’ groups like Forrester along with other parties that pitch products to ISPs are making similar noises (as demonstrated at last year’s Canadian Telecom Summit) a nagging pit in my stomach reminds me that constant vigilance is required to maintain those regulatory positions and keep ISPs from bitting into a very profitable – but poisonous for Canadians’ privacy – apple.

Deep Packet Inspection Canada

Last week my advisor, Dr. Colin Bennett, and I launched a new website that is meant to provide Canadians with information about how their Internet Service Provider (ISP) monitors data traffic and manages their network. This website, Deep Packet Inspection Canada, aggregates information that has been disclosed on the public record about how the technology is used, why, and what uses of it are seen as ‘off limits’ by ISPs. The research has been funded through the Office of the Privacy Commissioner of Canada’s contributions program.

Deep packet inspection is a technology that facilitates a heightened awareness of what is flowing across ISP networks. It has the ability to determine the protocols responsible for shuttling information to and from the Internet, the applications that are used in transmitting the data, and (in test conditions) can even extract elements of data from the application layer of the data traffic in real time and compare it against other packet signatures to block particular data flows based on the content being accessed. Additionally, the technology can be used to modify packet flows using the technology – something done by Rogers – but it should be noted that DPI is not presently used to prevent Canadians from accessing particular content on the web, nor is it stopping them from using P2P services to download copywritten works.

Continue reading

Thoughts on COUNTER: Counterfeiting and Piracy Research Conference

Last week I was a participant at the COUNTER: Counterfeit and Piracy Research Conference in Manchester, UK. I was invited to be part of a panel on deep packet inspection by Joseph Savirimuthu, as well as enjoy the conference more generally. It was, without a doubt, one of the best conferences that I have attended – it was thought-provoking and (at points) anger-inducing, good food and accommodations were provided, and excellent discussions were had. What I want to talk about are some of the resonating themes that coursed through the conference and try to situate a few of the positions and participants to give an insight into what was talked about.

The COUNTER project is a European research project exploring the consumption of counterfeit and pirated leisure goods. It has a series of primary research domains, including: (1) frequency and distribution of counterfeits; (2) consumer attitudes to counterfeit and pirated goods; (3) legal and ethical frameworks for intellectual property; (4) policy options for engaging with consumers of counterfeit; (5) the use of copyrighted goods for the creation of new cultural artifacts; (6) impacts of counterfeiting and control of intellectual property.

Continue reading