Agenda Denial and UK Privacy Advocacy

stopFunding, technical and political savvy, human resources, and time. These are just a few of the challenges standing before privacy advocates who want to make their case to the public, legislators, and regulators. When looking at the landscape there are regularly cases where advocates are more successful than expected or markedly less than anticipated; that advocates stopped BT from permanently deploying Phorm’s Webwise advertising system was impressive, whereas the failures to limit transfers of European airline passenger data to the US were somewhat surprising.[1] While there are regular analyses of how privacy advocates might get the issue of the day onto governmental agendas there is seemingly less time spent on how opponents resist advocates’ efforts. This post constitutes an early attempt to work through some of the politics of agenda-setting related to deep packet inspection and privacy for my dissertation project. Comments are welcome.

To be more specific, in this post I want to think about how items are kept off the agenda. Why are they kept off, who engages in the opposition(s), and what are some of the tactics employed? In responding to these questions I will significantly rely on theory from R. W. Cobb’s and M. H. Ross’ Cultural Strategies of Agenda Denial, linked with work by other prominent scholars and advocates. My goal is to evaluate whether the strategies that Cobb and Ross write about apply to the issues championed by privacy advocates in the UK who oppose the deployment of the Webwise advertising system. I won’t be working through the technical or political backstory of Phorm in this post and will be assuming that readers have at least a moderate familiarity with the backstory of Phorm – if you’re unfamiliar with it, I’d suggest a quick detour to the wikipedia page devoted to the company.

Continue reading

Distinguishing Between Mobile Congestions

by Simon TunbridgeThere is an ongoing push to ‘better’ monetize the mobile marketplace. In this near-future market, wireless providers use DPI and other Quality of Service equipment to charge subscribers for each and every action they take online. The past few weeks have seen Sandvine and other vendors talk about this potential, and Rogers has begun testing the market to determine if mobile customers will pay for data prioritization. The prioritization of data is classified as a network neutrality issue proper, and one that demands careful consideration and examination.

In this post, I’m not talking about network neutrality. Instead, I’m going to talk about what supposedly drives prioritization schemes in Canada’s wireless marketplace: congestion. Consider this a repartee to the oft-touted position that ‘wireless is different’: ISPs assert that wireless is different than wireline for their own regulatory ends, but blur distinctions between the two when pitching ‘congestion management’ schemes to customers. In this post I suggest that the congestion faced by AT&T and other wireless providers has far less to do with data congestion than with signal congestion, and that carriers have to own responsibility for the latter.

Continue reading

Rogers, Network Failures, and Third-Party Oversight

Photo credit: Faramarz HashemiDeep packet inspection (DPI) is a form of network surveillance and control that will remain in Canadian networks for the foreseeable future. It operates by examining data packets, determining their likely application-of-origin, and then delaying, prioritizing, or otherwise mediating the content and delivery of the packets. Ostensibly, ISPs have inserted it into their network architectures to manage congestion, mitigate unprofitable capital investment, and enhance billing regimes. These same companies routinely run tests of DPI systems to better nuance the algorithmic identification and mediation of data packets. These tests are used to evaluate algorithmic enhancements of system productivity and efficiency at microlevels prior to rolling new policies out to the entire network.

Such tests are not publicly broadcast, nor are customers notified when ISPs update their DPI devices’ long-term policies. While notification must be provided to various bodies when material changes are made to the network, non-material changes can typically be deployed quietly. Few notice when a deployment of significant scale happens…unless it goes wrong. Based on user-reports in the DSLreports forums it appears that one of Rogers’ recent policy updates was poorly tested and then massively deployed. The ill effects of this deployment are still unresolved, over sixty days later.

In this post, I first detail issues facing Rogers customers, drawing heavily from forum threads at DSLreports. I then suggest that this incident demonstrates multiple failings around DPI governance: a failure to properly evaluate analysis and throttling policies; a failure to significantly acknowledge problems arising from DPI misconfiguration; a failure to proactively alleviate inconveniences of accidental throttling. Large ISPs’ abilities to modify data transit and discrimination conditions is problematic because it increases the risks faced by innovators and developers who cannot predict future data discrimination policies. Such increased risks threaten the overall generative nature of the ends of the Internet. To alleviate some of these risks a trusted third-party should be established. This party would monitor how ISPs themselves govern data traffic and alert citizens and regulators if ISPs discriminate against ‘non-problematic’ traffic types or violate their own terms of service. I ultimately suggest that an independent, though associated, branch of the CRTC that is responsible for watching over ISPs could improve trust between Canadians and the CRTC and between customers and their ISPs.

Continue reading

Ole, Intellectual Property, and Taxing Canadian ISPs

Ole, a Canadian independent record label, put forward an often-heard and much disputed proposal to enhance record label revenues: Ole wants ISPs to surveil Canada’s digital networks for copywritten works. In the record label’s filing on July 12 for the Digital Economy Consultations, entitled “Building Delivery Systems at the Expense of Content Creators,” Ole asserts that ISPs are functioning as “short circuits” and let music customers avoid purchasing music on the free market. Rather than go to the market, customers are (behaving as rational economic actors…) instead using ISP networks to download music. That music is being downloaded is an unquestionable reality, but the stance that this indicates ISP liability for customers’ actions seems to be an effort to re-frame record industries’ unwillingness to adopt contemporary business models as a matter for ISPs to now deal with. In this post, I want to briefly touch on Ole’s filing and the realities of network surveillance for network-grade content awareness in today market. I’ll be concluding by suggesting that many of the problems presently facing labels are of their own making and that we should, at best, feel pity and at worst fear what they crush in their terror throes induced by disruptive technologies.

Ole asserts that there are two key infotainment revenue streams that content providers, such as ISPs, maintain: the $150 Cable TV stream and the $50 Internet stream. Given that content providers are required to redistribute some of the $150/month to content creators (often between 0.40-0.50 cents of every dollar collected), Ole argues that ISPs should be similarly required to distribute some of the $50/month to content creators that make the Internet worth using for end-users. Unstated, but presumed, is a very 1995 understanding of both copyright and digital networks. In 1995 the American Information Infrastructure Task Force released its Intellectual Property and the National Information Infrastructure report, wherein they wrote;

…the full potential of the NII will not be realized if the education, information and entertainment products protected by intellectual property laws are not protected effectively when disseminated via the NII…the public will not use the services available on the NII and generate the market necessary for its success unless a wide variety of works are available under equitable and reasonable terms and conditions, and the integrity of those works is assured…What will drive the NII is the content moving through it.

Of course, the assertion that if commercial content creators don’t make their works available on the Internet then the Internet will collapse is patently false.

Continue reading

Traffic Management on Mobile Gets Regulated

Shortly before Canada Day the Canadian Radio-television Telecommunications Commission (CRTC) released their decision as to whether they were to modify the forbearance framework for mobile wireless data services. To date, the CRTC has used a light hand when it’s come to wireless data communications: they’ve generally left wireless providers alone so that the providers could expand their networks in the (supposedly) competitive wireless marketplace. As of decision 2010-445 the Commission’s power and duties are extended and the spectre of traffic management on mobile networks is re-raised.

In this post I’m going to spell out what the changes actually mean – what duties and responsibilities, in specific, the CRTC is responsible for – and what traffic management on mobile networks would entail. This will see me significantly reference portions of the Canadian Telecommunications Act; if you do work in telecommunications in Canada you’ll be familiar with a lot of what’s below (and might find my earlier post on deep packet inspection and mobile discrimination more interesting), but for the rest this will expose you to some of the actual text of the Act.

In amending the forbearance framework the CRTC is entering the regulatory domain on several topics pertaining to wireless data communications. Specifically, wireless providers are now subject to section 24 and subsections 27(2), 27(3), and 27(4) of the Act. Section 24 states that the “offering and provision of telecommunications service by a Canadian carrier are subject to any conditions imposed by the Commission or included in tariff approved by the Commission.” In effect, the CRTC can now intervene in the conditions of service that carriers make available to other carriers and the public. Under 27(2) carriers can no longer unjustly discriminate against or give unreasonable preference towards any person. This limitation includes the telecommunications carrier itself and thus means that neither fees nor management of the network can be excessively leveraged to the benefit of the carrier and detriment of other parties.

Continue reading

Journal Publication: Moving Across the Internet

I recently had an article published through CTheory, one of the world’s leading journals of theory, technology, and culture. The article is titled “Moving Across the Internet: Code-Bodies, Code-Corpses, and Network Architecture.” The article emerged from a presentation I gave at last year’s Critical Digital Studies Workshop that was titled “Moving Online: Your Packets, Your ISP, Your Identity.”


Across the Internet, an arms race between agents supporting and opposing network-based surveillance techniques has quietly unfolded over the past two decades. Whereas the 1990s might be characterized as hosting the first round of the encryption wars, this paper focuses on the contemporary battlescape. Specifically, I consider how ISPs “secure” and “manage” their digital networks using contemporary DPI appliances and the ramifications that these appliances may have on the development, and our understanding, of the code-body. DPI networking appliances operate as surveillance devices that render the digital subject constituted by data packets bare to heuristic analyses, but, despite the ingenuity of these devices, some encryption techniques successfully harden otherwise soft digital flesh and render it opaque. Drawing on Kant and Derrida, I suggest that ISPs’ understanding of the Internet as one of packets arguably corresponds with a Kantian notion of reality-as-such and offers a limited and problematic conception of the code-body. Turning to Derrida, we move beyond protocol alone to consider the specters that are always before, and always after, the code-body; Derrida provides a way of thinking beyond Kantian conceptions of space and time and the reality-as-such code-body and lets us consider the holistic identity of the code-being. Further, Derrida lets us interrogate the nature of DPI networking appliances and see that they resemble thrashing zombie-like code-corpses that always try, but perpetually fail, to become fully self-animated. While Derridean insights suggest that ISPs are unlikely to be successful in wholly understanding or shaping code-bodies, these corporate juggernauts do incite identity transformations that are inculcated in cauldrons of risk and fear. Not even Derridean specters can prevent the rending of digital flesh or act as a total antidote to ISPs’ shaping of consumers’ packet-based bodily identity.

Link to article.