Rogers, Network Failures, and Third-Party Oversight

Photo credit: Faramarz HashemiDeep packet inspection (DPI) is a form of network surveillance and control that will remain in Canadian networks for the foreseeable future. It operates by examining data packets, determining their likely application-of-origin, and then delaying, prioritizing, or otherwise mediating the content and delivery of the packets. Ostensibly, ISPs have inserted it into their network architectures to manage congestion, mitigate unprofitable capital investment, and enhance billing regimes. These same companies routinely run tests of DPI systems to better nuance the algorithmic identification and mediation of data packets. These tests are used to evaluate algorithmic enhancements of system productivity and efficiency at microlevels prior to rolling new policies out to the entire network.

Such tests are not publicly broadcast, nor are customers notified when ISPs update their DPI devices’ long-term policies. While notification must be provided to various bodies when material changes are made to the network, non-material changes can typically be deployed quietly. Few notice when a deployment of significant scale happens…unless it goes wrong. Based on user-reports in the DSLreports forums it appears that one of Rogers’ recent policy updates was poorly tested and then massively deployed. The ill effects of this deployment are still unresolved, over sixty days later.

In this post, I first detail issues facing Rogers customers, drawing heavily from forum threads at DSLreports. I then suggest that this incident demonstrates multiple failings around DPI governance: a failure to properly evaluate analysis and throttling policies; a failure to significantly acknowledge problems arising from DPI misconfiguration; a failure to proactively alleviate inconveniences of accidental throttling. Large ISPs’ abilities to modify data transit and discrimination conditions is problematic because it increases the risks faced by innovators and developers who cannot predict future data discrimination policies. Such increased risks threaten the overall generative nature of the ends of the Internet. To alleviate some of these risks a trusted third-party should be established. This party would monitor how ISPs themselves govern data traffic and alert citizens and regulators if ISPs discriminate against ‘non-problematic’ traffic types or violate their own terms of service. I ultimately suggest that an independent, though associated, branch of the CRTC that is responsible for watching over ISPs could improve trust between Canadians and the CRTC and between customers and their ISPs.

Continue reading

Ole, Intellectual Property, and Taxing Canadian ISPs

Ole, a Canadian independent record label, put forward an often-heard and much disputed proposal to enhance record label revenues: Ole wants ISPs to surveil Canada’s digital networks for copywritten works. In the record label’s filing on July 12 for the Digital Economy Consultations, entitled “Building Delivery Systems at the Expense of Content Creators,” Ole asserts that ISPs are functioning as “short circuits” and let music customers avoid purchasing music on the free market. Rather than go to the market, customers are (behaving as rational economic actors…) instead using ISP networks to download music. That music is being downloaded is an unquestionable reality, but the stance that this indicates ISP liability for customers’ actions seems to be an effort to re-frame record industries’ unwillingness to adopt contemporary business models as a matter for ISPs to now deal with. In this post, I want to briefly touch on Ole’s filing and the realities of network surveillance for network-grade content awareness in today market. I’ll be concluding by suggesting that many of the problems presently facing labels are of their own making and that we should, at best, feel pity and at worst fear what they crush in their terror throes induced by disruptive technologies.

Ole asserts that there are two key infotainment revenue streams that content providers, such as ISPs, maintain: the $150 Cable TV stream and the $50 Internet stream. Given that content providers are required to redistribute some of the $150/month to content creators (often between 0.40-0.50 cents of every dollar collected), Ole argues that ISPs should be similarly required to distribute some of the $50/month to content creators that make the Internet worth using for end-users. Unstated, but presumed, is a very 1995 understanding of both copyright and digital networks. In 1995 the American Information Infrastructure Task Force released its Intellectual Property and the National Information Infrastructure report, wherein they wrote;

…the full potential of the NII will not be realized if the education, information and entertainment products protected by intellectual property laws are not protected effectively when disseminated via the NII…the public will not use the services available on the NII and generate the market necessary for its success unless a wide variety of works are available under equitable and reasonable terms and conditions, and the integrity of those works is assured…What will drive the NII is the content moving through it.

Of course, the assertion that if commercial content creators don’t make their works available on the Internet then the Internet will collapse is patently false.

Continue reading

The Consumable Mobile Experience

We are rapidly shifting towards a ubiquitous networked world, one that promises to accelerate our access to information and each other, but this network requires a few key elements. Bandwidth must be plentiful, mobile devices that can engage with this world must be widely deployed, and some kind of normative-regulatory framework that encourages creation and consumption must be in place. As it stands, backhaul bandwidth is plentiful, though front-line cellular towers in American and (possibly) Canada are largely unable to accommodate the growing ubiquity of smart devices. In addition to this challenge, we operate in a world where the normative-regulatory framework for the mobile world is threatened by regulatory capture that encourages limited consumption that maximizes revenues while simultaneously discouraging rich, mobile, creative actions. Without a shift to fact-based policy decisions and pricing systems North America is threatened to become the new tech ghetto of the mobile world: rich in talent and ability to innovate, but poor in the actual infrastructure to locally enjoy those innovations.

At the Canadian Telecom Summit this year, mobile operators such as TELUS, Wind Mobile, and Rogers Communications were all quick to pounce on the problems facing AT&T in the US. AT&T regularly suffers voice and data outages for its highest-revenue customers: those who own and use smart phones that are built on the Android, WebOS (i.e. Palm Pre and Pixi), and iOS. Each of these Canadian mobile companies used AT&T’s weaknesses to hammer home that unlimited bandwidth cannot be offered along mobile networks, and suggested that AT&T’s shift from unlimited to limited data plans are indicative of the backhaul and/or spectrum problems caused by smart devices. While I do not want to entirely contest the claim that there are challenges managing exponential increases in mobile data growth, I do want to suggest that technical analysis rather than rhetorical ‘obviousness’ should be applied to understand the similarities and differences between Canadian telcos/cablecos and AT&T.

Continue reading

Choosing Winners with Deep Packet Inspection

I see a lot of the network neutrality discussion as one surrounding the conditions under which applications can, and cannot, be prevented from running. On one hand there are advocates who maintain that telecommunications providers – ISPs such as Bell, Comcast, and Virgin – shouldn’t be responsible for ‘picking winners and losers’ on the basis that consumers should make these choices. On the other hand, advocates for managed (read: functioning) networks insist that network operators have a duty and responsibility to fairly provision their networks in a way that doesn’t see one small group negatively impact the experiences of the larger consumer population. Deep Packet Inspection (DPI) has become a hot-button technology in light of the neutrality debates, given its potential to let ISPs determine what applications function ‘properly’ and which see their data rates delayed for purposes of network management. What is often missing in the network neutrality discussions is a comparison between the uses of DPI across jurisdictions and how these uses might impact ISPs’ abilities to prioritize or deprioritize particular forms of data traffic.

As part of an early bit of thinking on this, I want to direct our attention to Canada, the United States, and the United Kingdom to start framing how these jurisdictions are approaching the use of DPI. In the process, I will make the claim that Canada’s recent CRTC ruling on the use of the technology appears to be more and more progressive in light of recent decisions in the US and the likelihood of the UK’s Digital Economy Bill (DEB) becoming law. Up front I should note that while I think that Canada can be read as ‘progressive’ on the network neutrality front, this shouldn’t suggest that either the CRTC or parliament have done enough: further clarity into the practices of ISPs, additional insight into the technologies they use, and an ongoing discussion of traffic management systems are needed in Canada. Canadian communications increasingly pass through IP networks and as a result our communications infrastructure should be seen as important as defence, education, and health care, each of which are tied to their own critical infrastructures but connected to one another and enabled through digital communications systems. Digital infrastructures draw together the fibres connecting the Canadian people, Canadian business, and Canadian security, and we need to elevate the discussions about this infrastructure to make it a prominent part of the national agenda.

Continue reading

Deep Packet Inspection and Mobile Discrimination

Throughout the 2009 Canadian Telecommunications Summit presenter after presenter, and session after session, spoke to the Canadian situation concerning growth in mobile data. In essence, there is a worry that the wireless infrastructure cannot cope with the high volumes of data that are expected to accompany increasing uses and penetrations of mobile technologies. Such worries persist, even though we’ve recently seen the launch of another high-speed wireless network that was jointly invested in by Bell and Telus, and despite the fact that new wireless competitors are promising to enter the national market as well.

The result of the wireless competition in Canada is this: Canadians actually enjoy pretty fast wireless networks. We can certainly complain about the high costs of such networks, about the conditions under which wireless spectrum was purchased and is used, and so forth, but the fact is that pretty impressive wireless networks exist…for Canadians with cash. As any network operator knows, however, speed is only part of the equation; it’s just as important to have sufficient data provisioning so your user base can genuinely take advantage of the network. It’s partially on the grounds of data provisioning that we’re seeing vendors develop and offer deep packet inspection (DPI) appliances for the mobile environment.

I think that provisioning is the trojan horse, however, and that DPI is really being presented by vendors as a solution to a pair of ‘authentic’ issues: first, the need to improve customer billing, and second, to efficiently participate in the advertising and marketing ecosystem. I would suggest that ‘congestion management’, right now, is more of a spectre-like issue than an authentic concern (and get into defending that claim, in just a moment).

Continue reading

Crown, Copyright, and the CRTC

I’m in the middle of a large project (for one person), and as part of it I wanted to host some CRTC documents on the project’s web server to link into. You see, if you’ve ever been involved in one of the CRTC’s public notices you’ll know that there are literal deluges of documents, many of which are zipped together. For the purposes of disseminating documents over email this works well – it puts all of the documents from say, Bell, into a single zipped file – but makes a user-unfriendly structure of linking to: expecting casual reader to link to zip archives is unreasonable. Given that as part of this project I do want to facilitate ease of access to resources it’s important that users can link to the documents themselves, and not zip archives.

While I pay attention to copyright developments in Canada and abroad, and have strong stances on how academics and the Canadian government should licence their publications, I’m not a lawyer. I do, however, know that government documents in Canada are governed by Crown Copyright – unlike in the US, the Canadian government maintains copyright over its publications – and thus I wanted to check with the CRTC if there were any problems hosting documents from their site, including those presumably under a Crown copyright such as the CRTC’s decision.

Continue reading