Call for Assistance: A Broadband Analysis Tool

3096166092_da7bcf9997_bCommunications systems are integral to emerging and developed democracies; the capability to rapidly transmit information from one point to another can help fuel revolutions and launch information campaigns about unpopular decisions to ‘meter’ the Internet. In foreign nations and at home in Canada we regularly see ISPs interfere with transmissions of data content. Both abroad and at home, researchers and advocates often have difficulties decoding what telecom and cableco providers are up to: What systems are examining data traffic? How is Internet access distributed through the nation? Are contractually similar data plans that are sold in different geographic regions providing customers with similar levels of service?

To date, Canadian advocates and researchers have been limited in their ability to draw on empirical data during major hearings at the CRTC. This makes research and advocacy challenging. Over the past several years, researchers, advocates, counsel, and members of industry that I’ve spoken to have complained that they need hard data. (It’s a gripe that I’ve stated personally, as well). With your help, numbers will be on the way. Continue reading

Review of Telecommunications Policy in Transition

Image courtesy of the MIT Press

This first: the edited collection is a decade old. Given the rate that communications technologies and information policies change, this means that several of the articles are…outmoded. Don’t turn here for the latest, greatest, and most powerful analyses of contemporary communications policy. A book published in 2001 is good for anchoring subsequent reading into telecom policy, but less helpful for guiding present day policy analyses.

Having said that: there are some genuine gems in this book, including one of the most forward thinking essays around network neutrality of the past decade by Blumenthal and Clark. Before getting to their piece, I want to touch on O’Donnell’s contribution, “Broadband Architectures, ISP Business Plans, and Open Access”. He reviews architectures and ISP service portfolios to demonstrate that open access is both technically and economically feasible, though acknowledges that implementation is not a trivial task. In the chapter he argues that the FCC should encourage deployment of open access ready networks to reduce the costs of future implementation; I think it’s pretty safe to say that that ship sailed by and open connection is (largely) a dead issue in the US today. That said, he has an excellent overview of the differences between ADSL and Cable networks, and identifies the pain points of interconnection in each architecture.

Generally, O’Donnell sees interconnection as less of a hardware problem and more of a network management issue. In discussing the need and value of open access, O’Donnell does a good job at noting the dangers of throttling (at a time well ahead of ISP’s contemporary throttling regimes), writing

differential caching and routing need not be blatant to be effective in steering customers to preferred content. The subtle manipulation of the technical performance of the network can condition users unconsciously to avoid certain “slower” web sites. A few extra milliseconds’ delay strategically inserted here and there, for example, can effectively shepard users from one web site to another (p53).

Continue reading

Analyzing the Verizon-Google Net Neutrality Framework

Technology is neither good or bad. It’s also not neutral. Network neutrality, a political rallying cry meant to motivate free-speech, free-culture, and innovation advocates, was reportedly betrayed by Google following the release of a Verizon-Google policy document on network management/neutrality. What the document reveals is that the two corporations, facing a (seemingly) impotent FCC, have gotten the ball rolling by suggesting a set of policies that the FCC could use in developing a network neutrality framework. Unfortunately, there has been little even-handed analysis of this document from the advocates of network neutrality; instead we have witnessed vitriol and over-the-top rhetoric. This is disappointing. While sensational headlines attract readers, they do little to actually inform the public about network neutrality in a detailed, granular, reasonable fashion. Verizon-Google have provided advocates with an opportunity to pointedly articulate their views while the public is watching, and this is not an opportunity that should be squandered with bitter and unproductive criticism.

I’m intending this to be the first of a few posts on network neutrality.[1] In this post, I exclusively work through the principles suggested by Verizon-Google. In this first, and probationary, analysis I will draw on existing American regulatory language and lessons that might be drawn from the Canadian experience surrounding network management. My overall feel of the document published by Verizon-Google is that, in many ways, it’s very conservative insofar as it adheres to dominant North American regulatory approaches. My key suggestion is that instead of rejecting the principles laid out in their entirety we should carefully consider each in turn. During my examination, I hope to identify what principles and/or their elements could be usefully taken up into a government-backed regulatory framework that recognizes the technical, social, and economic potentials of America’s broadband networks.

Continue reading

Beyond Fear and Deep Packet Inspection

securitybooksOver the past few days I’ve been able to attend to non-essential reading, which has given me the opportunity to start chewing through Bruce Schneier’s Beyond Fear. The book, in general, is an effort on Bruce’s part to get people thinking critically about security measures. It’s incredibly accessible and easy to read – I’d highly recommend it.

Early on in the text, Schneier provides a set of questions that ought to be asked before deploying a security system. I want to very briefly think through those questions as they relate to Deep Packet Inspection (DPI) in Canada to begin narrowing a security-derived understanding of the technology in Canada. My hope is that through critically engaging with this technology that a model to capture concerns and worries can start to emerge.

Question 1: What assets are you trying to protect?

  • Network infrastructure from being overwhelmed by data traffic.

Question 2: What are the risks to these assets?

  • Synchronous bandwidth-heavy applications running 24/7 that generate congestion and thus broadly degrade consumer experiences.

Question 3: How well does security mitigate those risks?

Continue reading