I’m happy to let my readers know that Marita Moll’s and Leslie Shade’s (eds.) The Internet Tree: The State of Telecom Policy in Canada 3.0 is now available for purchase. The book interrogates how Canada’s digital future does, and should, look in coming days by discussing present policies and proposing policies to enhance Canada’s position in the digitally connected world. The editors have done an excellent job in contacting academics, advocates, and solicitors from around Canada to develop an exciting and accessible edited collection on Internet and broadband in Canada. It includes scholars such as Dwayne Winseck, Michael Geist, Catherine Middleton, and Richard Smith, along with contributions from Steve Anderson (Open Media), Michael Janigan (PIAC), and a host of graduate students and researchers.
The book is published through the Canadian Center for Policy Alternatives (CCPA). The publisher and editors describe that book as a collection in which:
… committed public interest advocates and academics present primers on provocative digital policy issues: broadband access, copyright, net neutrality, privacy, and security, along with a consideration of structures of participation in policy-making and communication rights.
Contributors to The Internet Tree argue for a digital economy strategy that casts a winning vote for openness, broadband as an essential service, and community engagement and inclusion.
The Internet Tree is available for just $14.95 and is supportive of digital economy strategies that are guided by the principles of openness, broadband as an essential service, community engagement and inclusion, national sovereignty, and digital literacy programs. My own contribution (“Is Your ISP Snooping On You?”) explains the technical and social concerns raised by deep packet inspection to someone who doesn’t know a coaxial cable from a fibre node, with other authors similarly working to explain issues to the layman while offering suggestions to alleviate, mediate, or overcome the challenges facing Canada’s digital ecosystem. It’s got a great set of authors and I’d highly recommend it as a complement to Open Media’s recently published report on digital networks in Canada.
I spend an exorbitant amount of time reading about the legacies of today’s telecommunications networks. This serves to historically ground my analyses of today’s telecommunications ecosystem; why have certain laws, policies, and politics developed as they have, how do contemporary actions break from (or conform with) past events, and what cycles are detectable in telecommunications discussions. After reading hosts of accounts detailing the telegraph and telephone, I’m certain that John’s Network Nation: Inventing American Telecommunications is the most accessible and thorough discussion of these communications systems that I’ve come across to date.
Eschewing an anachronistic view of the telegraph and telephone – seeing neither through the lens that they are simply precursors to contemporary digital communications systems – John offers a granular account of how both technologies developed in the US. His analysis is decidedly neutral towards the technologies and technical developments themselves, instead attending to the role(s) of political economy in shaping how the telegraph and telephone grew as services, political objects, and zones of popular contention. He has carefully poured through original source documents and so can offer insights into the actual machinations of politicians, investors, municipal aldermen, and communications companies’ CEOs and engineers to weave a comprehensive account of the telegraph and telephone industries. Importantly, John focuses on the importance of civic ideals and governmental institutions in shaping technical innovations; contrary to most popular understandings that see government as ‘catching up’ to technicians post-WW I, the technicians have long locked their horns with those of government.
In the domain of telecom policy, it seems like a series of bad ideas (re)arise alongside major innovations in communications systems and technologies. In this post, I want to turn to the telegraph to shed light on issues of communication bandwidth, security and privacy that are being (re)addressed by regulators around the world as they grapple with the Internet. I’ll speak to the legacy of data retention in analogue and digital communicative infrastructures, congestion management, protocol development, and encryption policies to demonstrate how these issues have arisen in the past, and conclude by suggesting a few precautionary notes about the future of the Internet. I do want to acknowledge, before getting into the meat of this post, that while the telegraph can be usefully identified as a precursor to the digital Internet because of the strong analogies between the two technological systems it did use different technological scaffolding. Thus, lessons that are drawn are based on the analogical similarities, rather than technical homogeneity between the systems.
The telegraph took years to develop. Standardization was a particular issues, perhaps best epitomized by the French having an early telegraph system of (effectively) high-tech signal towers, whereas other nations struggled to develop interoperable cross-continental electrically-based systems. Following the French communication innovation (which was largely used to coordinate military endeavours), inventors in other nations such as Britain and the United States spent considerable amounts of time learning how to send electrical pulses along various kinds of cables to communicate information at high speed across vast distances.
Technology is neither good or bad. It’s also not neutral. Network neutrality, a political rallying cry meant to motivate free-speech, free-culture, and innovation advocates, was reportedly betrayed by Google following the release of a Verizon-Google policy document on network management/neutrality. What the document reveals is that the two corporations, facing a (seemingly) impotent FCC, have gotten the ball rolling by suggesting a set of policies that the FCC could use in developing a network neutrality framework. Unfortunately, there has been little even-handed analysis of this document from the advocates of network neutrality; instead we have witnessed vitriol and over-the-top rhetoric. This is disappointing. While sensational headlines attract readers, they do little to actually inform the public about network neutrality in a detailed, granular, reasonable fashion. Verizon-Google have provided advocates with an opportunity to pointedly articulate their views while the public is watching, and this is not an opportunity that should be squandered with bitter and unproductive criticism.
I’m intending this to be the first of a few posts on network neutrality. In this post, I exclusively work through the principles suggested by Verizon-Google. In this first, and probationary, analysis I will draw on existing American regulatory language and lessons that might be drawn from the Canadian experience surrounding network management. My overall feel of the document published by Verizon-Google is that, in many ways, it’s very conservative insofar as it adheres to dominant North American regulatory approaches. My key suggestion is that instead of rejecting the principles laid out in their entirety we should carefully consider each in turn. During my examination, I hope to identify what principles and/or their elements could be usefully taken up into a government-backed regulatory framework that recognizes the technical, social, and economic potentials of America’s broadband networks.
I see a lot of the network neutrality discussion as one surrounding the conditions under which applications can, and cannot, be prevented from running. On one hand there are advocates who maintain that telecommunications providers – ISPs such as Bell, Comcast, and Virgin – shouldn’t be responsible for ‘picking winners and losers’ on the basis that consumers should make these choices. On the other hand, advocates for managed (read: functioning) networks insist that network operators have a duty and responsibility to fairly provision their networks in a way that doesn’t see one small group negatively impact the experiences of the larger consumer population. Deep Packet Inspection (DPI) has become a hot-button technology in light of the neutrality debates, given its potential to let ISPs determine what applications function ‘properly’ and which see their data rates delayed for purposes of network management. What is often missing in the network neutrality discussions is a comparison between the uses of DPI across jurisdictions and how these uses might impact ISPs’ abilities to prioritize or deprioritize particular forms of data traffic.
As part of an early bit of thinking on this, I want to direct our attention to Canada, the United States, and the United Kingdom to start framing how these jurisdictions are approaching the use of DPI. In the process, I will make the claim that Canada’s recent CRTC ruling on the use of the technology appears to be more and more progressive in light of recent decisions in the US and the likelihood of the UK’s Digital Economy Bill (DEB) becoming law. Up front I should note that while I think that Canada can be read as ‘progressive’ on the network neutrality front, this shouldn’t suggest that either the CRTC or parliament have done enough: further clarity into the practices of ISPs, additional insight into the technologies they use, and an ongoing discussion of traffic management systems are needed in Canada. Canadian communications increasingly pass through IP networks and as a result our communications infrastructure should be seen as important as defence, education, and health care, each of which are tied to their own critical infrastructures but connected to one another and enabled through digital communications systems. Digital infrastructures draw together the fibres connecting the Canadian people, Canadian business, and Canadian security, and we need to elevate the discussions about this infrastructure to make it a prominent part of the national agenda.