Agenda Denial and UK Privacy Advocacy

stopFunding, technical and political savvy, human resources, and time. These are just a few of the challenges standing before privacy advocates who want to make their case to the public, legislators, and regulators. When looking at the landscape there are regularly cases where advocates are more successful than expected or markedly less than anticipated; that advocates stopped BT from permanently deploying Phorm’s Webwise advertising system was impressive, whereas the failures to limit transfers of European airline passenger data to the US were somewhat surprising.[1] While there are regular analyses of how privacy advocates might get the issue of the day onto governmental agendas there is seemingly less time spent on how opponents resist advocates’ efforts. This post constitutes an early attempt to work through some of the politics of agenda-setting related to deep packet inspection and privacy for my dissertation project. Comments are welcome.

To be more specific, in this post I want to think about how items are kept off the agenda. Why are they kept off, who engages in the opposition(s), and what are some of the tactics employed? In responding to these questions I will significantly rely on theory from R. W. Cobb’s and M. H. Ross’ Cultural Strategies of Agenda Denial, linked with work by other prominent scholars and advocates. My goal is to evaluate whether the strategies that Cobb and Ross write about apply to the issues championed by privacy advocates in the UK who oppose the deployment of the Webwise advertising system. I won’t be working through the technical or political backstory of Phorm in this post and will be assuming that readers have at least a moderate familiarity with the backstory of Phorm – if you’re unfamiliar with it, I’d suggest a quick detour to the wikipedia page devoted to the company.

Continue reading

Publication – Digital Inflections: Post-Literacy and the Age of Imagination

Earlier this year I was contacted by CTheory to find and interview interesting people that are doing work at the intersection of theory, digitality, and information. Michael Ridley, the Chief Information Officer and Chief Librarian at the University of Guelph, was the first person that came to mind. I met with Michael earlier this year for a face-to-face discussion, and our conversation has since been transcribed and published at CTheory. Below is the full introduction to the interview.

“… [O]ne of the things about librarians is that they’re subversive in the nicest possible ways. They’ve been doing the Wikileak thing for centuries, but just didn’t get the credit for it. This is what we try to do all the time; we try to reduce the barriers and open up that information.”
— Michael Ridley

Self-identifying as the University’s Head Geek and Chief Dork, Michael Ridley leads a life of the future by reconfiguring access to the past. As Chief Librarian and Chief Information Office of the University of Guelph, Ridley spends his days integrating digital potentialities and the power of imagination with the cultural and historical resources of the library. Seeing the digital as a liminal space between the age of the alphabet and an era of post-literacy, he is transforming the mission of libraries: gone are the days where libraries primarily focus on developing collections. Today, collections are the raw materials fueling the library as a dissonance engine, an engine enabling collaborative, cross-disciplinary imaginations.

With a critical attitude towards the hegemony of literacy, combined with a prognostication of digitality’s impending demise, Ridley’s position at the University of Guelph facilitates radical reconsiderations of the library’s present and forthcoming roles. He received his M.L.S. from the University of Toronto, his M.A from the University of New Brunswick, and has been a professional librarian since 1979. So far, Michael has served as President of the Canadian Association for Information Science, President of the Ontario Library Association, Board member of the Canadian Association of Research Libraries, and Chair of the Ontario Council of Universities. He is presently a board member of the Canadian Research Knowledge Network and of the Canadian University Council of CIOs. He has received an array of awards, and was most recently awarded the Miles Blackwell Award for Outstanding Academic Librarians by the Canadian Association of College and University Libraries. Ridley has published extensively about the intersection of networks, digital systems, and libraries, including “The Online Catalogue and the User,” “Providing Electronic Library Reference Service: Experiences from the Indonesia-Canada Tele-Education Project,” “Computer-Mediated Communications Systems,” and “Community Development in the Digital World.” He has also co-edited volumes one and two of The Public-Access Computer Systems Review. Lately, his work has examined the potentials of post-literacy, which has seen him teach an ongoing undergraduate class on literacy and post-literacy as well as giving presentations and publishing on the topic.

Read the full conversation at CTheory

Review: Internet Architecture and Innovation

Internet_Architecture_and_Innovation_coverI want to very highly recommend Barbara van Schewick’s Internet Architecture and Innovation. Various authors, advocates, scholars, and businesses have spoken about the economic impacts of the Internet, but to date there hasn’t been a detailed economic accounting of what may happen if/when ISPs monitor and control the flow of data across their networks. van Schewick has filled this gap by examining “how changes in the Internet’s architecture (that is, its underlying technical structure) affect the economic environment for innovation” and evaluating “the impact of these changes from the perspective of public policy” (van Schewick 2010: 2).

Her book traces the economic consequences associated with changing the Internet’s structure from one enabling any innovator to design an application or share content online to a structure where ISPs must first authorize access to content and design key applications  in house (e.g. P2P, email, etc). Barbara draws heavily from Internet history literatures and economic theory to buttress her position that a closed or highly controlled Internet not only constitutes a massive change in the original architecture of the ‘net, but that this change would be damaging to society’s economic, cultural, and political interests. She argues that an increasingly controlled Internet is the future that many ISPs prefer, and supports this conclusion with economic theory and the historical actions of American telecommunications corporations.

van Schewick begins by outlining two notions of the end-to-end principle undergirding the ‘net, a narrow and broad conception, and argues (successfully, in my mind) that ISPs and their interrogators often rely on different end-to-end understandings in making their respective arguments to the public, regulators, and each other.

Continue reading

Analyzing the Verizon-Google Net Neutrality Framework

Technology is neither good or bad. It’s also not neutral. Network neutrality, a political rallying cry meant to motivate free-speech, free-culture, and innovation advocates, was reportedly betrayed by Google following the release of a Verizon-Google policy document on network management/neutrality. What the document reveals is that the two corporations, facing a (seemingly) impotent FCC, have gotten the ball rolling by suggesting a set of policies that the FCC could use in developing a network neutrality framework. Unfortunately, there has been little even-handed analysis of this document from the advocates of network neutrality; instead we have witnessed vitriol and over-the-top rhetoric. This is disappointing. While sensational headlines attract readers, they do little to actually inform the public about network neutrality in a detailed, granular, reasonable fashion. Verizon-Google have provided advocates with an opportunity to pointedly articulate their views while the public is watching, and this is not an opportunity that should be squandered with bitter and unproductive criticism.

I’m intending this to be the first of a few posts on network neutrality.[1] In this post, I exclusively work through the principles suggested by Verizon-Google. In this first, and probationary, analysis I will draw on existing American regulatory language and lessons that might be drawn from the Canadian experience surrounding network management. My overall feel of the document published by Verizon-Google is that, in many ways, it’s very conservative insofar as it adheres to dominant North American regulatory approaches. My key suggestion is that instead of rejecting the principles laid out in their entirety we should carefully consider each in turn. During my examination, I hope to identify what principles and/or their elements could be usefully taken up into a government-backed regulatory framework that recognizes the technical, social, and economic potentials of America’s broadband networks.

Continue reading

The Consumable Mobile Experience

We are rapidly shifting towards a ubiquitous networked world, one that promises to accelerate our access to information and each other, but this network requires a few key elements. Bandwidth must be plentiful, mobile devices that can engage with this world must be widely deployed, and some kind of normative-regulatory framework that encourages creation and consumption must be in place. As it stands, backhaul bandwidth is plentiful, though front-line cellular towers in American and (possibly) Canada are largely unable to accommodate the growing ubiquity of smart devices. In addition to this challenge, we operate in a world where the normative-regulatory framework for the mobile world is threatened by regulatory capture that encourages limited consumption that maximizes revenues while simultaneously discouraging rich, mobile, creative actions. Without a shift to fact-based policy decisions and pricing systems North America is threatened to become the new tech ghetto of the mobile world: rich in talent and ability to innovate, but poor in the actual infrastructure to locally enjoy those innovations.

At the Canadian Telecom Summit this year, mobile operators such as TELUS, Wind Mobile, and Rogers Communications were all quick to pounce on the problems facing AT&T in the US. AT&T regularly suffers voice and data outages for its highest-revenue customers: those who own and use smart phones that are built on the Android, WebOS (i.e. Palm Pre and Pixi), and iOS. Each of these Canadian mobile companies used AT&T’s weaknesses to hammer home that unlimited bandwidth cannot be offered along mobile networks, and suggested that AT&T’s shift from unlimited to limited data plans are indicative of the backhaul and/or spectrum problems caused by smart devices. While I do not want to entirely contest the claim that there are challenges managing exponential increases in mobile data growth, I do want to suggest that technical analysis rather than rhetorical ‘obviousness’ should be applied to understand the similarities and differences between Canadian telcos/cablecos and AT&T.

Continue reading

Draft – Deep Packet Inspection: Privacy, Mash-ups, and Dignities

This is a draft of the paper that I’ll be presenting at the Counter: Piracy and Counterfeit conference in Manchester in a few days. It’s still rough around some edges, but feels like a substantial piece. Comments, as always, are welcome.

Abstract:

Privacy operates as an umbrella-like concept that shelters liberal citizens’ capacity to enjoy the autonomy, secrecy, and liberty, values that are key to citizens enjoying their psychic and civil dignity. As digitisation sweeps through the post-industrial information economy, these same citizens are increasingly sharing and disseminating copywritten files using peer-to-peer file sharing networks. In the face of economic challenges posed by these networks, some members of the recording industries have sought agreements with Internet Service Providers (ISPs) to govern the sharing of copywritten data. In Britain, file-sharing governance has recently manifested in the form of Virgin Media inserting deep packet inspection (DPI) appliances into their network to monitor for levels of infringing files. In this presentation, I argue that ISPs and vendors must demonstrate technical and social transparency over their use of DPI to assuage worries that communications providers are endangering citizens’ psychic and civil dignities. Drawing on recent Canadian regulatory processes concerning Canadian applications of DPI, I suggest that transparency between civil advocacy groups and ISPs and vendors can garner trust required to limit harms to citizens’ psychic dignity. Further, I maintain that using DPI appliances to detect copyright infringement and apply three-strikes proposals unduly threatens citizens’ civil dignities; alternate governance strategies must be adopted to preserve citizens’ civil dignity.

Download paper