Security, Hierarchy, and Networked Governance

UnlockedThe capacity for the Internet to route around damage and censorship is dependent on there being multiple pathways for data to be routed. What happens when there are incredibly few pathways, and when many of the existing paths contain hidden traps that undermine communications security and privacy? This question is always relevant when talking about communications, but has become particularly topical given recent events that compromised some of the Internet’s key security infrastructure and trust networks.

On March 22 2011, Tor researchers disclosed a vulnerability in the certificate authority (CA) system. Certificates are used to encrypt data traffic between parties and to guarantee that security certificates are actually issued to the parties holding them. The CA system underpins a massive number of the Internet’s trust relationships; when individuals log into their banks, some social networking services, and many online email services, their data traffic is encrypted to prevent a third-party from listening into the content of the communication. Those encrypted sessions are made possible by the certificates issued by certificate authorities. The Tor researchers announced that an attacker had compromised a CA and issued certificates that let the attacker impersonate the security credentials associated with many of the world’s most prominent websites. Few individuals would ever detect this subterfuge. In effect, Tor researchers discovered that a central element of the Internet’s trust network was broken.

In this post I want to do a few things. First, I’ll briefly describe the attack and its accompanying risks. This will, in part, see me briefly discuss modes of surveillance and motivations for different gradients of surveillance. I next address a growing problem for today’s Internet users: the points of trust we depend on, such as CAs and the DNS infrastructure, are increasingly unreliable. As a result, states can overtly or subtly manipulate to disrupt or monitor their citizens’ communications. Finally, I suggest that in spite of these points of control, states are increasingly limited in their capacities to unilaterally enforce their will. As a consequence of networked governance, and its accompanying power structures, citizens can impose accountability on states and limit their ability to (re)distribute power across and between nodes of networks. Thus, networked governance not only transforms state power but redistributes (some) power to non-state actors, empowering those actors to resist illegitimate state actions.

Continue reading

Technology and Politics in Tunisia and Iran: Deep Packet Surveillance

Middleeast-IranFor some time, I’ve been keeping an eye on how the Iranian government monitors, mediates, and influences data traffic on public networks. This has seen me write several posts, here and elsewhere, about the government’s usage of deep packet inspection, the implications of Iranian government surveillance, and the challenges posed by Iranian ISPs’ most recent network updates. Last month I was invited to give a talk at the Pacific Centre for Technology and Culture about the usage of deep packet inspection by the Iranian and Tunisian governments.

Abstract

Faced with growing unrest that is (at least in part) facilitated by digital communications, repressive nation-states have integrated powerful new surveillance systems into the depths of their nations’ communications infrastructures. In this presentation, Christopher Parsons first discusses the capabilities of a technology, deep packet inspection, which is used to survey, analyze, and modify communications in real-time. He then discusses the composition of the Iranian and Tunisian telecommunications infrastructure, outlining how deep packet inspection is used to monitor, block, and subvert encrypted and private communications. The presentation concludes with a brief reflection on how this same technology is deployed in the West, with a focus on how we might identify key actors, motivations, and drivers of the technology in our own network ecologies.

Note: For more information on the Iranian use of deep packet inspection, see ‘Is Iran Now Actually Using Deep Packet Inspection?

Call for Assistance: A Broadband Analysis Tool

3096166092_da7bcf9997_bCommunications systems are integral to emerging and developed democracies; the capability to rapidly transmit information from one point to another can help fuel revolutions and launch information campaigns about unpopular decisions to ‘meter’ the Internet. In foreign nations and at home in Canada we regularly see ISPs interfere with transmissions of data content. Both abroad and at home, researchers and advocates often have difficulties decoding what telecom and cableco providers are up to: What systems are examining data traffic? How is Internet access distributed through the nation? Are contractually similar data plans that are sold in different geographic regions providing customers with similar levels of service?

To date, Canadian advocates and researchers have been limited in their ability to draw on empirical data during major hearings at the CRTC. This makes research and advocacy challenging. Over the past several years, researchers, advocates, counsel, and members of industry that I’ve spoken to have complained that they need hard data. (It’s a gripe that I’ve stated personally, as well). With your help, numbers will be on the way. Continue reading

Deep Packet Inspection and Consumer Transparency

Image by David Clow

Rogers Communications modified their packet inspection systems last year, and ever since customers have experienced degraded download speeds. It’s not that random users happen to be complaining about an (effectively) non-problem: Rogers’ own outreach staff has confirmed that the modifications took place and that these changes have negatively impacted peer to peer (P2P) and non-P2P applications alike. Since then, a Rogers Communications senior-vice president, Ken Englehart, has suggested that any problems customers have run into are resultant of P2P applications themselves; no mention is made of whether or how Rogers’ throttling systems have affected non-P2P traffic.

In this brief post, I want to quickly refresh readers on the changes that Rogers Communications made to their systems last year, and also note some of the problems that have subsequently arisen. Following this, I take up what Mr. Englehart recently stated in the media about Rogers’ throttling mechanisms. I conclude by noting that Rogers is likely in compliance with the CRTC’s transparency requirements (or at least soon will be), but that such requirements are ill suited to inform the typical consumer.

Continue reading

Agenda Denial and UK Privacy Advocacy

stopFunding, technical and political savvy, human resources, and time. These are just a few of the challenges standing before privacy advocates who want to make their case to the public, legislators, and regulators. When looking at the landscape there are regularly cases where advocates are more successful than expected or markedly less than anticipated; that advocates stopped BT from permanently deploying Phorm’s Webwise advertising system was impressive, whereas the failures to limit transfers of European airline passenger data to the US were somewhat surprising.[1] While there are regular analyses of how privacy advocates might get the issue of the day onto governmental agendas there is seemingly less time spent on how opponents resist advocates’ efforts. This post constitutes an early attempt to work through some of the politics of agenda-setting related to deep packet inspection and privacy for my dissertation project. Comments are welcome.

To be more specific, in this post I want to think about how items are kept off the agenda. Why are they kept off, who engages in the opposition(s), and what are some of the tactics employed? In responding to these questions I will significantly rely on theory from R. W. Cobb’s and M. H. Ross’ Cultural Strategies of Agenda Denial, linked with work by other prominent scholars and advocates. My goal is to evaluate whether the strategies that Cobb and Ross write about apply to the issues championed by privacy advocates in the UK who oppose the deployment of the Webwise advertising system. I won’t be working through the technical or political backstory of Phorm in this post and will be assuming that readers have at least a moderate familiarity with the backstory of Phorm – if you’re unfamiliar with it, I’d suggest a quick detour to the wikipedia page devoted to the company.

Continue reading

Review of Telecommunications Policy in Transition

Image courtesy of the MIT Press

This first: the edited collection is a decade old. Given the rate that communications technologies and information policies change, this means that several of the articles are…outmoded. Don’t turn here for the latest, greatest, and most powerful analyses of contemporary communications policy. A book published in 2001 is good for anchoring subsequent reading into telecom policy, but less helpful for guiding present day policy analyses.

Having said that: there are some genuine gems in this book, including one of the most forward thinking essays around network neutrality of the past decade by Blumenthal and Clark. Before getting to their piece, I want to touch on O’Donnell’s contribution, “Broadband Architectures, ISP Business Plans, and Open Access”. He reviews architectures and ISP service portfolios to demonstrate that open access is both technically and economically feasible, though acknowledges that implementation is not a trivial task. In the chapter he argues that the FCC should encourage deployment of open access ready networks to reduce the costs of future implementation; I think it’s pretty safe to say that that ship sailed by and open connection is (largely) a dead issue in the US today. That said, he has an excellent overview of the differences between ADSL and Cable networks, and identifies the pain points of interconnection in each architecture.

Generally, O’Donnell sees interconnection as less of a hardware problem and more of a network management issue. In discussing the need and value of open access, O’Donnell does a good job at noting the dangers of throttling (at a time well ahead of ISP’s contemporary throttling regimes), writing

differential caching and routing need not be blatant to be effective in steering customers to preferred content. The subtle manipulation of the technical performance of the network can condition users unconsciously to avoid certain “slower” web sites. A few extra milliseconds’ delay strategically inserted here and there, for example, can effectively shepard users from one web site to another (p53).

Continue reading