Journal Publication: Moving Across the Internet

I recently had an article published through CTheory, one of the world’s leading journals of theory, technology, and culture. The article is titled “Moving Across the Internet: Code-Bodies, Code-Corpses, and Network Architecture.” The article emerged from a presentation I gave at last year’s Critical Digital Studies Workshop that was titled “Moving Online: Your Packets, Your ISP, Your Identity.”

Abstract:

Across the Internet, an arms race between agents supporting and opposing network-based surveillance techniques has quietly unfolded over the past two decades. Whereas the 1990s might be characterized as hosting the first round of the encryption wars, this paper focuses on the contemporary battlescape. Specifically, I consider how ISPs “secure” and “manage” their digital networks using contemporary DPI appliances and the ramifications that these appliances may have on the development, and our understanding, of the code-body. DPI networking appliances operate as surveillance devices that render the digital subject constituted by data packets bare to heuristic analyses, but, despite the ingenuity of these devices, some encryption techniques successfully harden otherwise soft digital flesh and render it opaque. Drawing on Kant and Derrida, I suggest that ISPs’ understanding of the Internet as one of packets arguably corresponds with a Kantian notion of reality-as-such and offers a limited and problematic conception of the code-body. Turning to Derrida, we move beyond protocol alone to consider the specters that are always before, and always after, the code-body; Derrida provides a way of thinking beyond Kantian conceptions of space and time and the reality-as-such code-body and lets us consider the holistic identity of the code-being. Further, Derrida lets us interrogate the nature of DPI networking appliances and see that they resemble thrashing zombie-like code-corpses that always try, but perpetually fail, to become fully self-animated. While Derridean insights suggest that ISPs are unlikely to be successful in wholly understanding or shaping code-bodies, these corporate juggernauts do incite identity transformations that are inculcated in cauldrons of risk and fear. Not even Derridean specters can prevent the rending of digital flesh or act as a total antidote to ISPs’ shaping of consumers’ packet-based bodily identity.

Link to article.

Privacy Issues Strike Street View (Again)

Google Street View has come under fire again, this time for collecting wireless router information and some packets of data whilst wandering the globe and collecting pictures of our streets. It looks like the German authorities, in particular, may come down hard of Google though I’m at odds about the ‘calibre’ of the privacy violation – router information is fair game as far as I’m concerned, though data packets are a little dicier. But before I dig into that, let me outline what’s actually gone on.

Last Friday, Google announced that they had been inadvertently collecting some data packets sent via unencrypted wireless access points for the past three years. This admission came after the Street View program (again) came under criticism from German data protection authorities following Google’s (original, and earlier) admission that they had only been collecting information about wireless routers as they drove their cars around towns. Specifically, the original admission saw Google reveal they had collected the SSID and MAC addresses of routers. In layman’s terms, the SSID is the name of the wireless network that is usually given to the device during configuration processes following the installation of the device (e.g. Apartment 312, Pablo14, or any of the other names that are shown when you scan for wireless networks from your computer). The MAC address a unique number that is associated with each piece of Internet networking equipment; your wireless card in your computer, your LAN card, your router, and your iPhone all have unique numbers. After collecting both the SSID and MAC address of a wireless router the company identified the physical location of the device using a GPS system.

Google collects information about wireless networks and (almost more importantly) their physical location to provide a wifi-based geolocation system. Once they know where wireless routers are, and plot them on a map, you don’t need GPS to plan and trace a route through a city because a wireless card and low-powered computer will suffice. There are claims that this constitutes a privacy infringement, insofar as the correlation of SSID, MAC address, and physical location of the router constitute personal information. I’m not sure that I agree with this, as the Google service stands now.

Continue reading

Deep Packet Inspection Canada

Last week my advisor, Dr. Colin Bennett, and I launched a new website that is meant to provide Canadians with information about how their Internet Service Provider (ISP) monitors data traffic and manages their network. This website, Deep Packet Inspection Canada, aggregates information that has been disclosed on the public record about how the technology is used, why, and what uses of it are seen as ‘off limits’ by ISPs. The research has been funded through the Office of the Privacy Commissioner of Canada’s contributions program.

Deep packet inspection is a technology that facilitates a heightened awareness of what is flowing across ISP networks. It has the ability to determine the protocols responsible for shuttling information to and from the Internet, the applications that are used in transmitting the data, and (in test conditions) can even extract elements of data from the application layer of the data traffic in real time and compare it against other packet signatures to block particular data flows based on the content being accessed. Additionally, the technology can be used to modify packet flows using the technology – something done by Rogers – but it should be noted that DPI is not presently used to prevent Canadians from accessing particular content on the web, nor is it stopping them from using P2P services to download copywritten works.

Continue reading

Choosing Winners with Deep Packet Inspection

I see a lot of the network neutrality discussion as one surrounding the conditions under which applications can, and cannot, be prevented from running. On one hand there are advocates who maintain that telecommunications providers – ISPs such as Bell, Comcast, and Virgin – shouldn’t be responsible for ‘picking winners and losers’ on the basis that consumers should make these choices. On the other hand, advocates for managed (read: functioning) networks insist that network operators have a duty and responsibility to fairly provision their networks in a way that doesn’t see one small group negatively impact the experiences of the larger consumer population. Deep Packet Inspection (DPI) has become a hot-button technology in light of the neutrality debates, given its potential to let ISPs determine what applications function ‘properly’ and which see their data rates delayed for purposes of network management. What is often missing in the network neutrality discussions is a comparison between the uses of DPI across jurisdictions and how these uses might impact ISPs’ abilities to prioritize or deprioritize particular forms of data traffic.

As part of an early bit of thinking on this, I want to direct our attention to Canada, the United States, and the United Kingdom to start framing how these jurisdictions are approaching the use of DPI. In the process, I will make the claim that Canada’s recent CRTC ruling on the use of the technology appears to be more and more progressive in light of recent decisions in the US and the likelihood of the UK’s Digital Economy Bill (DEB) becoming law. Up front I should note that while I think that Canada can be read as ‘progressive’ on the network neutrality front, this shouldn’t suggest that either the CRTC or parliament have done enough: further clarity into the practices of ISPs, additional insight into the technologies they use, and an ongoing discussion of traffic management systems are needed in Canada. Canadian communications increasingly pass through IP networks and as a result our communications infrastructure should be seen as important as defence, education, and health care, each of which are tied to their own critical infrastructures but connected to one another and enabled through digital communications systems. Digital infrastructures draw together the fibres connecting the Canadian people, Canadian business, and Canadian security, and we need to elevate the discussions about this infrastructure to make it a prominent part of the national agenda.

Continue reading

Thoughts on COUNTER: Counterfeiting and Piracy Research Conference

Last week I was a participant at the COUNTER: Counterfeit and Piracy Research Conference in Manchester, UK. I was invited to be part of a panel on deep packet inspection by Joseph Savirimuthu, as well as enjoy the conference more generally. It was, without a doubt, one of the best conferences that I have attended – it was thought-provoking and (at points) anger-inducing, good food and accommodations were provided, and excellent discussions were had. What I want to talk about are some of the resonating themes that coursed through the conference and try to situate a few of the positions and participants to give an insight into what was talked about.

The COUNTER project is a European research project exploring the consumption of counterfeit and pirated leisure goods. It has a series of primary research domains, including: (1) frequency and distribution of counterfeits; (2) consumer attitudes to counterfeit and pirated goods; (3) legal and ethical frameworks for intellectual property; (4) policy options for engaging with consumers of counterfeit; (5) the use of copyrighted goods for the creation of new cultural artifacts; (6) impacts of counterfeiting and control of intellectual property.

Continue reading