Last year I spent some time and put together a working paper entitled, “Deep Packet Inspection in Perspective: Tracing its lineage and surveillance potentials,” for the New Transparency Project (of which I’m a student member). The document has gone live as of today – if you have any comments/thoughts concerning it feel free to send them my way! The abstract is below:
Internet Service Providers (ISPs) are responsible for transmitting and delivering their customers’ data requests, ranging from requests for data from websites, to that from file-sharing applications, to that from participants in Voice over Internet Protocol (VoIP) chat sessions. Using contemporary packet inspection and capture technologies, ISPs can investigate and record the content of unencrypted digital communications data packets. This paper explains the structure of these packets, and then proceeds to describe the packet inspection technologies that monitor their movement and extract information from the packets as they flow across ISP networks. After discussing the potency of contemporary packet inspection devices, in relation to their earlier packet inspection predecessors, and their potential uses in improving network operators’ network management systems, I argue that they should be identified as surveillance technologies that can potentially be incredibly invasive. Drawing on Canadian examples, I argue that Canadian ISPs are using DPI technologies to implicitly ‘teach’ their customers norms about what are ‘inappropriate’ data transfer programs, and the appropriate levels of ISP manipulation of consumer data traffic.
Copyright is becoming an ever-increasingly important part of contemporary lexicon; in Canada, it’s so important that we now have a ‘citizen’s guide‘ to help ‘regular folk’ with their copyright-related concerns. While most eyes are presently focused on the Pirate Bay trial (Ernesto has been blogging about it regularly since the trial started, Jesse Brown’s recent podcast addresses it, etc.), a major ‘success’ in the war on copyright has actually been ‘won’ by Big Media. Ireland’s Eircom has announced that they will be blocking access to peer-to-peer websites in an effort to limit their users’ access to spaces holding copywritten content. This effort to block access is in addition to Eircom’s agreement that they will cut off users who are found infringing on copyright multiple times (a three-strikes rule).
This development substantially ratchets up the question, “What is role(s) do telecommunications companies play in today’s virtualized world, and global digital economy?” Self-imposed private corporations’ policies now threaten to substantially normalize ‘permissible’ modes of both accessing data and determining what accesses are ‘legitimate’ and which are not.
Deep Packet Inspection is being deploying by an increasing number of operators for a host of purposes, including content analysis, flow analysis, network management (broadly stated), network management as integrated with policy management, and behavioural advertising (to name a few). While BT, in the UK, has openly admitted to working with Phorm to bring behavioral advertising to its consumers, it now appears as though network owners are going to be analyzing Internet traffic from mobiles, as well as desktop and notebook computers.
The Guardian is reporting that in a recent GSMA trial to collect information of where mobile users’ are browsing, that “the UK’s five networks – 3, O2, Orange, T-Mobile and Vodafone – used deep packet inspection technology to collect data covering about half the UK’s entire mobile web traffic” (Source). There is no indication that this is presently being associated with customers’ geolocation, but this does suggest that DPI is gaining increasing acceptance in the UK as a means of tracking what people are doing. Apparently the weak regulatory responses in the UK are spurring companies to deploy DPI before they are left behind the rest of the pack.
The Canadian Press is reporting that the EDL database that was part of the Phase 1 Trial of the BC EDL program is coming home. Specifically, they write,
The database with details about several hundred British Columbians was turned over to the U.S. Customs and Border Protection agency last year as part of a controversial project to issue “enhanced driver’s licences” instead of passports for land border crossings. (Source)
What strikes me as interesting/weird about this is that under Phase 1 of the BC EDL program no Canadian data was turned over to the American authorities! This was revealed in the BC EDL Phase 1 Post Implementation Review (Redacted), for which I’ve provided a ‘best hits’ document. Only in Phase 2 was any data sharing to actually start happening, and it was last November (’08) that Ottawa totally dropped plans to locate the Canadian EDL database in the US. This seems to suggest one of two things about the Canadian Press’ article: Continue reading