If you spend much time working with computers then you’re likely familiar with metadata, or data about data. In the digital era metadata is relied upon for many of the tagging and categorization systems that are seen in popular web environments, such as Twitter, Digg, Delicious, Facebook, and so forth, and is more generally used to define, structure, and administrate data across all digital environments. I should state, upfront, that metadata is incredibly valuable: nothing that I’m going to write about should leave you with the suggestion that metadata should be removed from the digital landscape or could be removed. Instead I’m advocating for a responsible use of metadata.
In this post I will be drawing on a pair of examples to underscore just how much data is contained in popular metadata structures: the information divulged every time a person tweets on Twitter, and what your mobile phone operator may be giving up to third-parties when you browse the web on your phone. In the latter case, especially, we see that metadata is not just important for routing data traffic but also responsible for disclosing a considerable amount of personal information. I’ll conclude by noting, once again, that our privacy regulators, commissioners, advocates, and researchers need to additional funding if citizens are to have those parties regularly identify ‘bad’ metadata practices and seek rapid remedies before the data ends up being datamined for illicit or unjustifiable reasons.
Last week my advisor, Dr. Colin Bennett, and I launched a new website that is meant to provide Canadians with information about how their Internet Service Provider (ISP) monitors data traffic and manages their network. This website, Deep Packet Inspection Canada, aggregates information that has been disclosed on the public record about how the technology is used, why, and what uses of it are seen as ‘off limits’ by ISPs. The research has been funded through the Office of the Privacy Commissioner of Canada’s contributions program.
Deep packet inspection is a technology that facilitates a heightened awareness of what is flowing across ISP networks. It has the ability to determine the protocols responsible for shuttling information to and from the Internet, the applications that are used in transmitting the data, and (in test conditions) can even extract elements of data from the application layer of the data traffic in real time and compare it against other packet signatures to block particular data flows based on the content being accessed. Additionally, the technology can be used to modify packet flows using the technology – something done by Rogers – but it should be noted that DPI is not presently used to prevent Canadians from accessing particular content on the web, nor is it stopping them from using P2P services to download copywritten works.
I see a lot of the network neutrality discussion as one surrounding the conditions under which applications can, and cannot, be prevented from running. On one hand there are advocates who maintain that telecommunications providers – ISPs such as Bell, Comcast, and Virgin – shouldn’t be responsible for ‘picking winners and losers’ on the basis that consumers should make these choices. On the other hand, advocates for managed (read: functioning) networks insist that network operators have a duty and responsibility to fairly provision their networks in a way that doesn’t see one small group negatively impact the experiences of the larger consumer population. Deep Packet Inspection (DPI) has become a hot-button technology in light of the neutrality debates, given its potential to let ISPs determine what applications function ‘properly’ and which see their data rates delayed for purposes of network management. What is often missing in the network neutrality discussions is a comparison between the uses of DPI across jurisdictions and how these uses might impact ISPs’ abilities to prioritize or deprioritize particular forms of data traffic.
As part of an early bit of thinking on this, I want to direct our attention to Canada, the United States, and the United Kingdom to start framing how these jurisdictions are approaching the use of DPI. In the process, I will make the claim that Canada’s recent CRTC ruling on the use of the technology appears to be more and more progressive in light of recent decisions in the US and the likelihood of the UK’s Digital Economy Bill (DEB) becoming law. Up front I should note that while I think that Canada can be read as ‘progressive’ on the network neutrality front, this shouldn’t suggest that either the CRTC or parliament have done enough: further clarity into the practices of ISPs, additional insight into the technologies they use, and an ongoing discussion of traffic management systems are needed in Canada. Canadian communications increasingly pass through IP networks and as a result our communications infrastructure should be seen as important as defence, education, and health care, each of which are tied to their own critical infrastructures but connected to one another and enabled through digital communications systems. Digital infrastructures draw together the fibres connecting the Canadian people, Canadian business, and Canadian security, and we need to elevate the discussions about this infrastructure to make it a prominent part of the national agenda.
Last week I was a participant at the COUNTER: Counterfeit and Piracy Research Conference in Manchester, UK. I was invited to be part of a panel on deep packet inspection by Joseph Savirimuthu, as well as enjoy the conference more generally. It was, without a doubt, one of the best conferences that I have attended – it was thought-provoking and (at points) anger-inducing, good food and accommodations were provided, and excellent discussions were had. What I want to talk about are some of the resonating themes that coursed through the conference and try to situate a few of the positions and participants to give an insight into what was talked about.
The COUNTER project is a European research project exploring the consumption of counterfeit and pirated leisure goods. It has a series of primary research domains, including: (1) frequency and distribution of counterfeits; (2) consumer attitudes to counterfeit and pirated goods; (3) legal and ethical frameworks for intellectual property; (4) policy options for engaging with consumers of counterfeit; (5) the use of copyrighted goods for the creation of new cultural artifacts; (6) impacts of counterfeiting and control of intellectual property.