Beyond Fear and Deep Packet Inspection

securitybooksOver the past few days I’ve been able to attend to non-essential reading, which has given me the opportunity to start chewing through Bruce Schneier’s Beyond Fear. The book, in general, is an effort on Bruce’s part to get people thinking critically about security measures. It’s incredibly accessible and easy to read – I’d highly recommend it.

Early on in the text, Schneier provides a set of questions that ought to be asked before deploying a security system. I want to very briefly think through those questions as they relate to Deep Packet Inspection (DPI) in Canada to begin narrowing a security-derived understanding of the technology in Canada. My hope is that through critically engaging with this technology that a model to capture concerns and worries can start to emerge.

Question 1: What assets are you trying to protect?

  • Network infrastructure from being overwhelmed by data traffic.

Question 2: What are the risks to these assets?

  • Synchronous bandwidth-heavy applications running 24/7 that generate congestion and thus broadly degrade consumer experiences.

Question 3: How well does security mitigate those risks?

Continue reading

Talking About Deep Packet Inspection Tomorrow

chekokoI’ll be chatting with Chris Cook tomorrow between 5:00-5:20 or so about deep packet inspection, network neutrality, and the Canadian situation. This will be the second time that I’ve had the opportunity to talk with Chris, and it’s always a pleasure.  Hit Gorilla Radio’s posting for more information.

I’d just like to publicly thank the University of Victoria’s Communications department for their assistance these past few weeks. They’ve been wonderful in letting various media outlets around the country know about my research, which has let me disclose my research more widely then normal. UVic’s level of support to their graduate students is absolutely amazing – I’d highly recommend that any graduate student interested in a Canadian institution take a look at their offerings. UVic rocks!

Review: The Long Tail (Revised and Updated)

thelongtailI’m in the middle of a massive reading streak for my comprehensive exams, but I’m trying to sneak in some personal reading at the same time. The first book in that ‘extra’ reading is Anderson’s “The Long Tail”, which focuses on the effect that shifting to digital systems has for economic scarcities, producers, aggregators, and consumers.

The key insight that Anderson brings to the table is this: with the birth of digital retail and communication systems, customers can find niche goods that appeal to their personal interests and tastes, rather than exclusively focusing on goods that retailers expect will be hits. This means that customers can follow the ‘long tail’, or the line of niche goods that are individually less and less likely to sell in a mass retail environment.

There are several ‘drivers’ of the long tail:

  1. There are far more niche goods than ‘hits’ (massively popular works), and more and more niche goods are being produced with the falling costs of production and distribution in various fields.
  2. Filters are more and more effective, which means that consumers can find niches they are interested in.
  3. There are so many niche items that, collectively, they can comprise a market rivaling hits.
  4. Without distribution bottlenecks, the ‘true’ elongated tail of the present Western economic reality is made apparent.

Continue reading

Deep Packet Inspection: What Innovation Will ISPs Encourage?

InnovationAll sorts of nasty things as said about ISPs that use Deep Packet Inspection (DPI). ISPs aren’t investing enough in their networks, they just want to punish early adopters of new technologies, they’re looking to deepen their regulatory powers capacities, or they want to track what their customers do online. ISPs, in turn, tend to insist that P2P applications are causing undue network congestion, and DPI is the only measure presently available to them to alleviate such congestion.

At the moment, the constant focus on P2P over the past few years has resulted in various ‘solutions’ including the development of P4P and the shift to UDP. Unfortunately, the cat and mouse game between groups representing record labels, ISPs (to a limited extent), and end-users has led to conflict that has ensured that most of the time and money is being put into ‘offensive’ and ‘defensive’ technologies and tactics online rather than more extensively into bandwidth-limiting technologies. Offensive technologies include those that enable mass analysis of data- and protocol-types to try and stop or delay particular modes of data sharing. While DPI can be factored into this set of technologies, a multitude of network technologies can just as easily fit into this category. ‘Defensive’ technologies include port randomizers, superior encryption and anonymity techniques, and other techniques that are primarily designed to evade particular analyses of network activity.

I should state up front that I don’t want to make myself out to be a technological determinist; neither ‘offensive’ or ‘defensive’ technologies are in a necessary causal relationship with one another. Many of the ‘offensive’ technologies could have been developed in light of increasingly nuanced viral attacks and spam barrages, to say nothing of the heightening complexity of intrusion attacks and pressures from the copyright lobbies. Similarly, encryption and anonymity technologies would have continued to develop, given that in many nations it is impossible to trust local ISPs or governments.

Continue reading

Deep Packet Inspection and the Discourses of Censorship and Regulation

boredomIn the current CRTC hearings over Canadian ISPs’ use of Deep Packet Inspection (DPI) to manage bandwidth, I see two ‘win situations’ for the dominant carriers:

  1. They can continue to throttle ‘problem’ applications in the future;
  2. The CRTC decides to leave the wireless market alone right now.

I want to talk about the effects of throttling problem applications, and how people talking about DPI should focus on the negative consequences of regulation (something that is, admittedly, often done). In thinking about this, however, I want to first attend to the issues of censorship models to render transparent the difficulties in relying on censorship-based arguments to oppose uses of DPI. Following this, I’ll consider some of the effects of regulating access to content through protocol throttling. The aim is to suggest that individuals and groups who are opposed to the throttling of particular application-protocols should focus on the effects of regulation, given that it is a more productive space of analysis and argumentation, instead of focusing on DPI as an instrument for censorship.

Let’s first touch on the language of censorship itself. We typically understand this action in terms of a juridico-discursive model, or a model that relies on rules to permit or negate discourse. There are three common elements to this model-type:

Continue reading

Economics of Authenticity on Twitter

BerealI’m on Twitter all the time; it’s central to how I learn about discussions taking place about Deep Packet Inspection, a good way of finding privacy-folk from around the world, and lets me feel semi-socialized even though I’m somewhat reclusive. When I use the social networking service, I intersperse bits of ‘me’ (e.g. This wine sucks!) beside news articles I’ve found and believe would be useful to my colleagues, and add in some (attempts at) humor. In this sense, I try to make my Twitter feed feel ‘authentic’, meaning that it is reasonably reflective of how I want to present myself in digital spaces. Further, that presentation resonates (to varying extents) with how I behave in the flesh.

When you hear social-media enthusiasts talk about their media environment, authenticity (i.e. not pretending to be someone/something you’re really, absolutely, not) is the key thing to aim for. Ignoring the amusing Heideggerian implications of this use of authenticity (“How very They!), I think that we can take this to mean that there is a ‘currency’ in social media called ‘authenticity’. There are varying ways of gauging this currency. Continue reading