Analysis: ipoque, DPI, and bandwidth management

Bandwidth-exceededIn 2008, ipoque released a report titled “Bandwidth Management Solutions for Network Operators“. Using Deep Packet Inspection appliances, it is possible to establish a priority management system that privileges certain applications’ traffic over others; VoIP traffic can be dropped last, whereas P2P packets are given the lowest priority on the network. Two  modes of management are proposed by ipoque:

  1. Advanced Priority Management: where multi-tiered priorities maintain Quality of Experience (rather than Service) by identifying some packet-types as more important than others (e.g. VoIP is more important than BitTorrent packets). Under this system, less important packets are only dropped as needed, rather than being dropped once a bandwidth cap is met.
  2. Tiered Service Model: This uses a volume-service system, where users can purchase so much bandwidth for particular services. This is the ‘cell-phone’ model, where you sign up for packages that give you certain things and if you exceed your package limitations extra charges may apply*. Under this model you might pay for a file-sharing option, as well as a VoIP and/or streaming HTTP bundle.

The danger with filtering by application (from ipoque’s position) is that while local laws can be enforced, it  opens the ISP to dissatisfaction if legitimate websites are blocked. Thus, while an ISP might block Mininova, they can’t block Fedora repositories as well – the first might conform to local laws, whereas blocking the second would infringe on consumers’ freedoms. In light of this challenge, ipoque suggests that could ISPs adopt Saudi Arabia-like white-lists, where consumers can send a message to their ISP when they find sites being illegitimately blocked. Once the ISP checks out the site, they can either remove the site from the black-list, or inform the customer of why the site must remain listed.

Continue reading

Three-Strikes to Banish Europeans and Americans from the ‘net?

200903281552.jpgThroughout the Global North there are discussions on the table for introducing what are called ‘three-strikes’ rules that are designed to cut or, or hinder, people’s access to the Internet should they be caught infringing on copyright. In the EU, the big content cartel is trying to get ISPs to inspect consumer data flows and, when copywritten content is identified, ‘punish’ the individual in some fashion. Fortunately, it is looking that at least the EU Parliament is against imposing such rules on the basis that disconnecting individuals from the Internet would infringe on EU citizens’ basic rights. In an era where we are increasingly digitizing our records and basic communications infrastructure, it’s delightful to see a body in a major world power recognize the incredibly detrimental and over-reactionary behavior that the copyright cartel is calling for. Copyright infringement does not trump basic civil liberties.

Now, I expect that many readers would say something along this line: I don’t live in the EU, and the EU Parliament has incredibly limited powers. Who cares, this: (a) doesn’t affect me; (b) is unlikely to have a real impact on EU policy.

Continue reading

Summary: CRTC PN 2008-19; ISP Traffic Managment in Canada

As someone who is academically invested in how the ‘net is being regulated in Canada, I’ve been following the recent CRTC investigation into Internet management practices and regulation with considerable interest. Given that few people are likely to dig though the hundreds of pages that were in the first filing, I’ve summarized the responses from ISPs (save for Videotron’s submissions; I don’t read French) to a more manageable 50 pages. Enjoy!

Update: Thanks to Eric Samson and Daniel for translating Videotron’s filings – you guys rock!

Review: Access Denied

The OpenNet Initiative’s (ONI) mission is to “identify and document Internet filtering and surveillance, and to promote and inform wider public dialogs about such practices.” Access Denied: The Practice and Policy of Global Internet Filtering is one of their texts that effectively draws together years of their research, and presents it in an accessible and useful manner for researchers, activists, and individuals who are simply interested in how the Internet is shaped by state governments.

The text is separated into two broad parts – the first is a series of essays that situate the data that has been collected into a quickly accessible framework. The authors of each essay manage to retain a reasonable level of technical acumen, even when presenting their findings and the techniques of filtering to a presumably non-technical audience. It should be noted that the data collected includes up to 2007 – if you’re reading the text in the hopes that the authors are going to directly address filtering technologies that have recently been in the new, such as Deep Packet Inspection, you’re going to be a disappointed (though they do allude to Deep Packet technologies, without explicitly focusing on it, in a few areas). Throughout the text there are references to human rights and, while I’m personally a proponent of them, I wish that the authors had endeavored to lay out some more of the complexities of human rights discourse – while they don’t present these rights as unproblematic, I felt that more depth would have been rewarding both for their analysis, and for the benefit of the reader. This having been said, I can’t begrudge the authors of the essays for drawing on human rights at various points in their respective pieces – doing so fits perfectly within ONI’s mandate, and their arguments surrounding the use of human rights are sound.

Continue reading