Comment: Canadian ISPs and Internet Traffic Management

I’ve recently put up a document that summarized most of the first round of filings for the CRTC’s investigation of Canadian ISP traffic management practices (PN 2008-19), and thought that I’d post a few things that I thought were most interesting (for me). Keep in mind that many of my interests revolve around deep packet inspection.

Network Use Averages

  1. Bell filed their specific data points in confidence, though from what they provided we can see that the top 5% of usage on the network has declined from 61.1% to 46.6%, and the top 10% of network usage has declined from 77.1% to 62.6%.
  2. In TELUS’ case, we find that their retail customers have decreased the amount of content they are uploading, though they are downloading more. Their wholesale customers are both downloading and uploading more than in 2006. Specific traffic data was filed in confidence to the CRTC.
  3. Bell finds that P2P and HTTP/Streaming traffic are the most commonly used end-user categories that contribute to bandwidth usage.

Canadian ISPs Admitting to Traffic Management

  1. Bell Wireline (excludes Bell Mobility and Bell Aliant Atlantic). DPI technology is used, though the vendor and products are filed in confidence.
  2. Cogeco uses DPI, but has filed the vendor and products in confidence.
  3. Rogers filed their comments in confidence, but from past information that has emerged we know that they are using DPI equipment.
  4. Shaw Communications Inc. uses Arbor-Ellacoya devices, though the particular products are filed in confidence.
  5. Barrett Xplore Inc. Uses VoIP prioritization, provisioning of modems, and DPI. Specifics are filed in confidence.
  6. While not explicitly stated, is appears as though Bragg Communications Ltd. also uses DPI.

Canadian ISPs Not Using Traffic Management

  1. MTS Allstream Inc.
  2. SaskTel (though they do use Arbor Peakflow SP, dominantly for network security purposes)
  3. Primus Telecommunications Canada Inc.
  4. Telus

What is Being Filtered/Throttled?

  1. Bell acknowledges that they do throttle traffic between 1630 and 0200 each day by limiting bandwidth available to P2P applications. A detailed listing of applications is not publicly mentioned.
  2. Cogeco currently uses management technologies against: eDonkey/eMule, EmuleEncrypted, Kazaa, Fast Track KaZaA Networking, Napster, Bittorrent, Dijjer, Manolito, Hotline, Share, Soulseek, v-share, Zattoo, Joost, KuGoo, Kuro, DHT, Commercial File Sharing, Baidu Movie, Club Box, Winny, Gnitella, Gnutella Networking, WinMX, Direct Connect, PeerEnabler, Exosee, Further, Filtopia, Mute, NodeZilla, waste, Warez, NeoNet, PPLiveStream Misc, BAIBAO, POCO, Entropy, Rodi, Guruguru, Pando, Soribada, Freenet, PacketiX, Feidian, AntsP@P, Sony Location Free, thunder, Web Thunder. They only look at the specific signature of P2P applications.
  3. Rogers “looks at header information embedded in the payload and session establishment procedures.” What is unclear to me is how they are suggesting that header information is embedded in the payload itself – these are two separate spaces in packets, as I understand networking 101. Specifics P2P that are filtered is not mentioned, though they only concentrate on uploaded content.
  4. Shaw doesn’t say – they’ve filed their findings in confidence.
  5. Barrett doesn’t say – they’ve filed their findings in confidence.
  6. Bragg targets: Bittorrent, News, DirectConnect, Blubster, gnutella, KaZaA, WinMX, eDonkey, Filetopia, Hotline, GuruGuru, Soribada, Soulseek, Ares, JoltID, eMule, Waste, Konspire2b, ExoSee, FurtherNet, MUTE, GNUnet, Nodezilla. Bragg focuses on the packet headers and the behaviour of packet exchanges, and avoiding learning about the content of packet flows.

Under What Conditions Non-Management ISPs Would Manage Their Networks

  1. MTS Allstream notes that only if a capital investment analysis found traffic management technologies to lead to enhanced revenue would they invest in management technologies.
  2. SaskTel has three conditions that would lead them to adopt management technologies: (a) customer demand outstrips capacity and augmentation could not be economically accomplished; (b) if competitive forces require the introduction of alternate service definitions; (c) if there was a need to enforce the aUP so that there was sufficient network capacity for end-users.
  3. TELUS does not currently use management technologies such as DPI, and has no plans to do so.

There is more in the document that is of note, but insofar as it pertains to DPI I thought that these were probably core points that people would be interested in.

Summary: CRTC PN 2008-19; ISP Traffic Managment in Canada

As someone who is academically invested in how the ‘net is being regulated in Canada, I’ve been following the recent CRTC investigation into Internet management practices and regulation with considerable interest. Given that few people are likely to dig though the hundreds of pages that were in the first filing, I’ve summarized the responses from ISPs (save for Videotron’s submissions; I don’t read French) to a more manageable 50 pages. Enjoy!

Update: Thanks to Eric Samson and Daniel for translating Videotron’s filings – you guys rock!

Review: Privacy On The Line

This updated edition of Diffie and Landau’s text is a must-have for anyone who is interested in how encryption and communicative privacy politics have developed in the US over the past century or so. Privacy On The Line moves beyond a ‘who did what’ in politics, instead seeing the authors bring their considerable expertise in cryptography to bear in order to give the reader a strong understanding of the actual methods of securing digital transactions. After reading this text, the reader will have a good grasp on what types of encryption methods have been used though history, and strong understandings of the value and distinction between digital security and digital privacy, as well as an understanding of why and how data communications are tracked.

The only disappointment is the relative lack of examination of how the US has operated internationally – there is very little mention of the OECD, nor of European data protection, to say nothing of APEC. While the authors do talk about the role of encryption in the context of export control, I was a bit disappointed at just how little they talked about the perceptions of American efforts abroad – while this might have extended slightly beyond the American-centric lens of the book, it would have added depth of analysis (though perhaps at the expense of making the book too long for traditional publication). One of the great elements of this book is an absolutely stunning bibliography, references, and glossary – 106 pages of useful reference material ‘fleshes out’ the already excellent analysis of encryption in the US.

Ultimately, if you are interested in American spy politics, or in encryption in contemporary times, or in how these two intersect in the American political arena, then this text is for you.

Technology: CBC’s Search Engine and Traffic Shaping

200901051413.jpg

The CBC’s Jesse Brown has a nice piece that tries to respond to the question, “Is Throttling Necessary?” I won’t spoil the answer (or possible lack of an answer), but I will note that Jesse incorporated a few pieces of information that I’ve posted about here. If you’re not already subscribed to his Search Engine podcast, you should – it’s amongst the best Canadian tech journalism (that is accessible to non-tech people).

P2P and Complicity in Filesharing

I think about peer to peer (P2P) filesharing on a reasonably regular basis, for a variety of reasons (digital surveillance, copyright analysis and infringement, legal cases, value in efficiently mobilizing data, etc.). Something that always nags at me is the defense that P2P websites offer when they are sued by groups like the Recording Industry Association of America (RIAA). The defense goes something like this:

“We, the torrent website, are just an search engine. We don’t actually host the infringing files, we are just responsible for directing people to them. We’re no more guilty of copyright infringement than Google, Yahoo!, or Microsoft are.”

Let’s set aside the fact that Google has been sued for infringing on copyright on the basis that it scrapes information from other websites, and instead turn our attention to the difference between what are termed ‘public’ and ‘private’ trackers. ‘Public’ trackers are available to anyone with a web connection and a torrent program. These sites do not require users to upload a certain amount of data to access the website – they are public, insofar as there are few/no requirements placed on users to access the torrent search engine and associated index. Registration is rarely required. Good examples at thepiratebay.org, and mininova.org. ‘Private’ trackers require users to sign up and log into the website before they can access the search engine and associated index of .torrent files. Moreover, private trackers usually require users to maintain a particular sharing ration – they must upload a certain amount of data that equals or exceeds the amount of data that they download. Failure to maintain the correct share ratio results in users being kicked off the site – they can no longer log into it and access the engine and index.

Continue reading

Ownership of Public Clouds

I’ve recently been chewing through BlueMountainLab’s podcasts on Cloud Computing. I’ll be honest – I’m a skeptic when it comes to cloud computing, but I’m developing a better understanding of it after listening to ‘casts on this topic for about 2 hours (maybe I’m just been brainwashed?). If you’re not immediately familiar with what this term means, check out the below video – you’ll see some of the biggest and brightest minds in digital technologies explain in simple terms what ‘cloud computing’ is.


Unless you’ve been living under a rock, or away from a digital connection, for the past couple of years you’ve likely experienced cloud computing. Have you hopped into Google docs, Zimbra, or any other environment where you perform standard tasks in a web-based environment? If so, you’ve been ‘in the cloud’. What we’re seeing is a shift away from centrally owned company infrastructure toward infrastructure that is owned and operated by another company. To picture it, rather than host your own mail servers, you shift your corporation over to Google Apps, and at the same time can take advantage of the word processing, chat, and page creation features that accompany the Google solution. Should you need to increase storage, or alter your current feature set, you can have it set up in a few hours – this contrasts with spending corporate resources acquiring a solution, installing it, educating your users, etc. By outsourcing high-cost, high-time-sink operations you can realign your IT staff so that they can focus on corporate issues; designing unique solutions for unique problems, focusing their skill sets in more cost-effective areas, etc.

Continue reading