Technology, Thoughts & Trinkets

Touring the digital through type

Tag: control

Rogers, Network Failures, and Third-Party Oversight

Photo credit: Faramarz HashemiDeep packet inspection (DPI) is a form of network surveillance and control that will remain in Canadian networks for the foreseeable future. It operates by examining data packets, determining their likely application-of-origin, and then delaying, prioritizing, or otherwise mediating the content and delivery of the packets. Ostensibly, ISPs have inserted it into their network architectures to manage congestion, mitigate unprofitable capital investment, and enhance billing regimes. These same companies routinely run tests of DPI systems to better nuance the algorithmic identification and mediation of data packets. These tests are used to evaluate algorithmic enhancements of system productivity and efficiency at microlevels prior to rolling new policies out to the entire network.

Such tests are not publicly broadcast, nor are customers notified when ISPs update their DPI devices’ long-term policies. While notification must be provided to various bodies when material changes are made to the network, non-material changes can typically be deployed quietly. Few notice when a deployment of significant scale happens…unless it goes wrong. Based on user-reports in the DSLreports forums it appears that one of Rogers’ recent policy updates was poorly tested and then massively deployed. The ill effects of this deployment are still unresolved, over sixty days later.

In this post, I first detail issues facing Rogers customers, drawing heavily from forum threads at DSLreports. I then suggest that this incident demonstrates multiple failings around DPI governance: a failure to properly evaluate analysis and throttling policies; a failure to significantly acknowledge problems arising from DPI misconfiguration; a failure to proactively alleviate inconveniences of accidental throttling. Large ISPs’ abilities to modify data transit and discrimination conditions is problematic because it increases the risks faced by innovators and developers who cannot predict future data discrimination policies. Such increased risks threaten the overall generative nature of the ends of the Internet. To alleviate some of these risks a trusted third-party should be established. This party would monitor how ISPs themselves govern data traffic and alert citizens and regulators if ISPs discriminate against ‘non-problematic’ traffic types or violate their own terms of service. I ultimately suggest that an independent, though associated, branch of the CRTC that is responsible for watching over ISPs could improve trust between Canadians and the CRTC and between customers and their ISPs.

Continue reading

Genealogy and the ‘Net

genealogyI’ve recently had the pleasure of reading some of Foucault’s Society Must be Defended. Over the course of the book Foucault will be radically changing his early positions, and I hope to note and discuss these changes as I come across them. This said, I’ve recently finished the first lecture and wanted to reflect on the power of genealogies, the fragmented character of the ‘net, and synthesize that with Wu and Goldsmith’s account of the Internet and Foucault’s own thoughts on power as repression. There’s a lot to do, but I think that it might be very profitable to at least toy around with this for a bit.

Genealogy

There is a tendency to try and capture knowledge in unitary architectures. Foucault equates this to trying to develop a unifying concept to explain the behaviour of each droplet of water that explodes from around a sperm whale when it breeches. In the very process of establishing a complex formula to receive this information, the act itself is lost.

Continue reading