Technology, Thoughts & Trinkets

Touring the digital through type

Tag: Rogers (page 1 of 2)

Transparent Practices Don’t Stop Prejudicial Surveillance

In February I’m attending iConference 2012, and helping to organize a workshop titled “Networked Surveillance: Access Control, Transparency, Power, and Circumvention in the 21st Century.” The workshop’s participants will consider whether networked surveillance challenges notions of privacy and neutrality, exploits openness of data protocols, or requires critical investigations into how these surveillance technologies are developed and regulated. Participants will be arriving from around the world, and speaking to one (or more) of the workshop’s four thematics: Access Control, Transparency, Power, and Circumvention. As part of the workshop, all participants must prepare a short position statement that identifies their interest in network surveillance while establishing grounds to launch a conversation. My contribution, titled “Transparent Practices Don’t Stop Prejudicial Surveillance,” follows.

Transparent Practices Don’t Stop Prejudicial Surveillance

Controversies around computer processing and data analysis technologies led to the development of Fair Information Practice Principles (FIPs), principles that compose the bedrocks of today’s privacy codes and laws. Drawing from lessons around privacy codes and those around Canadian ISPs’ surveillance practices, I argue that transparency constitutes a necessary but insufficient measure to mitigate prejudicial surveillance practices and technologies. We must go further and inject public values into development cycles while also intentionally hobbling surveillance technologies to rein in their most harmful potentialities.

Continue reading

Deep Packet Inspection and Consumer Transparency

Image by David Clow

Rogers Communications modified their packet inspection systems last year, and ever since customers have experienced degraded download speeds. It’s not that random users happen to be complaining about an (effectively) non-problem: Rogers’ own outreach staff has confirmed that the modifications took place and that these changes have negatively impacted peer to peer (P2P) and non-P2P applications alike. Since then, a Rogers Communications senior-vice president, Ken Englehart, has suggested that any problems customers have run into are resultant of P2P applications themselves; no mention is made of whether or how Rogers’ throttling systems have affected non-P2P traffic.

In this brief post, I want to quickly refresh readers on the changes that Rogers Communications made to their systems last year, and also note some of the problems that have subsequently arisen. Following this, I take up what Mr. Englehart recently stated in the media about Rogers’ throttling mechanisms. I conclude by noting that Rogers is likely in compliance with the CRTC’s transparency requirements (or at least soon will be), but that such requirements are ill suited to inform the typical consumer.

Continue reading

Distinguishing Between Mobile Congestions

by Simon TunbridgeThere is an ongoing push to ‘better’ monetize the mobile marketplace. In this near-future market, wireless providers use DPI and other Quality of Service equipment to charge subscribers for each and every action they take online. The past few weeks have seen Sandvine and other vendors talk about this potential, and Rogers has begun testing the market to determine if mobile customers will pay for data prioritization. The prioritization of data is classified as a network neutrality issue proper, and one that demands careful consideration and examination.

In this post, I’m not talking about network neutrality. Instead, I’m going to talk about what supposedly drives prioritization schemes in Canada’s wireless marketplace: congestion. Consider this a repartee to the oft-touted position that ‘wireless is different’: ISPs assert that wireless is different than wireline for their own regulatory ends, but blur distinctions between the two when pitching ‘congestion management’ schemes to customers. In this post I suggest that the congestion faced by AT&T and other wireless providers has far less to do with data congestion than with signal congestion, and that carriers have to own responsibility for the latter.

Continue reading

Rogers, Network Failures, and Third-Party Oversight

Photo credit: Faramarz HashemiDeep packet inspection (DPI) is a form of network surveillance and control that will remain in Canadian networks for the foreseeable future. It operates by examining data packets, determining their likely application-of-origin, and then delaying, prioritizing, or otherwise mediating the content and delivery of the packets. Ostensibly, ISPs have inserted it into their network architectures to manage congestion, mitigate unprofitable capital investment, and enhance billing regimes. These same companies routinely run tests of DPI systems to better nuance the algorithmic identification and mediation of data packets. These tests are used to evaluate algorithmic enhancements of system productivity and efficiency at microlevels prior to rolling new policies out to the entire network.

Such tests are not publicly broadcast, nor are customers notified when ISPs update their DPI devices’ long-term policies. While notification must be provided to various bodies when material changes are made to the network, non-material changes can typically be deployed quietly. Few notice when a deployment of significant scale happens…unless it goes wrong. Based on user-reports in the DSLreports forums it appears that one of Rogers’ recent policy updates was poorly tested and then massively deployed. The ill effects of this deployment are still unresolved, over sixty days later.

In this post, I first detail issues facing Rogers customers, drawing heavily from forum threads at DSLreports. I then suggest that this incident demonstrates multiple failings around DPI governance: a failure to properly evaluate analysis and throttling policies; a failure to significantly acknowledge problems arising from DPI misconfiguration; a failure to proactively alleviate inconveniences of accidental throttling. Large ISPs’ abilities to modify data transit and discrimination conditions is problematic because it increases the risks faced by innovators and developers who cannot predict future data discrimination policies. Such increased risks threaten the overall generative nature of the ends of the Internet. To alleviate some of these risks a trusted third-party should be established. This party would monitor how ISPs themselves govern data traffic and alert citizens and regulators if ISPs discriminate against ‘non-problematic’ traffic types or violate their own terms of service. I ultimately suggest that an independent, though associated, branch of the CRTC that is responsible for watching over ISPs could improve trust between Canadians and the CRTC and between customers and their ISPs.

Continue reading

« Older posts