The Consumable Mobile Experience

We are rapidly shifting towards a ubiquitous networked world, one that promises to accelerate our access to information and each other, but this network requires a few key elements. Bandwidth must be plentiful, mobile devices that can engage with this world must be widely deployed, and some kind of normative-regulatory framework that encourages creation and consumption must be in place. As it stands, backhaul bandwidth is plentiful, though front-line cellular towers in American and (possibly) Canada are largely unable to accommodate the growing ubiquity of smart devices. In addition to this challenge, we operate in a world where the normative-regulatory framework for the mobile world is threatened by regulatory capture that encourages limited consumption that maximizes revenues while simultaneously discouraging rich, mobile, creative actions. Without a shift to fact-based policy decisions and pricing systems North America is threatened to become the new tech ghetto of the mobile world: rich in talent and ability to innovate, but poor in the actual infrastructure to locally enjoy those innovations.

At the Canadian Telecom Summit this year, mobile operators such as TELUS, Wind Mobile, and Rogers Communications were all quick to pounce on the problems facing AT&T in the US. AT&T regularly suffers voice and data outages for its highest-revenue customers: those who own and use smart phones that are built on the Android, WebOS (i.e. Palm Pre and Pixi), and iOS. Each of these Canadian mobile companies used AT&T’s weaknesses to hammer home that unlimited bandwidth cannot be offered along mobile networks, and suggested that AT&T’s shift from unlimited to limited data plans are indicative of the backhaul and/or spectrum problems caused by smart devices. While I do not want to entirely contest the claim that there are challenges managing exponential increases in mobile data growth, I do want to suggest that technical analysis rather than rhetorical ‘obviousness’ should be applied to understand the similarities and differences between Canadian telcos/cablecos and AT&T.

Continue reading

Choosing Winners with Deep Packet Inspection

I see a lot of the network neutrality discussion as one surrounding the conditions under which applications can, and cannot, be prevented from running. On one hand there are advocates who maintain that telecommunications providers – ISPs such as Bell, Comcast, and Virgin – shouldn’t be responsible for ‘picking winners and losers’ on the basis that consumers should make these choices. On the other hand, advocates for managed (read: functioning) networks insist that network operators have a duty and responsibility to fairly provision their networks in a way that doesn’t see one small group negatively impact the experiences of the larger consumer population. Deep Packet Inspection (DPI) has become a hot-button technology in light of the neutrality debates, given its potential to let ISPs determine what applications function ‘properly’ and which see their data rates delayed for purposes of network management. What is often missing in the network neutrality discussions is a comparison between the uses of DPI across jurisdictions and how these uses might impact ISPs’ abilities to prioritize or deprioritize particular forms of data traffic.

As part of an early bit of thinking on this, I want to direct our attention to Canada, the United States, and the United Kingdom to start framing how these jurisdictions are approaching the use of DPI. In the process, I will make the claim that Canada’s recent CRTC ruling on the use of the technology appears to be more and more progressive in light of recent decisions in the US and the likelihood of the UK’s Digital Economy Bill (DEB) becoming law. Up front I should note that while I think that Canada can be read as ‘progressive’ on the network neutrality front, this shouldn’t suggest that either the CRTC or parliament have done enough: further clarity into the practices of ISPs, additional insight into the technologies they use, and an ongoing discussion of traffic management systems are needed in Canada. Canadian communications increasingly pass through IP networks and as a result our communications infrastructure should be seen as important as defence, education, and health care, each of which are tied to their own critical infrastructures but connected to one another and enabled through digital communications systems. Digital infrastructures draw together the fibres connecting the Canadian people, Canadian business, and Canadian security, and we need to elevate the discussions about this infrastructure to make it a prominent part of the national agenda.

Continue reading

Deep Packet Inspection and Mobile Discrimination

Throughout the 2009 Canadian Telecommunications Summit presenter after presenter, and session after session, spoke to the Canadian situation concerning growth in mobile data. In essence, there is a worry that the wireless infrastructure cannot cope with the high volumes of data that are expected to accompany increasing uses and penetrations of mobile technologies. Such worries persist, even though we’ve recently seen the launch of another high-speed wireless network that was jointly invested in by Bell and Telus, and despite the fact that new wireless competitors are promising to enter the national market as well.

The result of the wireless competition in Canada is this: Canadians actually enjoy pretty fast wireless networks. We can certainly complain about the high costs of such networks, about the conditions under which wireless spectrum was purchased and is used, and so forth, but the fact is that pretty impressive wireless networks exist…for Canadians with cash. As any network operator knows, however, speed is only part of the equation; it’s just as important to have sufficient data provisioning so your user base can genuinely take advantage of the network. It’s partially on the grounds of data provisioning that we’re seeing vendors develop and offer deep packet inspection (DPI) appliances for the mobile environment.

I think that provisioning is the trojan horse, however, and that DPI is really being presented by vendors as a solution to a pair of ‘authentic’ issues: first, the need to improve customer billing, and second, to efficiently participate in the advertising and marketing ecosystem. I would suggest that ‘congestion management’, right now, is more of a spectre-like issue than an authentic concern (and get into defending that claim, in just a moment).

Continue reading

Canadian Telecom Summit and DPI

telecomtowerFor the past little while I’ve been (back) in Ontario trying to soak up as much information as I could about telecommunications and deep packet inspection. I was generously given the opportunity to attend the Canadian Telecommunications Summit by Mark Goldberg a while ago, and it was an amazing experience. I found that the new media panel, where broadcasters and carriers came together to discuss their (often contrasting) modes of disseminating content offered some real insights into the approaches to media on the ‘net. It demonstrated very clear contrasts in how new media might operate, and be seen by the Dominant Carriers, into focus for me and really began to provide a broader image of the actual strategies of various parties.

A huge element of the conference surrounded the development of wireless as the new space for innovation. Often unspoken, save for in informal discussions, was that wireline was seen as increasingly outmoded. Most statistics that were formally presented saw wireless overtaking wireline broadband by 2014 or so. This has me wondering about how important it is to examine capital expenses by major broadband providers – while we read that there is massive investment (totaling in the hundreds of millions/billions per year across all carriers), how much is in wireless and how much is in wireline infrastructure?

Continue reading

Shield the Sources, Shield the Telecoms

The past couple of days have been interesting, to say the least, when looking at recent shifts and decisions in American legislatures. Specifically, the House is looking to shield bloggers from federal investigations by providing them with the same protections as reporters, and that after the telecommunication companies that ‘theoretically’ (read: actually) cooperated with NSA spying activities have refused to cooperate with Congressional investigations that they have been let off the hook. Let’s get into it.

Federal Journalists and Professional Bloggers Shielded

The US has had a long history of journalistic freedoms, but in the face of recent technological advances they have refused to extend those freedoms to users of new journalistic mediums. Bloggers, in particular, are becoming a more and more important source of information in the US – some dedicate their lives to blogging and use it for professional gain. Until recently they have (typically) been refused the same status as traditional journalists, which has made it risky for bloggers to refuse to disclose their sources if hauled into courts of law.

Continue reading