Over the next 48-72 hours I’ll be doing some (extensive) work on my site. I’m simultaneously trying to renovate some features, dispose of others, and generally repair some long-standing problems on the backend. This site – and the database behind it – started as an experiment many years ago and I made a large number of fairly boneheaded mistakes over the years that I’ve tried (I think successfully) to cover up with bandages and duct tape during the last 3 years. It’s time, however, to amputate of these festering areas and rebuild them.
I’ve begun fixing up some of the problems over the past month, including migrating to a better hosting company that has located my data in Canada. Uptime has been more reliable and access speeds have generally improved, but more needs to be done. By the end of the weekend I hope to have performed the work needed to correct the bits and pieces of the site that are becoming increasingly problematic to deal with.
One of the more significant changes will be that the “/blog” in my URL will largely be removed. I’ll be trying to remedy internal links over the coming while, to limit internal breaks, but this might mean that some inbound links are broken. Significantly, those who use RSS readers to read what is written will likely need to adjust their feed. By the end of the weekend, the feed should have moved to: https://christopher-parsons.com/?feed=rss2
I’ll post an update, to this post, once the transition is complete. See you on the other side!
The move has concluded. In addition to considerable visual modifications I’ve also remedied some rotten links and tried to improve page response speed. URL structure has changed, though old links should successfully redirect to the new link structure. Text should remain easy to read (ideally as good, if not better, than before) and I’ve presently adopted a ‘reading-for-mobile’ theme. The analytics engine that I use is, at present, Piwiki, which stores data on my server instead of providing it to a third party. The privacy notice has been updated as a result.
As noted in the earlier note, the RSS feed has moved to: https://christopher-parsons.com/?feed=rss2
As mentioned previously, I’ve been conducting research with academics at the University of Victoria to understand the relationship(s) between social networking companies’ data access, retention, and disclosure policies for the past several months. One aspect of our work addresses the concept of jurisdiction: what systems of rules mediate or direct how social media companies collect, retain, use, and disclose subscribers’ personal information? To address this question we have taken up how major social networking companies comply, or not, with some of the most basic facets of Canadian privacy law: the right to request one’s own data from these companies. Our research has been supported by funding provided through the Office of the Privacy Commissioner of Canada’s contributions program. All our research has been conducted independently of the Office and none of our findings necessarily reflect the Commissioner’s positions. As part of our methodology, while we may report on our access requests being stymied, we are not filing complaints with the federal Commissioner’s office.
Colin Bennett first presented a version of this paper, titled “Real and Substantial Connections: Enforcing Canadian Privacy Laws Against American Social Networking Companies” at an Asian Privacy Scholars event and, based on comments and feedback, we have revised that work for a forthcoming conference presentation in Malta. Below is the abstract of the paper, as well as a link to the Social Science Research Network site that is hosting the paper.
Any organization that captures personal data in Canada for processing is deemed to have a “real and substantial connection” to Canada and fall within the jurisdiction of the Personal Information Protection and Electronic Documents Act (PIPEDA) and of the Office of the Privacy Commissioner of Canada. What has been the experience of enforcing Canadian privacy protection law on US-based social networking services? We analyze some of the high-profile enforcement actions by the Privacy Commissioner. We also test compliance through an analysis of the privacy policies of the top 23 SNSs operating in Canada with the use of access to personal information requests. Most of these companies have failed to implement some of the most elementary requirements of data protection law. We conclude that an institutionalization of non-compliance is widespread, explained by the countervailing conceptions of jurisdiction inherent in corporate policy and technical system design.
Download the paper at SSRN
Lawful access was a contentious issue on the Canadian agenda when it was initially introduced by the Martin government, and has become even more disputed as subsequent governments have introduced their own iterations of the Liberal legislation. Last year the current majority government introduced Bill C-30, the Protecting Children from Internet Predators Act. In the face of public outcry the government sent the bill to committee prior to a vote on second reading, and most recently declared the bill dead.
Last year I began research concerning alternate means of instituting lawful access powers in Canada. Specifically, I explored whether a ‘backdoor’ had been found to advance various lawful access powers: was Industry Canada, through the 700MHz spectrum consultation, and Public Safety, through its changes to how communications are intercepted, effectively establishing the necessary conditions for lawful access by compliance fiat?
In this post I try to work through aspects of this question. I begin by briefly unpacking some key elements of Bill C-30 and then proceed to give an overview of the spectrum consultation. This overview will touch on proposed changes to lawful intercept standards. I then suggest how changes to the intercept standards could affect Canadians, as well as (re)iterate the importance of publicly discussing expansions to lawful access and intercept powers instead of expanding these powers through regulatory and compliance backdoors.
My formal dissertation research focuses on deep packet inspection technologies, and how they serve as a nexus for competing political interests. Today, I’m making available a draft chapter from my dissertation. In this first chapter I trace the lineage of deep packet inspection (DPI) systems; how do shallow and medium packet inspection systems function, and what were their limitations, and what is novel about DPI itself?
Chapter one serves as an introduction to the theoretical capabilities of the systems; I am not making a claim that all DPI appliances are capable of achieving all, or even half, of the various use cases that I outline. As such, this writing builds on a much earlier working paper that I produced several years ago; core differences between the past work and current chapter surround the detail given to various uses of DPI and a more limited argumentative position. This limit was imposed because this is the first chapter of the dissertation; my analysis and broader theoretical conclusions about the technology and its applications will come in the last two chapters (six and seven).
Comments and feedback are welcomed. Should you choose to cite this draft, please reference it thusly:
Parsons, Christopher. (2013). “(Draft) Chapter One: Deep Packet Inspection and Its Predecessors, v. 3.5,” Technology, Thoughts, and Trinkets (blog). Published February 6, 2013. URL: http://www.christopher-parsons.com/Main/wp-content/uploads/2013/02/DPI-and-Its-Predecessors-3.5.pdf.
This chapter traces the lineage of contemporary packet inspection systems that monitor data traffic flowing across the Internet in real time. After discussing how shallow, medium, and deep packet inspection systems function, I outline the significance of this technology’s most recent iteration, deep packet inspection, and how it could be used to fulfill technical, economic, and political goals. Achieving these goals, however, requires that deep packet inspection be regarded as a surveillance practice. Indeed, deep packet inspection is, at its core, a surveillance-based technology that is used by private actors, such as Internet service providers, to monitor and mediate citizens’ communications. Given the importance of Internet-based communications to every facet of Western society, from personal communications, to economic, cultural and political exchanges, deep packet inspection must be evaluated not just in the abstract but with attention towards how society shapes its deployment and how it may shape society.
Download .pdf (alternate link)
There have been lots of good critiques and comments concerning Facebook’s recently announced “Graph Search” product. Graph Search lets individuals semantically query large datasets that are associated with data shared by their friends, friends-of-friends, and the public more generally. Greg Satell tries to put the product in context – Graph Search is really a a way for corporations to peer into our lives – and a series of articles have tried to unpack the privacy implications of Facebook’s newest product.
I want to talk less directly about privacy, and more about how Graph Search threatens to further limit discourse on the network. While privacy is clearly implicated throughout the post, we can think of privacy beyond just a loss for the individual and more about the broader social impacts of its loss. Specifically, I want to briefly reflect on how Graph Search (further?) transforms Facebook into a hostile discursive domain, and what this might mean for Facebook users.
This is a guest post from my colleague, Adam Molnar, who has been conducting research on the BC Services Card. Adam is a PhD Candidate in the Department of Political Science at the University of Victoria and a member of the New Transparency Project. His dissertation research focuses on security and policing legacies associated with mega-events. You can find him on Twitter at @admmo
In just two weeks, the province of British Columbia will be launching the new BC Services Card. If you haven’t already heard about the new province-wide identity management initiative, it’s not your fault; the government only began its public relations campaign for the Services Card initiative six weeks before the card was set to hit wallets and hospitals across the province. In fact, the government’s been so unforthcoming about the new Cards that, just six weeks before it’s release, the British Columbia Office of the Information and Privacy Commissioner is racing to adequately review the program. To be clear: this isn’t a new initiative, but one going back several years. The unwillingness to disclose the documents necessary for the Commissioner’s review is particularly troubling since the Services Card is just one component in a much larger transformation of the province’s movement to its integrated identity management program. Will similar tardiness to assist the province’s privacy czar pervade this entire transition? Will the public be as excluded from future debates as they have from the Services Card development and deployment regime?
The Services Cards feature a host of security enhancements, including layered polycarbonate plastics, embedded holography, laser etchings for images and text appearing on the card, and the integration of a Near Field Communications (NFC) chip. For this post, I focus exclusively on the NFC chip, that is meant to ‘secure’ your identity when presenting the card to government agencies, either in person or online.
The BC government has been touting NFC as an enhanced security feature in the Services Card initiative. While this technical feature might enhance the perception of privacy (especially when buttressed by official provincial political rhetoric), they actually entail serious flaws. These flaws could leave the personal information of BC residents and government databases vulnerable to attack; the security ‘features’ could be the beachhead that leads to serious privacy breaches.