Just before Christmas, Swikar Oli published an article in the National Post that discussed how the Public Health Agency of Canada (PHAC) obtained aggregated and anonymized mobility data for 33 million Canadians. From the story, we learn that the contract was awarded in March to TELUS, and that PHAC used the mobility data to “understand possible links between movement of populations within Canada and spread of COVID-19.”
Around the same time as the article was published, PHAC posted a notice of tender to continue collecting aggregated and anonymized mobility data that is associated with Canadian cellular devices. The contract would remain in place for several years and be used to continue providing mobility-related intelligence to PHAC.
Separate from either of these means of collecting data, PHAC has been also purchasing mobility data “… from companies who specialize in producing anonymized and aggregated mobility data based on location-based services that are embedded into various third-party apps on personal devices.” There has, also, been little discussion of PHAC’s collection and use of data from these kinds of third-parties, which tend to be advertising and data surveillance companies that consumers have no idea are collecting, repackaging, and monetizing their personal information.
There are, at first glance, at least four major issues that arise out of how PHAC has obtained and can use the aggregated and anonymized information to which they have had, and plan to have, access.
On January 14, 2016, the Ontario Superior Court ruled that “tower dumps” – the mass release of data collected by cellphone towers at the request of law enforcement agencies – violate privacy rights under the Canadian Charter of Rights and Freedoms. In response, Justice Sproat outlined a series of guidelines for authorities to adhere to when requesting tower dump warrants in the future.
I wrote about this case for PEN Canada. I began by summarizing the issue of the case and then proceeded to outline some of the highlights of Justice Sproat’s decision. The conclusion of the article focuses on the limits of that decision: it does not promote statutory reporting of tower dumps and thus Canadians will not learn how often such requests are made; it does not require notifying those affected by tower dumps; it does not mean Canadians will know if data collected in a tower dump is used in a subsequent process against them. Finally, the guidelines are not precedent-setting and so do not represent binding obligations on authorities requesting the relevant production orders.
In the wake of a stunning data breach the University of Victoria campus community could only hope that the institution would do everything it could to regain lost trust. One such opportunity arose this week, when controversial Google Streetview vehicles have been scheduled to canvas the campus. Unfortunately the opportunity was squandered: it is largely by accident that the campus community has – or will – learn that Google is capturing images and wireless access point information.
In this short post I want to discuss how seriously the University failed to disclose Google’s surveillance of the campus. I begin by providing a quick overview of Streetview’s privacy controversies. I then describe the serious data breach that UVic suffered earlier this year, which has left the institution with a significant trust deficit. A discussion of the institution’s failure to disclose Google’s presence to the community, and attempts to chill speech around Google’s presence, follows. I conclude by suggesting how institutions can learn from UVic’s failures and disclose the presence of controversial, potentially privacy invasive, actors in order to rebuild flagging trust deficits.
Google Streetview and Privacy
Streetview has been a controversial product since its inception. There were serious concerns as it captured images of people in sensitive places or engaged in indiscreet actions. Initially the company had a non-trivial means for individuals to remove images from the Google Streetview database. This process has subsequently been replaced with an option to blur sensitive information. Various jurisdictions have challenged Google’s conceptual and legal argument that taking images of public spaces with a Streetview vehicle are equivalent to a tourist taking pictures in a public space.
One of the largest network vendors in the world is planning to offer their ISP partners an opportunity to modify HTTP headers to get ISPs into the advertising racket. Juniper Networks, which sells routers to ISPs, is partnering with Feeva, an advertising solutions company, to modify data packets’ header information so that the packets will include geographic information. These modified packets will be transmitted to any and all websites that the customer visits, and will see individuals receive targeted advertisements according to their geographical location. Effectively, Juniper’s proposal may see ISPs leverage their existing customer service information to modify customers’ data traffic for the purposes of enhancing the geographic relevance of online advertising. This poses an extreme danger to citizens’ locational and communicative privacy.
The Canadian SIGINT Summaries includes downloadable copies, along with summary, publication, and original source information, of leaked CSE documents.
Parsons, Christopher; and Molnar, Adam. (2021). “Horizontal Accountability and Signals Intelligence: Lesson Drawing from Annual Electronic Surveillance Reports,” David Murakami Wood and David Lyon (Eds.), Big Data Surveillance and Security Intelligence: The Canadian Case.
Parsons, Christopher. (2015). “Stuck on the Agenda: Drawing lessons from the stagnation of ‘lawful access’ legislation in Canada,” Michael Geist (ed.), Law, Privacy and Surveillance in Canada in the Post-Snowden Era (Ottawa University Press).
Parsons, Christopher. (2015). “The Governance of Telecommunications Surveillance: How Opaque and Unaccountable Practices and Policies Threaten Canadians,” Telecom Transparency Project.
Parsons, Christopher. (2015). “Beyond the ATIP: New methods for interrogating state surveillance,” in Jamie Brownlee and Kevin Walby (Eds.), Access to Information and Social Justice (Arbeiter Ring Publishing).
Bennett, Colin; Parsons, Christopher; Molnar, Adam. (2014). “Forgetting and the right to be forgotten” in Serge Gutwirth et al. (Eds.), Reloading Data Protection: Multidisciplinary Insights and Contemporary Challenges.
Bennett, Colin, and Parsons, Christopher. (2013). “Privacy and Surveillance: The Multi-Disciplinary Literature on the Capture, Use, and Disclosure of Personal information in Cyberspace” in W. Dutton (Ed.), Oxford Handbook of Internet Studies.
McPhail, Brenda; Parsons, Christopher; Ferenbok, Joseph; Smith, Karen; and Clement, Andrew. (2013). “Identifying Canadians at the Border: ePassports and the 9/11 legacy,” in Canadian Journal of Law and Society 27(3).
Parsons, Christopher; Savirimuthu, Joseph; Wipond, Rob; McArthur, Kevin. (2012). “ANPR: Code and Rhetorics of Compliance,” in European Journal of Law and Technology 3(3).