Finding You: The Network Effect of Telecommunications Vulnerabilities for Location Disclosure

Last week, I published a report with Gary Miller and the Citizen Lab entitled, “Finding You: The Network Effect of Telecommunications Vulnerabilities for Location Disclosure.” I undertook this research while still employed by the Citizen Lab and was delighted to see it available to the public. In it, we discuss how the configuration and vulnerabilities of contemporary telecommunications networks enables surveillance actors to surreptitiously monitor the location of mobile phone users.

The report provides a high-level overview of the geolocation-related threats associated with contemporary networks that depend on the protocols used by 3G, 4G, and 5G network operators, followed by evidence of the proliferation of these threats. Part 1 provides the historical context of unauthorized location disclosures in mobile networks and the importance of the target identifiers used by surveillance actors. Part 2 explains how mobile networks are made vulnerable by signaling protocols used for international roaming, and how networks are made available to surveillance actors to carry out attacks. An overview of the mobile ecosystem lays the foundation for the technical details of domestic versus international network surveillance, while the vectors of active versus passive surveillance techniques with evidence of attacks shows how location information is presented to the actor. Part 3 provides details of a case study from a media report that shows evidence of widespread state-sponsored surveillance, followed by threat intelligence data revealing network sources attributed to attacks detected in 2023. These case studies underscore the significance and relevance of undertaking these kinds of surveillance operations.

Deficiencies in oversight and accountability of network security are discussed in Part 4. This includes outlining the incentives and enablers that are provided to surveillance actors from industry organizations and government regulatory agencies. Part 5 makes clear that the adoption of 5G technologies will not mitigate future surveillance risks unless policymakers quickly move to compel telecommunications providers to adopt the security features that are available in 5G standards and equipment. If policymakers do not move swiftly then surveillance actors may continue to prey upon mobile phone users by tracking their physical location. Such a future paints a bleak picture of user privacy and must be avoided.

Minding Your Business: A Critical Analysis of the Collection of De-identified Mobility Data and Its Use Under Socially Beneficial and Legitimate Business Exemptions in Canadian Privacy Law

Earlier this month Amanda Cutinha and I published a report, entitled “Minding Your Business: A Critical Analysis of the Collection of De-identified Mobility Data and Its Use Under Socially Beneficial and Legitimate Business Exemptions in Canadian Privacy Law.” In it, we examine how the Government of Canada obtained and used mobility data over the course of the COVID-19 pandemic, and use that recent history to analyse and critique the Consumer Privacy Protection Act (CPPA).

The report provides a detailed summary of how mobility information was collected as well as a legal analysis of why the collection and use of this information likely conformed with the Privacy Act as well as the Personal Information Protection and Electronic Documents Act (PIPEDA). We use this conformity to highlight a series of latent governance challenges in PIPEDA, namely:

  1. PIPEDA fails to adequately protect the privacy interests at stake with de-identified and aggregated data despite risks that are associated with re-identification.
  2. PIPEDA lacks requirements that individuals be informed of how their data is de-identified or used for secondary purposes.
  3. PIPEDA does not enable individuals or communities to substantively prevent harmful impacts of data sharing with the government.
  4. PIPEDA lacks sufficient checks and balances to ensure that meaningful consent is obtained to collect, use, or disclose de-identified data.
  5. PIPEDA does not account for Indigenous data sovereignty nor does it account for Indigenous sovereignty principles in the United Nations Declaration on the Rights of Indigenous Peoples, which has been adopted by Canada.
  6. PIPEDA generally lacks sufficient enforcement mechanisms.

We leverage these governance challenges to, subsequently, analyse and suggest amendments to the CPPA. Our report’s 19 amendments would affect:

  1. Governance of de-identified data
  2. Enhancing knowledge and consent requirements surrounding the socially beneficial purposes exemption and legitimate interest exemption
  3. Meaningful consent for secondary uses
  4. Indigenous sovereignty
  5. Enforcement mechanisms
  6. Accessibility and corporate transparency

While we frankly believe that the legislation should be withdrawn and re-drafted with human rights as the guide stone of the legislation we also recognise that this is unlikely to happen. As such, our amendments are meant to round off some of the sharp edges of the legislation, though we also recognise that further amendments to other parts of the legislation are likely required.

Ultimately, if the government of Canada is truly serious about ensuring that individuals and communities are involved in developing policies pursuant to themselves and their communities, ameliorating disadvantages faced by marginalized residents of Canada, and committing to reconciliation with Indigenous populations, it will commit to serious amendments of C-27 and the CPPA. Our recommendations are made in the spirit of addressing the gaps in this new legislation that are laid bare when assessing how it intersects with Health Canada’s historical use of locational information. They are, however, only a start toward the necessary amendments for this legislation.

Executive Summary

The Government of Canada obtained de-identified and aggregated mobility data from private companies for the socially beneficial purpose of trying to understand and combat the spread of COVID-19. This collection began as early as March 2020, and the information was provided by Telus and BlueDot. It wasn’t until December 2021, after the government issued a request for proposals for cellular tower information that would extend the collection of mobility information, that the public became widely aware of the practice. Parliamentary meetings into the government’s collection of mobility data began shortly thereafter, and a key finding was that Canada’s existing privacy legislation is largely ineffective in managing the collection, use, and disclosure of data in a manner that recognizes the privacy rights of individuals. In spite of this finding, the federal government introduced Bill C-27: An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts in June 2022 which, if passed into law, will fail to correct existing deficiencies in Canada’s federal commercial privacy law. In particular, Bill C-27 would make explicit that the government can continue collecting information, including mobility data from private organizations, so long as uses were socially beneficial and without clearly demarcating what will or will not constitute such uses in the future.

This report, “Minding Your Business: A Critical Analysis of the Collection of De-identified Mobility Data and Its Use Under the Socially Beneficial and Legitimate Interest Exemptions in Canadian Privacy Law,” critically assesses the government’s existing practice of collecting mobility information for socially beneficial purposes as well as private organizations’ ability to collect and use personal information without first obtaining consent from individuals or providing them with knowledge of the commercial activities. It uses examples raised during the COVID-19 pandemic to propose 19 legislative amendments to Bill C-27. These amendments would enhance corporate and government accountability for the collection, use, and disclosure of information about Canadian residents and communities, including for so-called de-identified information.

Part 1 provides a background of key privacy issues that were linked to collecting mobility data during the COVID-19 pandemic. We pay specific attention to the implementation of new technologies to collect, use, and disclose data, such as those used for contact-tracing applications and those that foreign governments used to collect mobility information from telecommunications carriers. We also attend to the concerns that are linked to collecting location information and why there is a consequent need to develop robust governance frameworks.

Part 2 focuses on the collection of mobility data in Canada. It outlines what is presently known about how Telus and BlueDot collected the mobility information that was subsequently disclosed to the government in aggregated and de-identified formats, and it discusses the key concerns raised in meetings held by the Standing Committee on Access to Information, Privacy and Ethics. The Committee’s meetings and final report make clear that there was an absence of appropriate public communication from the federal government about its collection of mobility information as well as a failure to meaningfully consult with the Office of the Privacy Commissioner of Canada. The Government of Canada also failed to verify that Telus and BlueDot had obtained meaningful consent prior to receiving data that was used to generate insights into Canadian residents’ activities during the pandemic.

Part 3 explores the lawfulness of the collection of mobility data by BlueDot and Telus and the disclosure of the data to the Public Health Agency of Canada under existing federal privacy law. Overall, we find that BlueDot and Telus likely complied with current privacy legislation. The assessment of the lawfulness of BlueDot and Telus’ activities serves to reveal deficiencies in Canada’s two pieces of federal privacy legislation, the Privacy Actand the Personal Information Protection and Electronic Documents Act (PIPEDA).

In Part 4, we identify six thematic deficiencies in Canada’s commercial privacy legislation:

  1. PIPEDA fails to adequately protect the privacy interests at stake with de-identified and aggregated data despite risks that are associated with re-identification.
  2. PIPEDA lacks requirements that individuals be informed of how their data is de-identified or used for secondary purposes.
  3. PIPEDA does not enable individuals or communities to substantively prevent harmful impacts of data sharing with the government.
  4. PIPEDA lacks sufficient checks and balances to ensure that meaningful consent is obtained to collect, use, or disclose de-identified data.
  5. PIPEDA does not account for Indigenous data sovereignty nor does it account for Indigenous sovereignty principles in the United Nations Declaration on the Rights of Indigenous Peoples, which has been adopted by Canada.
  6. PIPEDA generally lacks sufficient enforcement mechanisms.

The Government of Canada has introduced the Consumer Privacy Protection Act (CPPA) in Bill C-27 to replace PIPEDA. Part 5 demonstrates that Bill C-27 does not adequately ameliorate the deficiencies of PIPEDA as discussed in Part 4. Throughout, Part 5 offers corrective recommendations to the Consumer Privacy Protection Act that would alleviate many of the thematic issues facing PIPEDA and, by extension, the CPPA.

The federal government and private organizations envision the Consumer Privacy Protection Act as permitting private individuals’ and communities’ data to be exploited for the benefit of the economy and society alike. The legislation includes exceptions to consent and sometimes waives the protections that would normally be associated with de-identified data, where such exemptions could advance socially beneficial purposes or legitimate business interests. While neither the government nor private business necessarily intend to use de-identified information to injure, endanger, or negatively affect the persons and communities from whom the data is obtained, the breadth of potential socially beneficial purposes means that future governments will have a wide ambit to define the conceptual and practical meaning of these purposes. Some governments, as an example, might analyze de-identified data to assess how far people must travel to obtain abortion-care services and, subsequently, recognize that more services are required. Other governments could use the same de-identified mobility data and come to the opposite conclusion and selectively adopt policies to impair access to such services. This is but one of many examples. There are similar, though not identical, dangers that may arise should private organizations be able to collect or use an individual’s personal information without their consent under the legitimate interest exemption in the CPPA. Specifically, this exemption would let private organizations determine whether the collection or use of personal information outweighs the adverse effects of doing so, with the individuals and communities affected being left unaware of how personal information was collected or used, and thus unable to oppose collections or uses with which they disagree.

Parliamentary committees, the Office of the Privacy Commissioner of Canada, Canadian academics, and civil society organizations have all called for the federal government to amend federal privacy legislation. As presently drafted, however, the Consumer Privacy Protection Act would reaffirm existing deficiencies that exist in Canadian law while opening the door to expanded data collection, use, and disclosure by private organizations to the federal government without sufficient accountability or transparency safeguards while, simultaneously, empowering private organizations to collect and use personal information without prior consent or knowledge. Such safeguards must be added in legislative amendments or Canada’s new privacy legislation will continue the trend of inadequately protecting individuals and communities from the adverse effects of using de-identified data to advance so-called socially beneficial purposes or using personal information for ostensibly legitimate business purposes.

Public and Privacy Policy Implications of PHAC’s Use of Mobility Information

Last week I appeared before the House of Commons’ Standing Committee on Access to Information, Privacy, and Ethics to testify about the public and private policy implications of PHAC’s use of mobility information since March 2020. I provided oral comments to the committee which were, substantially, a truncated version of the brief I submitted. If interested, my oral comments are available to download. What follows in this post is the content of the brief which was submitted.

Introduction

  1. I am a senior research associate at the Citizen Lab, Munk School of Global Affairs & Public Policy at the University of Toronto. My research explores the intersection of law, policy, and technology, and focuses on issues of national security, data security, and data privacy. While I submit these comments in a professional capacity they do not necessarily represent the full views of the Citizen Lab.
Continue reading

Canadian Government’s Pandemic Data Collection Reveals Serious Privacy, Transparency, and Accountability Deficits

faceless multiethnic students in masks in subway train with phone
Photo by Keira Burton on Pexels.com

Just before Christmas, Swikar Oli published an article in the National Post that discussed how the Public Health Agency of Canada (PHAC) obtained aggregated and anonymized mobility data for 33 million Canadians. From the story, we learn that the contract was awarded in March to TELUS, and that PHAC used the mobility data to “understand possible links between movement of populations within Canada and spread of COVID-19.”

Around the same time as the article was published, PHAC posted a notice of tender to continue collecting aggregated and anonymized mobility data that is associated with Canadian cellular devices. The contract would remain in place for several years and be used to continue providing mobility-related intelligence to PHAC.

Separate from either of these means of collecting data, PHAC has been also purchasing mobility data “… from companies who specialize in producing anonymized and aggregated mobility data based on location-based services that are embedded into various third-party apps on personal devices.” There has, also, been little discussion of PHAC’s collection and use of data from these kinds of third-parties, which tend to be advertising and data surveillance companies that consumers have no idea are collecting, repackaging, and monetizing their personal information.

There are, at first glance, at least four major issues that arise out of how PHAC has obtained and can use the aggregated and anonymized information to which they have had, and plan to have, access.

Continue reading

The Limits of Tower Dump Privacy Protections in Canada

290822052_cccfe6d6ee_oOn January 14, 2016, the Ontario Superior Court ruled that “tower dumps” – the mass release of data collected by cellphone towers at the request of law enforcement agencies – violate privacy rights under the Canadian Charter of Rights and Freedoms. In response, Justice Sproat outlined a series of guidelines for authorities to adhere to when requesting tower dump warrants in the future.

I wrote about this case for PEN Canada. I began by summarizing the issue of the case and then proceeded to outline some of the highlights of Justice Sproat’s decision. The conclusion of the article focuses on the limits of that decision: it does not promote statutory reporting of tower dumps and thus Canadians will not learn how often such requests are made; it does not require notifying those affected by tower dumps; it does not mean Canadians will know if data collected in a tower dump is used in a subsequent process against them. Finally, the guidelines are not precedent-setting and so do not represent binding obligations on authorities requesting the relevant production orders.

Read the Article [NOTE: PEN Canada website no longer contains this article — see it, below]


The Limits of Tower Dump Privacy Protections

By Christopher Parsons

On January 14, 2016, the Ontario Superior Court ruled that “tower dumps” – the mass release of data collected by cellphone towers at the request of law enforcement agencies – violate privacy rights under the Canadian Charter of Rights and Freedoms. Christopher Parsons is a postdoctoral fellow and managing director of the telecom transparency project at Citizen Lab, Munk School of Global Affairs, at the University of Toronto. Read on for his break-down of this decision and its limits.

The Limits of Tower Dump Privacy Protections

When travelling with your mobile phone it routinely — often a few times second — communicates with the neighbouring cellular towers so that it can send, or receive, communications. Each such communication will geolocate the mobile device and send unique identifying information.

Authorities use production orders to compel telecommunications companies to disclose mobile tower-related retained data. Data from these so-called ‘tower dump warrants’ can be used to identify persons suspected of committing a crime. But they can also result in signification infringements of Canadians’ privacy because of the sheer volume of information that can be disclosed, which includes affected persons’ subscriber information and billing records. It was exactly this issue of over breadth that led TELUS and Rogers to challenge a tower dump order for an aggregate total of 43,000 persons’ information. The challenge was finally decided in January of 2016.[1]

Decision Highlights

Justice Sproat declared that the Peel Regional Police’s production orders “authorized unreasonable searches and so breached the s. 8 Charter rights of the Rogers and Telus subscribers.” He also outlined the following guidelines for authorities to adhere to when requesting tower dump warrants in the future:

  1. Provide a statement or explanation that demonstrates the officer seeking the order is aware of the principles of incrementalism and minimal intrusion, and tailored the requested order with that in mind.
  2. Explain why all the named locations or cell towers, and all the requested date and time parameters, are relevant to the investigation.
  3. Explain why all the types of records sought are relevant.
  4. Identify details or parameters which could be used to target the production order to conduct narrower searches and produce fewer records.
  5. Request a report based on the specific data instead of requesting the underlying data itself.
  6. Justify any requests for underlying data, when it is requested.
  7. Confirm that the types and amounts of data being requested can be meaningfully reviewed.

Justice Sproat declined to prohibit authorities from requesting ‘large’ amounts of data on the basis that the authorities and authorizing judge alike may be uncertain of the data required to conduct an investigation. He also declined to offer guidelines addressing how long authorities could retain data provided by telecommunications companies; legislatures, not courts, had to make that decision. Moreover, he maintained that legislatures, not courts, had to determine whether tower dumps be ‘last resort’ investigative techniques.

Importantly, the guidance Justice Sproat provided does not set precedent. As such, the guidelines are not binding obligations on authorities requesting production orders.

Sproat’s Limitations

The decision may limit authorities’ request for Canadians’ personal information. Such narrowed targeting will constitute a victory for Canadians and their privacy interests.

The decision and guidelines will not improve Canadians’ understanding of how often such requests are actually made. Authorities needn’t publicly report on how often, or to what effect, tower dump orders are useful for investigating or resolving criminal incidents. Moreover, those affected by tower dumps will not be notified of their data being collected by authorities unless charged with a crime. And finally, Canadians will not know if their data is used, later, for purposes unrelated to the original tower dump investigation: the unique identifiers and billing information might, as an example, be subsequently used to identify persons later detected at public events or protests by combining newly collected surveillance data with that previously disclosed by telecommunications providers.

So while Canadians have enjoyed a significant victory concerning their privacy rights they are no more aware of actually being affected by such requests unless charged with a crime. And this data might ultimately be used against them in subsequent investigations or government surveillance. Consequently, Canadians are still left to trust, without being able to verify, that our personal information is being accessed and retained appropriately by authorities. This privacy victory, in other words, has not come with an ounce of real transparency for the public at large.

Citations

[1] R. v. Rogers Communications, 2016, ONSC 70.

Photo credit: cell tower next to the casita by dasroofless (CC BY-NC-ND 2.0) https://flic.kr/p/rGxgj

UVic, Google, and Trust Deficits

Google Streetview Bicycle DublinIn the wake of a stunning data breach the University of Victoria campus community could only hope that the institution would do everything it could to regain lost trust. One such opportunity arose this week, when controversial Google Streetview vehicles have been scheduled to canvas the campus. Unfortunately the opportunity was squandered: it is largely by accident that the campus community has – or will – learn that Google is capturing images and wireless access point information.

In this short post I want to discuss how seriously the University failed to disclose Google’s surveillance of the campus. I begin by providing a quick overview of Streetview’s privacy controversies. I then describe the serious data breach that UVic suffered earlier this year, which has left the institution with a significant trust deficit. A discussion of the institution’s failure to disclose Google’s presence to the community, and attempts to chill speech around Google’s presence, follows. I conclude by suggesting how institutions can learn from UVic’s failures and disclose the presence of controversial, potentially privacy invasive, actors in order to rebuild flagging trust deficits.

Google Streetview and Privacy

Streetview has been a controversial product since its inception. There were serious concerns as it captured images of people in sensitive places or engaged in indiscreet actions. Initially the company had a non-trivial means for individuals to remove images from the Google Streetview database. This process has subsequently been replaced with an option to blur sensitive information. Various jurisdictions have challenged Google’s conceptual and legal argument that taking images of public spaces with a Streetview vehicle are equivalent to a tourist taking pictures in a public space.

Continue reading