The Policy and Political Implications of ‘Securing Canada’s Telecommunications Systems’

silhouette photo of transmission tower on hill
Photo by Troy Squillaci on Pexels.com

Many of Canada’s closest allies have either firmly or softly blocked Huawei and ZTE from selling telecommunications equipment to Internet service providers in their countries over the past several years. After repeated statements from Canadian government officials that a review of Huawei equipment was ongoing, on May 19, 2022 the government announced its own bans on Huawei and ZTE equipment. The government published an accompanying policy statement from Innovation, Science, and Economic Development (ISED) Canada on the same day.

This post begins by summarizing the possible risks that Chinese vendors might pose to Canadian networks. Next, it moves to discuss the current positions of Canada’s closest allies as well as Canada’s actions and statements pertaining to Chinese telecommunications vendors leading up to the May 2022 announcement. It then proceeds to unpack the government’s “Securing Canada’s Telecommunications System” policy statement. Some highlight findings include:

  • The government is unclear when it refers to “supply chain breaches”;
  • The government may be banning Huawei and ZTE principally on the basis of American export restrictions placed on Chinese vendors and, thus, be following the same model as the United Kingdom which was forced to ban Huawei following American actions; and
  • Establishing the security and protection of telecommunications systems as an “overriding objective” of Canadian telecommunications policy could have long-term implications for Canadians’ privacy interests.

The post concludes by discussing the policy and political implications of the policy statement, why any telecommunications security reforms must not be accompanied by broader national security and law enforcement reforms, and why the Canadian government should work with allied and friendly countries to collectively assess telecommunications equipment.

Continue reading

Findings and Absences in Canada’s (Draft) International Cybersecurity Strategy

low angle photography of high rise building
Photo by Andre Furtado on Pexels.com

For several years there have been repeated calls by academics and other experts for the Government of Canada to develop and publish a foreign policy strategy. There have also been recent warnings about the implications of lacking such a strategy. Broadly, a foreign policy strategy is needed for Canada to promote and defend its interests effectively.

Not only has the Government of Canada failed to produce a foreign policy strategy but, also, it has failed to produce even a more limited strategy that expresses how Canada will develop or implement the cyber dimensions of its foreign policy. The government itself has been aware of the need to develop a cyber foreign policy since at least 2010.1

As I have previously written with colleagues, an articulation of such a cybersecurity strategy is necessary because it is “inherently a discussion of political philosophy; not all actors share the same understanding of what is, or should be, the object of security, nor is there necessarily a shared understanding of what constitutes a threat.” To clearly and explicitly assert its underlying political values Canada needs to produce a coherent and holistic cyber foreign policy strategy.

On May 18, 2021 the Chief of the Communications Security Establishment, Shelly Bruce, stated that Global Affairs Canada (GAC) was leading the development of “Canada’s International Cybersecurity Strategy and our Diplomacy Initiative.” I subsequently filed an ATIP for it and received the relevant documents on March 31, 2022.2 GAC’s response included successive drafts of “Canada’s International Cybersecurity Strategy and our Diplomacy Initiative” (hereafter the ‘Strategy’ or ‘CICSDI’) from January 2021 to May 2021.

Some of my key findings from the CICSDI include:

  1. The May 2021 draft links the scope of the Strategy to order and prosperity as opposed to advancing human rights or Canadian values.
  2. The May 2021 draft struck language that Canadians and Canadian organisations “should not be expected to independently defend themselves against state or state-backed actors. There are steps only government can take to reduce cyber threats from state actors”. The effect may be to reduce the explicit expectation or requirement of government organisations to assist in mitigating nation-state operations towards private individuals and organisations.
  3. The May 2021 draft struck language that GAC would create a cyber stakeholder engagement action plan as well as language that GAC would leverage its expertise to assist other government departments and agencies on engagement priorities and to coordinate international outreach.
  4. None of the drafts include explicit reference to pressing international issues, including: availability of strong encryption, proliferation of cyber mercenaries, availability and use of dual-use technologies, online harms and disinformation, authoritarian governments’ attempts to lead and influence standards bodies, establishing a unit in GAC dealing with cyber issues that would be equivalent to the US State Department’s Bureau of Cyberspace and Digital Policy, or cyber operations and international law.
  5. None of the drafts make a positive case for what would entail an appropriate or responsible use of malware for cyber operations.

In this post I summarise the highlights in the drafts of the Strategy and, then, proceed to point to larger language and/or policy shifts across successive drafts of the CICSDI. I conclude by discussing some policy issues that were not mentioned in the drafts I obtained. While the draft has never been promulgated and consequently does not formally represent Canada’s foreign cybersecurity strategy it does present how GAC and the government more broadly conceptualised elements of such a strategy as of early- to mid-2021.

Continue reading

Public and Privacy Policy Implications of PHAC’s Use of Mobility Information

Last week I appeared before the House of Commons’ Standing Committee on Access to Information, Privacy, and Ethics to testify about the public and private policy implications of PHAC’s use of mobility information since March 2020. I provided oral comments to the committee which were, substantially, a truncated version of the brief I submitted. If interested, my oral comments are available to download. What follows in this post is the content of the brief which was submitted.

Introduction

  1. I am a senior research associate at the Citizen Lab, Munk School of Global Affairs & Public Policy at the University of Toronto. My research explores the intersection of law, policy, and technology, and focuses on issues of national security, data security, and data privacy. While I submit these comments in a professional capacity they do not necessarily represent the full views of the Citizen Lab.
Continue reading

Lawful Access Returns: Online Harms and Warrantless Access to Subscriber and Transmission Data

zombies behind shabby door
Photo by cottonbro on Pexels.com

For the better part of twenty years, law enforcement agencies in Canada have sought warrantless access to subscriber data that is held by telecommunications service providers and other organizations. The rationale has been that some baseline digital identifiers are needed to open investigations into alleged harms or criminal activities that have a digital nexus. Only once these identifiers are in hand can an investigation bloom. However, due to the time that it takes to obtain a relevant court order, as well as challenges in satisfying a judge or justice that there is a legitimate need to obtain these identifiers in the first place, these same agencies recurrently assert that an initial set of seed digital identifiers should be disclosed to officers absent a court order.

The Government of Canada has, once more, raised the prospect of law enforcement officers obtaining subscriber or transmission data without warrant when undertaking activities intended to “enhance efforts to curb child pornography.” This time, the argument that such information should be made available is in the context of combatting online harms. The government has heard that companies should include basic subscriber or transmission data in their child pornography-related reports to law enforcement, with the effect of law enforcement agencies getting around the need to obtain a warrant prior to receiving this information.

In this post I start by discussing the context in which this proposal to obtain information without warrant has been raised, as well as why subscriber and transmission data can be deeply revelatory. With that out of the way, I outline a series of challenges that government agencies regularly experience but which tend not to be publicly acknowledged in the warrantless access debates associated with child sexual abuse material (CSAM). It is only with this broader context and awareness of the challenges facing government agencies in mind that it becomes apparent that warrantless access to subscriber or transmission data cannot ‘solve’ the issues faced by agencies which are responsible for investigating CSAM offences. To develop appropriate policy solutions, then, we must begin by acknowledging all of the current obstacles to investigating these offences. Only then can we hope to develop proportionate policy solutions.

Continue reading

Canadian Government’s Pandemic Data Collection Reveals Serious Privacy, Transparency, and Accountability Deficits

faceless multiethnic students in masks in subway train with phone
Photo by Keira Burton on Pexels.com

Just before Christmas, Swikar Oli published an article in the National Post that discussed how the Public Health Agency of Canada (PHAC) obtained aggregated and anonymized mobility data for 33 million Canadians. From the story, we learn that the contract was awarded in March to TELUS, and that PHAC used the mobility data to “understand possible links between movement of populations within Canada and spread of COVID-19.”

Around the same time as the article was published, PHAC posted a notice of tender to continue collecting aggregated and anonymized mobility data that is associated with Canadian cellular devices. The contract would remain in place for several years and be used to continue providing mobility-related intelligence to PHAC.

Separate from either of these means of collecting data, PHAC has been also purchasing mobility data “… from companies who specialize in producing anonymized and aggregated mobility data based on location-based services that are embedded into various third-party apps on personal devices.” There has, also, been little discussion of PHAC’s collection and use of data from these kinds of third-parties, which tend to be advertising and data surveillance companies that consumers have no idea are collecting, repackaging, and monetizing their personal information.

There are, at first glance, at least four major issues that arise out of how PHAC has obtained and can use the aggregated and anonymized information to which they have had, and plan to have, access.

Continue reading

The Problems and Complications of Apple Monitoring for Child Sexual Abuse Material in iCloud Photos

pexels-photo-1294886.jpeg
Photo by Mateusz Dach on Pexels.com

On August 5, 2021, Apple announced that it would soon begin conducting pervasive surveillance of the devices that it sells in a stated intent to expand protections for children. The company announced three new features. The first will monitor for children sending or receiving sexually explicit images using the Messages application. The second will monitor for the presence of Child Sexual Abuse Material (CSAM) in iCloud Photos. The third will monitor for searches pertaining to CSAM. These features are planned to be activated in the United States in the next versions of Apple’s operating systems which will ship to end-users in the fall of 2021.

In this post I focus exclusively on the surveillance of iCloud Photos for CSAM content. I begin with a background of Apple’s efforts to monitor for CSAM content on their services before providing a description of the newly announced CSAM surveillance system. I then turn to outline some problems, complications, and concerns with this new child safety feature. In particular, I discuss the challenges facing Apple in finding reputable child safety organizations with whom to partner, the potential ability to region-shift to avoid the surveillance, the prospect of the surveillance system leading to ongoing harms towards CSAM survivors, the likelihood that Apple will expand the content which is subject to the company’s surveillance infrastructure, and the weaponization of the CSAM surveillance infrastructure against journalists, human rights defenders, lawyers, opposition politicians, and political dissidents. I conclude with a broader discussion of the problems associated with Apple’s new CSAM surveillance infrastructure.

A previous post focused on the surveillance children’s messages to monitor for sexually explicit photos. Future posts will address the third child safety feature that Apple has announced, as well as the broader implications of Apple’s child safety initiatives.

Background to Apple Monitoring for CSAM

Apple has previously worked with law enforcement agencies to combat CSAM though the full contours of that assistance are largely hidden from the public. In May 2019, Mac Observer noted that the company had modified their privacy policy to read, “[w]e may also use your personal information for account and network security purposes, including in order to protect our services for the benefit of all our users, and pre-screening or scanning uploaded content for potentially illegal content, including child sexual exploitation material” (emphasis not in original). Per Forbes, Apple places email messages under surveillance when they are routed through its systems. Mail is scanned and if CSAM content is detected then Apple automatically prevents the email from reaching its recipient and assigns an employee to confirm the CSAM content of the message. If the employee confirms the existence of CSAM content the company subsequently provides subscriber information to the National Center for Missing and Exploited Children (NCMEC) or a relevant government agency.1

Continue reading