Photo by Marco Verch (CC BY 2.0) https://flic.kr/p/RjMXMP
The Government of Canada has historically opposed the calls of its western allies to undermine the encryption protocols and associated applications that secure Canadians’ communications and devices from criminal and illicit activities. In particular, over the past two years the Minister of Public Safety, Ralph Goodale, has communicated to Canada’s Five Eyes allies that Canada will neither adopt or advance an irresponsible encryption policy that would compel private companies to deliberately inject weaknesses into cryptographic algorithms or the applications that facilitate encrypted communications. This year, however, the tide may have turned, with the Minister apparently deciding to adopt the very irresponsible encryption policy position he had previously steadfastly opposed. To be clear, should the Government of Canada, along with its allies, compel private companies to deliberately sabotage strong and robust encryption protocols and systems, then basic rights and freedoms, cybersecurity, economic development, and foreign policy goals will all be jeopardized.
This article begins by briefly outlining the history and recent developments in the Canadian government’s thinking about strong encryption. Next, the article showcases how government agencies have failed to produce reliable information which supports the Minister’s position that encryption is significantly contributing to public safety risks. After outlining the government’s deficient rationales for calling for the weakening of strong encryption, the article shifts to discuss the rights which are enabled and secured as private companies integrate strong encryption into their devices and services, as well as why deliberately weakening encryption will lead to a series of deeply problematic policy outcomes. The article concludes by summarizing why it is important that the Canadian government walk back from its newly adopted irresponsible encryption policy.
Last week I appeared before the Standing Committee on Public Safety and National Security (SECU) to testify about Cybersecurity in the financial sector as a national economic security issue. I provided oral comments to the committee which were, substantially, a truncated version of the brief I submitted. If so interested, my oral comments are available to download, and what follows in this post is the actual brief which was submitted.
I am a research associate at the Citizen Lab, Munk School of Global Affairs & Public Policy at the University of Toronto. My research explores the intersection of law, policy, and technology, with a focus on national security, data security, and data privacy issues. I submit these comments in a professional capacity representing my views and those of the Citizen Lab.
The State of Computer Insecurity
Canadian government agencies, private businesses and financial institutions, and private individuals rely on common computing infrastructures. Apple iPhones and Android-based devices are used for professional and private life alike, just as are Microsoft Windows and MacOS. Vulnerabilities in such mobile and personal computing operating systems can prospectively be leveraged to obtain access to data on the targeted devices themselves, or utilized to move laterally in networked computing environments for reconnaissance, espionage, or attack purposes. Such threats are accentuated in a world where individuals routinely bring their own devices to the workplace, raising the prospect that personal devices can be compromised to obtain access to more securitized professional environments.
The applications that we rely on to carry out business, similarly, tend to be used across the economy. Vulnerabilities in customer service applications, such as mobile banking applications, affect all classes of businesses, government departments, and private individuals. Also, underlying many of our commonly used programs are shared libraries, application programming interfaces (API), and random number generators (RNG); vulnerabilities such codebases are shared by all applications incorporating these pieces of code, thus prospectively endangering dozens, hundreds, or thousands of applications and systems. This sharedness of software between the public and private sector, and professional and private life, is becoming more common with the growth of common messaging, database, and storage systems, and will only become more routine over time.
Furthermore, all sectors of the economy are increasingly reliant on third-party cloud computing services to process, retain, and analyze data which is essential to business and government operations, as well as personal life. The servers powering these cloud computing infrastructures are routinely found to have serious vulnerabilities either in the code powering them or, alternately, as a result of insufficient isolation of virtual servers from one another. The result is that vulnerabilities or errors in setting up cloud infrastructures prospectively enable third-parties to inappropriately access, modify, or exfiltrate information.
In summary, the state of computer insecurity is profound. New vulnerabilities are discovered — and remediated — every day. Each week new and significant data breaches are reported on by major media outlets. And such breaches can be used to either engage in spearphishing — to obtain privileged access to information that is possessed by well-placed executives, employees, or other persons — or blackmail — as was threatened in the case of the Ashley Madison disclosures — or other nefarious activities. Vulnerabilities affecting computer security, writ large, threaten the financial sector and all other sectors of the economy, with the potential for information to be abused to the detriment of Canada’s national security interests.
Responsible Encryption Policies
Given the state of computer (in)security, it is imperative that the Government of Canada adopt and advocate for responsible encryption policies. Such policies entail commitments to preserving the right of all groups in Canada — government, private enterprises, and private individuals — to use computer software using strong encryption. Strong encryption can be loosely defined as encryption algorithms for which no weakness or vulnerability is known or has been injected, as well as computer applications that do not deliberately contain weaknesses designed to undermine the effectiveness of the aforementioned algorithms.
There have been calls in Canada,1 and by law enforcement agencies in allied countries,2 to ‘backdoor’ or otherwise weaken the protections that encryption provides. Succumbing to such calls will fundamentally endanger the security of all users of the affected computer software3 and, more broadly, threaten the security of any financial transactions which rely upon the affected applications, encryption algorithms, or software libraries.
Some of Canada’s closest allies, such as Australia, have adopted irresponsible encryption policies which run the risk of introducing systemic vulnerabilities into the software used by the financial sector, as well as other elements of the economy and government functions.4 Once introduced, these vulnerabilities might be exploited by Australian intelligence, security, or law enforcement agencies in the course of their activities but, also, by actors holding adversarial interests towards Canada or the Canadian economy. Threats activities might be carried out against the SWIFT network, as just one example.5
It is important to note that even Canada’s closest allies monitor Canadian banking information, often in excess of agreed upon surveillance mechanisms such as FINTRAC. As one example, information which was publicly disclosed by the Globe and Mail revealed that the United States of America’s National Security Agency (NSA) was monitoring Royal Bank of Canada’s Virtual Private Network (VPN) tunnels. The story suggested that the NSA’s activities could be a preliminary step in broader efforts to “identify, study and, if deemed necessary, “exploit” organizations’ internal communications networks.”6
Access to strong, uncompromised encryption technology is critical to the economy. In a technological environment marked by high financial stakes, deep interdependence, and extraordinary complexity, ensuring digital security is of critical importance and extremely difficult. Encryption helps to ensure the security of financial transactions and preserves public trust in the digital marketplace. The cost of a security breach, theft, or loss of customer or corporate data can have devastating impacts for private sector interests and individuals’ rights. Any weakening of the very systems that protect against these threats would represent irresponsible policymaking. Access to strong encryption encourages consumer confidence that the technology they use is safe.
Given the aforementioned threats, I recommend that the Government of Canada adopt a responsible encryption policy. Such a policy would entail a firm and perhaps legislative commitment to require that all sectors of the economy have access to strong encryption products, and would stand in opposition to irresponsible encryption policies, such as those calling for ‘backdoors’.
Vulnerabilities Equities Program
The Canadian government presently has a process in place, whereby the Communications Security Establishment (CSE) obtains computer vulnerabilities and ascertains whether to retain them or disclose them to private companies or software maintainers to remediate the vulnerabilities. The CSE is motivated to retain vulnerabilities to obtain access to foreign systems as part of its signals intelligence mandate and, also, to disclose certain vulnerabilities to better secure government systems. To date, the CSE has declined to make public the specific process by which it weighs the equities in retaining or disclosing these vulnerabilities.7 It remains unclear if other government agencies have their own equities processes. The Canadian government’s current policy stands in contrast to that of the United States of America, where the White House has published how all federal government agencies evaluate whether or retain or disclose the existence of a vulnerability.8
When agencies such as the CSE keep discovered vulnerabilities secret to later use them against specific targets, the unpatched vulnerabilities leave critical systems open to exploitation by other malicious actors who discover them. Vulnerability stockpiles kept by our agencies can be uncovered and used by adversaries. The NSA’s and Central Intelligence Agency’s (CIA) vulnerabilities have been leaked in recent years,9 with one of the NSA vulnerabilities used by malicious actors to cause at least $10B in commercial harm.10
As it stands, it is not clear what considerations guide Canada’s intelligence agencies’ decision-making process when they decide whether to keep a discovered vulnerability for future use or to disclose it so that it is fixed. There is also no indication that potentially impacted entities such as private companies or civil society organizations are involved in the decision-making process.
To reassure Canadian businesses, and make evident that Canadian intelligence and security agencies are not retaining vulnerabilities which could be used by non-government actors to endanger Canada’s financial sector by way of exploiting such vulnerabilities, I would recommend that the Government of Canada publicize its existing vulnerabilities equities program(s) and hold consultations on its effectiveness in protecting Canadian software and hardware that is used in the course of financial activities, amongst other economic activities.
Furthermore, I would recommend that the Government of Canada include the business community and civil society stakeholders in the existing, or reformed, vulnerabilities equities program. Such stakeholders would be able to identify the risks of retaining certain vulnerabilities for the Canadian economy, such as prospectively facilitating ransomware, data deletion, data modification, identify theft for commercial or espionage purposes, or data access and exfiltration to the advantage of other nation-states’ advantage.
Vulnerability Disclosure Programs
Security researchers routinely discover vulnerabilities in systems and software that are used in all walks of life, including in the financial sector. Such vulnerabilities can, in some cases, be used to inappropriately obtain access to data, modify data, exfiltrate data, or otherwise tamper with computer systems in ways which are detrimental to the parties controlling the systems and associated computer information. Relatively few organizations, however, have explicit procedures that guide researchers in how to responsibly disclose such vulnerabilities to the affected companies. Disclosing vulnerabilities absent a disclosure program can lead companies to inappropriately threaten litigation to whitehat security researchers, and such potentials reduce the willingness of researchers to disclose vulnerabilities absent a vulnerability disclosure program.11
Responsible disclosure of vulnerabilities typically involves the following. First, companies make clear to whom vulnerabilities can be reported, assure researchers they will not be legally threatened for disclosing vulnerabilities, and explains the approximate period of time a company will take to remediate the vulnerability reported. Second, researchers commit to not publicly disclosing the vulnerability until either a certain period of time (e.g. 30-90 days) have elapsed since the reporting, or until the vulnerability is patched, whichever event occurs once. The delimitation of a time period before the vulnerability is publicly reported is designed to encourage companies to quickly remediate reported vulnerabilities, as opposed to waiting for excessive periods of time before doing so.
I would recommend that the Government of Canada undertake, first, to establish a draft policy that financial sector companies, along with other sector companies, could adopt and which would establish the terms under which computer security researchers could report vulnerabilities to financial sector companies. Such a disclosure policy should establish to whom vulnerabilities are reported, how reports are treated internally, how long it will take for a vulnerability to be remediated, and insulate the security researchers from legal liability so long as they do not publicly disclose the vulnerability ahead of the established delimited period of time.
I would also recommend that the Government of Canada ultimately move to mandate the adoption of vulnerability disclosure programs for its own departments given that they could be targeted by adversaries for the purposes of financially advantaging themselves to Canada’s detriment. Such policies have been adopted by the United States of America’s Department of Defense12 and explored by the State Departments,13 to the effect of having hundreds of vulnerabilities reported and subsequently remediated. Encouraging persons to report vulnerabilities to the Government of Canada will reduce the likelihood that the government’s own infrastructures are successfully exploited to the detriment of Canada’s national interests.
Finally, I would recommend that our laws around unauthorized access be studied with an eye towards determining if they are too broad in their chill and impact on legitimate security researcher.
Two Factor Authentication Processes
Login and password pairs are routinely exfiltrated from private companies’ databases. Given that many individuals either use the same pair across multiple services (e.g. for social media as well as for professional accounts) and, also, that many passwords are trivially guessed, it is imperative that private companies’ online accounts incorporate two factor authentication (2FA). 2FA refers to a situation where an individual must be in possession of at least two ‘factors’ to obtain access to their accounts. The ‘factors’ most typically used for authentication include something that you know (e.g. a PIN or password), something you have (e.g. hardware token or random token generator), or something that you are (biometric, e.g. fingerprint or iris scan).14
While many financial sector companies use 2FA before employees can obtain access to their professional systems, the same is less commonly true of customer-facing login systems. It is important for these latter systems to also have strong 2FA to preclude unauthorized third-parties from obtaining access to personal financial accounts; such access can lead to better understandings of whether persons could be targeted by a foreign adversary for espionage recruitment, cause personal financial chaos (e.g. transferring monies to a third-party, cancelling automated bill payments, etc) designed to distract a person while a separate cyber activity is undertaken (e.g. distract a systems administrator to deal with personal financial activities, while then attempting to penetrate sensitive systems or accounts the individual administrates), or direct money to parties on terrorist watchlists.
Some Canadian financial institutions do offer 2FA but typically default to a weak mode of second factor authentication. This is problematic because SMS is a weak communications medium, and can be easily subverted by a variety of means.15 This is why entities such as the United States’ National Institute of Standards and Technology no longer recommends SMS as a two factor authentication channel.16
To improve the security of customer-facing accounts, I recommend that financial institutions should be required to offer 2FA to all clients and, furthermore, that such authentication utilize hardware or software tokens (e.g. one time password or random token generators). Implementing this recommendation will reduce the likelihood that unauthorized parties will obtain access to accounts for the purposes of recruitment or disruption activities.
The views I have presented are my own and based out of research that I and my colleagues have carried out at my place of employment, the Citizen Lab. The Citizen Lab is an interdisciplinary laboratory based at the Munk School of Global Affairs and Public Policy, University of Toronto, focusing on research, development, and high-level strategic policy and legal engagement at the intersection of information and communication technologies, human rights, and global security.
We use a “mixed methods” approach to research combining practices from political science, law, computer science, and area studies. Our research includes: investigating digital espionage against civil society, documenting Internet filtering and other technologies and practices that impact freedom of expression online, analyzing privacy, security, and information controls of popular applications, and examining transparency and accountability mechanisms relevant to the relationship between corporations and state agencies regarding personal data and other surveillance activities.
1 RCMP’s ability to police digital realm ‘rapidly declining,’ commissioner warned, https://www.cbc.ca/news/politics/lucki-briefing-binde-cybercrime-1.4831340. 2 In the dark about ‘going dark’, https://www.cyberscoop.com/fbi-going-dark-encryption-ari-schwartz-op-ed/. 3 See: Keys Under Doormats: Mandating insecurity by requiring government access to all data and communications, https://dspace.mit.edu/handle/1721.1/97690; Shining A Light On The Encryption Debate: A Canadian Field Guide, https://citizenlab.ca/2018/05/shining-light-on-encryption-debate-canadian-field-guide/. 4 Civil Society Letter to Australian Government, February 21, 2019, https://newamericadotorg.s3.amazonaws.com/documents/Coalition_comments_Australia_Assistance_and_Access_Law_2018_Feb_21_2019.pdf;
Australia’s Encryption Law Deals a Serious Blow to Privacy and Security, https://nationalinterest.org/feature/australia’s-encryption-law-deals-serious-blow-privacy-and-security-39212. 5 That Insane, $81M Bangladesh Bank Heist? Here’s What We Know, https://www.wired.com/2016/05/insane-81m-bangladesh-bank-heist-heres-know/. 6 NSA trying to map Rogers, RBC communications traffic, leak shows, https://www.theglobeandmail.com/news/national/nsa-trying-to-map-rogers-rbc-communications-traffic-leak- shows/article23491118/. 7 When do Canadian spies disclose the software flaws they find? There’s a policy, but few details, https://www.cbc.ca/news/technology/canada-cse-spies-zero-day-software-vulnerabilities-1.4276007. 8 Vulnerabilities Equities Policy and Process for the United States Government (November 15, 2017), https://www.whitehouse.gov/sites/whitehouse.gov/files/images/External%20-%20Unclassified%20VEP%20Charter%20FINAL.PDF. 9 Who Are the Shadow Brokers?, https://www.theatlantic.com/technology/archive/2017/05/shadow-brokers/527778/; WikiLeaks Starts Releasing Source Code For Alleged CIA Spying Tools, https://motherboard.vice.com/en_us/article/qv3xxm/wikileaks-vault-7-vault-8-cia-source-code. 10 The Untold Story of NotPetya, the Most Devastating Cyberattack in History, https://www.wired.com/story/notpetya-cyberattack-ukraine-russia-code-crashed-the-world/. 11 Vulnerability Disclosure Policies (VDP): Guidance for Financial Services, https://www.hackerone.com/sites/default/files/2018-07/VDP%20for%20Financial%20Services_Guide%20%281%29.pdf. 12 The Department of Defense wants more people to ‘hack the Pentagon’ — and is willing to pay them too, https://www.businessinsider.com/department-defense-wants-people-hack-pentagon-2018-10; DoD Vulnerability Disclosure Policy, https://hackerone.com/deptofdefense. 13 House panel approves bill to ‘hack’ the State Department, https://thehill.com/policy/cybersecurity/386897-house-panel-approves-bill-to-hack-the-state-department. 14 Office of the Privacy Commissioner of Canada Privacy Tech-Know Blog – Your Identity: Ways services can robustly authenticate you, https://www.priv.gc.ca/en/blog/20170105/. 15 Cybercriminals intercept codes used for banking to empty your accounts, https://www.kaspersky.com/blog/ss7-hacked/25529/; AT&T gets sued over two-factor security flaws and $23M cryptocurrency theft, https://www.fastcompany.com/90219499/att-gets-sued-over-two-factor-security-flaws-and-23m-cryptocurrency-theft. 16 Standards body warned SMS 2FA is insecure and nobody listened, https://www.theregister.co.uk/2016/12/06/2fa_missed_warning/.
The Citizen Lab and the Canadian Internet Policy and Public Interest Clinic (CIPPIC) have released a joint collaborative report, “Shining a Light on the Encryption Debate: A Canadian Field Guide,” which was written by Lex Gill, Tamir Israel, and myself. We argue that access to strong encryption is integral to the defense of human rights in the digital era. Encryption technologies are also essential to securing digital transactions, securing public safety, and protecting national security interests. Unfortunately, many state agencies have continues to argue that encryption poses insurmountable or unacceptable barriers to their investigative- and intelligence-gathering activities. In response, some governments have advanced irresponsible encryption policies that would limit the public availability and use of secure, uncompromised encryption technologies.
Our report examines this encryption debate, paying particular attention to the Canadian context. It provides insight and analyses for policy makers, lawyers, academics, journalists, and advocates who are trying to understand encryption technologies and the potential viability and consequences of different policies pertaining to encryption.
Section One provides a brief primer on key technical principles and concepts associated with encryption in the service of improving policy outcomes and enhancing technical literacy. In particular, we review the distinction between encryption at rest and in transit, the difference between symmetric and asymmetric encryption systems, the issue of end-to-end encryption, and the concept of forward secrecy. We also identify some of the limits of encryption in restricting the investigative or intelligence-gathering objectives of the state, including in particular the relationship between encryption and metadata.
Section Two explains how access to strong, uncompromised encryption technology serves critical public interest objectives. Encryption is intimately connected to the constitutional protections guaranteed by the Canadian Charter of Rights and Freedoms as well as those rights enshrined in international human rights law. In particular, encryption enables the right to privacy, the right to freedom of expression, and related rights to freedom of opinion and belief. In an era where signals intelligence agencies operate with minimal restrictions on their foreign facing activities, encryption remains one of the few practical limits on mass surveillance. Encryption also helps to guarantee privacy in our personal lives, shielding individuals from abusive partners, exploitative employers, and online harassment. The mere awareness of mass surveillance exerts a significant chilling effect on freedom of expression. Vulnerable and marginalized groups are both disproportionately subject to state scrutiny and may be particularly vulnerable to these chilling effects. Democracies pay a particularly high price when minority voices and dissenting views are pressured to self-censor or refrain from participating in public life. The same is true when human rights activists, journalists, lawyers, and others whose work demands the ability to call attention to injustice, often at some personal risk, are deterred from leveraging digital networks in pursuit of their activities. Unrestricted public access to reliable encryption technology can help to shield individuals from these threats. Efforts to undermine the security of encryption in order to facilitate state access, by contrast, are likely to magnify these risks. Uncompromised encryption systems can thus foster the security necessary for meaningful inclusion, democratic engagement, and equal access in the digital sphere.
Section Three explores the history of encryption policy across four somewhat distinct eras, with a focus on Canada to the extent the Canadian government played an active role in addressing encryption. The first era is characterized by the efforts of intelligence agencies such as the United States National Security Agency (NSA) to limit the public availability of secure encryption technology. In the second era of the 1990s, encryption emerged as a vital tool for securing electronic trust on the emerging web. In the third era—between 2000 and 2010—the development and proliferation of strong encryption technology in Canada, the United States, and Europe progressed relatively unimpeded. The fourth era encompasses from 2011 to the present day where calls to compromise, weaken, and restrict access to encryption technology have steadily reemerged.
Section Four reviews the broad spectrum of legal and policy responses to government agencies’ perceived encryption “problem,” including historical examples, international case studies, and present-day proposals. The section provides an overview of factors which may help to evaluate these measures in context. In particular, it emphasizes questions related to: (1) whether the proposed measure is truly targeted and avoids collateral or systemic impacts on uninvolved parties; (2) whether there is an element of conscription or compelled participation which raises an issue of self-incrimination or unfairly impacts the interests of a third party; and (3) whether, in considering all the factors, the response remains both truly necessary and truly proportionate. The analysis of policy measures in this sections proceeds in three categories. The first category includes measures designed to limit the broad public availability of effective encryption tools. The second category reviews measures that are directed at intermediaries and service providers. The third category focuses on efforts that target specific encrypted devices, accounts, or individuals.
Section Five examines the necessity of proposed responses to the encryption “problem.” A holistic and contextual analysis of the encryption debate makes clear that the investigative and intelligence costs imposed by unrestricted public access to strong encryption technology are often overstated. At the same time, the risks associated with government proposals to compromise encryption in order to ensure greater ease of access for state agencies are often grossly understated. When weighed against the profound costs to human rights, the economy, consumer trust, public safety, and national security, such measures will rarely—if ever—be proportionate and almost always constitute an irresponsible approach to encryption policy. In light of this, rather than finding ways to undermine encryption, the Government of Canada should make efforts to encourage the development and adoption of strong and uncompromised technology.
This research was led by the Citizen Lab at the Munk School of Global Affairs, University of Toronto, as well as the Canadian Internet Policy and Public Interest Clinic (CIPPIC) at the University of Ottawa. This project was funded, in part, by the John D. And Catherine T. MacArthur Foundation and the Ford Foundation.
The authors would like to extend their deepest gratitude to a number of individuals who have provided support and feedback in the production of this report, including (in alphabetical order) Bram Abramson, Nate Cardozo, Masashi Crete-Nishihata, Ron Deibert, Mickael E.B., Andrew Hilts, Jeffrey Knockel, Adam Molnar, Christopher Prince, Tina Salameh, Amie Stepanovich, and Mari Jing Zhou. Any errors remain the fault of the authors alone.
We are also grateful to the many individuals and organizations who gave us the opportunity to share early versions of this work, including Lisa Austin at the Faculty of Law (University of Toronto); Vanessa Rhinesmith and David Eaves at digital HKS (Harvard Kennedy School); Ian Goldberg and Erinn Atwater at the Cryptography, Security, and Privacy (CrySP) Research Group (University of Waterloo); Florian Martin-Bariteau at the Centre for Law, Technology and Society (University of Ottawa); and the Citizen Lab Summer Institute (Munk School of Global Affairs, University of Toronto).
Lex Gill is a Citizen Lab Research Fellow. She has also served as the National Security Program Advocate to the Canadian Civil Liberties Association, as a CIPPIC Google Policy Fellow and as a researcher to the Berkman Klein Center for Internet & Society at Harvard University. She holds a B.C.L./LL.B. from McGill University’s Faculty of Law.
Tamir Israel is Staff Lawyer at the Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic at the University of Ottawa, Faculty of Law. He leads CIPPIC’s privacy, net neutrality, electronic surveillance and telecommunications regulation activities and conducts research and advocacy on a range of other digital rights-related topics.
Christopher Parsons is currently a Research Associate at the Citizen Lab, in the Munk School of Global Affairs with the University of Toronto as well as the Managing Director of the Telecom Transparency Project at the Citizen Lab. He received his Bachelor’s and Master’s degrees from the University of Guelph, and his Ph.D from the University of Victoria.
I co-authored a comment to the editors of the Globe and Mail, “More Surveillance Powers Won’t Prevent Intelligence Failures,” in response to Christian Leuprecht’s article “Pointing fingers won’t prevent intelligence failures“. Leuprecht asserts that further intelligence sharing is critical to prevent and avoid attacks such as those in Paris, that more trust between intelligence agencies to facilitate international intelligence sharing is needed, and that more resources are needed if particular individuals subject to state suspicion are to be monitored. He also asserted that governments need the powers to act against targeted individuals, and that unnamed ‘critics’ are responsible for the weakening of intelligence agencies and, by extension, for the senseless deaths of innocents that result from agencies’ inabilities to share, monitor, and engage suspicious persons.
The co-authored comment rebuts Leuprecht’s assertions. We point that there is more intelligence collected, now, than ever before. We note that some of the attackers were already known to intelligence and security services. And we note that it was intelligence sharing, itself, that led to the targeting and torture of Maher Arar. In effect, the intelligence community is failing in spite of having the capabilities and powers that Leuprecht calls for; what is missing, if anything, is the ability to transform the intelligence collected today into something that is actionable.
The full comment, first published at the Globe and Mail, is reproduced below:
The horrific attacks in Paris have led to a wave of finger-pointing – often powerfully disassociated from the realities of the failures (Pointing Fingers Won’t Prevent Intelligence Failures – Nov 25). The answer from security agencies is inevitably to request more surveillance and more capacity to intrude into citizens’ lives.
These requests are made despite the historically unprecedented access to digital information that security agencies already enjoy and repeated expansions of security powers. Clearly “more security” is not the answer to preventing all future attacks.
The intelligence failure in Paris painted a familiar picture. Many of the attackers were known to French officials, and Turkish intelligence agencies sent repeated warnings of another. Yet in their rush to blame communications technologies such as iPhone encryption and the PlayStation (claims since discredited), security agencies neglect the lack of adequate human intelligence resources and capacities needed to translate this digital knowledge into threat prevention. Also absent is attention to agency accountability – the unaddressed information-sharing problems that caused the mistaken targeting and torture of Maher Arar.
The targets of terror are not only physical, but also ideological. Introducing a laundry list of new powers in response to every incident without regard to the underlying causes will not prevent all attacks, but will leave our democracy in tatters.
Vincent Gogolek, Executive Director, BC Freedom of Information and Privacy Association (BCFIPA)
Tamir Israel, Staff Lawyer, Canadian Internet Policy & Public Interest Clinic (CIPPIC), University of Ottawa
Monia Mazigh, National Coordinator, International Civil Liberties Monitoring Group (ICLMG)
Christopher Parsons, Postdoctoral Fellow, Citizen Lab at Munk School of Global Affairs, University of Toronto
Sukanya Pillay, Executive Director & General Counsel, Canadian Civil Liberties Association (CCLA)
Laura Tribe, Digital Rights Specialist, OpenMedia
Micheal Vonn, Policy Director, British Columbia Civil Liberties Association (BCCLA)
The Canadian SIGINT Summaries includes downloadable copies, along with summary, publication, and original source information, of leaked CSE documents.
Parsons, Christopher; and Molnar, Adam. (2021). “Horizontal Accountability and Signals Intelligence: Lesson Drawing from Annual Electronic Surveillance Reports,” David Murakami Wood and David Lyon (Eds.), Big Data Surveillance and Security Intelligence: The Canadian Case.
Parsons, Christopher. (2015). “Stuck on the Agenda: Drawing lessons from the stagnation of ‘lawful access’ legislation in Canada,” Michael Geist (ed.), Law, Privacy and Surveillance in Canada in the Post-Snowden Era (Ottawa University Press).
Parsons, Christopher. (2015). “The Governance of Telecommunications Surveillance: How Opaque and Unaccountable Practices and Policies Threaten Canadians,” Telecom Transparency Project.
Parsons, Christopher. (2015). “Beyond the ATIP: New methods for interrogating state surveillance,” in Jamie Brownlee and Kevin Walby (Eds.), Access to Information and Social Justice (Arbeiter Ring Publishing).
Bennett, Colin; Parsons, Christopher; Molnar, Adam. (2014). “Forgetting and the right to be forgotten” in Serge Gutwirth et al. (Eds.), Reloading Data Protection: Multidisciplinary Insights and Contemporary Challenges.
Bennett, Colin, and Parsons, Christopher. (2013). “Privacy and Surveillance: The Multi-Disciplinary Literature on the Capture, Use, and Disclosure of Personal information in Cyberspace” in W. Dutton (Ed.), Oxford Handbook of Internet Studies.
McPhail, Brenda; Parsons, Christopher; Ferenbok, Joseph; Smith, Karen; and Clement, Andrew. (2013). “Identifying Canadians at the Border: ePassports and the 9/11 legacy,” in Canadian Journal of Law and Society 27(3).
Parsons, Christopher; Savirimuthu, Joseph; Wipond, Rob; McArthur, Kevin. (2012). “ANPR: Code and Rhetorics of Compliance,” in European Journal of Law and Technology 3(3).