Technology, Thoughts & Trinkets

Touring the digital through type

Tag: encryption (page 1 of 4)

Shining a Light on the Encryption Debate: A Canadian Field Guide

The Citizen Lab and the Canadian Internet Policy and Public Interest Clinic (CIPPIC) have released a joint collaborative report, “Shining a Light on the Encryption Debate: A Canadian Field Guide,” which was written by Lex Gill, Tamir Israel, and myself. We argue that access to strong encryption is integral to the defense of human rights in the digital era. Encryption technologies are also essential to securing digital transactions, securing public safety, and protecting national security interests. Unfortunately, many state agencies have continues to argue that encryption poses insurmountable or unacceptable barriers to their investigative- and intelligence-gathering activities. In response, some governments have advanced irresponsible encryption policies that would limit the public availability and use of secure, uncompromised encryption technologies.

Our report examines this encryption debate, paying particular attention to the Canadian context. It provides insight and analyses for policy makers, lawyers, academics, journalists, and advocates who are trying to understand encryption technologies and the potential viability and consequences of different policies pertaining to encryption.

Section One provides a brief primer on key technical principles and concepts associated with encryption in the service of improving policy outcomes and enhancing technical literacy. In particular, we review the distinction between encryption at rest and in transit, the difference between symmetric and asymmetric encryption systems, the issue of end-to-end encryption, and the concept of forward secrecy. We also identify some of the limits of encryption in restricting the investigative or intelligence-gathering objectives of the state, including in particular the relationship between encryption and metadata.

Section Two explains how access to strong, uncompromised encryption technology serves critical public interest objectives. Encryption is intimately connected to the constitutional protections guaranteed by the Canadian Charter of Rights and Freedoms as well as those rights enshrined in international human rights law. In particular, encryption enables the right to privacy, the right to freedom of expression, and related rights to freedom of opinion and belief. In an era where signals intelligence agencies operate with minimal restrictions on their foreign facing activities, encryption remains one of the few practical limits on mass surveillance. Encryption also helps to guarantee privacy in our personal lives, shielding individuals from abusive partners, exploitative employers, and online harassment. The mere awareness of mass surveillance exerts a significant chilling effect on freedom of expression. Vulnerable and marginalized groups are both disproportionately subject to state scrutiny and may be particularly vulnerable to these chilling effects. Democracies pay a particularly high price when minority voices and dissenting views are pressured to self-censor or refrain from participating in public life. The same is true when human rights activists, journalists, lawyers, and others whose work demands the ability to call attention to injustice, often at some personal risk, are deterred from leveraging digital networks in pursuit of their activities. Unrestricted public access to reliable encryption technology can help to shield individuals from these threats. Efforts to undermine the security of encryption in order to facilitate state access, by contrast, are likely to magnify these risks. Uncompromised encryption systems can thus foster the security necessary for meaningful inclusion, democratic engagement, and equal access in the digital sphere.

Section Three explores the history of encryption policy across four somewhat distinct eras, with a focus on Canada to the extent the Canadian government played an active role in addressing encryption. The first era is characterized by the efforts of intelligence agencies such as the United States National Security Agency (NSA) to limit the public availability of secure encryption technology. In the second era of the 1990s, encryption emerged as a vital tool for securing electronic trust on the emerging web. In the third era—between 2000 and 2010—the development and proliferation of strong encryption technology in Canada, the United States, and Europe progressed relatively unimpeded. The fourth era encompasses from 2011 to the present day where calls to compromise, weaken, and restrict access to encryption technology have steadily reemerged.

Section Four reviews the broad spectrum of legal and policy responses to government agencies’ perceived encryption “problem,” including historical examples, international case studies, and present-day proposals. The section provides an overview of factors which may help to evaluate these measures in context. In particular, it emphasizes questions related to: (1) whether the proposed measure is truly targeted and avoids collateral or systemic impacts on uninvolved parties; (2) whether there is an element of conscription or compelled participation which raises an issue of self-incrimination or unfairly impacts the interests of a third party; and (3) whether, in considering all the factors, the response remains both truly necessary and truly proportionate. The analysis of policy measures in this sections proceeds in three categories. The first category includes measures designed to limit the broad public availability of effective encryption tools. The second category reviews measures that are directed at intermediaries and service providers. The third category focuses on efforts that target specific encrypted devices, accounts, or individuals.

Section Five examines the necessity of proposed responses to the encryption “problem.” A holistic and contextual analysis of the encryption debate makes clear that the investigative and intelligence costs imposed by unrestricted public access to strong encryption technology are often overstated. At the same time, the risks associated with government proposals to compromise encryption in order to ensure greater ease of access for state agencies are often grossly understated. When weighed against the profound costs to human rights, the economy, consumer trust, public safety, and national security, such measures will rarely—if ever—be proportionate and almost always constitute an irresponsible approach to encryption policy. In light of this, rather than finding ways to undermine encryption, the Government of Canada should make efforts to encourage the development and adoption of strong and uncompromised technology.

DOWNLOAD THE FULL REPORT

Project Support

This research was led by the Citizen Lab at the Munk School of Global Affairs, University of Toronto, as well as the Canadian Internet Policy and Public Interest Clinic (CIPPIC) at the University of Ottawa. This project was funded, in part, by the John D. And Catherine T. MacArthur Foundation and the Ford Foundation.

The authors would like to extend their deepest gratitude to a number of individuals who have provided support and feedback in the production of this report, including (in alphabetical order) Bram Abramson, Nate Cardozo, Masashi Crete-Nishihata, Ron Deibert, Mickael E.B., Andrew Hilts, Jeffrey Knockel, Adam Molnar, Christopher Prince, Tina Salameh, Amie Stepanovich, and Mari Jing Zhou. Any errors remain the fault of the authors alone.

We are also grateful to the many individuals and organizations who gave us the opportunity to share early versions of this work, including Lisa Austin at the Faculty of Law (University of Toronto); Vanessa Rhinesmith and David Eaves at digital HKS (Harvard Kennedy School); Ian Goldberg and Erinn Atwater at the Cryptography, Security, and Privacy (CrySP) Research Group (University of Waterloo); Florian Martin-Bariteau at the Centre for Law, Technology and Society (University of Ottawa); and the Citizen Lab Summer Institute (Munk School of Global Affairs, University of Toronto).

Authors

Lex Gill is a Citizen Lab Research Fellow. She has also served as the National Security Program Advocate to the Canadian Civil Liberties Association, as a CIPPIC Google Policy Fellow and as a researcher to the Berkman Klein Center for Internet & Society at Harvard University. She holds a B.C.L./LL.B. from McGill University’s Faculty of Law.

Tamir Israel is Staff Lawyer at the Samuelson-Glushko Canadian Internet Policy & Public Interest Clinic at the University of Ottawa, Faculty of Law. He leads CIPPIC’s privacy, net neutrality, electronic surveillance and telecommunications regulation activities and conducts research and advocacy on a range of other digital rights-related topics.

Christopher Parsons is currently a Research Associate at the Citizen Lab, in the Munk School of Global Affairs with the University of Toronto as well as the Managing Director of the Telecom Transparency Project at the Citizen Lab. He received his Bachelor’s and Master’s degrees from the University of Guelph, and his Ph.D from the University of Victoria.

More Surveillance Powers Won’t Prevent Intelligence Failures

Newspapers B&W (5)I co-authored a comment to the editors of the Globe and Mail, “More Surveillance Powers Won’t Prevent Intelligence Failures,” in response to Christian Leuprecht’s article “Pointing fingers won’t prevent intelligence failures“. Leuprecht asserts that further intelligence sharing is critical to prevent and avoid attacks such as those in Paris, that more trust between intelligence agencies to facilitate international intelligence sharing is needed, and that more resources are needed if particular individuals subject to state suspicion are to be monitored. He also asserted that governments need the powers to act against targeted individuals, and that unnamed ‘critics’ are responsible for the weakening of intelligence agencies and, by extension, for the senseless deaths of innocents that result from agencies’ inabilities to share, monitor, and engage suspicious persons.

The co-authored comment rebuts Leuprecht’s assertions. We point that there is more intelligence collected, now, than ever before. We note that some of the attackers were already known to intelligence and security services. And we note that it was intelligence sharing, itself, that led to the targeting and torture of Maher Arar. In effect, the intelligence community is failing in spite of having the capabilities and powers that Leuprecht calls for; what is missing, if anything, is the ability to transform the intelligence collected today into something that is actionable.

The full comment, first published at the Globe and Mail, is reproduced below:

More Surveillance Powers Won’t Prevent Intelligence Failures
Re: “Pointing Fingers Won’t Prevent Intelligence Failures” (Nov 25):

The horrific attacks in Paris have led to a wave of finger-pointing – often powerfully disassociated from the realities of the failures (Pointing Fingers Won’t Prevent Intelligence Failures – Nov 25). The answer from security agencies is inevitably to request more surveillance and more capacity to intrude into citizens’ lives.

These requests are made despite the historically unprecedented access to digital information that security agencies already enjoy and repeated expansions of security powers. Clearly “more security” is not the answer to preventing all future attacks.

The intelligence failure in Paris painted a familiar picture. Many of the attackers were known to French officials, and Turkish intelligence agencies sent repeated warnings of another. Yet in their rush to blame communications technologies such as iPhone encryption and the PlayStation (claims since discredited), security agencies neglect the lack of adequate human intelligence resources and capacities needed to translate this digital knowledge into threat prevention. Also absent is attention to agency accountability – the unaddressed information-sharing problems that caused the mistaken targeting and torture of Maher Arar.

The targets of terror are not only physical, but also ideological. Introducing a laundry list of new powers in response to every incident without regard to the underlying causes will not prevent all attacks, but will leave our democracy in tatters.

Vincent Gogolek, Executive Director, BC Freedom of Information and Privacy Association (BCFIPA)

Tamir Israel, Staff Lawyer, Canadian Internet Policy & Public Interest Clinic (CIPPIC), University of Ottawa

Monia Mazigh, National Coordinator, International Civil Liberties Monitoring Group (ICLMG)

Christopher Parsons, Postdoctoral Fellow, Citizen Lab at Munk School of Global Affairs, University of Toronto

Sukanya Pillay, Executive Director & General Counsel, Canadian Civil Liberties Association (CCLA)

Laura Tribe, Digital Rights Specialist, OpenMedia

Micheal Vonn, Policy Director, British Columbia Civil Liberties Association (BCCLA)

Photo credit: Newspapers B&W (5) by Jon S (CC BY 2.0) https://flic.kr/p/ayGkBN

Canada’s Quiet History Of Weakening Communications Encryption

500995147_6c97aab488_o-300x225American and British officials have been warning with an increasing sense of purported urgency that their inability to decrypt communications could have serious consequences. American authorities have claimed that if they cannot demand decrypted communications from telecommunications providers then serious crimes may go unsolved. In the UK this danger is often accentuated by the threat of terrorism. In both nations, security and policing services warn that increased use of encryption is causing communications to ‘go dark’ and thus be inaccessible to policing and security services. These dire warnings of the threats potentially posed by criminals and terrorists ‘going dark’ have been matched over the years with proposals that would regulate encryption or mandate backdoors into any otherwise secure system. Comparatively little has been said about Canada’s long-standing efforts to inhibit end-user encryption despite the federal government’s longstanding efforts to restrict the security provided to Canadians by encryption.

This article outlines some of the federal government of Canada’s successful and unsuccessful attempts to weaken cryptographic standards. It starts by explaining (in brief) what communications encryption is, why it matters, and the implications of enabling unauthorized parties to decrypt communications. With this primer out of the way, we discuss why all of Canada’s mobile telecommunications carriers agree to implement cryptographic weaknesses in their service offerings. Next, we discuss the legislation that can be used to compel telecommunications service providers to disclose decryption keys to government authorities. We then briefly note how Canada’s premier cryptologic agency, the Communications Security Establishment (CSE), successfully compromised global encryption standards. We conclude the post by arguing that though Canadian officials have not been as publicly vocal about a perceived need to undermine cryptographic standards the government of Canada nevertheless has a history of successfully weakening encryption available to and used by Canadians.

Continue reading

Advancing Encryption for the Masses

CryptographyEdward Snowden’s revelations have made it incredibly obvious that signals intelligence agencies have focused a lot of their time and energy in tracking people as they browse the web. Such tracking is often possible at a global scale because so much of the data that crosses the Internet is unencrypted. Fortunately, the ease of such surveillance is being curtailed by large corporations and advocacy organizations alike.

Today, WhatsApp and Open Whisper Systems announced they have been providing, and will continue to deploy, what’s called ‘end to end’ encryption to WhatsApp users. This form of encryption ensures that the contents of subscribers’ communications are be secured from third-party content monitoring as it transits from a sender’s phone to a recipient’s device.

As a result of these actions, WhatsApp users will enjoy a massive boost in their communications security. And it demonstrates that Facebook, the owner of WhatsApp, is willing to enhance the security of its users even when such actions are likely to provoke and upset surveillance-hawks around the world who are more interested in spying on Facebook and WhatsApp subscribers than in protecting them from surveillance.

A separate, but thematically related, blog post the Electronic Frontier Foundation announced the creation of a new Certificate Authority (CA) initiative called ‘Let’s Encrypt’. Partnering with the Electronic Frontier Foundation are Mozilla, Cisco, Akamai, Identrust, and researchers at the University of Michigan. CAs issue the data files that are used to cryptographically secure communications between clients (like your web browser) and servers (like EFF.org). Such encryption makes it more challenging for another party to monitor what you are sending to, and receiving from, a server you are visiting.

Key to the ‘Let’s Encrypt’ initiative is that the issued certificates will be free and installable using a script. The script is meant to automate the process of requesting, configuring, and installing the certificate. Ideally, this will mean that people with relatively little experience will be able to safely and securely set up SSL-protected websites. Academic studies have shown that even those with experience routinely fail to properly configure SSL-protections.

The aim of both of these initiatives is to increase the ‘friction’, or relative difficulty, in massively monitoring chat and web-based communications. However, it is important to recognize that neither initiative can be considered a perfect solution to surveillance.

In the case of WhatsApp and Open Whisper Systems, end to end encryption does not fix the broader problems of mobile security: if an adversary can take control of a mobile device, or has a way of capturing text that is typed into or that is displayed on the screen when you’re using WhatsApp, then any message sent or received by the device could be susceptible to surveillance. However, there is no evidence that any government agency in the world has monitored, or is currently capable of monitoring, millions or billions of devices simultaneously. There is evidence, however, of government agencies aggressively trying to monitor the servers and Internet infrastructure that applications like WhatsApp use in delivering messages between mobile devices.

Moreover, it’s unclear what Facebook’s or WhatsApp’s reaction would be if a government agency tried to force the delivery of a cryptographically broken or weakened version of WhatsApp to particular subscribers using orders issued by American, European, or Canadian courts. And, even if the companies in question fought back, what would they do if they lost the court case?

Similarly, the ‘Let’s Encrypt’ initiative relies on a mode of securing the Internet that is potentially susceptible to state interference. Governments or parties affiliated with governments have had certificates falsely issued in order to monitor communications between client devices (e.g. smartphones) and servers (e.g. Gmail). Moreover, professional developers have misconfigured commerce backends to the effect of not checking whether the certificate used to encrypt a communication belong to the right organization (i.e. not checking that the certificate used to communicate with Paypal actually belongs to Paypal). There are other issues with SSL, including a poor revocation checking mechanism, historical challenges in configuring it properly, and more. Some of these issues may be defrayed by the ‘Let’s Encrypt’ initiative because of the members’  efforts to work with the Decentralized SSL Observatory, scans.io, and Google’s Certificate Authority logs, but the initiative — and the proposals accompanying it — is not a panacea for all of the world’s online encryption problems. But it will hopefully make it more difficult for global-scale surveillance that is largely predicated on monitoring unencrypted communications between servers and clients.

Edward Snowden was deeply concerned that the documents he brought to light would be treated with indifference and that nothing would change despite the documents’ presence in the public record. While people may be interested in having more secure, and more private, communications following his revelations those interests are not necessarily translated into an ability for people to secure their communications. And the position that people must either embark on elaborate training regimes to communicate securely or just not say sensitive things, or visit sensitive places, online simply will not work: information security needs to work with at least some of the tools that people are using in their daily lives while developing new and secure ones. It doesn’t make sense to just abandon the public to their own devices while the ‘professionals’ use hard-to-use ’secured’ systems amongst themselves.

The work of WhatsApp, Facebook, Open Whisper Systems, the Electronic Frontier Foundation, and that other members of the ‘Let’s Encrypt’ initiative can massively reduce the challenges people face when trying to communicate more responsibly. And the initiatives demonstrate how the cryptographic and communications landscape is shifting in the wake of Snowden’s revelations concerning the reality of global-scale surveillance. While encryption was ultimately thrown out of the original design specifications for the Internet it’s great to see that cryptography is starting to get bolted onto the existing Internet in earnest.

Older posts