I’ve published a report with the Citizen Lab, entitled, “Huawei and 5: Clarifying the Canadian Equities and Charting a Strategic Path Forward.” The report first provides a background to 5G and the Chinese telecommunications vendor, Huawei, as well as the activities that have been undertaken by Canada’s closest allies before delving into issues that have been raised about Huawei, its products, and its links to the Chinese government. At its core, the report argues that Canada doesn’t have a ‘Huawei problem’ per se, so much as a desperate need to develop a principled and integrated set of industrial, cybersecurity, and foreign policy strategies. The report concludes by providing a range of suggestions for some elements of such strategies, along the lines of how Canada might develop and protect its intellectual property, better manage trade issues, and develop stronger cybersecurity postures.Continue reading
Equity, inclusion and Canada’s COVID Alert app
The governments of Canada and Ontario announced the release of their COVID Alert exposure notification app on July 31. The application has been developed with privacy protection in mind, and has undergone governmental and private-sector reviews of its security and privacy. It has received high praise from many notable members of Canada’s privacy community, many of whom—myself included—have installed the application.
Despite this, the app still raises concerns of a non-technical nature – particularly when it comes to equity and inclusion.
COVID Alert App 101
COVID Alert can currently be used by residents of Ontario to receive exposure notifications. Canadian residents outside of Ontario can download the app but it won’t gain full functionality until their provincial heath authority joins the project. The application uses the exposure notification framework that was created by Google and Apple, and integrated into the companies’ respective operating systems.
COVID Alert does not collect:
- Your name or address;
- Your phone’s contacts;
- Your health information;
- The health information of people around you; or
- Your location.
A smartphone with the app installed will generate random codes every five minutes and transmit them using Bluetooth to any phone within two metres that also has the app installed. Your smartphone will retain a log of all the codes that have been received for 14 days; information is deleted after that period. If the code of a person who has tested positive for COVID-19, and has uploaded their status to a government server, is found to be proximate to your device for 15 minutes or more, your device will notify you. At no point does the app collect any person’s name or the places they have visited; if you receive an exposure notification, neither the app nor the government can tell you who tested positive for COVID-19 or where you were potentially exposed to the disease. (For a more far more detailed overview of how Apple and Google’s exposure notification framework operates, see Hussein Nasser’s explainer video.)
The server will normally retain data for three months when devices contact the server, or up to two years if suspicious activity is identified. Access to these logs are highly restricted to authorized users who are bound to security obligations to protect, and not misuse, the data.
In addition to strong technical safeguards associated with the Apple-Google framework, the federal and Ontario privacy commissioners conducted their own privacy reviews of the app. The app’s developers spent a significant amount of time ensuring it was maximally accessible to Canadianswho may have visual, auditory or other physical impairments. Both the Canadian Centre for Cyber Security and BlackBerry Security have assessed the application’s security, and a formal vulnerability disclosure process for the application has been created. Finally, the Canadian government has established an Advisory Council composed of members of industry, academia and civil society, and is developing a framework to define and evaluate the app’s effectiveness, which will include an audit by the Office of the Privacy Commissioner of Canada and Health Canada later this year. If the app if found to be ineffective it will be decommissioned.
Considering all of this, the Canadian government and its provincial partner are to be congratulated on learning from many of the lessons of their international peers by collecting a minimum amount of data, developing a secure app and subjecting themselves and the Covid Alert app to substantial accountability checks.
Access and Equity Issues Remain
As I wrote at the onset of the pandemic, any COVID-19 apps must be developed with social inclusivity in mind. Technologies are inherently political in nature and their design, in part, defines what is and isn’t normal behaviour, what its use cases are, and what social norms govern its use. Inclusive policy design should accompany technologies that are intended to be used throughout society; at minimum, policy-makers should ask: Who is this technology designed for? What is this technology specifically intended to do or change in society? Who is included or excluded from using this technology? And, how might this technology detrimentally affect some members of society? It is this set of questions that brings some of the limitations of the COVID Alert app to the fore.
The COVID Alert application is designed for Canadians who own sufficiently recent smartphones; this means that people lacking such smartphones are excluded from using the app. A June 2020 study from Ryerson University’s Cybersecure Policy Exchange showed that 26 per cent of households earning less than $20,000, and about the same percentage of people over 60 years old, lack a smartphone. Similarly, people who identify as Black, Indigenous and people of colour tend to be less affluent and, as such, are less likely to own smartphones capable of installing the application. All of the aforementioned groups — the less economically advantaged, the elderly and racialized communities — have tended to disproportionately suffer the effects of COVID-19.
The COVID Alert app is designed to achieve positive social goods — to mitigate the spread of disease — but there are live questions about an app’s ability to accomplish this goal. A team from Oxford University developed a model in April 2020 that found that approximately 60 per cent of the U.K.’s general population would need to install an app for it to be fully effective; this measures out to approximately 80 per cent of all smartphone users in that country. A lower adoption rate may still potentially help to inhibit the spread of COVID-19, but at less dramatic rates.
Beyond questions of the actual efficacy of any given app, there are also potential unintended consequences that might disproportionately affect those who enjoy less privilege in Canadian society. First, carding is a pernicious problem in Canada and there is a risk that law enforcement officers, or other public officers, might demand to see a person’s app to assess whether that person has been exposed to COVID-19. With an unlocked device in hand, officers could search through the device for potentially incriminating materials they otherwise would not have been able to access; these kinds of activities would be a continuation of the enhanced and often illegal searches that Black-identifying Canadians are often subjected to. A recent report from the Canadian Civil Liberties Association found that law enforcement agencies have disproportionately applied law throughout the pandemic to “Black, Indigenous, and other racialized groups, those with precarious housing, recent immigrants, youth, members of the LGBTQ2S community, and certain religious minorities.” It is reasonable to worry that over-policing will extend to so-called “exposure checks” that then turn into smartphone fishing expeditions.
Second, private organizations, such as businesses, may also demand that individuals reveal their COVID-19 exposure status before entering workplaces. Some individuals, such as those who cannot afford a sufficiently up-to-date smartphone or who have lost their phone and cannot afford to replace it, may be denied access to employment. Similarly, if showing one’s COVID-19 status is a prerequisite to entering a shop, these same people may be denied access to grocery stores, pharmacies or other essential businesses.
Some Canadians may regard the aforementioned risks as merely theoretical, or as too high a bar to climb in a time of crisis. Such a response, however, misses the very point: the potential harms are linked to implicit social biases and structural inequality that means some in Canadian society have to worry about these risks, whereas others do not. When Canadian leaders assert that they want to build more inclusive societies, the aforementioned issues associated with the COVID app lay bare social inequity and demonstrate the need for government to explain how it expects to ameliorate these inequities through policy and law. Ignoring these inequities is not an option for a truly inclusive society.
COVID Alert and Inclusive Policy
In the excellent accessibility documentation that accompanies the COVID Alert app, the Canadian Digital Service acknowledges that:
“Some people may have phones or operating systems that do not support downloading the app. And some people may not have smart phones at all. Many people may not have affordable access to the Internet, and the app needs an Internet connection at least once a day to work. … COVID Alert is one part of our public health effort to limit COVID-19. The app does not replace manual contact tracing by local public health authorities. Manual contact tracing is available to everyone in Canada, along with other important resources.”
This acknowledgement is important, and positive, insofar as it showcases that the developers recognize the app’s shortcomings and make clear that other resources are available to Canadians to mitigate the spread of COVID-19. But the governments of Canada and Ontario can go much further to address these limitations, as well as the potential harms linked with the COVID Alert app.
First, governments of Canada can pass legislation that bars public officials, as well as private individuals or organizations, from demanding that individuals install the application or compelling individuals to disclose any information from their COVID-19 app. This legislation could make it a criminal offence to issue such a request in order to prevent police, social workers, landlords, retail staff or others from conducting “exposure checks” that can be used to discriminate against minority populations or less advantaged members of society. Not only would such legislation bar bad behaviour by punishing individuals who inappropriately access information on smartphones, but it might increase trust in the application by firmly giving individuals genuine control over the information held in the app.
Second, the federal and provincial governments can rapidly explain how they will ensure that there is equity in the kinds of health responses that are provided to all Canadians, including those who are less affluent or privileged. Given that governments are unlikely to supply less-advantaged residents of Canada with smartphones that can run the COVID-19 app or subsidize their purchase, the government could explain what other policies will be implemented to ensure that all Canadians enjoy health monitoring; this might, as an example, include increased availability of testing in less affluent communities, focused public outreach conducted through local health authorities and community groups, or broader efforts to meaningfully invest in the social determinants of health that are known to increase health resiliency.
Third, and relatedly, the governments should rapidly release information about how, specifically, the federal and provincial departments of health will assess the success or efficacy of the COVID Alert app. Canadians deserve to know how the government is modelling success and failure, and how the government is accounting for the fact that many less affluent and older residents of Canada lack smartphones capable of installing the COVID Alert app. Without clear success or failure criteria, the COVID Alert app risks becoming a prop in “pandemic theatre” as opposed to a demonstrably effective tool to mitigate the spread of the disease. Given that public and private groups had time to assess the app’s privacy and security properties, it is shocking that health officials have yet to explain how the app’s utility should be measured.
In summary, the technical teams that developed the application, the bodies responsible for assessing the app’s security, and the privacy commissioners’ offices have all performed admirably. The overlapping accountability regimes surrounding the app should provide confidence to Canadians that the app itself will not be used to nefariously collect data, and the app will be decommissioned once shown to be ineffective or no longer needed. But more is needed. Governments that have committed to inclusive policy design must go beyond making the design of the technology accessible, to making it accessible for all people to either safely access and use, or to have access to equivalent public health protections. Governments in Canada must focus on building up trust and proving that public health efforts are being designed to protect all residents of Canada, and especially those most detrimentally affected by the pandemic. The time for action is now.
(This article was first published by First Policy Response.)
The Information Security Cultures of Journalism
I’ve had the pleasure to work with a series of colleagues over the past few years to assess and better understand the nature of security practices which are adopted by journalists around the world. Past outputs from this work have included a number of talk, an academic article by one of my co-authors Lokman Tsui, as well as a Columbia Journalism Review article by Joshua Oliver. Most recently, a collection of us have published an article entitled, “The Information Security Cultures of Journalism” with Digital Journalism.
This article is an exploratory study of the influence of beat and employment status on the information security culture of journalism (security-related values, mental models, and practices that are shared across the profession). The study is based on semi-structured interviews with 16 journalists based in Canada in staff or freelance positions working on investigative or non-investigative beats. We find that journalism has a multitude of security cultures that are influenced by beat and employment status. The perceived need for information security is tied to perceptions of sensitivity for a particular story or source. Beat affects how journalists perceive and experience information security threats. Investigative journalists are concerned with surveillance and legal threats from state actors including law enforcement and intelligence agencies. Non-investigative journalists are more concerned with surveillance, harassment, and legal actions from companies or individuals. Employment status influences the perceived ability of journalists to effectively implement information security. Based on these results we discuss how journalists and news organisations can develop effective security cultures and raise information security standards.
We Chat, They Watch: How International Users Unwittingly Build up WeChat’s Chinese Censorship Apparatus
Over the past several months I’ve had the distinct honour to work with, and learn from, a number of close colleagues and friends on the topic of surveillance and censorship that takes place on WeChat. We have published a report with the Citizen Lab entitled, “We Chat, They Watch: How International Users Unwittingly Build up WeChat’s Chinese Censorship Apparatus.” The report undertook a mixed methods approach to understand how non-China registered WeChat accounts were subjected to surveillance which was, then, used to develop a censorship list that is applied to users who have registered their accounts in China. Specifically, the report:
- Presents results from technical experiments which reveal that WeChat communications conducted entirely among non-China-registered accounts are subject to pervasive content surveillance that was previously thought to be exclusively reserved for China-registered accounts.
- Documents and images transmitted entirely among non-China-registered accounts undergo content surveillance wherein these files are analyzed for content that is politically sensitive in China.
- Upon analysis, files deemed politically sensitive are used to invisibly train and build up WeChat’s Chinese political censorship system.
- From public information, it is unclear how Tencent uses non-Chinese-registered users’ data to enable content blocking or which policy rationale permits the sharing of data used for blocking between international and China regions of WeChat.
- Tencent’s responses to data access requests failed to clarify how data from international users is used to enable political censorship of the platform in China.
You can download the report as a pdf, or read it on the Web in its entirety at the Citizen Lab’s website. There is also a corresponding FAQ to quickly answer questions that you may have about the report.