I raised a number of questions about the nature of the Agency’s incident, and its implications, in a post I published earlier in 2021. In this post, I provide an update as well as some further analysis of the incident based on the information that NSIRA revealed in August 2021.
I begin by outlining the additional details that NSIRA has provided about the incident and juxtapose that information with what has been provided by the Canadian Centre for Cyber Security (CCCS) about the Microsoft Exchange vulnerability that led to NSIRA’s incident. I note that NSIRA (or the team(s) responsible for securing its networks) seems to have failed to either patch NSIRA’s on-premises Exchange server when the vulnerability was first announced, or they were unable to successfully implement mitigation measures intended to prevent the exploitation of the server. The result was employee information was obtained by an unauthorized party.
Next, I note the extent to which NSIRA’s update responds to the initial questions I raised when writing about this incident in April 2021. On the whole, most of the questions I raised have been answered to at least some extent.
I conclude by discussing the significance of the information that was exfiltrated from NSIRA, the likelihood that a nation-state actor either conducted the operation or now has access to the exfiltrated data, what this incident may suggest for NSIRA’s IT security, and finally raise questions about NSIRA’s decommissioning of its Protected networks.
On August 5, 2021, Apple announced that it would soon begin conducting pervasive surveillance of the devices that it sells in a stated intent to expand protections for children. The company announced three new features. The first will monitor for children sending or receiving sexually explicit images using the Messages application. The second will monitor for the presence of Child Sexual Abuse Material (CSAM) in iCloud Photos. The third will monitor for searches pertaining to CSAM. These features are planned to be activated in the United States in the next versions of Apple’s operating systems which will ship to end-users in the fall of 2021.
In this post I focus exclusively on the surveillance of iCloud Photos for CSAM content. I begin with a background of Apple’s efforts to monitor for CSAM content on their services before providing a description of the newly announced CSAM surveillance system. I then turn to outline some problems, complications, and concerns with this new child safety feature. In particular, I discuss the challenges facing Apple in finding reputable child safety organizations with whom to partner, the potential ability to region-shift to avoid the surveillance, the prospect of the surveillance system leading to ongoing harms towards CSAM survivors, the likelihood that Apple will expand the content which is subject to the company’s surveillance infrastructure, and the weaponization of the CSAM surveillance infrastructure against journalists, human rights defenders, lawyers, opposition politicians, and political dissidents. I conclude with a broader discussion of the problems associated with Apple’s new CSAM surveillance infrastructure.
On August 5, 2021, Apple announced that it would soon begin conducting pervasive surveillance of devices that they sell with a stated intent of expanding protections for children. The company announced three new features. The first will monitor for children sending or receiving sexually explicit images over the Messages application, the second will monitor for the reception or collection of Child Sexual Abuse Material (CSAM), and the third will monitor for searches pertaining to CSAM. These features are planned to be activated in the next versions of Apple’s mobile and desktop operating systems which will ship to end-users in the fall of 2021.
In this post I focus exclusively on the surveillance of children’s messages to detect whether they are receiving or sending sexually explicit images. I begin with a short discussion of how Apple has described this system and spell out the rationales for it, and then proceed to outline some early concerns with how this feature might negatively affect children and adults alike. Future posts will address the second and third child safety features that Apple has announced, as well as broader problems associated with Apple’s unilateral decision to expand surveillance on its devices.
Sexually Explicit Image Surveillance in Messages
Apple currently lets families share access to Apple services and cloud storage using Family Sharing. The organizer of the Family Sharing plan can utilize a number of parental controls to restrict the activities that children who are included in a Family Sharing plan can perform. Children, for Apple, include individuals who are under 18 years of age.
Upon the installation of Apple’s forthcoming mobile and desktop operating systems, children’s communications over Apple’s Messages application can be analyzed to assess if the content of the communications include sexually explicit images, if this analysis feature is enabled in Family Sharing. Apple’s analysis of images will occur on-device and Apple will not be notified of whether an image is sexually explicit. Should an image be detected it will initially be blurred out, and if a child wants to see the image they must proceed through either one or two prompts, depending on their age and how their parents have configured the parental management settings.
In this post I briefly discuss some of the highlights of the report and offer some productive criticism concerning who the report and its guidance is directed at, and the ability for individuals to act on the provided guidance. The report ultimately represents a valuable contribution to efforts to increase the awareness of national security issues in Canada and, on that basis alone, I hope that CSIS and other members of Canada’s intelligence and security community continue to publish these kinds of reports.
The report generally outlines a series of foreign interference-related threats that face Canada, and Canadians. Foreign interference includes, “attempts to covertly influence, intimidate, manipulate, interfere, corrupt or discredit individuals, organizations and governments to further the interests of a foreign country” and are, “carried out by both state and non-state actors” towards, “Canadian entities both inside and outside of Canada, and directly threaten national security” (Page 5). The report is divided into sections which explain why Canada and Canadians are targets of foreign interference, the types of foreign states’ goals, who might be targeted, and the techniques that might be adopted to apply foreign interference and how to detect and avoid such interference. The report concludes by discussing some of the election-specific mechanisms that have been adopted by the Government of Canada to mitigate the effects and effectiveness of foreign interference operations.
On the whole this is a pretty good overview document. It makes a good academic teaching resource, insofar as it provides a high-level overview of what foreign interference can entail and would probably serve as a nice kick off to discuss the topic of foreign interference more broadly.2
On Monday, the Canadian government imposed mandatory national security risk assessments on scholarly research. The new rules apply to projects that receive funding from the Natural Sciences and Engineering Research Council (NSERC) and involve foreign researchers or private-sector organizations. The stated intent of the assessments is to prevent intellectual property from being stolen and ensure that Canadian researchers do not share industrial, military or intelligence secrets with foreign governments or organizations to the detriment of Canadian interests. But they will chill research and scholarly training, accentuate anti-immigrant biases and may amplify national security problems.
In brief, these assessments add an analysis of national security issues into the process of funding partnerships by compelling researchers to evaluate whether their work is “sensitive.” Cutting-edge topics that are considered sensitive include artificial intelligence, biotechnology, medical technology, quantum science, robotics, autonomous systems and space technology. Amongst other criteria, researchers must also assess risks posed by partners, including whether they might disclose information to other groups that could negatively affect Canada’s national security, whether they could be subject to influence from foreign governments or militaries, or if they lack clear explanations for how or why they can supplement funding from NSERC.
If a researcher or their team cannot state there are no risks, they must itemize prospective risks, even in cases where they must speculate. Mitigation processes must explain what security protocols will be established, how information might be restricted on a need-to-know basis, or how collaborators will be vetted. Government documents specifically warn researchers to take care when working with members of the university research community, such as contractors, employees or students.
Whenever research is assessed as raising national security concerns, it may be reviewed by NSERC and Canada’s national security agencies, and research programs may need to be modified or partners abandoned before funding will be released.
These assessments will chill Canadian research. Consider Canadian university professors who are working on artificial intelligence research, but who hold Chinese citizenship and thus could potentially be subject to compulsion under China’s national security legislation. Under the assessment criteria, it would seem that such researchers are now to be regarded as inherently riskier than colleagues who pursue similar topics, but who hold Canadian, American or European citizenship. The assessments will almost certainly reify biases against some Canadian researchers on the basis of their nationality, something that has become commonplace in the United States as Chinese researchers have increasingly been the focus of U.S. security investigations.
Students who could potentially be directly or indirectly compelled by their national governments may now be deemed a threat to Canada’s national security and interests. Consequently, international students or those who have families outside of Canada might be kept from fully participating on professors’ research projects out of national security concerns and lose out on important training opportunities. This stigma may encourage international students to obtain their education outside of Canada.
These assessments may create more problems than they solve. Some Canadian researchers with foreign citizenships might apply for foreign funding to avoid national security assessments altogether. But they may also be motivated to conceal this fact for fear of the suspicion that might otherwise accompany the funding, especially based on how their American counterparts have been targeted in FBI-led investigations. Foreign intelligence services look for individuals who have something to hide to exploit such vulnerabilities. In effect, these assessments may amplify the prospect that researchers will be targeted for recruitment by foreign spy agencies and exacerbate fears of foreign espionage and illicit acquisition of intellectual property.
What must be done? If the government insists on applying these assessments, then NSERC must commit to publishing annual reports explaining how regularly research is assessed, the nature of the assessed research, rationales for assessments and the outcomes. Canada’s national security review agencies will also have to review NSERC’s assessments to ensure that the results are based in fact, not suspicion or bias. Researchers can and should complain to the review agencies and the news media if they believe that any assessment is inappropriate.
Ultimately, Canadian university leaders must strongly oppose these assessments as they are currently written. The chill of national security threatens to deepen suspicions towards some of our world-leading researchers and exceptional international students, and those running universities must publicly stand up for their communities. Their universities’ status as being open and inclusive – and being independent, world-leading research bodies – depends on their advocacy.
CSE potentially violated the Privacy Act, which governs how federal government institutions handle personal information.
The CSE’s assistance to the Canadian Security Intelligence Service (CSIS) was concealed from the Federal Court. The Court was responsible for authorizing warrants for CSIS operations that the CSE was assisting with.
CSE officials may have misled Parliament in explaining how the assistance element of its mandate was operationalized in the course of debates meant to extend CSE’s capabilities and mandate.
In this post I describe the elements of the review, a few key parts of CSE’s response it, and conclude with a series of issues that the review and response raise.
Under the National Defence Act, CSE would incidentally collect CII in the course of conducting foreign signals intelligence, cybersecurity and information assurance, and assistance operations. From all of those operations, it would produce reports that were sent to clients within the Government of Canada. By default, Canadians’ information is expected to be suppressed but agencies can subsequently request CSE to re-identify suppressed information.
NSIRA examined disclosures of CII which took place between July 1, 2015 – July 31, 2019 from CSE to all recipient government departments; this meant that all the disclosures took place when the CSE was guided by the National Defense Act and the Privacy Act.1 In conducting their review NSIRA looked at, “electronic records, correspondence, intelligence reports, legal opinions, policies, procedures, documents pertaining to judicial proceedings, Ministerial Authorizations, and Ministerial Directives of relevance to CSE’s CII disclosure regime” (p. 2). Over the course of its review, NSIRA engaged a range of government agencies that requested disclosures of CII, such as the Royal Canadian Mounted Police (RCMP) and Innovation Science and Economic Development Canada (ISED). NSIRA also assessed the disclosures of CII to CSIS and relevant CSIS’ affidavits to the Federal Court.