
On August 5, 2021, Apple announced that it would soon begin conducting pervasive surveillance of the devices that it sells in a stated intent to expand protections for children. The company announced three new features. The first will monitor for children sending or receiving sexually explicit images using the Messages application. The second will monitor for the presence of Child Sexual Abuse Material (CSAM) in iCloud Photos. The third will monitor for searches pertaining to CSAM. These features are planned to be activated in the United States in the next versions of Apple’s operating systems which will ship to end-users in the fall of 2021.
In this post I focus exclusively on the surveillance of iCloud Photos for CSAM content. I begin with a background of Apple’s efforts to monitor for CSAM content on their services before providing a description of the newly announced CSAM surveillance system. I then turn to outline some problems, complications, and concerns with this new child safety feature. In particular, I discuss the challenges facing Apple in finding reputable child safety organizations with whom to partner, the potential ability to region-shift to avoid the surveillance, the prospect of the surveillance system leading to ongoing harms towards CSAM survivors, the likelihood that Apple will expand the content which is subject to the company’s surveillance infrastructure, and the weaponization of the CSAM surveillance infrastructure against journalists, human rights defenders, lawyers, opposition politicians, and political dissidents. I conclude with a broader discussion of the problems associated with Apple’s new CSAM surveillance infrastructure.
A previous post focused on the surveillance children’s messages to monitor for sexually explicit photos. Future posts will address the third child safety feature that Apple has announced, as well as the broader implications of Apple’s child safety initiatives.
Background to Apple Monitoring for CSAM
Apple has previously worked with law enforcement agencies to combat CSAM though the full contours of that assistance are largely hidden from the public. In May 2019, Mac Observer noted that the company had modified their privacy policy to read, “[w]e may also use your personal information for account and network security purposes, including in order to protect our services for the benefit of all our users, and pre-screening or scanning uploaded content for potentially illegal content, including child sexual exploitation material” (emphasis not in original). Per Forbes, Apple places email messages under surveillance when they are routed through its systems. Mail is scanned and if CSAM content is detected then Apple automatically prevents the email from reaching its recipient and assigns an employee to confirm the CSAM content of the message. If the employee confirms the existence of CSAM content the company subsequently provides subscriber information to the National Center for Missing and Exploited Children (NCMEC) or a relevant government agency.1
Continue reading