Apple’s Monitoring of Children’s Communications Content Puts Children and Adults at Risk

pexels-photo-193004.jpeg
Photo by Torsten Dettlaff on Pexels.com

On August 5, 2021, Apple announced that it would soon begin conducting pervasive surveillance of devices that they sell with a stated intent of expanding protections for children. The company announced three new features. The first will monitor for children sending or receiving sexually explicit images over the Messages application, the second will monitor for the reception or collection of Child Sexual Abuse Material (CSAM), and the third will monitor for searches pertaining to CSAM. These features are planned to be activated in the next versions of Apple’s mobile and desktop operating systems which will ship to end-users in the fall of 2021.

In this post I focus exclusively on the surveillance of children’s messages to detect whether they are receiving or sending sexually explicit images. I begin with a short discussion of how Apple has described this system and spell out the rationales for it, and then proceed to outline some early concerns with how this feature might negatively affect children and adults alike. Future posts will address the second and third child safety features that Apple has announced, as well as broader problems associated with Apple’s unilateral decision to expand surveillance on its devices.

Sexually Explicit Image Surveillance in Messages

Apple currently lets families share access to Apple services and cloud storage using Family Sharing. The organizer of the Family Sharing plan can utilize a number of parental controls to restrict the activities that children who are included in a Family Sharing plan can perform. Children, for Apple, include individuals who are under 18 years of age.

Upon the installation of Apple’s forthcoming mobile and desktop operating systems, children’s communications over Apple’s Messages application can be analyzed to assess if the content of the communications include sexually explicit images, if this analysis feature is enabled in Family Sharing. Apple’s analysis of images will occur on-device and Apple will not be notified of whether an image is sexually explicit. Should an image be detected it will initially be blurred out, and if a child wants to see the image they must proceed through either one or two prompts, depending on their age and how their parents have configured the parental management settings.

Continue reading

A Predator in Your Pocket : A Multidisciplinary Assessment of the Stalkerware Application Industry

With a series of incredible co-authors at the Citizen Lab, I’ve co-authored a report that extensively investigates the stalkerware ecosystem. Stalkerware refers to spyware which is either deliberately manufactured to, or repurposed to, facilitate intimate partner violence, abuse, or harassment. “A Predator in Your Pocket” is accompanied by a companion legal report, also released by the Citizen Lab. This companion report is entitled “Installing Fear: A Canadian Legal and Policy Analysis of Using, Developing, and Selling Smartphone Spyware and Stalkerware Applications,” and conducts a comprehensive criminal, civil, regulatory, and international law assessment of the legality of developing, selling, and using stalkerware.

Continue reading