In his recent discussion with Ann Cauvoukian, Jesse Brown seems to have touched on a nerve. In the interview, the Commissioner discusses the use of self-encrypting/decrypting security systems that are meant to meet her ‘PET Plus’ program; she wants to ensure that measures are embedded in surveillance technologies that secure individuals’ privacy while at the same time enabling police to perform their duties. In the case of cameras, this will mean that all bodies on the screen are barely visible – not blurred, but almost erased from a non-decrypted viewing. Individuals are only revealed on film when a decryption algorithm is applied; until then, those individuals hold a spectre-like existence.
Russell McOrmond has taken a strong stance against this, arguing that the Commissioner’s efforts would make first-party/content owners subservient to third party agents who hold decryption keys. It is important to note that, as the Commissioner has presented her ideas, the police, or some other authority, would be the only group who would have access to these keys. This would limit the use of CCTV by employees to illegitimately survey clients/patrons/etc. Surprisingly quickly, Ken Anderson (Assistant Commissioner, Privacy, Ontario) has jumped into the discussion.
Ken asserts that:
The Ontario Privacy Commissioner is not advocating for a “technology mandate that devices … be under the control of someone other than its owner,” — quite the opposite. Her wish is to empower users in an world of unlimited personal data creation, use, disclosure, etc. She’s challenging us to build — or rather, demand — technologies to make that possible. (Source)
Broadly, the Commissioner wants to apply access controls to security/privacy invasive technologies. In some respects, this reminds me of the efforts of Viktor Mayer-Schoenberger, who in “Useful Void: The Art of Forgetting in the Age of Ubiquitous Computing” suggests that data expiration dates should be attached to data articles. Similar to the Commissioner, he sees this as a way of overcoming the challenges of persistent data (i.e. data’s tendency to exist forever, even when it’s no longer needed). As I see it, Russell is arguing that the insertion of XML or other code to modify the conditions of data longevity/legitimacy is the same as digital rights management, and in a sense I can see the merit in this claim. Effectively, the Commissioner is presently suggesting that some content that is collected in the course of private security should be embedded in a self-destruction system. I like this.
What I don’t think that the Commission is suggesting is that private security groups will be required to adopt her measures – as I understand it, that would be the ideal situation, but presently she wants to address public cameras. Now, the ‘danger’ is that securing public data is only the first step in rolling out a ‘security rights management’ system (with the associated concerns about legitimate authority and willingness to allow individuals to decrypt surveillance recordings). Maybe I’m being naive, but I don’t see Ann as pushing that particular agenda. I do, however, recognize that she operates within an institution and, as a result, any decisions that she makes must be seen through the lens of a potential replacement when her term is up.
Maybe what makes me feel most uneasy about all of this is the way that the discussion is being framed – I don’t think that associating the Commissioner’s proposals with DRM is necessarily the right tactic, and worry that in casting it as such a resentement develops towards a technology that is meant to address many of the privacy-associated concerns with CCTV. As she notes in her interview, she doesn’t think that she can stop CCTV; the best she can do is limit its effects. I think that this is the point of conflict that we want to take up – we want out public officials to fight the hard fights, and not ‘cop out’. I have nothing but respect for what Ann is trying to accomplish, but have to ask: “Is it enough?”