[Note – I preface this with the following: I am not a lawyer, and what follows is a non-lawyer’s ruminations of how the Supreme Court’s thoughts on reasonable expectations to privacy intersect with what deep packet inspection (DPI) can potentially do. This is not meant to be a detailed examination of particular network appliances with particular characteristics, but much, much more general in nature.]
Whereas Kyllo v. United States saw the US Supreme Court assert that thermal-imaging devices, when directed towards citizens’ homes, did constitute an invasion of citizens’ privacy, the corresponding Canadian case (R. v. Tessling) saw the Supreme Court assert that RCMP thermal imaging devices did not violate Canadians’ Section 8 Chart rights (“Everyone has the right to be secure against unreasonable search or seizure”). The Court’s conclusions emphasized information privacy interests at the expense of normative expectations – thermal information, on its own, was practically ‘meaningless’ – which has led Ian Kerr and Jena McGill to worry that informational understandings of privacy invoke:
. . . a mesmerizing sleight of hand through which our minds are misdirected away from police choppers slashing through the night and patrol dogs perambulating corridors – these things no longer qualifying as searches – towards an extremely impersonal, non-social and merely informational scientific account of heat emanating from a building or odours emanating from luggage ( Kerr and McGill 2007: 407).
In an era where datamining is an incredibly profitable market, and where individual data fragments on their own are practically meaningless (Daniel Solove has suggested that they are like the points in a Seurat painting), what does adopting an informational account towards privacy mean for securing packets from police surveillance practices? Do we have a reasonable expectation that the packets that stream to and from our computers – including those packets that applications generate and receive without our knowing in the background of typical computer uses – should be considered private?
To begin, it’s important to note that the Court was responding to police searches in Tessling, and not to corporate examinations of data that individuals agree to when they sign their End User Licensing Agreements (EULAs). Recognizing this, I want to think through what it would mean if police were to use corporate DPI appliances to evaluate and examine packet transmissions – would such examinations constitute a violation of Canadians’ reasonable expectation to privacy?
Generally, DPI appliances have not been configured by Canadian ISPs to engage in what (for this posting) I term ‘total dataveillance’; ISPs are not actually interested in retaining perfect records of where packets are traveling to and arriving from along with detailed inventories of the contents of those packets. Various ISPs have noted that their devices are not configured to engage in such a massive surveillance operation, and that any substantial reconfiguration of their devices would take months of labour (references to this found in CRTC filings that I’ve summarized). At the same time, we do know that analysis of data traffic occurs; Canadian ISPs have rough numbers for traffic averages, and Bell Canada has implied that they classify consumers’ online actions by positing why wholesale customers consumer a disproportionate amount of bandwidth:
- These consumers use HTTP for content sharing to greater extents (on average) than non-wholesale consumers.
- This consumer group is behaviourally disposed to consuming greater amounts of bandwidth, and are actively courted by retail wholesellers of Bell’s broadband network.
Bell’s classification of users is only possible because of a substantial amalgamation of what would otherwise be meaningless data. They have drawn composite packet data together, applied inferences, and come to particular conclusions. This process is similar to the inferences that were used by American police in Kyllo, where the they measured a house’s heat emanations and combined it with other inferences to demonstrate cause to receive a search warrant.
In the case of DPI appliances, even where consumers use encryption to mask their packet transfers some appliances can be configured to evaluate likely application-specific traffic based on unique packet transmission exchanges. Suppose that we say that encrypted packets are, in fact, ‘private’ – they would be the equivalent of the home. Would the inferences derived from how packets are being exchanged be ‘private’? Such inferences could be seen as analogous to heat – data exchanges alone do not reveal the contents of the packet, and sophisticated computer users cannot reasonably expect that such information could be private, given that it is impossible to ‘lock up’ data exchange information. Such exchange information is like digital radiation.
Under this metric, we might equate an unencrypted packet as (effectively) being in public (similar to a postcard going through the mail), an encrypted packet as constituting a private enclosure (such as a letter), and the variation of packet transmission as being as public as heat emanations (such as the rate that mail is delivered to a home). On their own, variations in packet exchanges are effectively meaningless; the information derived from those exchanges only becomes meaningful when associated with public data such as what applications tend to exchange packets in a particular manner, with packets of a particular size, and so forth.
If we adopt what Kerr and McGill call the ‘predictive’ approach to privacy, which is concerned “primarily with current standards of police practice and the technological state-of-the-art,” we enter space where “all one needs to do to alter the reasonable expectation of privacy standard is to engineer a change in people’s expectations” (Kerr and McGill 2007: 421-22). The normative understanding of privacy expectations, on the other hand, rests on subjective and objective expectations:
- Subjective: Individuals are expected to demonstrate that they have an expectation of privacy.
- Objective: The following contextual factors are considered in evaluating objective expectations: (i) the place where the alleged search occurred; (ii) whether the subject matter of the search was in public view; (iii) whether the subject matter had been abandoned or was already in the possession of third parties; (iv) the intrusiveness of the police technique utilized in the alleged search; (v) whether the information obtained by police exposed core biographical or intimate details of an individual’s life. (Kerr and McGill 2007: 409)
Let’s set aside the subjective expectation, and instead narrowly focus on the objective expectation of privacy as it pertains to using DPI appliances for police surveillance practices. The search will have occurred in a private installation that is owned by a third party, whom the consumer has permitted to let examine or manage data traffic. While the search might be taking place on private property, the individual knows that pervasive examiniation of data traffic is possible and that results might be disclosed to authorities. The packets were knowingly placed in the hands of third parties, though they were not abandoned. The technique is unobtrusive, and bears resemblance to a wiretap insofar as the search can occur without the individual in question ever knowing. Further, the information garnered from analysis of particular encrypted packets does not, in and of itself, expose core biographical or intimate details.
On this narrow reading, it would appear as though individuals should not reasonably expect that they have an objective expectation of privacy. In light of this, it is helpful to turn with with Kerr and McGill and argue that Tessling is incredibly specific – it applies to a particular technology being used in a particular manner – and thus does not establish a widespread precedent that applies to all ’emanation-based’ data. Paperny J. wrote in a majority reading of Tessling that corresponds with this reading: “the Supreme Court’s statements . . . are consistently confined to the factual situation and the type of technology before it” (Kant Brown in Kerr and McGill 2007: 428). On this reading, the emanations from packet transfers should be subject to a new reasonable expectations test, and one that goes beyond a simple analogy between heat emanations and encrypted packet characteristics.
It will be interesting to see how Canadian courts deal with these kinds of questions in the future; while I would hope that we would see rulings that require police to receive warrants to collect information like data transfer particularities, I worry that we’ll see something like in America, where when things go digital historical privacy protections are thrown out. At the same time, I guess that given the impetus born of Tessling it might not be such a bad thing for a ‘whole new’ set of privacy laws surrounding packet exchanges…