[Note – I preface this with the following: I am not a lawyer, and what follows is a non-lawyer’s ruminations of how the Supreme Court’s thoughts on reasonable expectations to privacy intersect with what deep packet inspection (DPI) can potentially do. This is not meant to be a detailed examination of particular network appliances with particular characteristics, but much, much more general in nature.]
Whereas Kyllo v. United States saw the US Supreme Court assert that thermal-imaging devices, when directed towards citizens’ homes, did constitute an invasion of citizens’ privacy, the corresponding Canadian case (R. v. Tessling) saw the Supreme Court assert that RCMP thermal imaging devices did not violate Canadians’ Section 8 Chart rights (“Everyone has the right to be secure against unreasonable search or seizure”). The Court’s conclusions emphasized information privacy interests at the expense of normative expectations – thermal information, on its own, was practically ‘meaningless’ – which has led Ian Kerr and Jena McGill to worry that informational understandings of privacy invoke:
. . . a mesmerizing sleight of hand through which our minds are misdirected away from police choppers slashing through the night and patrol dogs perambulating corridors – these things no longer qualifying as searches – towards an extremely impersonal, non-social and merely informational scientific account of heat emanating from a building or odours emanating from luggage ( Kerr and McGill 2007: 407).
In an era where datamining is an incredibly profitable market, and where individual data fragments on their own are practically meaningless (Daniel Solove has suggested that they are like the points in a Seurat painting), what does adopting an informational account towards privacy mean for securing packets from police surveillance practices? Do we have a reasonable expectation that the packets that stream to and from our computers – including those packets that applications generate and receive without our knowing in the background of typical computer uses – should be considered private?
To begin, it’s important to note that the Court was responding to police searches in Tessling, and not to corporate examinations of data that individuals agree to when they sign their End User Licensing Agreements (EULAs). Recognizing this, I want to think through what it would mean if police were to use corporate DPI appliances to evaluate and examine packet transmissions – would such examinations constitute a violation of Canadians’ reasonable expectation to privacy?
Generally, DPI appliances have not been configured by Canadian ISPs to engage in what (for this posting) I term ‘total dataveillance’; ISPs are not actually interested in retaining perfect records of where packets are traveling to and arriving from along with detailed inventories of the contents of those packets. Various ISPs have noted that their devices are not configured to engage in such a massive surveillance operation, and that any substantial reconfiguration of their devices would take months of labour (references to this found in CRTC filings that I’ve summarized). At the same time, we do know that analysis of data traffic occurs; Canadian ISPs have rough numbers for traffic averages, and Bell Canada has implied that they classify consumers’ online actions by positing why wholesale customers consumer a disproportionate amount of bandwidth:
- These consumers use HTTP for content sharing to greater extents (on average) than non-wholesale consumers.
- This consumer group is behaviourally disposed to consuming greater amounts of bandwidth, and are actively courted by retail wholesellers of Bell’s broadband network.
Bell’s classification of users is only possible because of a substantial amalgamation of what would otherwise be meaningless data. They have drawn composite packet data together, applied inferences, and come to particular conclusions. This process is similar to the inferences that were used by American police in Kyllo, where the they measured a house’s heat emanations and combined it with other inferences to demonstrate cause to receive a search warrant.
In the case of DPI appliances, even where consumers use encryption to mask their packet transfers some appliances can be configured to evaluate likely application-specific traffic based on unique packet transmission exchanges. Suppose that we say that encrypted packets are, in fact, ‘private’ – they would be the equivalent of the home. Would the inferences derived from how packets are being exchanged be ‘private’? Such inferences could be seen as analogous to heat – data exchanges alone do not reveal the contents of the packet, and sophisticated computer users cannot reasonably expect that such information could be private, given that it is impossible to ‘lock up’ data exchange information. Such exchange information is like digital radiation.
Under this metric, we might equate an unencrypted packet as (effectively) being in public (similar to a postcard going through the mail), an encrypted packet as constituting a private enclosure (such as a letter), and the variation of packet transmission as being as public as heat emanations (such as the rate that mail is delivered to a home). On their own, variations in packet exchanges are effectively meaningless; the information derived from those exchanges only becomes meaningful when associated with public data such as what applications tend to exchange packets in a particular manner, with packets of a particular size, and so forth.
If we adopt what Kerr and McGill call the ‘predictive’ approach to privacy, which is concerned “primarily with current standards of police practice and the technological state-of-the-art,” we enter space where “all one needs to do to alter the reasonable expectation of privacy standard is to engineer a change in people’s expectations” (Kerr and McGill 2007: 421-22). The normative understanding of privacy expectations, on the other hand, rests on subjective and objective expectations:
- Subjective: Individuals are expected to demonstrate that they have an expectation of privacy.
- Objective: The following contextual factors are considered in evaluating objective expectations: (i) the place where the alleged search occurred; (ii) whether the subject matter of the search was in public view; (iii) whether the subject matter had been abandoned or was already in the possession of third parties; (iv) the intrusiveness of the police technique utilized in the alleged search; (v) whether the information obtained by police exposed core biographical or intimate details of an individual’s life. (Kerr and McGill 2007: 409)
Let’s set aside the subjective expectation, and instead narrowly focus on the objective expectation of privacy as it pertains to using DPI appliances for police surveillance practices. The search will have occurred in a private installation that is owned by a third party, whom the consumer has permitted to let examine or manage data traffic. While the search might be taking place on private property, the individual knows that pervasive examiniation of data traffic is possible and that results might be disclosed to authorities. The packets were knowingly placed in the hands of third parties, though they were not abandoned. The technique is unobtrusive, and bears resemblance to a wiretap insofar as the search can occur without the individual in question ever knowing. Further, the information garnered from analysis of particular encrypted packets does not, in and of itself, expose core biographical or intimate details.
On this narrow reading, it would appear as though individuals should not reasonably expect that they have an objective expectation of privacy. In light of this, it is helpful to turn with with Kerr and McGill and argue that Tessling is incredibly specific – it applies to a particular technology being used in a particular manner – and thus does not establish a widespread precedent that applies to all ’emanation-based’ data. Paperny J. wrote in a majority reading of Tessling that corresponds with this reading: “the Supreme Court’s statements . . . are consistently confined to the factual situation and the type of technology before it” (Kant Brown in Kerr and McGill 2007: 428). On this reading, the emanations from packet transfers should be subject to a new reasonable expectations test, and one that goes beyond a simple analogy between heat emanations and encrypted packet characteristics.
A merely informational reading of privacy or narrow interpretation of the objective expectation of privacy threatens to see the dots of our life’s Seurat painting disclosed without recognizing that the dots can and will be connected by inference. Alternately, we might engage in a broader understanding of the objective expectations of privacy test and look beyond particular dots in the painting to the image that they paint when seen in relation to one another. Under such an account the subject matter (packet streams) may not be public, insofar as there is a private contract between the individual and ISP and some subjective expectation that data will be kept private (e.g. surveys regularly reveal that if a website has a privacy policy consumers expect that their data will not be shared). Further, the data is in transit with third parties, but has not been abandoned or fully turned over to them; ISPs are expected to act like postal officers and “merely” carry the mail. Finally, while individual data packets do not reveal core biographical information, it is possible that analyzing packet rates and transfers, in aggregate, does reveal such information. Effectively, when we move beyond the process that a particular packet goes through to reach its destination, and reflect on the broader scope of the transfer, we reach a different expectation of privacy that avoids the reductionism of informational privacy accounts/narrow interpretations of objective expectations to privacy.
It will be interesting to see how Canadian courts deal with these kinds of questions in the future; while I would hope that we would see rulings that require police to receive warrants to collect information like data transfer particularities, I worry that we’ll see something like in America, where when things go digital historical privacy protections are thrown out. At the same time, I guess that given the impetus born of Tessling it might not be such a bad thing for a ‘whole new’ set of privacy laws surrounding packet exchanges…
Question:
Why should the privacy expectations for electronic data on public networks be different than the privacy expectations of appearing in public?
As far as I am aware, anyone (public or police) can conduct surveillance of your home and your travel. They can note when you leave/arrive, determine where you travel, how, even how you’re dressed. They can even record what lights are are on, what times, how often you look out a window etc. etc.
That’s all legal information gathering (and doesn’t require any particular technology). So why would the same information derived from data packets on public networks be different?
LikeLike
As far as I’m aware, you are correct in what the police can and cannot do re: surveillance. I’m trying to think about whether or not packet transfer characteristics would be a kind of ’emanation’ from the packet, and thus fall into a discussion of Tessling – if it does, then do we want to read Tessling in a fairly analogous way, or try to suggest that ’emanation’ tests unfairly capture information. The issue, as you point out, is that if we read ISP networks as public networks, then there is a question about whether or not we can expect any privacy when moving data across them.
I think that there needs to be a right to privacy across communications networks for the same reasons that we expect privacy along telephone lines: courts have insisted that wiretaps are detrimental to the democratic environment that we live in. If we accept that there is no/little privacy along ‘public networks’, then the communicative privacy that is needed to maintain democracies is challenged. In effect, privacy and freedom of speech are tightly aligned. This way of looking at privacy in public networks would likely be encompassed in my ‘broad’ understanding of normative expectations of privacy, or perhaps would be situated in the ‘subjective’ category (going to talk with Dr. Kerr about this difference later today).
If we adopt an informational approach to privacy, however, I worry that we get into a situation where we’re ‘just’ talking about particular data packets, and miss the fact that understanding communications data away from their social context threatens to ignore the broader social impacts.
LikeLike