The Western world is pervaded by digital information, to the point where we might argue that most Western citizens operate in a bio-digital field that is constituted by the conditions of life and life’s (now intrinsic) relationships to digital code. While historically (if 30 years or so can withstand the definitional intonations of ‘historically) such notions of code would dominantly pertain to government databanks and massive corporate uses of code and data, with the advent of the ‘social web’ and ease of mashups we are forced to engage with questions of how information, code, and privacy norms and regulations pertain to individual’s usage of data sources. While in some instances we see penalties being handed down to individuals that publicly release sensitive information (such as Sweden’s Bodil Lindqvist, who was fined for posting personal data about fellow church parishioners without consent), what is the penalty when public information is situated outside of its original format and mashed-up with other data sources? What happens when we correlate data to ‘map’ it?
Let’s get into some ‘concrete’ examples to engage with this matter. First, I want to point to geo-locating trace route data, the information that identifies the origin of website visitors’ data traffic, to start thinking about mashups and privacy infringements. Second, I’ll briefly point to some of the challenges arising with the meta-coding of the world using Augmented Reality (AR) technologies. The overall aim is not to ‘resolve’ any privacy questions, but to try and reflect on differences between ‘specificity’ of geolocation technology, the implications of specificity, and potential need to establish a new set of privacy norms given the bio-digital fields that we find ourself immersed in.
The above image was taken by a Google Streetcar. As is evident, all of the faces in the picture have been blurred in accordance with Google’s anonymization policy. I think that the image nicely works as a lightning rod to capture some of the criticisms and questions that have been arisen around Streetview:
Does the Streetview image-taking process itself, generally, constitute a privacy violation of some sort?
Are individuals’ privacy secured by just blurring faces?
Is this woman’s privacy being violated/infringed upon in so way as a result of having her photo taken?
Google’s response is, no doubt, that individuals who feel that an image is inappropriate can contact the company and they will take the image offline. The problem is that this puts the onus on individuals, though we might be willing to affirm that Google recognizes photographic privacy as a social value, insofar as any member of society who sees this as a privacy infringement/violation can also ask Google to remove the image. Still, even in the latter case this ‘outsources’ privacy to the community and is a reactive, rather than a proactive, way to limit privacy invasions (if, in fact, the image above constitutes an ‘invasion’). Regardless of whether we want to see privacy as an individual or social value (or, better, as valuable both for individuals and society) we can perhaps more simply ponder whether blurring the face alone is enough to secure individuals’ privacy. Is anonymization the same as securing privacy?
…users will soon be seeing an ActionScript Fire Eagle library and a Mozilla Firefox geo-plugin that locates users via WiFi MAC addresses. Also coming up are new XMPP libraries. (Source)
It’s the focus on the Firefox geo-plugin that I think will be most interesting to watch. Given the Mozilla is currently developing their Fennec browser for mobile environments, it suggests that the Fire Eagle plugin could come to phones and other mobile devices that are Internet-by-WiFi but not GPS or data plan enabled. Using a browser plugin, it should be possible to identify your location on a map simply by being in vicinity to wireless APs, regardless of whether you can actually authenticate to them (similar to how users with iPod Touches can currently roughly locate themselves on Google Maps via WiFi MAC address detection). Below is an image of Mozilla’s beta-version of Fennec.
In the past week or so, Google has receive an enormous amount of attention because of their Latitude program. Latitude, once installed and enabled, will alert specified friends to your geographic location very specifically (i.e. street address) or more broadly (i.e. city). Google has developed this system so that users can turn off the system, can alter how precise it locates users, and has (really) just caught up to the technologies that their competitors have already been playing with (I wrote a little about Yahoo!’s Fire Eagle software, which is similar to Latitude, a few months ago).
While many people have already written and spoken about Latitude, I’ve found myself on a fence. On the one hand, I think that some of the criticisms towards the ‘privacy’ features of the program have been innane – at least one privacy advocate’s core ‘contribution’ to has been a worry that individuals might be given a phone with Latitude installed and active, without knowing about its presence or activation. As a result, they would be tracked without having consented to the program, or the geo-surveillance.
The Canadian SIGINT Summaries includes downloadable copies, along with summary, publication, and original source information, of leaked CSE documents.
Parsons, Christopher; and Molnar, Adam. (2021). “Horizontal Accountability and Signals Intelligence: Lesson Drawing from Annual Electronic Surveillance Reports,” David Murakami Wood and David Lyon (Eds.), Big Data Surveillance and Security Intelligence: The Canadian Case.
Parsons, Christopher. (2015). “Stuck on the Agenda: Drawing lessons from the stagnation of ‘lawful access’ legislation in Canada,” Michael Geist (ed.), Law, Privacy and Surveillance in Canada in the Post-Snowden Era (Ottawa University Press).
Parsons, Christopher. (2015). “The Governance of Telecommunications Surveillance: How Opaque and Unaccountable Practices and Policies Threaten Canadians,” Telecom Transparency Project.
Parsons, Christopher. (2015). “Beyond the ATIP: New methods for interrogating state surveillance,” in Jamie Brownlee and Kevin Walby (Eds.), Access to Information and Social Justice (Arbeiter Ring Publishing).
Bennett, Colin; Parsons, Christopher; Molnar, Adam. (2014). “Forgetting and the right to be forgotten” in Serge Gutwirth et al. (Eds.), Reloading Data Protection: Multidisciplinary Insights and Contemporary Challenges.
Bennett, Colin, and Parsons, Christopher. (2013). “Privacy and Surveillance: The Multi-Disciplinary Literature on the Capture, Use, and Disclosure of Personal information in Cyberspace” in W. Dutton (Ed.), Oxford Handbook of Internet Studies.
McPhail, Brenda; Parsons, Christopher; Ferenbok, Joseph; Smith, Karen; and Clement, Andrew. (2013). “Identifying Canadians at the Border: ePassports and the 9/11 legacy,” in Canadian Journal of Law and Society 27(3).
Parsons, Christopher; Savirimuthu, Joseph; Wipond, Rob; McArthur, Kevin. (2012). “ANPR: Code and Rhetorics of Compliance,” in European Journal of Law and Technology 3(3).