There have been lots of good critiques and comments concerning Facebook’s recently announced “Graph Search” product. Graph Search lets individuals semantically query large datasets that are associated with data shared by their friends, friends-of-friends, and the public more generally. Greg Satell tries to put the product in context – Graph Search is really a a way for corporations to peer into our lives – and a series of articles have tried to unpack the privacyimplications of Facebook’s newest product.
I want to talk less directly about privacy, and more about how Graph Search threatens to further limit discourse on the network. While privacy is clearly implicated throughout the post, we can think of privacy beyond just a loss for the individual and more about the broader social impacts of its loss. Specifically, I want to briefly reflect on how Graph Search (further?) transforms Facebook into a hostile discursive domain, and what this might mean for Facebook users.
Don Reisinger’s posting on Pro-privacy initiatives are getting out of hand is a good read, even if I don’t think that he ‘gets’ the reason why privacy advocates are (should be?) concerned about Google Streetview. If you’ve been under a rock, Google is in the process of sending out cars (like the one at the top of this post) to photograph neighborhoods and cities. The aim? To let people actually see where they are going – get directions, and you can see the streets and the buildings that you’ll be passing by. It also lets you evaluate how ‘safe’ a neighborhood is (ignoring the social biases that will be involved in any such estimation) and has been talked about as a privacy violation because some people have been caught on camera doing things that they didn’t want to be caught doing.
Don: Privacy Wimps Stand Up, Sit Down, and Shut Up
Don’s general position is this: American law doesn’t protect your privacy in such a way that no one can get one or take a photo of your property. What’s more, even if you were doing something that you didn’t want to be seen in you home, and if that action was captured by a Google car, don’t worry – no one really cares about you. In the new digital era, privacy by obscurity relies on poor search, poor image recognition, and even less interest in what you’re doing. Effectively, Streetview will be used to watching streets, and little else. Continue reading
This is just a really quick thought that I wanted to toss out.
I perceive a problem associated with the digitization of public records: such digitization allows business interests to gather aggregate data on large collections of people while retaining identifiable characteristics. This allows for a phenomenal sorting potential. At the same time, we might ask, “is there anything we can, or really want to, do about this?”
I hear this a lot – ‘Chris, you have to understand that things are different now. The paradigm is shifting towards transparency, and there’s nothing wrong with that, and you’re being a pain in the ass suggesting that there is anything wrong with transparency. Do you have something to hide, or something like that?’ This particular line bothers the hell out of me, because I shouldn’t have to expose myself without giving my consent, especially when I previously enjoyed a greater degree of privacy as a consequence of obscurity and/or the costs involved with copying, sorting, and analyzing analogue records. I fail to see why I have to give up past nascent rights and expectations just because we can mine data more effectively (hell, that would have been a meaningless statement around the time that I was born…). Efficiency is not the same as superior, better, or (necessarily) wanted.
The Canadian SIGINT Summaries includes downloadable copies, along with summary, publication, and original source information, of leaked CSE documents.
Parsons, Christopher; and Molnar, Adam. (2021). “Horizontal Accountability and Signals Intelligence: Lesson Drawing from Annual Electronic Surveillance Reports,” David Murakami Wood and David Lyon (Eds.), Big Data Surveillance and Security Intelligence: The Canadian Case.
Parsons, Christopher. (2015). “Stuck on the Agenda: Drawing lessons from the stagnation of ‘lawful access’ legislation in Canada,” Michael Geist (ed.), Law, Privacy and Surveillance in Canada in the Post-Snowden Era (Ottawa University Press).
Parsons, Christopher. (2015). “The Governance of Telecommunications Surveillance: How Opaque and Unaccountable Practices and Policies Threaten Canadians,” Telecom Transparency Project.
Parsons, Christopher. (2015). “Beyond the ATIP: New methods for interrogating state surveillance,” in Jamie Brownlee and Kevin Walby (Eds.), Access to Information and Social Justice (Arbeiter Ring Publishing).
Bennett, Colin; Parsons, Christopher; Molnar, Adam. (2014). “Forgetting and the right to be forgotten” in Serge Gutwirth et al. (Eds.), Reloading Data Protection: Multidisciplinary Insights and Contemporary Challenges.
Bennett, Colin, and Parsons, Christopher. (2013). “Privacy and Surveillance: The Multi-Disciplinary Literature on the Capture, Use, and Disclosure of Personal information in Cyberspace” in W. Dutton (Ed.), Oxford Handbook of Internet Studies.
McPhail, Brenda; Parsons, Christopher; Ferenbok, Joseph; Smith, Karen; and Clement, Andrew. (2013). “Identifying Canadians at the Border: ePassports and the 9/11 legacy,” in Canadian Journal of Law and Society 27(3).
Parsons, Christopher; Savirimuthu, Joseph; Wipond, Rob; McArthur, Kevin. (2012). “ANPR: Code and Rhetorics of Compliance,” in European Journal of Law and Technology 3(3).