The abstract for my presentation, as well as references, have already been made available. I wasn’t aware (or had forgotten) that all the presentations from Social Media Camp Victoria were going to be recorded and put on the web, but thought that others visiting this space might be interested in my talk. The camera is zoomed in on me, which means you miss some of the context provided by slides and references to people in the audience as I was talking. (Having quickly looked/listened to some of what I say, I feel as though I’m adopting a presentation style similar to a few people I watch a lot. Not sure how I think about that…The inability to actually walk around – being tethered to the mic and laptop – was particularly uncomfortable, which comes across in my body language, I think.)
Immediately after my presentation, Kris Constable of PrivaSecTech gives a privacy talk on social media that focuses on the inability to control personal information dissemination. Following his presentation, the two of us take questions from the audience for twenty or thirty minutes.
Apple’s entrance into the mobile advertising marketplace was born with their announcement of iAd. Alongside iAd comes persistent locational surveillance of Apple’s customers for the advantage of advertisers and Apple. The company’s advertising platform is controversial because Apple gives it a privileged position in their operating system, iOS4, and because the platform can draw on an iPhone’s locational awareness (using the phone’s GPS functionality) to deliver up targeted ads.
In this post I’m going to first give a brief background on iAd and some of the broader issues surrounding Apple’s deployment of their advertising platform. From there, I want to recap what Steve Jobs stated in a recent interview at the All Things Digital 8 concerning how Apple approaches locational surveillance through their mobile devices and then launch into an analysis of Apple’s recently changed terms of service for iOS4 devices as it relates to collecting, sharing, and retaining records on an iPhone’s geographic location. I’ll finish by noting that Apple may have inadvertently gotten itself into serious trouble as a result of its heavy-handed control of the iAd environment combined with modifying the privacy-related elements of their terms of service: Apple seems to have awoken the German data protection authorities. Hopefully the Germans can bring some transparency to a company regularly cloaked in secrecy.
Apple launched the iAd beta earlier this year and integrates the advertising platform into their mobile environment such that ads are seen within applications, and clicking on ads avoids taking individuals out of the particular applications that the customers are using. iAds can access core iOS4 functionality, including locational information, and can be coded using HTML 5 to provide rich advertising experiences. iAd was only made possible following Apple’s January acquisition of Quattro, a mobile advertising agency. Quattro was purchased after Apple was previously foiled in acquiring AdMob by Google last year (with the FTC recently citing iAd as a contributing reason why the Google transaction was permitted to go through). Ostensibly, the rich advertising from iAds is intended to help developers produce cheap and free applications for Apple’s mobile devices while retaining a long-term, ad-based, revenue stream. Arguably, with Apple taking a 40% cut of all advertising revenue and limiting access to the largest rich-media mobile platform in the world, advertising makes sense for their own bottom line and its just nice that they can ‘help’ developers along the way… Continue reading
The Western world is pervaded by digital information, to the point where we might argue that most Western citizens operate in a bio-digital field that is constituted by the conditions of life and life’s (now intrinsic) relationships to digital code. While historically (if 30 years or so can withstand the definitional intonations of ‘historically) such notions of code would dominantly pertain to government databanks and massive corporate uses of code and data, with the advent of the ‘social web’ and ease of mashups we are forced to engage with questions of how information, code, and privacy norms and regulations pertain to individual’s usage of data sources. While in some instances we see penalties being handed down to individuals that publicly release sensitive information (such as Sweden’s Bodil Lindqvist, who was fined for posting personal data about fellow church parishioners without consent), what is the penalty when public information is situated outside of its original format and mashed-up with other data sources? What happens when we correlate data to ‘map’ it?
Let’s get into some ‘concrete’ examples to engage with this matter. First, I want to point to geo-locating trace route data, the information that identifies the origin of website visitors’ data traffic, to start thinking about mashups and privacy infringements. Second, I’ll briefly point to some of the challenges arising with the meta-coding of the world using Augmented Reality (AR) technologies. The overall aim is not to ‘resolve’ any privacy questions, but to try and reflect on differences between ‘specificity’ of geolocation technology, the implications of specificity, and potential need to establish a new set of privacy norms given the bio-digital fields that we find ourself immersed in.
In a conversation with Prof. Andrew Clement this summer we got talking about the ever-increasing deployment of CCTV cameras throughout Canada. The conversation was, at least in part, motivated by the massive number of cameras that are being deployed throughout Vancouver with the leadup to the 2010 Olympic games; these cameras were one of the key focuses of the 10th Annual Security and Privacy Conference, where the BC Privacy Commissioner said that he might resign if the surveillance infrastructure is not taken down following the games.
I don’t want to delve into what, in particular, Prof. Clement is thinking of doing surrounding CCTV given that I don’t think he’s publicly announced his intentions. What I will do, however, is outline my own two-pronged approach to rendering CCTV a little more transparent. At the onset, I’ll note that:
My method will rely on technology (augmented reality) that is presently only in the hands of a small minority of the population;
My method is meant to be more and more useful as the years continue (and as the technology becomes increasingly accessible to consumers).
The broad goal is the following: develop a set of norms and processes to categorize different CCTV installations. Having accomplished this task, a framework would be developed for an augmented reality program (here’s a great blog on AR) that could ‘label’ where CCTV installations are and ‘grade’ them based on the already established norms and processes.
The Canadian SIGINT Summaries includes downloadable copies, along with summary, publication, and original source information, of leaked CSE documents.
Molnar, Adam; Parsons, Christopher; Zoauve, Erik. (2017). “Computer network operations and ‘rule-with-law’ in Australia,” Internet Policy Review6(1).
Parsons, Christopher; Israel, Tamir. (2016). “Gone Opaque? An Analysis of Hypothetical IMSI Catcher Overuse in Canada,” Citizen Lab – Telecom Transparency Project // CIPPIC.
Parsons, Christopher. (2015). “Beyond Privacy: Articulating the Broader Harms of Pervasive Mass Surveillance,” Media and Communication 3(3).
Parsons, Christopher. (2015). “Stuck on the Agenda: Drawing lessons from the stagnation of ‘lawful access’ legislation in Canada,” Michael Geist (ed.), Law, Privacy and Surveillance in Canada in the Post-Snowden Era (Ottawa University Press).
Parsons, Christopher. (2015). “The Governance of Telecommunications Surveillance: How Opaque and Unaccountable Practices and Policies Threaten Canadians,” Telecom Transparency Project.
Parsons, Christopher. (2015). “Beyond the ATIP: New methods for interrogating state surveillance,” in Jamie Brownlee and Kevin Walby (Eds.), Access to Information and Social Justice (Arbeiter Ring Publishing).
Parsons, Christopher; and Molnar, Adam. (2014). “Watching Below: Dimensions of Surveillance-by-UAVs in Canada” for the Surveillance Studies Centre and British Columbia Civil Liberties Association.
Bennett, Colin; Parsons, Christopher; Molnar, Adam. (2014). “Forgetting and the right to be forgotten” in Serge Gutwirth et al. (Eds.), Reloading Data Protection: Multidisciplinary Insights and Contemporary Challenges.
Bennett, Colin, and Parsons, Christopher. (2013). “Privacy and Surveillance: The Multi-Disciplinary Literature on the Capture, Use, and Disclosure of Personal information in Cyberspace” in W. Dutton (Ed.), Oxford Handbook of Internet Studies.
McPhail, Brenda; Parsons, Christopher; Ferenbok, Joseph; Smith, Karen; and Clement, Andrew. (2013). “Identifying Canadians at the Border: ePassports and the 9/11 legacy,” in Canadian Journal of Law and Society 27(3).
Parsons, Christopher; Savirimuthu, Joseph; Wipond, Rob; McArthur, Kevin. (2012). “ANPR: Code and Rhetorics of Compliance,” in European Journal of Law and Technology 3(3).