Touring the digital through type

Tag: privacy policies (Page 1 of 2)

References for ‘Putting the Meaningful into Meaningful Consent’

By Stephanie BoothDuring my presentation last week at Social Media Club Vancouver – abstract available! – I drew from a large set of sources, the majority of which differed from my earlier talk at Social Media Camp Victoria. As noted earlier, it’s almost impossible to give full citations in the middle of a talk, but I want to make them available post-talk for interested parties.

Below is my keynote presentation and list of references. Unfortunately academic paywalls prevent me from linking to all of the items used, to say nothing of chapters in various books. Still, most of the articles should be accessible through Canadian university libraries, and most of the books are in print (if sometimes expensive).

I want to thank Lorraine Murphy and Cathy Browne for inviting me and doing a stellar job of publicizing my talk to the broader media. It was a delight speaking to the group at SMC Vancouver, as well as to reporters and their audiences across British Columbia and Alberta.

Keynote presentation [20.4MB; made in Keynote ’09]

References

Bennett, C. (1992). Regulating Privacy: Data Protection and Public Policy in Europe and the United States. Ithica: Cornell University Press.

Bennett, C. (2008).  The Privacy Advocates:  Resisting the Spread of Surveillance.  Cambridge, Mass:  The MIT Press.

Carey, R. and Burkell, J. (2009). ‘A Heuristics Approach to Understanding Privacy-Protecting Behaviors in Digital Social Environments’, in I. Kerr, V. Steeves, and C. Lucock (eds.). Lessons From the Identity Trail: Anonymity, Privacy and Identity in a Networked Society. Toronto: Oxford University Press. 65-82.

Chew, M., Balfanz, D., Laurie, B. (2008). ‘(Under)mining Privacy in Social Networks’, Proceedings of W2SP Web 20 Security and Privacy: 1-5.

Fischer-Hübner, S., Sören Pettersson, J. and M. Bergmann, M. (2008). “HCI Designs for Privacy-Enhancing Identity Management’, in A. Acquisti and S. Gritzalis (eds.). Digital Privacy: Theory, Technologies, and Practices. New York: Auerbach Publications. 229-252.

Flaherty, D. (1972). Privacy in Colonial England. Charlottesville, VA: University Press of Virginia.

Hoofnagle, Chris; King, Jennifer; Li, Su; and Turow, Joseph. (2010). “How different are young adults from older adults when it comes to information privacy attitudes and policies?” available at: http://www.ftc.gov/os/comments/privacyroundtable/544506-00125.pdf

Karyda, M., Koklakis, S. (2008). ’Privacy Perceptions among Members of Online Communities‘, in A. Acquisti and S. Gritzalis (eds.). Digital Privacy: Theory, Technologies, and Practices. New York: Auerbach Publications, 253-266.

Kerr, I., Barrigar, J., Burkell, J, and Black K. (2009). ‘Soft Surveillance, Hard Consent: The Law and Psychology of Engineering Consent’, in I. Kerr, V. Steeves, and C. Lucock (eds.). Lessons From the Identity Trail: Anonymity, Privacy and Identity in a Networked Society. Toronto: Oxford University Press. 5-22.

Marwick, A. E., Murgia-Diaz, D., and Palfrey Jr., J. G. (2010). ‘Youth, Privacy and Reputation (Literature Review)’. Berkman Center Research Publication No. 2010-5; Harvard Law Working Paper No. 10-29. URL: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1588163

O’Reilly, T, and Battelle, J. (2008), ‘Web Squared: Web 2.0 Five Years On’. Presented at Web 2.0 Summit 2009, at http://www.web2summit.com/web2009/public/schedule/detail/10194

Steeves, V. (2009). ‘Reclaiming the Social Value of Privacy‘, in I. Kerr, V. Steeves, and C. Lucock (eds). Privacy, Identity, and Anonymity in a Network World: Lessons from the Identity Trail. New York: Oxford University Press.

Steeves, V, and Kerr, I. (2005). ‘Virtual Playgrounds and Buddybots: A Data-Minefield for Tweens‘, Canadian journal of Law and Technology 4(2), 91-98.

Turow, Joseph; King, Jennifer; Hoofnagle, Chris Jay; Bleakley, Amy; and Hennessy, Michael. (2009). “Contrary to what marketers say Americans reject tailored advertising and three activities that enable it,” Available at: http://graphics8.nytimes.com/packages/pdf/business/20090929-Tailored_Advertising.pdf

Turow, Joseph. (2007). “Cracking the Consumer Code: Advertisers, Anxiety, and Surveillance in the Digital Age,” in The New Politics of Surveillance and Visibility. Toronto: University of Toronto Press

Do You Know Who Your iPhone’s Been Calling?

The-Apple-iPhone-3GS-gets-a-phoneAn increasing percentage of Western society is carrying a computer with them, everyday, that is enabled with geo-locative technology. We call them smartphones, and they’re cherished pieces of technology. While people are (sub)consciously aware of this love-towards-technology, they’re less aware of how these devices are compromising their privacy, and that’s the topic of this post.

Recent reports on the state of the iPhone operating system show us that the device’s APIs permit incredibly intrusive surveillance of personal behaviour and actions. I’ll be walking through those reports and then writing somewhat more broadly about the importance of understanding how APIs function if scrutiny of phones, social networks, and so forth is to be meaningful. Further, I’ll argue that privacy policies – while potentially useful for covering companies’ legal backends – are less helpful in actually educating end-users about a corporate privacy ethos. These policies, as a result, need to be written in a more accessible format, which may include a statement of privacy ethics that is baked into a three-stage privacy statement.

iOS devices, such as the iPhone, iPad, Apple TV 2.0, and iPod touch, have Unique Device Identifiers (UDIDs) that can be used to discretely track how customers use applications associated with the device. A recent technical report, written by Eric Smith of PSKL, has shed light into how developers can access a device UDID and correlate it with personally identifiable information. UDIDs are, in effect, serial numbers that are accessible by software. Many of the issues surrounding the UDID are arguably similar to those around the Pentium III’s serial codes (codes which raised the wrath of the privacy community and were quickly discontinued. Report on PIII privacy concerns is available here).

Continue reading

APIs, End-Users, and the Privacy Commons

Mozilla is throwing their hat into the ‘privacy commons‘ ring. Inspired by Aza Rankin’s ‘Making Privacy Policies Not Suck‘, Mozilla is trying to think through a series of icons intended to educate users about websites’ privacy policies. This is inspirational, insofar as a large corporation is actually taking up the challenge of the privacy commons, but at the same time we’ve heard that a uniform privacy analysis system is coming before….in 1998. A working draft for the Platform for Privacy Preferences (P3P) was released May 19, 1998 during the still heady-times of people thinking that Privacy Enhancing Technologies (PETs) could secure people’s online privacy or, at least, make them aware of privacy dangers. The P3P initiative failed.

Part of the reason behind P3P’s failure was the length of its documentation (it was over 150% the length of Alice in Wonderland) and the general challenge of ‘properly’ checking for privacy compliance. Perhaps most importantly, when the P3P working group disbanded in 2007 they noted that a key reason behind their failure was “insufficient support for curent Browser implementors”. Perhaps with Mozilla behind the project, privacy increasingly being seen as space of product competition and differentiation, and a fresh set of eyes that can learn from the successes of the creative commons and other privacy initiatives, something progressive will emerge from Mozilla’s effort.

Continue reading

Thinking About a ‘Privacy Commons’

unclesamsurveillanceIn some privacy circles there is a vision of creating a simple method of decoding privacy policies. As it stands, privacy policies ‘exist’ in a nebulous domain of legalese. Few people read these policies, and fewer still understand what they do (and do not) say. The same has traditionally been true of many copyright agreements. To assuage this issue surrounding copyright, the creative commons were created. Privacy groups are hoping to take some of the lessons from the creative commons and apply it to privacy policies.

I need to stress that this is a ‘thinking’ piece – I’ve been bothered by some of the models and diagrams used to express the ‘privacy commons’ because I think that while they’re great academic pieces, they’re nigh useless for the public at large. When I use the term ‘public at large’ and ‘useless’ what I am driving at is this: the creative commons is so good because it put together a VERY simple system that lets people quickly understand what copyright is being asserted over particular works. A privacy commons will live (or, very possibly, die) on its ease of access and use.

So, let’s think about use-value of any mode of description. The key issue with many commons approaches is that they are trying to do way too much all at once. Is there necessarily a need for a uniform commons statement, or is privacy sufficiently complicated that we adopt a medical privacy commons, a banking privacy commons, a social networking privacy commons, and so forth? Perhaps, instead of cutting the privacy cake so granularly (i.e. by market segment) we should try to boil down key principles and then offer real-language explanations for each principle’s application in particular business environments instead. This division of the commons is a topic that researchers appreciate and struggle with.

Continue reading

« Older posts