Google Analytics, Privacy, and Legalese

Google Logo in Building43Google Analytics have become an almost ever-present part of the contemporary Internet. Large, small, and medium-sized sites alike track their website visitors using Google’s free tools to identify where visitors are coming from, what they’re looking at (and for how long), where they subsequently navigate to, what keywords bring people to websites, and whether internal metrics are in line with advertising campaign goals. As of 2010, roughly 52% of all websites used Google’s analytics system, and it accounted for 81.4% of the traffic analysis tools market. As of this writing, Google’s system is used by roughly 58% of the top 10,000 websites, 57% of the top 100,000 websites, and 41.5% of the top million sites. In short, Google is providing analytics services to a considerable number of the world’s most commonly frequented websites.

In this short post I want to discuss the terms of using Google analytics. Based on conversations I’ve had over the past several months, it seems like many of the medium and small business owners are unaware of the conditions that Google places on using their tool. Further, independent bloggers are using analytics engines – either intentionally or by the default of their website host/creator – and are ignorant of what they must do to legitimately use them. After outlining the brief bits of legalese that are required by Google – and suggesting what Google should do to ensure terms of service compliance – I’ll suggest a business model/addition that could simultaneously assist in privacy compliance while netting an enterprising company/individual a few extra dollars in revenue.

Continue reading

References for ‘Putting the Meaningful into Meaningful Consent’

By Stephanie BoothDuring my presentation last week at Social Media Club Vancouver – abstract available! – I drew from a large set of sources, the majority of which differed from my earlier talk at Social Media Camp Victoria. As noted earlier, it’s almost impossible to give full citations in the middle of a talk, but I want to make them available post-talk for interested parties.

Below is my keynote presentation and list of references. Unfortunately academic paywalls prevent me from linking to all of the items used, to say nothing of chapters in various books. Still, most of the articles should be accessible through Canadian university libraries, and most of the books are in print (if sometimes expensive).

I want to thank Lorraine Murphy and Cathy Browne for inviting me and doing a stellar job of publicizing my talk to the broader media. It was a delight speaking to the group at SMC Vancouver, as well as to reporters and their audiences across British Columbia and Alberta.

Keynote presentation [20.4MB; made in Keynote ’09]

References

Bennett, C. (1992). Regulating Privacy: Data Protection and Public Policy in Europe and the United States. Ithica: Cornell University Press.

Bennett, C. (2008).  The Privacy Advocates:  Resisting the Spread of Surveillance.  Cambridge, Mass:  The MIT Press.

Carey, R. and Burkell, J. (2009). ‘A Heuristics Approach to Understanding Privacy-Protecting Behaviors in Digital Social Environments’, in I. Kerr, V. Steeves, and C. Lucock (eds.). Lessons From the Identity Trail: Anonymity, Privacy and Identity in a Networked Society. Toronto: Oxford University Press. 65-82.

Chew, M., Balfanz, D., Laurie, B. (2008). ‘(Under)mining Privacy in Social Networks’, Proceedings of W2SP Web 20 Security and Privacy: 1-5.

Fischer-Hübner, S., Sören Pettersson, J. and M. Bergmann, M. (2008). “HCI Designs for Privacy-Enhancing Identity Management’, in A. Acquisti and S. Gritzalis (eds.). Digital Privacy: Theory, Technologies, and Practices. New York: Auerbach Publications. 229-252.

Flaherty, D. (1972). Privacy in Colonial England. Charlottesville, VA: University Press of Virginia.

Hoofnagle, Chris; King, Jennifer; Li, Su; and Turow, Joseph. (2010). “How different are young adults from older adults when it comes to information privacy attitudes and policies?” available at: http://www.ftc.gov/os/comments/privacyroundtable/544506-00125.pdf

Karyda, M., Koklakis, S. (2008). ’Privacy Perceptions among Members of Online Communities‘, in A. Acquisti and S. Gritzalis (eds.). Digital Privacy: Theory, Technologies, and Practices. New York: Auerbach Publications, 253-266.

Kerr, I., Barrigar, J., Burkell, J, and Black K. (2009). ‘Soft Surveillance, Hard Consent: The Law and Psychology of Engineering Consent’, in I. Kerr, V. Steeves, and C. Lucock (eds.). Lessons From the Identity Trail: Anonymity, Privacy and Identity in a Networked Society. Toronto: Oxford University Press. 5-22.

Marwick, A. E., Murgia-Diaz, D., and Palfrey Jr., J. G. (2010). ‘Youth, Privacy and Reputation (Literature Review)’. Berkman Center Research Publication No. 2010-5; Harvard Law Working Paper No. 10-29. URL: http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1588163

O’Reilly, T, and Battelle, J. (2008), ‘Web Squared: Web 2.0 Five Years On’. Presented at Web 2.0 Summit 2009, at http://www.web2summit.com/web2009/public/schedule/detail/10194

Steeves, V. (2009). ‘Reclaiming the Social Value of Privacy‘, in I. Kerr, V. Steeves, and C. Lucock (eds). Privacy, Identity, and Anonymity in a Network World: Lessons from the Identity Trail. New York: Oxford University Press.

Steeves, V, and Kerr, I. (2005). ‘Virtual Playgrounds and Buddybots: A Data-Minefield for Tweens‘, Canadian journal of Law and Technology 4(2), 91-98.

Turow, Joseph; King, Jennifer; Hoofnagle, Chris Jay; Bleakley, Amy; and Hennessy, Michael. (2009). “Contrary to what marketers say Americans reject tailored advertising and three activities that enable it,” Available at: http://graphics8.nytimes.com/packages/pdf/business/20090929-Tailored_Advertising.pdf

Turow, Joseph. (2007). “Cracking the Consumer Code: Advertisers, Anxiety, and Surveillance in the Digital Age,” in The New Politics of Surveillance and Visibility. Toronto: University of Toronto Press

Do You Know Who Your iPhone’s Been Calling?

The-Apple-iPhone-3GS-gets-a-phoneAn increasing percentage of Western society is carrying a computer with them, everyday, that is enabled with geo-locative technology. We call them smartphones, and they’re cherished pieces of technology. While people are (sub)consciously aware of this love-towards-technology, they’re less aware of how these devices are compromising their privacy, and that’s the topic of this post.

Recent reports on the state of the iPhone operating system show us that the device’s APIs permit incredibly intrusive surveillance of personal behaviour and actions. I’ll be walking through those reports and then writing somewhat more broadly about the importance of understanding how APIs function if scrutiny of phones, social networks, and so forth is to be meaningful. Further, I’ll argue that privacy policies – while potentially useful for covering companies’ legal backends – are less helpful in actually educating end-users about a corporate privacy ethos. These policies, as a result, need to be written in a more accessible format, which may include a statement of privacy ethics that is baked into a three-stage privacy statement.

iOS devices, such as the iPhone, iPad, Apple TV 2.0, and iPod touch, have Unique Device Identifiers (UDIDs) that can be used to discretely track how customers use applications associated with the device. A recent technical report, written by Eric Smith of PSKL, has shed light into how developers can access a device UDID and correlate it with personally identifiable information. UDIDs are, in effect, serial numbers that are accessible by software. Many of the issues surrounding the UDID are arguably similar to those around the Pentium III’s serial codes (codes which raised the wrath of the privacy community and were quickly discontinued. Report on PIII privacy concerns is available here).

Continue reading

Education, Web 2.0, and Privacy

I have a lot that I could talk about here, but rather than working through philosophical arguments for the value of privacy in education, I want to constrain myself to establishing some key points that educators should be mindful of when using Web 2.0 applications in the classroom. I begin by listing a series of factors that organizations should consult to determine if they are collecting personal information, and then follow by talking about the value and importance of privacy statements. I will conclude by providing a brief (and non-comprehensive) list of personal information that educators probably want to keep offline, unless their University can provide granular access to the information.

Is this information personal information?

Pretty well all Web 2.0 tools gather some kinds of data from individuals that use them, be it in the form of email addresses, Internet Protocol (IP) addresses, telephone numbers, messenger names, or social networking information. Before deploying any Web 2.0 technology it is important for organizations to determine whether they are capturing what is identified as ‘personal’ data, and can do so by reflecting on the following factors:

Continue reading