Those who create and author technical systems can and do impose their politics, beliefs, and inclinations onto how technology is perceived, used, and understood. On the Internet, this unfortunately means that the technically savvy often recommend choices to users who are less knowledgeable. A number of these recommendations are tainted by existing biases, legal (mis)understandings, or stakeholder gamesmanship. In the case of website development firms, such as Weebly, recommendations can lead users to violate terms of service and legal provisions to the detriment of those users. In essence, bad advice from firms like Weebly can lead to harms befalling their blissfully ignorant users.
In this short post, I talk about how Weebly blatantly encourages its customers to conduct surveillance on websites without telling them of their obligations to notify website visitors that surveillance is being conducted. I also note how the company deceives those visiting Weebly’s own properties by obfuscating whether information is collected and who is involved in the collection of visitors’ data. I conclude by briefly noting that Google ought to behave responsibly and publicly call out, and lean on, the company to ensure that Google’s Analytics product is used responsibly and in concordance with its terms of service.
In my presentation at Social Media Camp Victoria (abstract available!), I drew heavily from various academic literatures and public sources. Given the nature of talks, it’s nearly impossible to cite as you’re talking without entirely disrupting the flow of the presentation. This post is an attempted end-run/compromise to that problem: you get references and (what was, I hope) a presentation that flowed nicely!
There is a full list of references below, as well as a downloadable version of my keynote presentation (sorry powerpoint users!). As you’ll see, some references are behind closed academic paywalls: this really, really, really sucks, and is an endemic problem plaguing academia. Believe me when I say that I’m as annoyed as you are that the academic publishing system locks up the research that the public is paying for (actually, I probably hate it even more than you do!), but unfortunately I can’t do much to make it more available without running afoul of copyright trolls myself. As for books that I’ve drawn from, there are links to chapter selections or book reviews where possible.
Danezis, G. and Clayton, R. (2008). ‘Introducing Traffic Analysis‘, in A. Acquisti, S. Gritzalis, C. Lambrinoudakis, and S. D. C. di Vimercati (eds.). Digital Privacy: Theory, Technologies, and Practices. New York: Auerback Publications. 95-116.
Elmer, G. (2004). Profiling Machines: Mapping the Personal Information Economy. Cambridge, Mass.: The MIT Press.
Friedman, L. M. (2007). Guarding Life’s Dark Secrets: Legal and Social Controls over Reputation, Propriety, and Privacy. Stanford: Stanford University Press. [Excellent book review of text]
Saco, D. (1999). ‘Colonizing Cyberspace: National Security and the Internet’, in J. Weldes, M. Laffey, H. Gusterson, and R. Duvall (eds). Cultures of Insecurity: States, Communities, and the Production of Danger. Minneapolis: University of Minnesota Press, 261-292. [Selection from Google Books]
This is a short post, but gives three definitive examples of why we need to develop and instill norms in youth concerning how to use digital resources.
Let’s help this hottie find her camera!
Here’s the story (remember that…story).
In Britain a young woman (unfortunately) lost her camera. Some delightful chap decided that, rather than keeping the camera to himself, he’d try to get it back to her. Problem: he didn’t have her name, address, or anything that identified her beyond the pictures on the camera. Solution: post all of the pictures from the camera on Facebook and encourage tons of people to join the group the hopes that someone recognizes her. Problem: the embarrassment of having adult and non-adult pictures of yourself posted on the net.
Now, it turns out that this whole thing was viral marketing – the woman is an adult model and this was intended to promote a particular adult website. Nevertheless, based on the posts in the group that was set up, people saw this as a legitimate way to deliver missing property – many didn’t see anything wrong with deliberately posting pictures of a woman in various states of dress without first receiving her willful consent.
In this post I want to consider privacy from a bit of a ‘weird’ point of view: What information do you want students to reveal to each other and yourself, and what do you want to reveal to them? What ethical responsibilities do educators have to their students concerning their disclosure of information to one another?
In many classrooms, instructors and their students develop bonds by becoming vulnerable to one another by sharing personal stories with one another. ‘Vulnerability’ should be understood as developing a rapport of trust that could be strategically or maliciously exploited, though there is not an implicit suggestion that vulnerability will necessarily lead to exploitation. Some of the best teachers and professors that I have ‘revealed’ themselves as human beings – once I saw that they were like me I felt more comfortable participating in the classroom environment. With this comfort and increased participation, I developed more mature understandings of subject material and my personal stances regarding it. The rapports of trust that I developed with faculty led to the best learning environments I have ever experienced.
The Canadian SIGINT Summaries includes downloadable copies, along with summary, publication, and original source information, of leaked CSE documents.
Parsons, Christopher; and Molnar, Adam. (2021). “Horizontal Accountability and Signals Intelligence: Lesson Drawing from Annual Electronic Surveillance Reports,” David Murakami Wood and David Lyon (Eds.), Big Data Surveillance and Security Intelligence: The Canadian Case.
Parsons, Christopher. (2015). “Stuck on the Agenda: Drawing lessons from the stagnation of ‘lawful access’ legislation in Canada,” Michael Geist (ed.), Law, Privacy and Surveillance in Canada in the Post-Snowden Era (Ottawa University Press).
Parsons, Christopher. (2015). “The Governance of Telecommunications Surveillance: How Opaque and Unaccountable Practices and Policies Threaten Canadians,” Telecom Transparency Project.
Parsons, Christopher. (2015). “Beyond the ATIP: New methods for interrogating state surveillance,” in Jamie Brownlee and Kevin Walby (Eds.), Access to Information and Social Justice (Arbeiter Ring Publishing).
Bennett, Colin; Parsons, Christopher; Molnar, Adam. (2014). “Forgetting and the right to be forgotten” in Serge Gutwirth et al. (Eds.), Reloading Data Protection: Multidisciplinary Insights and Contemporary Challenges.
Bennett, Colin, and Parsons, Christopher. (2013). “Privacy and Surveillance: The Multi-Disciplinary Literature on the Capture, Use, and Disclosure of Personal information in Cyberspace” in W. Dutton (Ed.), Oxford Handbook of Internet Studies.
McPhail, Brenda; Parsons, Christopher; Ferenbok, Joseph; Smith, Karen; and Clement, Andrew. (2013). “Identifying Canadians at the Border: ePassports and the 9/11 legacy,” in Canadian Journal of Law and Society 27(3).
Parsons, Christopher; Savirimuthu, Joseph; Wipond, Rob; McArthur, Kevin. (2012). “ANPR: Code and Rhetorics of Compliance,” in European Journal of Law and Technology 3(3).