Steven Levy’s book, “In the Plex: How Google Things, Works, and Shapes Our Lives,” holistically explores the history and various products of Google Inc. The book’s significance comes from Levy’s ongoing access to various Google employees, attendance at company events and product discussions, and other Google-related cultural and business elements since the company’s inception in 1999. In essence, Levy provides us with a superb – if sometimes favourably biased – account of Google’s growth and development.
The book covers Google’s successes, failures, and difficulties as it grew from a graduate project at Stanford University to the multi-billion dollar business it is today. Throughout we see just how important algorithmic learning and automation is; core to Google’s business philosophy is that using humans to rank or evaluate things “was out of the question. First, it was inherently impractical. Further, humans were unreliable. Only algorithms – well drawn, efficiently executed, and based on sound data – could deliver unbiased results” (p. 16). This attitude of the ‘pure algorithm’ is pervasive; translation between languages is just an information problem that can – through suitable algorithms – accurately and effectively translate even the cultural uniqueness that is linked to languages. Moreover, when Google’s search algorithms routinely display anti-Semitic websites after searching for “Jew” the founders refused to modify the search algorithms because the algorithms had “spoke” and “Brin’s ideals, no matter how heartfelt, could not justify intervention. “I feel like I shouldn’t impose my beliefs on the world,” he said. “It’s a bad technology practice”” (p. 275). This is an important statement: the founders see the product of human mathematical ingenuity as non-human and lacking bias born of their human creation.
In the wake of a stunning data breach the University of Victoria campus community could only hope that the institution would do everything it could to regain lost trust. One such opportunity arose this week, when controversial Google Streetview vehicles have been scheduled to canvas the campus. Unfortunately the opportunity was squandered: it is largely by accident that the campus community has – or will – learn that Google is capturing images and wireless access point information.
In this short post I want to discuss how seriously the University failed to disclose Google’s surveillance of the campus. I begin by providing a quick overview of Streetview’s privacy controversies. I then describe the serious data breach that UVic suffered earlier this year, which has left the institution with a significant trust deficit. A discussion of the institution’s failure to disclose Google’s presence to the community, and attempts to chill speech around Google’s presence, follows. I conclude by suggesting how institutions can learn from UVic’s failures and disclose the presence of controversial, potentially privacy invasive, actors in order to rebuild flagging trust deficits.
Google Streetview and Privacy
Streetview has been a controversial product since its inception. There were serious concerns as it captured images of people in sensitive places or engaged in indiscreet actions. Initially the company had a non-trivial means for individuals to remove images from the Google Streetview database. This process has subsequently been replaced with an option to blur sensitive information. Various jurisdictions have challenged Google’s conceptual and legal argument that taking images of public spaces with a Streetview vehicle are equivalent to a tourist taking pictures in a public space.
Mobile penetration is extremely high in Canada. 78% of Canadian households had a mobile phone in 2010, in young households 50% exclusively have mobiles, and 33% of Canadians generally lack landlines. Given that mobile phones hold considerably more information than ‘dumb’ landlines and are widely dispersed it is important to consider their place in our civil communications landscape. More specifically, I think we must consider the privacy and security implications associated with contemporary mobile communications devices.
In this post I begin by outlining a series of smartphone-related privacy concerns, focusing specifically on location, association, and device storage issues. I then pivot to a recent – and widely reported – survey commissioned by Canada’s federal privacy commissioner’s office. I assert that the reporting inappropriately offloads security and privacy decisions to consumers who are poorly situated to – and technically unable to – protect their privacy or secure their mobile devices. I support this by pointing to intentional exploitations of users’ ignorance about how mobile applications interact with their device environments and residing data. While the federal survey may be a useful rhetorical tool I argue that it has limited practical use.
I conclude by asserting that privacy commissioners, and government regulators more generally, must focus their attention upon the Application Programming Interfaces (APIs) of smartphones. Only by focusing on APIs will we redress the economics of ignorance that are presently relied upon to exploit Canadians and cheat them out of their personal information.
Those who create and author technical systems can and do impose their politics, beliefs, and inclinations onto how technology is perceived, used, and understood. On the Internet, this unfortunately means that the technically savvy often recommend choices to users who are less knowledgeable. A number of these recommendations are tainted by existing biases, legal (mis)understandings, or stakeholder gamesmanship. In the case of website development firms, such as Weebly, recommendations can lead users to violate terms of service and legal provisions to the detriment of those users. In essence, bad advice from firms like Weebly can lead to harms befalling their blissfully ignorant users.
In this short post, I talk about how Weebly blatantly encourages its customers to conduct surveillance on websites without telling them of their obligations to notify website visitors that surveillance is being conducted. I also note how the company deceives those visiting Weebly’s own properties by obfuscating whether information is collected and who is involved in the collection of visitors’ data. I conclude by briefly noting that Google ought to behave responsibly and publicly call out, and lean on, the company to ensure that Google’s Analytics product is used responsibly and in concordance with its terms of service.