Steven Levy’s book, “In the Plex: How Google Things, Works, and Shapes Our Lives,” holistically explores the history and various products of Google Inc. The book’s significance comes from Levy’s ongoing access to various Google employees, attendance at company events and product discussions, and other Google-related cultural and business elements since the company’s inception in 1999. In essence, Levy provides us with a superb – if sometimes favourably biased – account of Google’s growth and development.
The book covers Google’s successes, failures, and difficulties as it grew from a graduate project at Stanford University to the multi-billion dollar business it is today. Throughout we see just how important algorithmic learning and automation is; core to Google’s business philosophy is that using humans to rank or evaluate things “was out of the question. First, it was inherently impractical. Further, humans were unreliable. Only algorithms – well drawn, efficiently executed, and based on sound data – could deliver unbiased results” (p. 16). This attitude of the ‘pure algorithm’ is pervasive; translation between languages is just an information problem that can – through suitable algorithms – accurately and effectively translate even the cultural uniqueness that is linked to languages. Moreover, when Google’s search algorithms routinely display anti-Semitic websites after searching for “Jew” the founders refused to modify the search algorithms because the algorithms had “spoke” and “Brin’s ideals, no matter how heartfelt, could not justify intervention. “I feel like I shouldn’t impose my beliefs on the world,” he said. “It’s a bad technology practice”” (p. 275). This is an important statement: the founders see the product of human mathematical ingenuity as non-human and lacking bias born of their human creation.
This conception of the ‘pure’ and ‘good’ algorithm that is devoid of human bias drove the aesthetic development of the Google’s products line, insofar as “the message Google wanted to convey was that its products had no human bias” (p. 207). Calling their mobile operating system Android is clearly linked with this popular conception: the device is distinct from humanity and independent of it past creation; creation itself (seemingly) is not seen as establishing particular narratives, biases, ethics, or other determining contingencies. Similar conceptions drove the naming of Google’s web browser, Chrome.
Though the algorithm might be seen as pure, actions surrounding those mathematical equations and code are often wrapped in ethical and political questions inside of the company. Ethical questions arose when the GMail team discussed inserting ads based on data mining email. There were strong worries at Google that this behaviour was “just going to be creepy and weird”, though the company’s founders “were really entranced by it . . . We felt like, ‘Wow, something was mentioned in my email and I actually got an ad that was relevant!’ That was amazing. We thought that was a great thing.” As for the potential blowback, Brin says, “We didn’t give it a second thought. There were plenty of things to question, but I never batted an eyelash at that. It never occurred to me as a privacy thing” (p. 170-1). This issue of privacy, highlighted in the middle of the book, returns with a fury towards the conclusion when Levy examines Google’s involvement with the Federal Trade Commission (FTC).
In Levy’s discussion of the DoubleClick acquisition we learn a great deal about Google’s corporate behaviours. The FTC investigated the potential implications of DoubleClick’s acquisition and ultimately permitted Google’s purchase. After explaining, in quite some depth, the significance of Google’s potential acquisition of the company Levy writes that the FTC investigation “failed to perceive the admittedly complicated privacy implications that were unique in this case. For its part, Google helped foment misunderstanding by not being clear about the unprecedented benefits it would gain in tracking consumer behaviour . . . the DoubleClick cookie provided a potentially voluminous amount of information about its users and their interests, virtually all of it compiled by stealth” (p. 333). While Google’s actions here are not terribly surprising – we somewhat cynically expect companies to struggle to increase their revenues within the confines of the law – they are telling: DoubleClick’s acquisition was key to Google’s continuing dominance of the online advertising marketshare and, as such, if government couldn’t find out it was coming to a bad conclusion then Google wasn’t going to help correct that conclusion.
Questions of ethics also arose around the problem of video-based copyright infringement. Key people at Google felt, and presumably still feel, that such infringement “when you come right down to it” is “evil” (p. 229). Similarly there were serious questions around what is an inappropriate degree of censorship to incorporate into the company’s Search product, questions that were prominently raised not just when Eric Schmidt tried (and failed) to remove some search results about himself but also when Google entered the Chinese search market.
Indeed, it is the many-page account of Google’s troubles in China that are amongst the most important – and revealing – in Levy’s book. To begin, from the book we discover that there were intense debates within Google about the appropriateness of even doing business in the country, with the company’s chief policy directory Andrew McLaughlin strongly arguing against launching business in China. McLaughlin lost the argument but, upon entering the Chinese market, another problem arose for Google: a cult culture arose around the Googler in charge of China, Kai-Fu, which was seen as very ‘non-Googley’. There were also culturally-driven missteps, including gifts that were (mis)perceived as bribes and other cross-cultural frictions.
These cultural issues, however, paled in the face of determining what the company would censor in their Search products. Specifically, there was a problem in “determining what information should not be given to Chinese users. Though the government demanded censorship, it didn’t hand out a complete list of what wasn’t allowed. Following the law required self-censorship, with the implicit risk that if a company failed to block information that the Chinese government didn’t want its citizens to see, it could lose its license.” To deal with the problem Google examined and probed “the sites of competitors, such as China’s top search engine, Baidu, testing them with risky keywords and see what they blocked” (280). This was seen as both technically and politically (and, perhaps, ethically) savvy: Google itself would not engage in censorship beyond the censorship that already existed in the Chinese market.
This process of navigating the Chinese business, cultural, and censorial environment forced Google to establish differing security permissions between their North American and Chinese operations. It wasn’t that the company’s executives didn’t trust their Chinese engineers but that “when you go to a place like China, there’s lots of examples of companies where intellectual property has gone out the door.” Moreover, the company’s engineering director was “concerned that employees in China who were Chinese nationals might be asked by government officials to disclose personal information, and all our access policies derived from that” (p. 301). So, in essence, it wasn’t that Google was necessarily worried about specific engineers but about the social situation and governmental pressures that could be applied to engineers, and thus affect their behaviour and actions.
While Google worried about data exfiltration by employees, it established even more extensive security precautions after some of its Beijing employees succumbed to spearphishing. The consequence of this successful attack was the theft of confidential security code. Though Levy’s book just offers a brief account of the attack on Gaia, Google’s master password system, it succinctly draws together the body of public writing on the topic and boils that writing down in terms that the layperson can both appreciate and understand. This data was subsequently used to gather information about Chinese dissidents who used Google’s products: Google, though being compromised, acted as a boon to the Chinese state in its efforts to surveil and suppress dissenting members of its population.
In the end, this book is incredibly useful for anyone interested in corporate growth, privacy, or corporate-government relations. It’s also, obviously, of interest to “Google-watchers”. Levy has brought significant insights to how Google developed and why certain paths and decisions were chosen over others. He is routinely attentive to the ‘big picture’, or how Google’s services interact with one another and with the world. Though not covered in the review, there are interesting bits and pieces around the company’s relationship with telecommunications carriers and spectrum policy, hardware development, business secrecy, and other topics sure to interest policymakers, scholars, business readers, and the generally interested member of the public. Without any doubt, this is one of the ‘must buy’ books about big data, big corporations, and one of the biggest information giants that collects and controls vast amount of the world’s information.