Steven Levy’s book, “In the Plex: How Google Things, Works, and Shapes Our Lives,” holistically explores the history and various products of Google Inc. The book’s significance comes from Levy’s ongoing access to various Google employees, attendance at company events and product discussions, and other Google-related cultural and business elements since the company’s inception in 1999. In essence, Levy provides us with a superb – if sometimes favourably biased – account of Google’s growth and development.
The book covers Google’s successes, failures, and difficulties as it grew from a graduate project at Stanford University to the multi-billion dollar business it is today. Throughout we see just how important algorithmic learning and automation is; core to Google’s business philosophy is that using humans to rank or evaluate things “was out of the question. First, it was inherently impractical. Further, humans were unreliable. Only algorithms – well drawn, efficiently executed, and based on sound data – could deliver unbiased results” (p. 16). This attitude of the ‘pure algorithm’ is pervasive; translation between languages is just an information problem that can – through suitable algorithms – accurately and effectively translate even the cultural uniqueness that is linked to languages. Moreover, when Google’s search algorithms routinely display anti-Semitic websites after searching for “Jew” the founders refused to modify the search algorithms because the algorithms had “spoke” and “Brin’s ideals, no matter how heartfelt, could not justify intervention. “I feel like I shouldn’t impose my beliefs on the world,” he said. “It’s a bad technology practice”” (p. 275). This is an important statement: the founders see the product of human mathematical ingenuity as non-human and lacking bias born of their human creation.
There are ongoing concerns in Canada about the CRTC’s capacity to gauge and evaluate the quality of Internet service that Canadians receive. This was most recently brought to the fore when the CRTC announced that Canada ranked second to Japan in broadband access speeds. Such a stance is PR spin and, as noted by Peter Nowak, “[o]nly in the halcyon world of the CRTC, where the sky is purple and pigs can fly, could that claim possibly be true.” This head-in-the-sands approach to understanding the Canadian broadband environment, unfortunately, is similarly reflective in the lack of a federal digital strategy and absolutely inadequate funding for even the most basic governmental cyber-security.
To return the CRTC from the halcyon world it is presently stuck within, and establish firm empirical data to guide a digital economic strategy, the Government of Canada should establish a framework to audit ISPs’ infrastructure and network practices. Ideally this would result in an independent body that could examine the quality and speed of broadband throughout Canada. Their methodology and results would be publicly published and could assure all parties – businesses, citizens, and consumers – that they could trust or rely upon ISPs’ infrastructure. Importantly, having an independent body research and publish data concerning Canadian broadband would relieve companies and consumers from having to assume this role, freeing them to use the Internet for productive (rather than watchdog-related) purposes.
Google Analytics have become an almost ever-present part of the contemporary Internet. Large, small, and medium-sized sites alike track their website visitors using Google’s free tools to identify where visitors are coming from, what they’re looking at (and for how long), where they subsequently navigate to, what keywords bring people to websites, and whether internal metrics are in line with advertising campaign goals. As of 2010, roughly 52% of all websites used Google’s analytics system, and it accounted for 81.4% of the traffic analysis tools market. As of this writing, Google’s system is used by roughly 58% of the top 10,000 websites, 57% of the top 100,000 websites, and 41.5% of the top million sites. In short, Google is providing analytics services to a considerable number of the world’s most commonly frequented websites.
In this short post I want to discuss the terms of using Google analytics. Based on conversations I’ve had over the past several months, it seems like many of the medium and small business owners are unaware of the conditions that Google places on using their tool. Further, independent bloggers are using analytics engines – either intentionally or by the default of their website host/creator – and are ignorant of what they must do to legitimately use them. After outlining the brief bits of legalese that are required by Google – and suggesting what Google should do to ensure terms of service compliance – I’ll suggest a business model/addition that could simultaneously assist in privacy compliance while netting an enterprising company/individual a few extra dollars in revenue.
I owe this (more nuanced reflection) of yesterday’s note on the role of ‘professional’ versus ‘amateur’ news, again, to my colleague Tim Smith. After reading my post yesterday, he replied:
nice piece Chris! I have a follow up question.
is investigative journalism on the net in the spaces Simon characterized as amateur. I am thinking of reports like a Bob Woodward breaking of Watergate. A Seymour Hersh breaking of Abu Ghraib. This type of investigative reporting.
Do you see the type of investigative journalism (on political matters) coming from blogs and internet media? If not, could it come from there? It certainly requires a system of professional training (gathering and putting together information not necessarily available on the internet), resources and social capital (contacts).
Re-reading what I’d posted, I can see that these are questions that needed to be asked and responded to. Below is my response to Tim.