Review: In the Plex

intheplexSteven Levy’s book, “In the Plex: How Google Things, Works, and Shapes Our Lives,” holistically explores the history and various products of Google Inc. The book’s significance comes from Levy’s ongoing access to various Google employees, attendance at company events and product discussions, and other Google-related cultural and business elements since the company’s inception in 1999. In essence, Levy provides us with a superb – if sometimes favourably biased – account of Google’s growth and development.

The book covers Google’s successes, failures, and difficulties as it grew from a graduate project at Stanford University to the multi-billion dollar business it is today. Throughout we see just how important algorithmic learning and automation is; core to Google’s business philosophy is that using humans to rank or evaluate things “was out of the question. First, it was inherently impractical. Further, humans were unreliable. Only algorithms – well drawn, efficiently executed, and based on sound data – could deliver unbiased results” (p. 16). This attitude of the ‘pure algorithm’ is pervasive; translation between languages is just an information problem that can – through suitable algorithms – accurately and effectively translate even the cultural uniqueness that is linked to languages. Moreover, when Google’s search algorithms routinely display anti-Semitic websites after searching for “Jew” the founders refused to modify the search algorithms because the algorithms had “spoke” and “Brin’s ideals, no matter how heartfelt, could not justify intervention. “I feel like I shouldn’t impose my beliefs on the world,” he said. “It’s a bad technology practice”” (p. 275). This is an important statement: the founders see the product of human mathematical ingenuity as non-human and lacking bias born of their human creation.

Continue reading

ISP Audits in Canada

Union members call for an independent investigation to ensure safety in Milwaukee County.There are ongoing concerns in Canada about the CRTC’s capacity to gauge and evaluate the quality of Internet service that Canadians receive. This was most recently brought to the fore when the CRTC announced that Canada ranked second to Japan in broadband access speeds. Such a stance is PR spin and, as noted by Peter Nowak, “[o]nly in the halcyon world of the CRTC, where the sky is purple and pigs can fly, could that claim possibly be true.” This head-in-the-sands approach to understanding the Canadian broadband environment, unfortunately, is similarly reflective in the lack of a federal digital strategy and absolutely inadequate funding for even the most basic governmental cyber-security.

To return the CRTC from the halcyon world it is presently stuck within, and establish firm empirical data to guide a digital economic strategy, the Government of Canada should establish a framework to audit ISPs’ infrastructure and network practices. Ideally this would result in an independent body that could examine the quality and speed of broadband throughout Canada. Their methodology and results would be publicly published and could assure all parties – businesses, citizens, and consumers – that they could trust or rely upon ISPs’ infrastructure. Importantly, having an independent body research and publish data concerning Canadian broadband would relieve companies and consumers from having to assume this role, freeing them to use the Internet for productive (rather than watchdog-related) purposes.

Continue reading

Google Analytics, Privacy, and Legalese

Google Logo in Building43Google Analytics have become an almost ever-present part of the contemporary Internet. Large, small, and medium-sized sites alike track their website visitors using Google’s free tools to identify where visitors are coming from, what they’re looking at (and for how long), where they subsequently navigate to, what keywords bring people to websites, and whether internal metrics are in line with advertising campaign goals. As of 2010, roughly 52% of all websites used Google’s analytics system, and it accounted for 81.4% of the traffic analysis tools market. As of this writing, Google’s system is used by roughly 58% of the top 10,000 websites, 57% of the top 100,000 websites, and 41.5% of the top million sites. In short, Google is providing analytics services to a considerable number of the world’s most commonly frequented websites.

In this short post I want to discuss the terms of using Google analytics. Based on conversations I’ve had over the past several months, it seems like many of the medium and small business owners are unaware of the conditions that Google places on using their tool. Further, independent bloggers are using analytics engines – either intentionally or by the default of their website host/creator – and are ignorant of what they must do to legitimately use them. After outlining the brief bits of legalese that are required by Google – and suggesting what Google should do to ensure terms of service compliance – I’ll suggest a business model/addition that could simultaneously assist in privacy compliance while netting an enterprising company/individual a few extra dollars in revenue.

Continue reading

Three-Strikes to Banish Europeans and Americans from the ‘net?

200903281552.jpgThroughout the Global North there are discussions on the table for introducing what are called ‘three-strikes’ rules that are designed to cut or, or hinder, people’s access to the Internet should they be caught infringing on copyright. In the EU, the big content cartel is trying to get ISPs to inspect consumer data flows and, when copywritten content is identified, ‘punish’ the individual in some fashion. Fortunately, it is looking that at least the EU Parliament is against imposing such rules on the basis that disconnecting individuals from the Internet would infringe on EU citizens’ basic rights. In an era where we are increasingly digitizing our records and basic communications infrastructure, it’s delightful to see a body in a major world power recognize the incredibly detrimental and over-reactionary behavior that the copyright cartel is calling for. Copyright infringement does not trump basic civil liberties.

Now, I expect that many readers would say something along this line: I don’t live in the EU, and the EU Parliament has incredibly limited powers. Who cares, this: (a) doesn’t affect me; (b) is unlikely to have a real impact on EU policy.

Continue reading

DPI, Employees, and Proper Inspection

In my last post I alluded to the fact that Deep Packet Inspection (DPI) technologies could be used by businesses to try and reduce the possibility of ‘inappropriate’ employee use of bandwidth and wrongful or accidental transmissions of confidential IP. In that last post I was talking about IT security, and this post will continue to reflect on DPI technologies’ applications and benefits to and for corporate environments.

A Quick Refresher on DPI

From ArsTechnica:

The “deep” in deep packet inspection refers to the fact that these boxes don’t simply look at the header information as packets pass through them. Rather, they move beyond the IP and TCP header information to look at the payload of the packet. The goal is to identify the applications being used on the network, but some of these devices can go much further; those from a company like Narus, for instance, can look inside all traffic from a specific IP address, pick out the HTTP traffic, then drill even further down to capture only traffic headed to and from Gmail, and can even reassemble e-mails as they are typed out by the user. (Source)

For a slightly longer discussion/description of DPI I suggest that you look at the wiki page that I’m gradually putting together on the topic of Deep Packet Inspection.

Continue reading