Review of The Offensive Internet: Speech, Privacy, and Reputation

9780674050891-lgThe Offensive Internet: Speech, Privacy, and Reputation is an essential addition to academic, legal, and professional literatures on the prospective harms raised by Web 2.0 and social networking sites more specifically. Levmore and Nussbaum (eds.) have drawn together high profile legal scholars, philosophers, and lawyers to trace the dimensions of how the Internet can cause harm, with a focus on the United States’ legal code to understand what enables harm and how to mitigate harm in the future. The editors have divided the book into four sections – ‘The Internet and Its Problems’, ‘Reputation’, ‘Speech’, and ‘Privacy’ – and included a total of thirteen contributions. On the whole, the collection is strong (even if I happen to disagree with many of the policy and legal changes that many authors call for).

In this review I want to cover the particularly notable elements of the book and then move to a meta-critique of the book. Specifically, I critique how some authors perceive the Internet as an ‘extra’ that lacks significant difference from earlier modes of disseminating information, as well as the position that the Internet is a somehow a less real/authentic environment for people to work, play, and communicate within. If you read no further, leave with this: this is an excellent, well crafted, edited volume and I highly recommend it.

Continue reading

Review: Network Nation – Inventing American Telecommunications

Image courtesy of Harvard University Press

I spend an exorbitant amount of time reading about the legacies of today’s telecommunications networks. This serves to historically ground my analyses of today’s telecommunications ecosystem; why have certain laws, policies, and politics developed as they have, how do contemporary actions break from (or conform with) past events, and what cycles are detectable in telecommunications discussions. After reading hosts of accounts detailing the telegraph and telephone, I’m certain that John’s Network Nation: Inventing American Telecommunications is the most accessible and thorough discussion of these communications systems that I’ve come across to date.

Eschewing an anachronistic view of the telegraph and telephone – seeing neither through the lens that they are simply precursors to contemporary digital communications systems – John offers a granular account of how both technologies developed in the US. His analysis is decidedly neutral towards the technologies and technical developments themselves, instead attending to the role(s) of political economy in shaping how the telegraph and telephone grew as services, political objects, and zones of popular contention. He has carefully poured through original source documents and so can offer insights into the actual machinations of politicians, investors, municipal aldermen, and communications companies’ CEOs and engineers to weave a comprehensive account of the telegraph and telephone industries. Importantly, John focuses on the importance of civic ideals and governmental institutions in shaping technical innovations; contrary to most popular understandings that see government as ‘catching up’ to technicians post-WW I, the technicians have long locked their horns with those of government.

Continue reading

Review: Surveillance or Security?

surveillance-or-security-the-risks-posed-by-new-wiretapping-technologiesIn Security or Security? The Real Risks Posed by New Wiretapping Technologies, Susan Landau focuses on the impacts of integrating surveillance systems into communications networks. Her specific thesis is that  integrating surveillance capacities into communications networks does not necessarily or inherently make us more secure, but may introduce security vulnerabilities and thus make us less secure. This continues on threads that began to come together in the book she and Whitfield Diffie wrote, titled Privacy on the Line: The Politics of Wiretapping and Encryption, Updated and Expanded Edition.

Landau’s work is simultaneously technical and very easy to quickly read. This is the result of inspired prose and gifted editing. As a result, she doesn’t waver from working through the intricacies of DNSSEC, nor how encryption keys are exchanged or mobile surveillance conducted, and by the time the reader finishes the book they will have a good high-level understanding of how these technologies and systems (amongst many others!) work. On the policy side, she gracefully walks the reader through the encryption wars of the 1990s,[1] as well as the politics of wiretapping more generally in the US. You don’t need to be a nerd to get the tech side of the book, nor do you need to be a policy wonk to understand the politics of American wiretapping.

Given that her policy analyses are based on deep technical understanding of the issues at hand, each of her recommendations carry a considerable amount of weight. As examples, after working through authentication systems and their deficits, she differentiates between three levels of online identification (machine-based, which relies on packets; human, which relies on application authentication; and digital, which depends on biometric identifiers). This differentiation lets her  consider the kinds of threats and possibilities each identification-type provides. She rightly notes that the “real complication for attribution is that the type of attribution varies with the type of entity for which we are seeking attribution” (58). As such, totalizing identification systems are almost necessarily bound to fail and will endanger our overall security profiles by expanding the surface that attackers can target.

Continue reading

Review of Telecommunications Policy in Transition

Image courtesy of the MIT Press

This first: the edited collection is a decade old. Given the rate that communications technologies and information policies change, this means that several of the articles are…outmoded. Don’t turn here for the latest, greatest, and most powerful analyses of contemporary communications policy. A book published in 2001 is good for anchoring subsequent reading into telecom policy, but less helpful for guiding present day policy analyses.

Having said that: there are some genuine gems in this book, including one of the most forward thinking essays around network neutrality of the past decade by Blumenthal and Clark. Before getting to their piece, I want to touch on O’Donnell’s contribution, “Broadband Architectures, ISP Business Plans, and Open Access”. He reviews architectures and ISP service portfolios to demonstrate that open access is both technically and economically feasible, though acknowledges that implementation is not a trivial task. In the chapter he argues that the FCC should encourage deployment of open access ready networks to reduce the costs of future implementation; I think it’s pretty safe to say that that ship sailed by and open connection is (largely) a dead issue in the US today. That said, he has an excellent overview of the differences between ADSL and Cable networks, and identifies the pain points of interconnection in each architecture.

Generally, O’Donnell sees interconnection as less of a hardware problem and more of a network management issue. In discussing the need and value of open access, O’Donnell does a good job at noting the dangers of throttling (at a time well ahead of ISP’s contemporary throttling regimes), writing

differential caching and routing need not be blatant to be effective in steering customers to preferred content. The subtle manipulation of the technical performance of the network can condition users unconsciously to avoid certain “slower” web sites. A few extra milliseconds’ delay strategically inserted here and there, for example, can effectively shepard users from one web site to another (p53).

Continue reading

Review: Internet Architecture and Innovation

Internet_Architecture_and_Innovation_coverI want to very highly recommend Barbara van Schewick’s Internet Architecture and Innovation. Various authors, advocates, scholars, and businesses have spoken about the economic impacts of the Internet, but to date there hasn’t been a detailed economic accounting of what may happen if/when ISPs monitor and control the flow of data across their networks. van Schewick has filled this gap by examining “how changes in the Internet’s architecture (that is, its underlying technical structure) affect the economic environment for innovation” and evaluating “the impact of these changes from the perspective of public policy” (van Schewick 2010: 2).

Her book traces the economic consequences associated with changing the Internet’s structure from one enabling any innovator to design an application or share content online to a structure where ISPs must first authorize access to content and design key applications  in house (e.g. P2P, email, etc). Barbara draws heavily from Internet history literatures and economic theory to buttress her position that a closed or highly controlled Internet not only constitutes a massive change in the original architecture of the ‘net, but that this change would be damaging to society’s economic, cultural, and political interests. She argues that an increasingly controlled Internet is the future that many ISPs prefer, and supports this conclusion with economic theory and the historical actions of American telecommunications corporations.

van Schewick begins by outlining two notions of the end-to-end principle undergirding the ‘net, a narrow and broad conception, and argues (successfully, in my mind) that ISPs and their interrogators often rely on different end-to-end understandings in making their respective arguments to the public, regulators, and each other.

Continue reading

Review: Delete – The Virtue of Forgetting in the Digital Age

Viktor Mayer-Schonberger’s new book Delete: The Virtue of Forgetting in the Digital Age (2009) is a powerful effort to rethink basic principles of computing that threaten humanity’s epistemological nature. In essence, he tries get impress upon us the importance of adding ‘forgetfulness’ to digital data collection process. The book is masterfully presented. It draws what are arguably correct theoretical conclusions (we need to get a lot better at deleting data to avoid significant normative, political, and social harms) while drawing absolutely devastatingly incorrect technological solutions (key: legislating ‘forgetting’ into all data formats and OSes). In what follows, I sketch the aim of the book, some highlights, and why the proposed technological solutions are dead wrong.

The book is concerned with digital systems defaulting to store data ad infinitum (barring ‘loss’ of data on account of shifting proprietary standards). The ‘demise of forgetting’ in the digital era is accompanied by significant consequences: positively, externalizing memory to digital systems preserves information for future generations and facilitates ease of recalls through search. Negatively, digital externalizations dramatically shift balances of power and obviate temporal distances. These latter points will become the focus of the text, with Mayer-Schonberger arguing that defaulting computer systems to either delete or degrade data over time can rebalance the challenges facing temporal obviations that presently accompany digitization processes. Continue reading