In the domain of telecom policy, it seems like a series of bad ideas (re)arise alongside major innovations in communications systems and technologies. In this post, I want to turn to the telegraph to shed light on issues of communication bandwidth, security and privacy that are being (re)addressed by regulators around the world as they grapple with the Internet. I’ll speak to the legacy of data retention in analogue and digital communicative infrastructures, congestion management, protocol development, and encryption policies to demonstrate how these issues have arisen in the past, and conclude by suggesting a few precautionary notes about the future of the Internet. I do want to acknowledge, before getting into the meat of this post, that while the telegraph can be usefully identified as a precursor to the digital Internet because of the strong analogies between the two technological systems it did use different technological scaffolding. Thus, lessons that are drawn are based on the analogical similarities, rather than technical homogeneity between the systems.
The Telegraph
The telegraph took years to develop. Standardization was a particular issues, perhaps best epitomized by the French having an early telegraph system of (effectively) high-tech signal towers, whereas other nations struggled to develop interoperable cross-continental electrically-based systems. Following the French communication innovation (which was largely used to coordinate military endeavours), inventors in other nations such as Britain and the United States spent considerable amounts of time learning how to send electrical pulses along various kinds of cables to communicate information at high speed across vast distances.
While the French had captured imaginations, inventors pushed the technical envelope to increase speed and efficiency of communication. The French, ultimately, were slow to adopt a new telegraphic infrastucture on the basis that they didn’t see a need to (re)invest sunk costs in communications infrastructure. This is a first lesson from the telegraph: sunk costs often act as a barrier to infrastructure experimentation, and speaks to the inertia that can overtake progressive first-movers.
Telegraphic systems bore some resemblance to contemporary Internet infrastructures, with rural stations in particular needing to transmit messages across a series of telegraphic ‘hubs’ to get messages to their destination. In the process of moving information between parties the initial message was converted to a telegraphic code (with Morse code perhaps being the most famous today) and transmitted to larger telegraphic hubs. At the hubs, the message would be re-transmitted (requiring an operator to manually re-input the message along the separate line), be transferred between locations via pneumatic tubes, or be physically taken from one location to another by runners. In effect, there were a series of physical mediums that the data was translated between without losing the actual content of the message. The end result was that “the early 1870s, the Victorian Internet had taken shape: A patchwork of telegraphic networks, submarine cables, pneumatic tube systems, and messengers combined to deliver messaged within hours over a vast area of the globe” (Standage 1998: 101). The second lesson of the telegraph: differentiating between the ‘layers’ of communications, such as transmission layer protocols and content communication protocols, enables experimentation in both transmission and content protocols without experiments disrupting the other layers involved in the communication.
Congestion became an increasingly significant issue as telegraphic systems transmitted more and more data; as noted by Standage, these systems’ own success threatened their long-term viability as instruments of rapid communication. Messages sometimes took days to reach their destination, even when points between transmission and final reception were in close proximity. In the face of congestion, automation and innovations around bandwidth usage were developed and implemented. Steam-powered communication equipment created circuits across telegraphic exchanges, and bandwidth was vastly expanded with the introduction of duplex and quadraplex systems (doubling and quadrupling available bandwidth on the lines themselves, respectively). The third lesson of the telegraph: demand for bandwidth led to innovations in bandwidth compression and efficiency. Operators were driven to meet demand, and not to exclusively focus on telegraphic bandwidth as just another scarce resource to be managed.
In addition to addressing data transmission and bandwidth issues, security and privacy were familiar issues for telegraph users and operators. During the telegraph’s widespread usage in the 1860s, many European countries forbade the use of codes except by government or government agents. In Prussia, the government went as far as to establish the equivalent of a data retention law, where copies of encrypted messages had to be retained by telegraph operators and passed to the government. There were also laws about the languages telegraphic messages could be transmitted in; common dictionaries were provided, and words outside of the dictionaries either could not be transmitted or were charged a higher rate and assumed to be in some kind of code. It was only in 1864-5 that European governments came together to form the International Telegraph Union (the precursor to the International Telecommunication Union) and scrapped the ban on code usage by non-governmental actors. Lesson four: retention is something that has been tried in the past, and ultimately discarded, on the basis that broad-based surveillance in inefficient for securing the state. Lesson five: establishing a permission-based culture around content (and content protocols) is often a fruitless endeavor.
Despite abandoning a ban on coded communications, the ITU developed a system to discourage code usage. Because it was literally hard for operators to transmit codes (coded communications used letter combinations that were abnormal) words were to be seven syllables or less. Messages that were in cipher (i.e. ‘gibberish’ words) were charged on the basis that every five letters constituted a word. Since most words were more than five letters there was a higher cost to encrypted communications. This can be seen an early way of passing on ‘operational costs’ of cryptography to end-users. Lesson six: externalizing computational costs to users can diminish communicative privacy and security while more efficiently processing a larger volume of traffic at each node.
Telegraph Lessons and the Internet
Chronocentricity is an issue that needs to be overcome by those working on digital issues today. We often possess an egoism that our generation is poised on the cusp of history itself. As noted in this post, many of the ‘hot’ issues about there Internet – bandwidth, security, privacy, layered communications standards – were addressed by the precursor to digital networks. What are the lessons we can draw from the telegraph?
Lesson one: sunk costs as a barrier to innovation
When an organization has significant capital investments in infrastructure it will typically do whatever necessary to extract the maximum value from its investment. This process, however, threatens to undermine the cycle of innovation if organizations ‘just’ maintain the infrastructure and increase/maintain costs of bandwidth to facilitate scarcity-driven business models. Limiting external innovation in infrastructure limits the novelties that often promote new uses of the technology itself: if users are legally and technically disallowed from tinkering with the network or protocols used in accessing the network then the communications infrastructure may become stagnant.
Further, attempts to maximize the return on investment of infrastructure may limit an organization’s willingness to ‘self-cannibalize’ itself. Self-cannibalization can be incredibly valuable in insuring that the organization retains the first-mover advantage across several technological cycles, but does demand a willingness to forego present revenues under the expectation that not consuming oneself ultimately leads to an even more significantly diminished revenue stream in the future. The diminished revenues from the cannibalized revenue stream, however, can be supplemented or expanded by entering other market areas that become available with the transition to new technical infrastructures. The danger, however, is that where infrastructure costs are incredibly high incumbents can behave uncompetitively and suppress competitors while retaining legacy infrastructure. With the Internet, there is a resistance to deploy expensive technical upgrades that would massively reshape the possibilities of the Internet (e.g. fibre to the home) and instead a focus on having supply dictate bandwidth available for consumable demand. Demand is not propelling the growth by last-mile bandwidth availability, unlike the telegraph.
Lesson two: differentiated layers of communication are essential
The telegraph was largely successful because it divided the labours involved in communicating; what was written had minimal impact on delivery (save that encrypted messages were more ‘costly’ to input). Further, separation of these layers led to innovations in inputing content, preparing content, and transmitting/delivering content. Such separation is essential. Where there is significant blurring of the ‘layers’ of communication the parties responsible for this blurring can unduly exercise influence in the practice of communications, potentially to the detriment of those operating discretely at particular layers. This is especially problematic when those blurring boundaries do so outside of nationally or internationally approved communications standards, endangering the communicative possibilities and innovation enabled by the differentiation of communication layers. Where standards are made proprietary there is a shift in power away from the consumer and towards the industrial actor: such shifts should be resisted wherever and however possible.
Lesson three: demand for bandwidth drives bandwidth innovation
The telegraph’s problems with congestion drove innovations in bandwidth efficiency and transmission. Each hub in the telegraphic network bore some resemblance to a router, where messages would be backed up and transmitted as bandwidth became available. There were attempts to balance flows – messages were diverted from high-flow stations to low-traffic stations when congestion was identified – but there wasn’t a broad attempt to prioritize various messages based on presumed content save for along private telegraphic systems. This general attempt to implement a ‘best effort’ system is replicated in the ‘net today, though with the massive introduction of traffic analysis appliances we are potentially moving towards a priority-based approach to data routing. When traffic analysis is deployed as a temporary stop-gap to address limited periods of bandwidth congestion until additional bandwidth is provisioned then the systems seem appropriate. We should remain skeptical of bandwidth management and analysis technologies where telecommunications organizations providing a public service use them manage scarcity without corresponding efforts to provision bandwidth to meet growing consumer demand. Reducing bandwidth without a commensurate reduction in prices, especially in Western nations, is indicative of managing for scarcity instead of building for abundance.
Lesson four: broad-based communication surveillance is inefficient
Communications dragnets work best following event rather than as their precursor. Massive surveillance systems that “deprive people of liberty and invade their privacy are never worth it”, though the equation changes with dragnet-based surveillance. Dragnets can be effective in ensnaring attackers after a security event, but precautionary mass surveillance is problematic on the basis that “the costs are too high, both social and economic, and the benefits are negligible” (Schneier 2006: 249). Focused surveillance, or that which is reliant on analyzing traffic patterns, tends to be more useful when trying to ‘secure’ the nation-state from perceived threats.
Prussia demanded that copies of encrypted communications were passed to the government, but today well-encrypted files are largely impenetrable if secured with efficient and effective (and often free) algorithms. Today, even in cases where encryption is ultimately defeated, such defeats often happen after past the data’s ‘best before’ date. Encryption doesn’t have to be totally impenetrable, just impenetrable enough that by the time it is broken whatever is protected is no longer important, or that the leaking of the information is no longer considered a liability. Encryption buys time, rather than impenetrable security, in the face of a well resourced and dedicated attacker.
Of course, whereas data was once retained in analogue format (i.e. paper) it is now stored as bits. This decreases costs of data retention and transferral from telecom operators to government sources. These decreased costs, however, don’t assuage the cost to liberty, nor the challenges posed by modern encryption to code breakers.
Lesson five: stringent content censorship is hard to implement
Data transmission protocols are challenging enough to implement, but massively regulating content using content protocols constitute a Herculean task if mass communication is the end-goal of the transmission network. Not only can and will individuals subvert the protocols/censors of content, but instituting censorship also narrows the possibilities of the network itself. In a (relatively) unregulated transmission network users can uncover its oft-hidden value, whereas in a stringently regulated network administrators tend to dictate use-conditions. The latter approach is often suitable for entirely private networks, where hidden values can be ignored, but in public or quasi-public networks in democratic nation-states the absence of content regulation facilitates deliberative discourse and content-protocol innovations. So long as the content/application layer isn’t interfering with transmission of data then protocols should be defaulted to ‘best-effort’ transmission on the network. It was impossible to successfully implement a dictionary-based protocol for content transmissions with the telegraph, and equally difficult to do so in a digital network that secures and provides for free speech. Where we must investigate the transmission of content, it is likely best to do so in either highly granular ways (e.g. get a warrant, focus on a specific user or collection of users) or see about accessing the data offline (e.g. take control of the device receiving/transmitting the content in the first place).
Lesson six: externalizing computation costs undermines privacy
Telegraph operators imposed significant extra costs for sending coded messages, and today this is manifest in the default of sending communications in the clear. Today, the individual is responsible for establishing, or renting, the infrastructure that necessary to secure their traffic. Given how much data transmission now requires authentication prior to transmitting content, and that authenticating data crosses the same networks as content data, it is sensible to at least require the encryption of authentication information to enhance security. It is perhaps less important to mandate the encryption of content flows themselves. In encrypting credentials, service providers would be limiting attackers’ changes of successfully impersonating another person using that person’s valid authentication credentials. Admittedly, just encrypting authentication information would still permit the surveillance of the individual’s content traffic. Ideally, both content and authentication streams would be encrypted, but online service providers should at least encrypt authentication data streams.
Book Sources
Schneier, Bruce. (2006). Beyond Fear.
Standage, Tom. (1998). The Victorian Internet: The Remarkable Story of the Telegraph and the Nineteenth Century’s On-Line Pioneers.