Transparent Commons

We live in a society where there is a strong desire to commoditize everything – water, energy, pollution, and each packet of data that is passed along digital networks. This desire comes from a position that (at least in part) holds that by giving everything a value, by associating costs with the degradation or poor management of commodities, it becomes possible for society to operate more ‘efficiently’. This is the great myth of capitalistic societies; that the deregulation of social goods provides a means to maximally divide wealth, opportunity, and power across the society. In Canada it is NAFTA, and its associated market pressures, that have been largely responsible for the deregulation of social programs and Crown Corporations that were previously responsible for providing core services to Canadians. We now see the specter of similar efficiencies mobilizing to ensure that bandwidth is distributed more efficiently, that people pay proportionate fees for the bandwidth and the actions they use that bandwidth for. In the process, private corporations will limit the possibilities of the Internet – they will stifle innovation by militating how their networks can be used and, as a result, inhibit development that can unexpectedly occur at each bend of these digital highways.

The Notion of Commons

It is possible that you’re not entirely familiar with the notion of the Commons, save for having heard in news reports of the ‘Tragedy of the Commons’, without any real guidance as to what the catchphrase means. To put it quickly, the Commons identifies all places, spaces, items, and products that belong to society at large rather than to any particular individual. This can be better explained by turning to town squares and roads. In the case of town squares, they operate as public space that is available to any and all members of the public to use. Because there is a greater advantage to having those spaces available to a large number of people than if there were not they continue to remain in public hands. By having squares as a public space it is possible to hold various town functions, rallies, readings, and other social events, whereas if they were privately owned the these goods would not have a space where they could be grown, potentially stunting the growth of the community’s identity.

Roads are kept in common because, again, there is greater benefit for keeping them in common than not. Were roads sold in pieces there would be a significant financial incentive for some people to refuse to let their roads be used by everyone; owners could strike strategic deals with come companies to let their goods along the road while denying other companies the right to use their pavement. While the individual who was leasing that section of the road might prosper in such a situation the public at large would not because they might experience increased costs for goods, might experience nuisances, and might be unable to participate in the actions available to them that exist when roads are kept in common. In short, there are more advantages to holding the roads in common than in placing them in private corporate hands.

The Internet As Commons

The notion of the Internet as Commons might initially strike you as a bit weird; you pay to get onto the ‘net, which would seem to indicate that the Internet was already privatized. In making that assertion, however, it would seem as though toll roads, where drivers pay a fee to maintain the roads, would be cases where the roads had been taken from the Commons. I would suggest that making some degree of profit while maintaining a Commons infrastructure is insufficient to show that the Commons have been lost. Instead I would suggest that what demonstrates the loss of a Commons are cases where individuals are denied the ability to enjoy actions that are possible in the commons on the basis of market regulatory efficiencies that operate along a profit maximization principle. I’ll try to unpack that a bit in the following:

Imagine, if you will, that you were driving down the road. You arrive at a toll booth and proceed to try and pay your fee but are prevented from doing so because your vehicle “might increase upkeep expenses” (perhaps it is particularly heavy, or you have snowchains on your wheels during the winter). You would be being denied access along the roadway on the basis of market calculations and, as a result, would be unable to engage in whatever action you had intended to. Disappointed, you turn around and head home. You aren’t given the chance to visit your ailing mother in the hospital, give a lecture at a conference, or assist your grandmother with her leaky tap. Perhaps you were prevented from going to a human-rights rally, or government protest, or similar public event. Regardless of the reason that you were traveling along the road, because it was not held in common the public (potentially) faces a reduced degree of civic participation, innovation, or personal involvement than if they had not been prevented from passing along the roadway. There was a reduction of potential actions that exceeds the overall value of what a private corporation gained.

When referring to the Internet as Commons what I am trying to articulate is the following: The Internet operates by sending packets of data across a series of servers around the world, with those packets ultimately arriving at their destination and being composed into the information that you had emitted. This means that when talking using Skype the packets of data move quickly across the networks so that you can listen to a person’s voice, and when sending messages using Instant Message clients (such as AIM, MSN, ICQ, and Google Talk) that what you type is delivered to the other person(s) involved in the chat. Whether you are sending a message using Skype, AIM, email, or torrenting a file, the data packets are treated equally on a ‘first come, first served’ basis.

Operating in the Commons, this means that the Internet can be used to do things that are unexpected because of the extremely limited amount of regulation that actually occurs along the networks. VOIP wasn’t something that was on the radar of the founders of the Internet (while, in the sense that it didn’t exist and wasn’t a core project of their’s – they were thinking of how to build a redundant communication system, where the system at the time was an analogue phone system) but because of the lack of ‘packet discrimination’ along the network someone could develop a way for packets to be transmitted and reassembled across the Internet and allow for voice communications. Similarly, the efficient modes of distributing data using Bittorrent exist because of a lack of filtering technologies along the networks. Unfortunately, this notion of the ‘net as Commons is under threat and, along with it, the fires of innovation are at risk of being extinguished.

Commons Under Fire

Internet Service Providers are increasingly deploying technologies that will let them inspect each packet of data that is transmitted along their networks and if that packet violates the ISPs terms of network use it may not be passed through the ISPs network. Moreover, the computer that sends that packet, and where it was going, may be recorded for future disciplinary actions. The possibility of this filtering is accompanied by two dangers.

Black Hole Syndrome

This syndrome involves the ‘disappearing’ of packets. It is often the case that when you send a packet of data that it is redirected when it tries to access a server that no longer exists, or isn’t operational at the time that you transmit a packet towards it. When these situations occur, the packet is routed around the black hole – this routing is where the phrase ‘the Internet routes around all damage’ comes from.

What may also happen in these situations, though it is less common, is the followin: You send a packet of data to a server with the expectation that that server will forward the packet to the next server in the chain, but when the server gets your packet it fails to forward it to the next server in the chain. It doesn’t announce that there is a problem with the packet, or note that it is refusing to take packets. This situation is especially problematic because, like a black hole, it sucks in packets and they never emerge from its maw. Were ISPs to inspect packets and refuse to pass them along their network AND refuse to provide redirects around their collapsed stars the the speed that packets moved would be impacted. The ‘solution’, of course, would be to write programs that sent packets in ways that didn’t result in being sucked into ISPs’ maws. Unfortunately, this would mean operating in accordance with whatever regulations ISPs had established. ISPs would dictate what was permitted on their network and innovation that exceeded those boundaries would not be spread to other users. With a system such as this it would be questionable whether many of the technologies that we currently enjoy and that use a great deal of bandwidth would have ever been feasible to create, let alone created at all.

Persistent Guilt Syndrome

What if when you drove your car along the road there was always the possibility that you could be pulled over for absolutely no reason and asked what you were doing it, why, and how long you expected to be doing it? What if it happened every couple of feet along the road? Not only would it get pretty tiring (and really slow down your trip) but you would probably be pretty angry about the persistent disruptions. Sure, you might not have have anything to hide, but that doesn’t mean that you feel the need to explain what you’re doing every bloody meter of the way down the road.

On the ‘net you won’t necessarily experience that persistent slowdown as the data that you are transmitting is ‘inspected’ by ISP packet sniffing technologies. You will, however, be in a situation where everything you do is potentially monitored and you are always responsible for what you type/say/search. Given recent shifts in how US and British telecommunication giants are treating content (i.e. moving away from content neutrality toward militarizing their networks to detect ‘illegal’ materials) citizens around the world who pass data through their networks face the possibility of having their data similarly inspected. This means that it is possible to receive a phone call when the ISP passes ‘suspicious’ traffic information to authorities or corporations – the police may want to know why you were reading about neurotoxins a few days before Ms. Jones next door turned up dead of a snake bite, or Microsoft may pay you a legal visit to ask why you were searching for ‘crackz’ for Office 2014. In either case, you are responsible for proving you innocence and can be found guilty without authorities having to move through historically common legal procedures (such as requiring some justification before getting a warrant to search your home). It’s a good thing that you remember exactly why you looked for everything you ever have online, right? You’re always innocent…as long as you have an eidetic memory.

The consequence of this is that citizens may reduce what they do online – they might be less likely to expose themselves to new things and individuals on the basis that any such exposure could endanger their well-being in the future. The possibility of forgetting why you were doing research on homocidal rages and the possibility of exposing yourself to coecive actions should you forget the motivation of that particular search might mean that you just don’t investigate that topic online. Why bother when it could be risky?

The Tragedy of the Commons

It may seem as though the possibilities that I’m proposing seem far-fetched. The Internet is a space of action, freedom, and exploration! This is true only so long as we can retain the Internet as a Commons. If it genuinely becomes commercialized at the level of ISPs, and if those ISPs are either allowed to or required to militarize their networks then it is entirely possible that the aforementioned syndromes could become widespread. While there will of course be solutions if we reach that point in the game, why bother getting to that point in the first place? Network Neutrality is, in part, an attempt to avoid the tragedy of the commons in cyberspace. While we have almost certainly lost most of the free fresh water, grazing lands, and precious metals that once belonged to the public good there is not a terribly convincing reason for why a similar tragedy must take place online.