I’ve recently been reading some of David Lyon’s work, and his idea of developing an ethic of voyeurism has managed to intrigue me. I don’t think that I necessarily agree with his position in its entirety, but I think that it’s an interesting position. This paper, entitled “Code-Bodies and Algorithmic Surveillance: Examining the impacts of encryption, rights of publicity, and code-specters,” is an effort to think through how voyeurism might be understood in the context of Deep Packet Inspection using the theoretical lenses of Kant and Derrida. This paper is certainly more ‘theoretical’ than the working paper that I’ve previously put together on DPI, but builds on that paper’s technical discussion of DPI to think about surveillance, voyeurism, and privacy.
As always, I welcome positive, negative, and ambivalent comments on the draft. Elements of it will be adopted for a paper that I’ll be presenting at a Critical Digital Studies workshop in a month or two – this is your chance to get me to reform positions to align with your own! *grin*
The problem with walled gardens such as Facebook, is that you can be searched whenever you pass through their blue gates. In the course of being searched, undesired data can be refused – data like links to ‘abusive’ sites that facilitate copyright infringement. As of today, Facebook has declared war on the Pirates Bay, maintaining that because links to the site often infringe on someone’s copyright then linking to it violates the terms of service that Facebook users agree to. Given that the Pirates Bay is just a particularly specialized search engine, it would seem that Facebook is now going to start applying (American?) ethical and moral judgements on what people use to search for data. Sharing data is great, but only so long as it’s the ‘right kind’ of data.
What constitutes ‘infringing’ use when talking about a search engine? Google, as an example, lets individuals quickly and easily find torrent files that can subsequently be used to download/upload infringing material. The specific case being made against the Pirate Bay is that:
“Facebook respects copyrights and our Terms of Service prohibits placement of ‘Share on Facebook’ links on sites that contain “any content that is infringing. Given the controversy surrounding The Pirate Bay and the pending lawsuit against them, we’ve reached out to The Pirate Bay and asked them to remove the ‘Share on Facebook’ links from their site. The Pirate Bay has not responded and so we have blocked their torrents from being shared on Facebook.” (Source)
In 2008, ipoque released a report titled “Bandwidth Management Solutions for Network Operators“. Using Deep Packet Inspection appliances, it is possible to establish a priority management system that privileges certain applications’ traffic over others; VoIP traffic can be dropped last, whereas P2P packets are given the lowest priority on the network. Two modes of management are proposed by ipoque:
- Advanced Priority Management: where multi-tiered priorities maintain Quality of Experience (rather than Service) by identifying some packet-types as more important than others (e.g. VoIP is more important than BitTorrent packets). Under this system, less important packets are only dropped as needed, rather than being dropped once a bandwidth cap is met.
- Tiered Service Model: This uses a volume-service system, where users can purchase so much bandwidth for particular services. This is the ‘cell-phone’ model, where you sign up for packages that give you certain things and if you exceed your package limitations extra charges may apply*. Under this model you might pay for a file-sharing option, as well as a VoIP and/or streaming HTTP bundle.
The danger with filtering by application (from ipoque’s position) is that while local laws can be enforced, it opens the ISP to dissatisfaction if legitimate websites are blocked. Thus, while an ISP might block Mininova, they can’t block Fedora repositories as well – the first might conform to local laws, whereas blocking the second would infringe on consumers’ freedoms. In light of this challenge, ipoque suggests that could ISPs adopt Saudi Arabia-like white-lists, where consumers can send a message to their ISP when they find sites being illegitimately blocked. Once the ISP checks out the site, they can either remove the site from the black-list, or inform the customer of why the site must remain listed.
I worry that increasingly far-reaching and burdensome copyright laws, when combined with the analysis techniques of Deep Packet Inspection (DPI), will lead to pervasive chilling of speech. I see this as having real issues for both the creation and development of contemporary culture, which depends on mixing the past into new creations (with ‘the past’ increasingly copy-written), and for the opportunities to use rich media environments such as the Internet to create and distribute political statements. Copyright isn’t just an issue for musicians and artists; it’s an issue for anyone who is or who wants to engage in digital self-expression in media-creative ways.
Given that my earlier post about this relationship between DPI and freedom of expression may have seemed overly paranoid, I thought that I should substantiate it a bit by turning to a DPI vendor’s white paper on copyright. In one of their most recent white papers, ipoque talks about “Copyright Protection in the Internet“. One of the great things about this white paper is how the author(s) have divided their analysis; they identify different methods of limiting or stopping infringement theoretically (i.e. can a technology do this?) and then provide a ‘reality check’ (i.e. can this practically be implemented without gross rights violations or technical nightmares), and end each analysis with a conclusion that sums up ipoque’s official position on the method in question. I want to focus on detecting infringing files, rather than on preventing such transfers of those file, on the basis that it is the former that really depends on DPI to be effective.