Graph Search and ‘Risky’ Communicative Domains

Photo by Lynn Friedman

There have been lots of good critiques and comments concerning Facebook’s recently announced “Graph Search” product. Graph Search lets individuals semantically query large datasets that are associated with data shared by their friends, friends-of-friends, and the public more generally. Greg Satell tries to put the product in context – Graph Search is really a a way for corporations to peer into our lives –  and a series of articles have tried to unpack the privacy implications of Facebook’s newest product.

I want to talk less directly about privacy, and more about how Graph Search threatens to further limit discourse on the network. While privacy is clearly implicated throughout the post, we can think of privacy beyond just a loss for the individual and more about the broader social impacts of its loss. Specifically, I want to briefly reflect on how Graph Search (further?) transforms Facebook into a hostile discursive domain, and what this might mean for Facebook users.

‘Right’ and ‘Wrong’ Speech Acts

Anthony Wing Kosner’s article, published mid-January, tried to capture the complexities of Graph Search and the various concerns/problems associated with it at technical and social levels. Towards the end, however, he writes the following:

And this is not just a matter of noisy results, but of potentially embarrassing disclosure, as well. In a recent story on Gizmodo, These People Are Now Sharing Horrible Things About Themselves Thanks to Facebook Search, Graph Search autocompletes some really nasty stuff and calls up profiles of people who have “liked” things that in retrospect they shouldn’t have.

The problem with comments like Kosner’s is that while, on the one hand, there are certain things that individuals would ideally not ‘like’ or state online (e.g. racist actions or hate speech), the imposition of visibility serves to narrow the confines of what people will generally say and do. To be sure, some of the items that you can see in the above link are socially repugnant (e.g. “Men who like sexism”) but the process of shaming individuals isn’t how such social ills are corrected. Moreover, the meaning embedded behind those likes remains utterly unclear. Other examples included “Men who like shitting my pants” and “Women who like sucking dicks”. These latter examples were used to shock readers and (presumably) embarrass anyone who had ‘liked’ things that the author perceived as gross, awkward, or inappropriate. This attempt to impose particular norms of ‘correctness’ on a discursive population of almost 1 billion people is, functionally, an attempt to limit speech and speech acts. Such efforts are incredibly problematic. The risk with trying to narrow certain discourses and actions is that they can detrimentally affect other speech and association decisions that are legal, if not supported by the majority.

Free speech rights include the ability to speak, within reason, about topics that the mass majority of citizens may disagree with or find unpleasant. So long as that speech does not demonstrably cause harm – such as intentionally engaging in acts of libel, or yelling “fire” in a crowded theatre – then the speech ought to be permitted. No, not all speech is equally interesting but nevertheless it ought to be protected. The same is true in terms of associational rights; individuals should be able to speak and associate with other members of society who are perceived negatively by the general public. So long as the act of association or communication is not, in itself, criminal then the parties ought to be permitted their engagements. Public recrimination should not follow. So, in this sense, privacy can be seen as facilitating broader social goods – deliberative discourse and association – and thus as a kind of ‘umbrella concept’.

At issue, in terms of Graph Search, is that when people engaged in such speech acts and associations on Facebook they likely could not have contemplated – let alone understood – that sometime in the future Facebook would release its newest Search product. They could never have known that the obscurity of what they had done – which was an assumed attribute of their communicative environment – would be stripped away and laid bare to hundreds of thousands of readers. The long-term consequences of their statements and actions were unknowable.

Discursive Obscurity and Contemporary Discursive Risk

Facebook’s Graph Search not only threatens the obscurity that people had (perhaps falsely) expected. In stripping away that obscurity, which did function as a kind of social privacy, if not technical or legal privacy protection, the Facebook network has gotten a little more risky. Are my privacy settings set appropriately? Are they set to keep my thoughts and interests as obscure/secretive/public as they presently are, should the company release a new product?

As a result of these privacy confusions, as more and more people adopt Facebook as the Internet – as Facebook becomes the AOL garden of yesteryear – the network’s users may be encouraged to stifle their speech in a domain that is increasingly seen as a semi-hostile public sphere. On the one hand, Facebook is where individuals go to communicate and share with friends; an aspect of Facebook’s mission is to encourage connections amongst close friends and individuals to whom you may only have a passing connection. On the other hand, however, Facebook’s ‘innovations’ are turning this public domain of connectedness into a deeply cynical and untrusting environment. The company’s assurances concerning privacy are routinely doubted and shown as shams. As a result of Facebook’s (seeming) disregard for users’ privacy norms, the space that their users are speaking in – a space that  many people feel compelled to participate in – become semi-normalized; speech that isn’t technically wrong but is about ‘bad things’ should be left unspoken. Associations that are legal, and perhaps playful or innocent, should be avoided because of prospective ways that an employer might think about them in five or ten months time, or an insurer or divorce attorney in a few years time. Graph Search continues to make Facebook the antithesis of a domain of friendly discourse that encourages openness; the Facebook network is instead one that increasingly encourages careful tending and cultivation of appearance. Such cultivation is, in part, driven from wariness associated with Facebook’s inevitable release a new version of its garden.

In light of Facebook’s ever-mutating environment those who are at risk of being ‘seen and shamed’ are often faced with (at least!) two choices: flee and, in the process, lose access to a network of friends and colleagues and potential to engage in constructive deliberations in one of the most populated public spheres of our day (a choice this author has made), or remain on Facebook and potentially suffer when the company introduces new product lines or modifies the technical controls over individuals’ information flows. These are not choices that individuals should have to make. The imposition of such choices on individuals is demonstrable of an incredibly naive, ignorant, and uncaring corporate agent. Or perhaps an agent with all three of these characteristics.