Day 24/ Mon 17 Aug 09  Mobile penetration is extremely high in Canada. 78% of Canadian households had a mobile phone in 2010, in young households 50% exclusively have mobiles, and 33% of Canadians generally lack landlines. Given that mobile phones hold considerably more information than ‘dumb’ landlines and are widely dispersed it is important to consider their place in our civil communications landscape. More specifically, I think we must consider the privacy and security implications associated with contemporary mobile communications devices.

In this post I begin by outlining a series of smartphone-related privacy concerns, focusing specifically on location, association, and device storage issues. I then pivot to a recent – and widely reported – survey commissioned by Canada’s federal privacy commissioner’s office. I assert that the reporting inappropriately offloads security and privacy decisions to consumers who are poorly situated to – and technically unable to – protect their privacy or secure their mobile devices. I support this by pointing to intentional exploitations of users’ ignorance about how mobile applications interact with their device environments and residing data. While the federal survey may be a useful rhetorical tool I argue that it has limited practical use.

I conclude by asserting that privacy commissioners, and government regulators more generally, must focus their attention upon the Application Programming Interfaces (APIs) of smartphones. Only by focusing on APIs will we redress the economics of ignorance that are presently relied upon to exploit Canadians and cheat them out of their personal information.

Mobile Privacy

The latest smart devices often spur national headlines and consume hours of television reporting and advertising. Consumers are typically sold of the ‘cool’ features of devices, such as video chats, new intuitive gestures, better screens and speakers, superior access to third-party applications, music services, and so forth. Rarely are security improvements or enhancements to user privacy anywhere near the popular marketing material. This isn’t to say that innovations in security aren’t regular: every generation of Apple’s iDevices have been accompanied by more sophisticated hardware- and software-based security innovations, and the same can be said for Android, Blackberry, and Nokia devices. Innovations in privacy are somewhat rarer. Some proponents of smartphone privacy, such as Apple, have chosen to walk away from strong privacy settings in preference for more ‘engaging’ interfaces. Contemporary conveniences have come at the cost of consumer privacy protections.

There are (at least) three key areas where mobile privacy commonly comes to the fore. The integration of GPS and wifi-based location tools with the core operating systems of contemporary phones has, and will continue to, raise serious concerns about locational privacy. In tying contact information with underlying APIs, along with weak consumer privacy protections, expectations of privacy in who we associate with are threatened. Finally, poor management of third-party applications’ access to stored data has, and will likely continue to, limit consumers’ abilities to secure their data or prevent borderline malicious surveillance processes from taking place.

I will note that many of the examples I draw on will refer to Apple’s iPhone, with far fewer examples drawn from other smart phones. This isn’t necessarily meant to single out Apple but is the result of conducting months of research on deficiencies associated with Apple products. Other devices – Android in particular! – have and will likely continue to manifest security vulnerabilities that infringe upon their users’ expectations of privacy.

Location Privacy

Where a mobile device happens to be on a regular and not-so-regular basis can reveal considerable amounts of information about an individual, especially when data is collected over extended periods of time. Using basic data mining (and common sense) it is possible to identify routine movement patterns, where someone is likely to be at any time of the day, where they live and work, whether they suffer from medical conditions requiring (semi-)regular treatment, when an abnormal life event occurs, and so on. While these movement patterns are revealed regardless of whether someone has a smartphone, feature and dumb phones are less able to disclose this information to non-carrier partners. All three types of phone will disclose the following to a carrier (and anyone it’s partnered with): information such as cell identification, signal level, angle of arrival, time of arrival, and time of difference to arrive can be used to calculate a phone’s position.[1] In the case of smartphones, third-party applications can typically access collected location information and transmit it back to its corporate servers. Further, on smart devices location information can be collected by identifying nearby wifi access points, by activating the GPS system, and/or by locating the phone in relationship to cellular towers.

Once movement location is collected it can have other data overlaid upon it to gain deeper insight into who is using the phone. Imposing demographic, psychographic, and consumer information over geolocational data can establish nuanced profiles.[2] Such profiles are not just geolocationally-sensitive but also vary over time. By integrating time as a variable the data miner can develop deeper insights about the device owner by integrating migratory patterns with behavioural and imputed racial characteristics (e.g. pinpointing a phone as at gay pride parades, carnival routes, or other cultural events that have publicly disclosed geo-temporal characteristics).[3]

In the case of the iPhone, Apple had initially required application developers to query the user every time before accessing the GPS sub-system or locating the phone using nearby wifi access points. This meant that a customer could sporadically disclose their location as they saw fit, trading their privacy for specific benefits. This capability, which was present in all versions of iOS prior to 3.2.1, has subsequently been replaced with a uniform opt-in/out mechanism. If a user selects “OK” once when an application asks to access a device location they must do the following to modify their configuration:

  1. Open Settings;
  2. Select General;
  3. Open Location Services;
  4. Turn off a particular application’s sharing of the device location.
  5. Steps 1-4 must be repeated every time that a user wants to opt-out of location sharing again.

While this is an opt-in approach, it stands in stark contrast to Steve Jobs’ statements at the D8 conference. Specifically, Jobs stated that Apple has a “very different view of privacy than some of our colleagues in [Silicon] Valley. We take privacy extremely seriously. That’s one of the reasons we have the curated apps store. We have rejected a lot of apps that want to take a lot of your personal information and suck it up into the cloud. Privacy means that people know what they’re signing up for. In plain English, and repeatedly, that’s what it means. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of you asking them. Let them know precisely what you’re going to do with their data.” Evidently, Apple no longer takes privacy as seriously as it had in previous iterations of its business strategy.

In the case of Windows Phone 7 device, many of the applications will request access to location information as a precondition of installing the application. This is true for RSS feed readers, calendaring programs, and video games. Some applications, such as the BC Ferries Sailing Information app, prominently display an option on the main screen so that users can opt-out of location sharing at any time. Unlike Apple, however, Microsoft’s phone does not contain a setting page where users can opt-out of location sharing on a per-app basis. Instead, users must entirely disable or enable all location services. Many apps will let you subsequently opt-out of location sharing, but where to disable the feature varies depending on the application.

Smartphones also have a habit of turning their users into ‘warphoners’. To clarify, this means that the phones detect, store, and subsequently transmit information about the wifi access points the phones pass by (with geolocation information) to their respective corporations. Microsoft, Apple, and Google have all been ‘caught’ capturing locational information and sending it home to their servers. While Google’s database does limit some of the information it discloses, we can intuit its capabilities based on what was revealed about Microsoft’s own location database. Specifically, when researchers examined the Live.com database they found that some of its items moved from location to location. The Live.com database was tracking where mobile hotspots were and, thus, giving Microsoft and those accessing the database insight into the movements of not just mobile phone owners but also of non-Windows phone users who had mobile wifi access. On a contemporary smartphone there is no reason why a third-party application couldn’t also develop similar sniffing services that operated while the app was running.

Various privacy officials have stated that there is relatively little harm in access point information being captured. Unfortunately, few seem aware of how easy it is to collect a router’s MAC address. With this address it is possible to query publicly available databases that retain correlated MAC addresses and location information. Using this information, you can identify where an individual is physically situated.

Unfortunately, many data protection and privacy commissioners operate on complaints-based systems dependent on citizens identifying harms. Most citizens are poorly situated to trace the data flowing in and out of their phone, and have limited insight into what happens to data after it leaves their device. Those that know may be bound by non-disclosure agreements, limiting their ability to contribute to the public sphere. In light of these limitations commissioners and regulators must proactively engage with smartphone manufacturers. Government officials must ensure that APIs guarantee effective privacy controls over location information so that citizens can ‘control’ or be aware of the flow of their personal information.

Association Privacy

The fact that considerable amount of personal information is held on mobile phones is nothing new. There have been worries around what happens if a person loses their phone for years, and such anxieties will likely continue as long as humans outsource memory retention to semi-animate objects. What has changed with the rise of data-enabled devices is the ease of unknowingly losing your contact list without ever having physically lost hold of your phone. The loss of this information not only compromises contact details of associates and colleagues, but also sheds light upon who the device owner likely communicates with, has met, or generally has in their social network. Such revelations impact citizens’ association privacy, insofar as they cannot be sure that their communications device won’t indiscriminately disclose to parties-unknown about who the owners associate with. Such revelations can have chilling consequences and also lead to profiles being developed that negatively impact the device owners or others who have their information stored on the mobile device.

All smartphones have address books (or address book equivalents, in the case of Windows Phone). The iPhone, in particular, is well-known for letting third-party applications transmit copies of users’ address books. Apple installs their ‘Contacts’ app on all phones and it cannot be removed by the phone owner. In a report by the European Network and Information Security Agency (ENISA), it was noted that there was a serious privacy concern related to how third-party applications interact with the ‘Contacts’ application. The report’s authors write, “…in iOS, the address book is accessible to all apps. No special status is given to the user’s own contact details in the address book, meaning that, apart from the large amounts of personal data this exposes, the user’s own phone number is also accessible, which can be used for unsolicited marketing” (.pdf). Third-party application developers can access a considerable amount of personal information without first informing users of the access.

To be more specific, software engineer Nicholas Seriot writes that the following items are accessible through the Address Book database, which underlies the Contacts application:

  • Names of contacts;
  • User and contacts’ phone numbers;
  • User and contacts’ email addresses;
  • Notes field, “in which many Mac users store sensitive data such as door codes or bank accounts’” (.pdf)

These concerns are not just academic or hypothetical. In 2008, Aurora Feint was caught looking through the Address Book Database, sending it unencrypted to their servers, and subsequently matching the data against others users’ contact lists to inform users when their contacts/friends were also playing the game. In this case Apple did identify the problem and subsequently removed the application from their app store. Importantly, however, the problem was detected after it had previously been approved for sale within their curated environment and following considerable public outrage. Other companies have secretively collected data as well: MogoRoad collected Swiss phone numbers to subsequently call users (though not in contravention of Swiss law) (.pdf) and Storm8 collected users’ phone numbers and correlated them with users’ names, email address, and unique device identifiers.

Apple does note in their iOS Reference Library that “the Address Book database is ultimately owned by the user, so applications must be careful not to make unexpected changes to it. Generally, changes should be initiated or confirmed by the user.” Despite this suggestions, it remains possible for application developers to access, transmit, and modify information from the Address Book database without first requesting the user’s permission.

Of some concern is Apple’s more recent response when contacted about applications that transmit contact information without user consent. In their paper, “PiOS: Detecting Privacy Leaks in iOS Applications” [.pdf] researchers M. Egele, C. Kruegel, E. Kirda, and G. Vigna found that popular social network application Gowalla transmitted a user’s contact book, in its entirety, without the owner’s consent. When the authors contacted Apple about this indiscriminate appropriation of contact information the company suggested that the researchers direct their concerns directly to the application developer.

There are several problems with how Apple has established the API for their mobile environment. To begin, their API enables access to contacts information without imposing code-based restrictions. This is a serious deficit. Second, the information that is being shared is not exclusively owned or controlled by the phone owners. There is no ability for those in the ‘Contacts’ application to consent to the disclosure of their personal information to a third-party. Moreover, given their lack of consent or notice to the device owner, and given that we cannot reasonably expect that those included in the contacts book will be notified of disclosures, it is dubious that individuals in a person’s contact book will ever know to contact the application developer and have their personal information removed. Ignorance permeates all stages of the disclosure process, and this ignorance fuels the monetization of personal information.

Device Storage Privacy

Of course, there is even more information that is stored on these devices. In the case of iDevices there is a unified keyboard cache that is accessible to third-parties. The cache “contains all the words ever typed on the keyboard, except for the ones entered in the password field. This is supposed to help autocompletion but this mechanism effectively acts as a key-logger, storing potentially private and confidential names and numbers.” (source .pdf) As it stands, third-parties that access this information – without the owner knowing about this caching feature, or consenting to third-parties accessing it for non-cut/paste purposes – can uncover significant personal information about the owner. Have they recently been searching for medical products? Have they been visiting job search or infidelity websites? Have they input addresses, text messages, emails, or comments in web forums that could be sensitive? All this information is prospectively available.

Device storage is typically what people worry about when thinking of mobile security. Specifically, they establish passwords for their mobiles so that if the devices are lost then whoever finds the phone cannot immediately access its full contents. While physical access protection is important – and something that was specifically noted in the federal privacy commissioner’s recent survey – it is a very small part of a much larger device security and privacy framework. Simply setting a password protects you against the most obvious, if not the most common, sources of data appropriations, privacy infringements and security breaches.

Reporting on Perception-Based Studies

The purpose of walking through these security and privacy vulnerabilities isn’t intended to drive people away from smartphones or any other computing device. Rather, it is meant to underscore the current technical reality of owning and using the devices. Few people, even those who are technically savvy (myself included!), can limit the sharing of information if they are using certain smartphones. Privacy settings are not intended to maximize customer privacy but to facilitate perceptions that companies are meeting consumer privacy concerns. That these same companies enable the dissemination of personal information to third-parties, often without consumers learning about the dissemination or purposes of data collection, indicate the importance that Apple et al places on consumer privacy. Even for the interested consumer, many apps lack a privacy policy and neither Apple nor Google require developers to create or make available such policies. Indeed, to ‘simply’ access Apple’s own privacy policy from their iDevice consumers must do the following:

  1. Select ‘Settings’
  2. Select ‘General’
  3. Select ‘About’
  4. Select Legal
  5. Press screen until copy option is available and copy the URL to the privacy policy
  6. Click the ‘Home’ button
  7. Open Mobile Safari
  8. Select Address Bar and paste URL
  9. Select ‘Go’

Given the reality that customers cannot secure their personal information, or effectively even be aware of when or where it is flowing, headlines concerning the Privacy Commissioner of Canada’ recent survey can be both misleading and harmful. CBC led their coverage of the report with an article entitled “Canadians lax about cellphone security” and the Vancouver Sun with “Do a better job protecting mobile privacy, Canadians told.” The articles pick up on the fact that a minority of Canadians establish locking passwords or modify their privacy/sharing settings on their mobile devices. The actual study notes that those who store personal information on the devices are more likely to install a password (52% versus 33%) as are those who install applications beyond those installed on the phone by default (68% versus 27%). The report also notes that almost 60% of the people with GPS-enabled phones don’t actually have the GPS enabled. The majority is somewhat concerned about privacy issues stemming from location information but the survey fails to inquire whether their GPS-enabled devices are smartphones that can (and do) leak and collect location information based on other data sources.

While it is admirable that many people claim to modify their mobile device settings to limit data disclosure, such modifications have varying degrees of effect. In the case of an iPhone, key bits of data are being collected by third-parties without customers having any option to prevent the collection and subsequent dissemination of personal information. The iOS API itself permits for accessing the address book, and similar public calls can discretely be made to the wifi location system and the keyboard cache. The nature of iDevices make these actions possible. Thus, even if an iPhone user has a password their data is insecure from the companies invited onto the device. Further, establishing a password is insufficient to secure a mobile device: did the users of iDevices use more than the 4-digital password, which is required to initiate the full range of iDevice encryption? What did users of older devices, which no longer receive security updates, do with their devices? Use them? If so, did these same users identify themselves as taking actions to secure their privacy and believe it was effective?

The problem with the study, and with the subsequent headlines, is that it fails to adequately identify who an data thief might be and suggests that owners can genuinely protect their privacy if using their devices. Generally, individuals will assume that it’s a bad third-party, not Apple  or their favourite video game manufacturer, who is going to abscond with their personal information and that of their family, friends, and business contacts. When the hostile party is the operating system itself consumers can only save themselves by refusing to purchase or use the device, or by relying on government regulators to prevent the harm and force manufactures to sell devices that comply with Canadian law.

Undermining the Economic of Ignorance

The problem with studies like the Privacy Commissioner’s – if only for how the media will report on them – is that consumers come to believe that they are primarily responsible for security failures. This offloads a considerable amount of responsibility from government officers to a relatively impotent citizenry. Further, the survey offers a sense that device owners can take actions to significantly limit the primary vectors of information leakage. While they have some control over a few vectors they rarely have control of the primary means of information collection and dissemination.

There is a high level of friction when a customer must disable systems-level processes to use an application without disclosing location information. Performing such actions add considerable delays in accessing features of the phone and, as a result, most consumers simply will not disable location awareness on a regular basis. This is a behaviour we will see even if the device owners are uncomfortable with persistent disclosures. Such high levels of friction also indicate near-absolute absences of any genuine privacy-by-design features. Privacy-by-design does not simply mean that citizens can proactively protect their privacy but that user interfaces are configured to best let citizens control how and when they disclose personal information. Not only is it incredibly hard to limit the sharing of personal information using the devices’ options (varying UIs in the same operating system, single opt-in options, having to burrow through layers of settings to opt-out of features that can negatively impact the rest of the device’s operation, etc) but in many cases the dissemination of personal information cannot be blocked, no notice is given of disseminations, and data cannot be subsequently deleted from third-parties’ repositories. For many smart phones, APIs should stand for ‘Advanced Privacy Intrusions’ instead of ‘Application Programming Interfaces’.

Unwanted collection and dissemination of personal information, to say nothing of the lack of notice or inability to delete disseminated data, exploits users’ ignorance and impotence for economic gain. The smartphone ecosystem is substantially predicated on an economics of ignorance which, if unveiled and addressed by parties with significant direct market power, is reversible.

To be forthright: companies do not collect large sums of data and pay to store it in their databases for no reason. Corporations are not in the habit of intentionally increasing the costs of doing business without some profit-based rationale. After selling an app of $0.99 or less no company is interested in then developing an ever-larger server infrastructure to store collected personal information without anticipating a return on their investment. The issue, however, is that many apps lack discernible privacy policies and users – especially those in curated gardens – may ‘trust’ the applications they install on the basis that a ‘knowledgable’ party is believed to have rooted out bad or malicious applications. While this may be true in some cases, Apple’s integration of surreptitious data expropriation without consumer consent into their API clearly reveals that the gatekeepers who directly profit from application sales cannot be trusted. We cannot trust the fox to protect the henhouse from the other foxes!

Popular consumer surveys can be valuable. They are noticeably less helpful when delving deeper and deeper into technical matters, of which few members of the public should be expected to know much about. Consumers may be cognizant of superficial ways to protect their personal information on their devices. Those same knowledgable consumers are far less likely to know about the deeper vulnerabilities and intentionally designed weaknesses that pervade mobile devices. Consequently, privacy commissioners and government regulators more generally should take long, hard looks at how mobile operating systems are designed. They should ensure that the systems – and by extension the information environments they spawn – comply with Canadian law.

Commissioners should focus on the source of the worst privacy concerns which, in the case of smartphones, arguably originate in the design of operating system APIs that exploit citizens’ ignorance of how and when data is migrated off of their smartphones. While there is some value in evaluating how often people modify their sharing options on mobile phones it is as important to know why they don’t modify these settings – are they using devices where they don’t know how to do so, or find it tiresome to manage their privacy? If yes to either of the latter, then there has been a serious failure in designing the operating system’s graphic user interface. In the case of Apple and Microsoft, both of whom have almost entirely locked down basic facets of their operating system while investing heavily in designing their mobile environments, these are intentional (if correctable) errors.

If operating system manufacturers will not restrict indiscriminate and non-consensual sharing of personal information on their own then the Canadian government should step in. Government, using its regulatory powers, can resolve market imbalances by investing in the research to identify market problems and subsequently correcting information asymmetries that disrupt market processes and that infringe upon Canadian law. Such corrections might entail issuing fines on a per-device sold basis, publicly naming and shaming offending companies, or ever using federal dollars to deliver public warning announcements about the harms associated with specific smartphone operating systems.

Regardless of the solution, it should be significant enough to either rebalance the information assymetry between consumers and device manufacturers or disrupt the profitability of exploiting ignorance to extract personal information from mobile devices. Ultimately, commissioners and regulators must demand that device manufacturers either provide APIs that comply with Canadian law or change existing APIs in the face of prevalent privacy issues. Where neither of these conditions are met, OS vendors should be forced to suffer significant penalties. The only way to secure devices’ security and citizens’ privacy is to erode the economics of ignorance that application vendors and device manufacturers alike depend on to cheat Canadians out of their personal information.

References

[1] C. A. Ardagna et al. (2008). “Privacy-Enhanced Location Services Information,” in A. Acquisti, S. Gritzalis, C. Lambrinoudakis, and S. De Capitani di Vimercati (eds.). Digital Privacy: Theory, Technologies, and Practices. New York: Auerbach Publications.

[2] G. Elmer. (2004). Profiling Machines: Mapping the Personal Information Economy. Cambridge, Mass.: The MIT Press.

[3] See: D. Phillips’ and M. Curry’s “Privacy and the phonetic urge: Geodemographics and the changing spatiality of local practice.”