Digital literacy is a topic that is regularly raised at Internet-related events across Canada. As Garth Graham has noted, “some people will remain marginalized even when everyone is online. It’s not enough to give those who are excluded basic access to the technologies. It requires different social skills as much as different technical skills to come in from the cold of digital exclusion” (29). Perhaps in light of Canadians’ relative digital illiteracy, key Canadian policy bodies and organizations have seemingly abandoned their obligations to protect Canadian interests in the face of national and foreign belligerence. Bodies such as Industry Canada, the Canadian Radio-television Telecommunications Commission (CRTC), and the Canadian Internet Registry Authority (CIRA) are all refusing to take strong leadership roles on key digital issues that affect Canadians today.
In this post I want to first perform a quick inventory of a few ‘key issues’ that ought to be weighing upon Canadian policy bodies with authority over the Internet. I then transition to focus on what CIRA could do to take up and address some of them. I focus on this organization in particular because they are in the process of electing new members to their board; putting votes behind the right candidates might force CIRA to assume leadership over key policy issues and alleviate harms experienced by Canadians. I’ll conclude by suggesting one candidate who clearly understands these issues and has plans to resolve them, as well as how you can generally get involved in the CIRA elections.
Cornucopia of Concerns
Internet standards operate as highly visible examples of how technology has been shaped to interoperate in a transparent fashion. Common Internet protocols let networks connect with one another while simultaneously establishing common points of failure. A danger is that if these protocols are exploited then the Internet can be significantly damaged. In effect, where a central trusted node on the Internet is subject to onerous pressures the Internet – and by extension, entire regions that are serviced by these central nodes – is affected. The concerns I raise focus on three types of trust-holders: Internet service providers, DNS root authorities, and certificate authorities.
Internet service providers
Internet service providers, such as Rogers, Videotron, and Bell, receive a considerable amount of criticism from the public, advocacy organizations, industry, government, and the academy. In recent years, criticism has focused on ISPs’ imposition of usage based billing systems, integration and use of deep packet inspection devices, and redirection of traffic to their own web portals. Billing issues arose most recently with large ISPs, such as Bell Canada, demanding changes to how wholesale ISPs were charged for bandwidth volume. Such demands were exacerbated by proposals to charge consumers vastly more for bandwidth usage and what seemed to be anti-competitive efforts to squeeze companies who were competing for complementary products (e.g. cable TV, telephone or voice services) out of the market. The campaign against CRTC-approved changes to how wholesale ISPs were billed for bandwidth initiated a firestorm right at the moment of the last federal election. This arguably opened the policy window for the Canadian government to reject the CRTC’s findings and force the Commission to re-examine the issue.
While public advocates were successful in pushing against changes to the billing regimes, they were less successful in pushing against ISPs’ use of deep packet inspection technologies. ISPs won the right to manage their networks in a non-discriminatory manner and consumers were left on the hook to determine whether discrimination was occurring. This requires citizens, who lack clear insight into the network, to do their own testing. As I’ve written previously,
The unjustified discrimination of data traffic may not be evident to all consumers, especially when they lack the skills associated with digital literacy to even register the occurrence of bandwidth or application discrimination. Without solid training, many people resort to subjective ‘smell tests’. This approach to identifying whether discrimination is occurring does not contribute to evidence-based, empirically sound, complaints systems or policy responses.
This is a particularly significant issue given that almost all of Canada’s dominant ISPs have violated the rules that the CRTC established concerning the use of deep packet inspection. A small handful of people – academics, advocates, and journalists – bring the public’s attention to the technology’s misuse, often showcasing the excellent work by citizens who are fed up with trying to resolve their own complaints or organized grassroots efforts to hold ISPs accountable.
The final point, that of redirecting traffic to ISPs’ web portals, is a common practice in Canada that is incredibly aggravating. Quite often, when someone in Canada mistypes a URL or a subpage in the domain that does not exist, they are redirected to a portal controlled by their ISP. This practice is formally known as ‘DNS hijacking‘ and involves your ISP intentionally interfering with web queries. These hijacks violate the Internet standards that are supposed to guide how networks interconnect and what constitute ‘legitimate’ modes of directing web traffic. In other areas of the world this is used for censorship purposes. In Canada its used to interfere with Canadians’ web traffic so that ISPs can try to generate some advertising dollars while offering their own degraded search capabilities.
DNS root authorities
Distributed Name Servers (DNS) make the Internet significantly easier for humans to navigate, but in the process of creating ease the DNS system generates choke points where control over communication and speech can be exerted. Paul Mockapetris developed DNS in 1983 to let names be translated to IP addresses and vice versa (for more, see RFCs 1034 and 1035). As a result, when you type a website’s IP address (e.g. 184.108.40.206) or its host name (e.g. UN.org) you are directed to the same location on the Internet – the United Nations’ homepage. The DNS system is, effectively, a massive database that lets individuals type human readable names into their web browsers and be directed to websites and services. A hierarchical network of nameservers facilitates this system.
At the top of the DNS hierarchy are root nameservers, which are authoritative for top-level domains (e.g. .com, .net, .org, .ca, .co.uk, etc). For a top-level domain to exist it must first be registered by one of the root nameservers. Below the root are authoritative DNS nameservers which are responsible for domains associated with distinct top level domains. For example the .com authoritative DNS nameservers translate the IP addresses and host names of all .com addresses, the .ca DNS nameservers translate IP addresses and host names of all .ca addresses, and so forth. Below these two levels are domain resolvers. Resolvers have a cache that can quickly translate human readable host names (e.g. UN.org) to machine-friendly IP addresses (e.g. 220.127.116.11). Because they are physically located near the device making the request they are faster to respond than authoritative nameservers, which are often geographically distant and experience longer queues to return name/IP address translations. Where the resolver closest the end-user (often run by the user’s ISP or business) hasn’t already cached the host name and IP address it immediately contacts other nameservers to get that information and subsequently directs the user to the site/data they are requesting. (For a quick audio-visual walkthrough of how the DNS system works, see this short (2:08 minute) video.)
There are a host of potential problems with the current DNS system:
- It is susceptible to DNS cache poisoning, where an attacker tricks a local resolver into mistranslating. This occurs when an attacker sends a translation request to a local resolver and then floods the resolver with faked resolution responses. If successful, this will cause the resolver to incorrectly direct all web traffic trying to access that host name to a non-legitimate IP address; while you might type ‘UN.org’ into your web browser you could be sent to a site hosting malware, a site that appears like the UN’s but disseminating false information, or so forth rather than arriving at 18.104.22.168. (For a video presentation of how DNS cache poisoning occurs, see the YouTube video “DNS Cache Poisoning Attack“.)
- It operates as a single point of exploitable failure. A case in point: in 2005 a novel poisoning attack was developed by Dan Kaminsky that threatened “to take down vast swaths of the Internet”.
- It didn’t have security designed into it when first developed and deployed because DNS is a trusting system. Domain Name System Security Extensions (DNSSEC) are meant to guarantee that “DNS resolvers receive correct IP addresses for their queries” by providing source authentication (resolvers can guarantee that the IP address information correlated with a host name came from a DNS authoritative nameserver) and integrity verification (resolvers can be assured that the information received from the nameserver hasn’t been tampered with in transit to the local resolver) (Landau 2010: 60). DNSSEC, in effect, alleviates some of the dangers posed by cache poisoning by reasserting the importance of a trusted hierarchy though it still relies on trusting security certificate providers (more on why that’s a problem in a minute).
- It operates as a hierarchy, creating crises between “centralized, hierarchical powers and distributed, horizontal networks” (Galloway 2004: 204). Case in point: assuming DNSSEC were deployed, if the authoritative DNS nameservers were modified so that UN.org didn’t resolve to 22.214.171.124 then all local resolvers would trust the modification. Thus, a government could act on an authoritative nameserver, forcing its owner to modify where packets were routed to, and the change would have global consequences. Importantly, such subterfuge would pass DNSSEC’s source authentication and integrity validation.
Moreover, as a central point of control foreign governments can exert pressure on root nameservers to forcibly redirect the traffic to some websites. The United States’s Immigrations and Customs Enforcement (ICE) has been seizing domain names and redirecting them on the basis of their violating American law since 2010. Such seizures have taken place regardless of whether the sites were legal in their country of operation. Such measures follow from President Bush’s “Enforcement of Intellectual Property Rights Act,” which asserts a need to combat copyright infringement on and off American soil. High-level political guarantees to ‘protect’ intellectual property have been made by the Obama administration as well, with Vice-President Biden asserting that the administration would aggressively use tactics to close websites that offered content illegally per American law.
The effect of ICE’s campaign has been that domains names are being redirected to servers owned by the United States government, even if the servers are located outside of the US. In effect, a foreign government is leveraging its influence and power over Verisign – which controls the authoritative domain rootserver for the .com, .org, and other top-level domains – to forcibly infringe upon website owners’ free speech rights on copyright grounds. Domain names themselves constitute speech acts (see: Chelsea and Westminster Hospital NHS Foundation Trust v. Frank Redmond, The Crown in the Right of the State of Tasmania trading as “Tourism Tasmania” v. Gordon James Craven, and Wal-Mart Stores, Inc. v. wallmartcanadasucks.com and Kenneth J. Harvey) and the seizure of these names without court proceedings has the effect of censoring particular speech (in the domain name) as well as muffling the speech contained at the website which the domain name points towards.
Importantly, because ICE is targeting authoritative name servers no person in the world can resolve the domain names after the seizure takes place. This limits the ability of commercial entities to conduct business both within the US but abroad as well, amounting to ICE-created and –enforced, site-specific, embargos. Further, the U.S. government’s actions threaten innovation by heightening the risks innovators assume by relying on a web presence to monetize/popularize their works. Finally, ICE’s actions supersede the decisions of foreign courts; where a supposedly ‘copyright infringing’ website is found legal outside of the US, ICE imposes American definitions of copyright upon all global Internet users. ICE is globalizing American copyright laws.
Certificate authorities are critical to the Internet’s current security infrastructure. They provide certificates to companies and websites who meet identity and financial requirements. When you visit an https website a series of transactions take place to ensure that the communications channel is encrypted. Encryption prevents third-parties from listening in on the content of the communications. Specifically, when you visit a SSL-secured website the following occurs:
- The web server transmits its public key with its certificate;
- The web browser determines whether the certificate was issued by a trusted party – typically a certificate authority – and that the certificate remains valid and is related to the website in question;
- The browser uses the public key to encrypt a symmetrical encryption key and sends it to the server with the encrypted URL as required, in addition to other encrypted http data;
- The web server decrypts the key using its private key and uses the key to decrypt the URL and http data;
- The server sends back the requested html document and data after encrypting it with the symmetric key;
- The browser decrypts the document and data using its symmetric key.
To initiate the secure transmission process you need a trustworthy certificate authority. This effectively means that the authority must be ethical enough not to violate the trust put in it, be financially resolute enough to refuse bribes, and be willing to publicly fight against attempts by government to compel violations of trust. As written about by Soghoian and Stamm, governments can theoretically compel certificate authorities to issue fraudulent certificates, thus enabling state-actors to conduct ‘man-in-the-middle’ attacks, or those where a third-party injects themselves between the web server and web browser. As noted by Stevens et al.,
Any website secured using TLS can be impersonated using a rogue certificate issued by a rogue CA. This is irrespective of which CA issued the website’s true certificate and of any property of that certificate….Combined with redirection attacks where http requests are redirected to rogue web servers, this leads to virtually undetectable phishing attacks (pp. 36; .pdf source).
In essence this means that if a government forces a major trusted certificate authority to issue a valid (i.e. working) fraudulent (i.e. not issued to the website owner) certificate it can potentially intercept, decrypt, and analyze communications without either the web browser or web server knowing. This fear was made real a few months back and again last month, when certificates were issued for major communications companies such as Microsoft, Google, Mozilla, and Skype.
What can CIRA do?
To be clear from the outset: CIRA cannot resolve all of these issues, but they can assume a leadership role in addressing many of them. CIRA possesses a robust policy development framework (.pdf source) and in their recent survey found that Canadians were incredibly interested in – and concerned about – the safety, security, resilience of the Internet, as well as privacy issues. CIRA has publicly argued the DNSSEC, a security extension to DNS that prevents domain poisoning and domain hijacking, should be adopted by the federal government. At present, however, DNSSEC cannot be implemented where Canadian carriers are involved in domain hijacking. CIRA notes that such interferences strongly interfere with “the norms upon which the Internet was built” and that the “consensus from the international Internet community is that DNS redirection should be prohibited, with the exception of rare instances for purposes of law enforcement.” CIRA feels strongly enough about this issue to suggest that imposing legal liabilities on Canadian ISPs that persist in this practice may be appropriate. (pp. 14-5; .pdf source).
CIRA’s record on copyright is somewhat more nebulous and could interfere with their strong demands to prevent DNS redirections. In their 2010 Digital Economy filing, the organization notes that updated copyright laws are important to “protect Canadians from illegal activity on-line just as they are protected from illegal activity off-line” (pp. 12; .pdf source). This is a worrying statement, insofar as it is unclear what direct harm Canadians have experienced as a result of the present copyright legislation. Indeed, when compounded with CIRA’s grudging acceptance of DNS redirections for law enforcement purposes it may be that the organization is supportive of American efforts to impose US copyright law throughout the world to ‘protect’ American (and, presumably, some Canadian) rights holders at the expense of judicial decisions in nations where websites are operated.
CIRA could, and should, clarify its position and clarify when a redirect is appropriate for law enforcement purposes. As they are likely aware, redirects are not a significant impediment on serious online crimes such as child pornography (.pdf source), and so it is important for the organization’s directors to explain to CIRA members and Canadians more generally how a redirect – as opposed to taking down servers hosting truly illegal, as opposed to infringing, content – resolves serious legal issues instead of making them more convenient to ignore. Filtering access to particular websites also often runs the risk of being used increasingly expansively. As noted by Villeneuve, filtering is seen as an inexpensive technical solution to the challenges posed by the ease of access to information on the Internet. Regardless of the initial reason for implementing Internet filtering there is increasing pressure to expand its use once filtering is in place. Any avocation for filtering or DNS redirections thus must be made with an awareness of its (in)effectiveness in stopping crimes and likely misuses over time.
It is especially important to work against the unilateral imposition of foreign copyright law on the workings of the Internet, and to ensure that dot-ca and Canadian-held dot-com, dot-org, and other top-level domains are not subjected to inappropriate censorship. CIRA is in the unique position to strongly and loudly argue against unilateral censorship at the root level; should nation-states compel their ISPs to block particular records that is one matter, but to forcibly modify the root is another. While CIRA has been notified of these issues and concerns they have yet to publicly address these issues (.pdf source). Their inaction is something that must change.
Finally, CIRA can and should establish itself as a certificate authority. In various public documents the organization has noted the need to establish a safe and secure Internet. Acting as a trust-agent for Canadians is certainly one way to accomplish this goal. CIRA already has a reasonably robust verification system for its members to ensure that only Canadians who hold a dot-ca domain can claim membership. It could leverage existing policies to become a trusted certificate authority and, ideally, welcome the chance to trial next-generation trust systems (such as www.convergence.io) as part of its mission.
A Technically Savvy, Politically Engaged, Candidate
Only one of the candidates who are seeking election to the CIRA board of directors this year has both the background and interest to push these particular issues to the forefront of CIRA’s agenda. Kevin McArthur is a developer, security researcher, and technical author who has been deeply invested in the network neutrality debate in Canada and at the forefront of examining recent violations of the certificate authority system. His aim is to get CIRA more involved in the issues and debates concerning the Canadian Internet while expanding the scope and role of the organization’s existing Internet Forums. As someone who has actually spent time working with technologies such as Voice over IP that are so vulnerable to network neutrality abuses and is responsible for websites that would suffer badly were they censored using a DNS hijack/redirect. His full portfolio is available at his CIRA election website and his publicly disclosed research efforts at his personal website.
CIRA and You
If you are a dot-ca domain name owner then you can take part in the upcoming CIRA elections. The final members slate has been established and has a series of variously interesting candidates. To take part in the election you must formally become a member; this involves more than just registering your domain. Specifically you must do the following:
- Membership is free for all dot-ca owners. Sign up for membership. It can take up to a week or so for a membership to be awarded so register as soon as possible.
- If you are already a member, verify that you can access your member account prior to the election itself. Your login can be tested at http://www.member.cira.ca.
- Vote between September 21, 2011 – September 28, 2011. Visit https://elections.cira.ca during this time period to vote for your candidate.
The next handful of years promise to be incredibly important for the progression – or regression – of the Internet in Canada. Electing people to CIRA who are committed to advancing its mandate and ensuring the most secure, efficient, and trustworthy Internet ecosystem whilst understanding the full ramifications of their actions is essential. Take the time, sign up to become a member, and vote for the candidate you think will live up to these key principles.
A. R. Galloway. (2004). Protocol: How Control Exists After Decentralization. Cambridge, Mass.: The MIT Press.
G. Graham. (2011). “Towards a National Strategy for Digital Inclusion: Addressing Social and Economic Disadvantage in an Internet Economy” in M. Moll and L. R. Shade (eds.). The Internet Tree: The State of Telecom Policy in Canada 3.0. Ottawa: The Canadian Center for Policy Alternatives.
S. Landau. (2010). Surveillance or Security: The Risks Posed by New Wiretapping Technologies. Cambridge, Mass.: The MIT Press.