HTTPS and the clustered trusted third party
Rather than going into an in-depth rant about SSL/TLS/HTTPS and everything I hate about them, let me just discuss two problems I’ve run into recently.
As a bit of background, SSL/TLS addresses the issue that Internet connections are completely unauthenticated, and anyone who can intercept your request can forge a reply to it, all the more a concern these days with prevalent wireless networking in public places. The solution, using public-key cryptography, involves the website digitally signing its responses with a certificate that lists the website’s name. This certificate, in turn, is signed by one of several well-known certificate authorities who verify that anyone asking them to sign a certificate legitimately owns the domain name in question. An attacker can come up with his own certificate, but should not be able to get that certificate signed by any authority, and if he tried to invent his own certificate authority it would not be one of the well-known ones.
First, for MIT’s web hosting platform scripts.mit.edu, we give users hostnames of the form geofft.scripts.mit.edu. We have a single wildcard certificate for “*.scripts.mit.edu” that covers our users’ sites. This is all well and good, except that wildcard certificates for HTTPS, according to the spec, only cover one domain component. This is a problem for users who have dots in their username — most notably courses. If you vist, say, 6.001.scripts.mit.edu in your browser, you’ll notice that you’ll get an error about 6.001.scripts.mit.edu being an invalid hostname, as the certificate was “only” issued for *.scripts.mit.edu. I asked the RFC author what the rationale was for this restriction, and the related one that a *.*.scripts.mit.edu certificate or similar is useless for HTTPS, and apart from “wait, someone wants multiple wildcards?”, it boils down to protecting against malicious certificates, like tricking a certificate authority into getting a *.com or (if we prevent first-level wildcards) *.co.uk certificate, which is much more damaging than tricking a CA into issuing a particular site’s wildcard-less certificate to you. It seems that the CA/Browser Forum is moving away from being interested in wildcards in general; for instance, extended-validation (EV) certificates, which verify real-world identity in addition to domain name ownership (and are a lot more expensive), simply cannot be issued for wildcards. So even if we can convince the folks who run MIT’s local certificate authority (who know us in real life) to issue an EV certificate for scripts.mit.edu or some other MIT site, there’s no way they can generate one.
Second, I tried visiting https://bugs.freedesktop.org/ earlier tonight. If your browser is anything like mine (Firefox on Ubuntu), it will give another certificate error because freedesktop got their certificate signed by CAcert.org, a “community driven certificate authority” that unlike every other CA out there (Verisign, Equifax, etc.) is not a large corporation with both an equally large business stake in never being found untrustworthy and a bunch of business folks who can talk to the browser vendors and know how to run a respectable business, and importantly, gives free certificates. (Certificates from large corporations can cost hundreds of dollars a year.) CAcert has had some legitimacy and audit troubles in the past that are too long to go in here, but it’s a bit of an uphill battle for then to get themselves listed in the root certificate list of most operating systems (Debian, notably and unsurprisingly, notwithstanding) or browsers. Long story short, Firefox tells me that the signing authority isn’t trusted, and asks me if I want to accept the presented cert for bugs.freedesktop.org.
Now, this is a bit of a problem. I don’t really trust CAcert the way I trust Verisign and Equifax; if I tried to get to my bank’s website or Gmail or somesuch, and saw a CAcert-signed certificate, I would be highly suspicious. But for a site like freedesktop.org, it is completely in character for them to use CAcert. Unfortunately, since my browser doesn’t trust CAcert, it’s not going to tell me whether validation passed, and this certificate was actually signed by the CAcert authority certificate, or even whether the CA being passed off as CAcert belongs to the real CAcert. If I wanted to do this verification, I have two choices: either I can add CAcert to the list of CAs that my browser trusts by default for everything, or I can learn how SSL certificates work and a tool like OpenSSL, and manually do the verification, which is a completely unreasonable burden even given that I do know how to use the OpenSSL command-line tools, and just not even an option for most users, even including the majority of technical ones.
On the flip side, when I came to MIT and got my computer set up, I set up my browser to trust the MIT certificate authority. The purpose here is that MIT-internal services like online student records won’t need to pay an outside company for a certificate, if the target audience is just MIT folks who can do this configuration. However, if MIT decided one day it wanted to snoop on my Gmail traffic, and one day I got an MIT-signed certificate for mail.google.com, I would assuredly never notice: MIT’s on the list of trusted root certificates and is as good as any other. I certainly don’t check every HTTPS web site I visit to determine which CA signed it: in the normal case, I couldn’t care less whether it’s Verisign or Equifax or GoDaddy or Thawte or whomever. Of course, you might say MIT would never do that, but there’s an actual real-world concern that the government of China might be doing this. They have the same legitimate concern about not paying outside foreign companies for every site that only targets Chinese citizens, but the worry that CNNIC might sign a mail.google.com certificate and execute a man-in-the-middle attack against Gmail and its users is much more real in that context.
Ultimately, the design flaw here is that there is a single class of root certificate authorities, and either a CA is in or it’s out. I want to have the flexibility of saying that MIT’s CA can do whatever it wants, including signing EV certificates, for .mit.edu sites, but nothing else, and that CNNIC can generally do what it wants for .cn but definitely not for anything else. I want to have the flexibility to say that although no, I don’t trust CAcert in general, I would at least like to know if something is honestly signed by the real CAcert as opposed to just being signed by some fake authority calling itself CAcert, and I would like to change my browser’s default for some sites, as I see them, to permit CAcert-signed certificates. I want the error to say not that my browser doesn’t know who signed this certificate and would I like to accept this certificate on faith, but that my browser does know who CAcert is and doesn’t in general trust them but wants to know if it should for this website.
Can we restructure HTTPS to give us these capabilities? More importantly, are the real decision-makers here, the CA/Browser Forum, interested in this restructuring? This mostly only affects small, private CAs who aren’t part of the mostly-common list that all browsers and operating systems trust out of the box: the group of signing authorities that effectively operates as a single trusted third party distributed among multiple companies. So it’s not really in the interest of these commercial CAs to do something so invasive to the structure of HTTPS at this point in the game. Can we convince the browsers that there’s a way to pull this off, and that it’s worth it?