Friday, August 12, 2011

USENIX: The (Decentralized) SSL Observatory

Peter Eckersley, Senior Staff Technologist for the Electronic Frontier Foundation, and Jesse Burns, Founding Partner, iSEC Partners, started with the well known crypto stipulation, which is your encryption is only as good as your trust anchor. Knowing that, they wanted to see how secure the X.509 certificates in the wild are, so they started scanning port 443 on IPv4 servers the world over, so they could collect certificates.

They have created an Observatory Browser Extension that collects certificate chain, destination domain, approximate time stamp, optional ASN and server IP that users can install into Firefox that can be used to help the researchers gather more information, and also help you to identify if you've got a bogus certificate in your browser.

Certificate Authorities have a hard job (verifying server identities) with strange incentives (they get paid for each certificate they issue). In 2009 there were three major vulnerabilities due to CA mistakes and in 2010, EFF discovered some evidence that governments were compelling CAs to put in back doors for them. On top of all that, there are a lot of certificate authorities out there. All of these things were daunting to the researchers as they started their project.

The technology this is all based on, X.509, was designed in the 1980s, before TLS/SSL or even HTTP! In their research, they discovered 10,320 kinds of X.509 certificates in the wild, of those, only about 1300 were *valid* (according to SSL).

They found 16.2 million IPs were listening on port 443, and 11 million responded to their SSL handshake.

Typical browsers trust about 1500 CAs. Can that really be a good thing?

These CAs are located in about 52 different countries. They found many certificates that are valid but don't actually identify anyone in particular: localhost, exchange, Mail and private IP addresses [RFC 1918]. What's the point of having a CA verify your identity, if you aren't really providing an actual identity?

They tried to use their browsers to check certificate validity, but had a hard time using it, because Firefox and IE cache intermediate CAs. This means that some certificates are considered valid only sometime (depending on where you've been before with you browser). Clearly, that shouldn't be - a certificate should either be valid, or not.

Even when problems are found and the CA authorities are aware, revocation of problematic certificates is difficult or impossible to do, as many browsers and other software doesn't look at revocation data. They found nearly 2 million revocations, 4 in the future and 2 from the 1970s (before this technology existed).

They found a few subordinate CAs that claim to be from the country "ww" (which doesn't exist), with organization "global" and a bunch of other bogus information - that were irrevocable, and the CPS pointed to dead websites.

So, what can we do? Consensus measurement, more vigilant auditing, DNSSEC + DANE, or certificate pinning via HTTPS headers.

Consensus measurement is where you look for sites to all agree that a certificate is valid, but false warnings can happen when sites swap certificates for testing purposes or for other unknown reasons. Users are already "trained" to ignore warnings if they get too many false positives, so this approach would be problematic.

Certificate pinning relies on whoever used to be domain.com should stay domain.com, which works great if it is implemented correctly. The right way to do this, is to create a private CA just for this domain, and use it in parallel to PKIX. Using this correctly can protect you against compromise and malice, though users would still be vulnerable at first connection.

This article is syndicated from Thoughts on security, beer, theater and biking!