There were many articles yesterday suggesting that a major new flaw in TLS (aka SSL) had been found ([1][2][3]). The last of those is a post by Ben Laurie, an expert in these matters, with a suitably hyperbolic title: “Another Protocol Bites The Dust”

Here's the issue: there's an extremely uncommon configuration of web servers where they're setup to require client side certificates for some URLs and not others. If a user has an HTTPS connection open that didn't handshake with a client side certificate and they try to access such a URL, the webserver will perform another handshake on the same connection. As soon as that handshake completes with the correct certificate, they'll run the request that was received from before the connection was fully authenticated.

It's a bug in the web server. There was a misunderstanding between what the folks writing the webserver thought that TLS was providing and what it actually provides. One might also argue that it's a short coming in the HTTP protocol (there's no way for a server to ask a client to redo a request). One might also argue that TLS should provide the properties that the web servers expected.

But it's not a flaw in TLS. The TLS security properties are exactly what was intended.

Now, it appears that the fix will be to TLS. That's fine, but the place that gets ‘fixed’ isn't always the place that made the mistake.

I don't understand why knowledgeable folks like EKR and Laurie are so eager to attribute this problem to TLS.