What a fabulously badly written, sensationalistic article. The TLS problem published last week affects virtually none of the applications that use HTTPS on the web today: they're currently only relevant to applications in which the client is required to carry a certificate to authenticate themselves.
That was the initial report from one of the researchers, but there are claims that it can be demonstrated to work in cases not involving client-side certificates, too. See http://extendedsubset.com/?p=8 : "Cases not involving client certificates have been demonstrated as well."
Unfortunately, that's all the data I have on that, that one sentence, and a few other places that have the same basic info which may or may not simply be reflections of this one. I haven't seen any additional substantiation.
The flaw here is that TLS tunnels multiple sessions inside a single connection, and, particularly with HTTPS, the first one of those sessions is both client-anonymous and (the bug) "live" --- in that any commands received in that session will be processed as long as a subsequent session is client-authenticated. That's the "authentication gap" being talked about in the reports.
What's also true is that there are lots of protocols, many of them very obscure, that either rely on client auth, or that may themselves have an app-layer notion of multiple sessions which will conflict with TLS's session renegotiation.
I'm not going to eat my hat or anything if this happens, but I'd bet againt someone finding a plausible MITM attack that beats server certificates using this attack. I think it's reasonably safe shorthand right now to say this attack largely applies to apps that use SSL outside of its common case.
In my 5 years as a UNIX Sysadmin I've set up SSL/TLS hundreds of times to protect web, email, and database communications, and not once have I ever been compelled to use client side certificates for authentication.
I strongly agree. Client cert auth is also harder to screw up than most of the alternatives, and, let's face it, the alternatives are almost always "shared passwords stored in configuration files".
From wiki: "Zero-day attacks occur when a vulnerability window exists between the time a threat is released and the time security vendors release patches."
I was surprised to read this. Wikipedia also has "Zero-day warez: copyrighted works that are pirated on the same day they are released", so the meaning of zero-day is different depending on the context.
I'm not quite convinced by the Wiki quote though; more than once, I've heard 0-day attacks to mean a vulnerability found on the day a new version of a software is released. Perhaps both meanings exist.
> I've heard 0-day attacks to mean a vulnerability found on the day a new version of a software is released.
I've never heard that meaning. It doesn't really make sense - why would someone care if the vulnerability was the same day of the new version or not? (Except to show off.)
Zero day to me, means it's the first day after the vulnerability was found. i.e. it's a brand new vulnerability, and no one else knows about it or has a defense for it. (It being assumed that all vulnerabilities are fixed the day they are found.) Extending it to mean "until it's fixed" makes sense too when dealing with certain vendors who don't fix things very fast.
Exactly - warez group love to show off. Just look at those 'old school' cracks with the tracked music and animation, all coded in raw assembly for the sake of speed and glory.
The term comes from #warez. Groups like Razor 1911 and Fairlight would brag about how recent their software was. 0 day was release day. First-run warez.
There are as many different definitions of zeroday in the security scene as there are people with strong opinions, but all the definitions boil down to, "it's zeroday if it's new". There's patched zeroday (when SUNW releases code that fixes a flaw and they don't tell anyone about it, the exploit for that flaw is zeroday). Unpatched flaws are usually by definition zeroday.
It's kind of a silly thing to argue about, since there's no consensus definition. This TLS flaw definitely qualifies by most people's definitions.