Skip to main content

Is this sufficient to protect against a CSRF for an ajax-driven application?



I'm working on a completely ajax-driven application where all requests pass through what basically amounts to a main controller which, at its bare bones, looks something like this:







if(strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) == 'xmlhttprequest') {

fetch($page);

}







Is this generally sufficient to protect against cross-site request forgeries?





It's rather inconvenient to have a rotating token when the entire page isn't refreshed with each request.





I suppose I could pass and update unique token as a global javascript variable with every request -- but somehow that feels clumsy and seems inherently unsafe anyway.





EDIT - Perhaps a static token, like the user's UUID, would be better than nothing?





EDIT #2 - As The Rook pointed out, this might be a hair-splitting question. I've read speculation both ways and heard distant whispers about older versions of flash being exploitable for this kind of shenanigans. Since I know nothing about that, I'm putting up a bounty for anyone who can explain how this is a CSRF risk. Otherwise, I'm giving it to Artefacto . Thanks.



Source: Tips4all

Comments

  1. I'd say it's enough. If cross-domain requests were permitted, you'd be doomed anyway because the attacker could use Javascript to fetch the CSRF token and use it in the forged request.

    A static token is not a great idea. The token should be generated at least once per session.

    EDIT2 Mike is not right after all, sorry. I hadn't read the page I linked to properly. It says:


    A simple cross-site request is one that: [...]
    Does not set custom headers with the HTTP Request (such as X-Modified, etc.)


    Therefore, if you set X-Requested-With, the request has to be pre-flown, and unless you respond to pre-flight OPTIONS request authorizing the cross-site request, it won't get through.

    EDIT Mike is right, as of Firefox 3.5, cross-site XMLHttpRequests are permitted. Consequently, you also have to check if the Origin header, when it exists, matches your site.

    if (array_key_exists('HTTP_ORIGIN', $_SERVER)) {
    if (preg_match('#^https?://myserver.com$#', $_SERVER['HTTP_ORIGIN'])
    doStuff();
    }
    elseif (array_key_exists('HTTP_X_REQUESTED_WITH', $_SERVER) &&
    (strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) == 'xmlhttprequest'))
    doStuff();

    ReplyDelete
  2. Short answer : no. Any attacker would just use Ajax himself to attack your website.
    You should generate a random token with a short but not too much lifetime which you would update during each ajax request.

    You'd have to use an array of tokens in javascript as you may have multiple ajax request running at the same time.

    ReplyDelete
  3. I do not believe that this is safe. The same origin policies are designed to prevent the documents from different domains from accessing the content that is returned from a different domain. This is why XSRF problems exist in the first place. In general XSRF doesn't care about the response. It is used to execute a specific type of request, like a delete action.
    In the simplest form, this can be done with a properly formatted img tag. Your proposed solution would prevent this simplest form, but doesn't protect someone from using the XMLHttp object to make the request.
    You need to use the standard prevention techniques for XSRF. I like to generate a random number in javascript and add it to the cookie and a form variable. This makes sure that the code can also write cookies for that domain. If you want more information please see this entry.

    Also, to pre-empt the comments about XMLHttp not working in script. I used the following code with firefox 3.5 to make a request to google from html running in the localhost domain. The content will not be returned, but using firebug, you can see that the request is made.

    <script>
    var xmlhttp = false;

    if (!xmlhttp && typeof XMLHttpRequest != 'undefined') {
    try {
    xmlhttp = new XMLHttpRequest();
    } catch (e) {
    xmlhttp = false;
    }
    }
    if (!xmlhttp && window.createRequest) {
    try {
    xmlhttp = window.createRequest();
    } catch (e) {
    xmlhttp = false;
    }
    }

    xmlhttp.open("GET", "http://www.google.com", true);
    xmlhttp.onreadystatechange = function() {
    if (xmlhttp.readyState == 4) {
    alert("Got Response");
    alert(xmlhttp.responseText)
    }
    }

    xmlhttp.send(null)
    alert("test Complete");

    ReplyDelete
  4. What you are doing is secure because xmlhttprequest is usually not vulnerable to cross-site request forgery.

    As this is a client side problem, the safest way would be to check the security architecture of each browser :-)


    (This is a summary; I am adding this answer because this question is very confusing, let's see what the votes say)

    ReplyDelete

Post a Comment

Popular posts from this blog

[韓日関係] 首相含む大幅な内閣改造の可能性…早ければ来月10日ごろ=韓国

div not scrolling properly with slimScroll plugin

I am using the slimScroll plugin for jQuery by Piotr Rochala Which is a great plugin for nice scrollbars on most browsers but I am stuck because I am using it for a chat box and whenever the user appends new text to the boxit does scroll using the .scrollTop() method however the plugin's scrollbar doesnt scroll with it and when the user wants to look though the chat history it will start scrolling from near the top. I have made a quick demo of my situation http://jsfiddle.net/DY9CT/2/ Does anyone know how to solve this problem?

Why does this javascript based printing cause Safari to refresh the page?

The page I am working on has a javascript function executed to print parts of the page. For some reason, printing in Safari, causes the window to somehow update. I say somehow, because it does not really refresh as in reload the page, but rather it starts the "rendering" of the page from start, i.e. scroll to top, flash animations start from 0, and so forth. The effect is reproduced by this fiddle: http://jsfiddle.net/fYmnB/ Clicking the print button and finishing or cancelling a print in Safari causes the screen to "go white" for a sec, which in my real website manifests itself as something "like" a reload. While running print button with, let's say, Firefox, just opens and closes the print dialogue without affecting the fiddle page in any way. Is there something with my way of calling the browsers print method that causes this, or how can it be explained - and preferably, avoided? P.S.: On my real site the same occurs with Chrome. In the ex