Is url rewriting a mitigation of CSRF? Though, almost sure it's not a foolproof solution, I put up this query before all security gurus out there on webappsec.
The application in question was replacing all the urls with some randomized and unique long strings in this format:
https://mysite.com/myportal/b1/04_SjzQ0NTYyNzS2MLTUj9CPykssy0xPLMnMz0vMAfGjzOJDfU19LNxMTQwsAoydDDyNXb0cnc2dDA2czfRzoxwVAVLe6h0!/
The url was long enough and sufficiently randomized.
The argument in favor of randomized url as csrf mitigation is, even an attacker is able to grab the url, it won't be valid for next session. So, the attacker can't exploit it by sending/ embedding in link/ images etc as it would stand invalid. Thus csrf mitigated.
But let's consider the scenario wherein an attacker goes to logged in victim's machine, applied his social engineering tricks and note down the url , convince user to click the forged link sent to him. If the user stays at the same page, he gets exploited. Though, it's a infeasible scenario and unpractical one and there's very remote chance of its technical viability, still it's a risk. The attack window is certainly very small and short timed, but why to take chances.Even, I have detected some static urls in a page which don't rewritten each time, so they can be easily forged. Therefore it's always best mitigation is to implement anti-csrf tokens on the pages where critical actions are performed.
The url rewriting improves the anti-csrf defense mechanism, but we need to be sure that the strings/ tokens etc are unique and sufficiently randomized.So, even they are cached, they are unusable in next session or unpredictable. Invalidating them is always a part of good session management once the user is logged out.
However there are flipside of rewriting is that it can't be bookmarked for a later use and it would be a server intensive task as generating random strings for each and every ulr will affect the performance too.
So the bottom line is, though url rewriting raises the bar it's not foolproof solution to CSRF. The safest approach is using anti-csrf token in pages.
I thank all the people on webappsec.org mailing list for such a nice discussion.
The application in question was replacing all the urls with some randomized and unique long strings in this format:
https://mysite.com/myportal/b1/04_SjzQ0NTYyNzS2MLTUj9CPykssy0xPLMnMz0vMAfGjzOJDfU19LNxMTQwsAoydDDyNXb0cnc2dDA2czfRzoxwVAVLe6h0!/
The url was long enough and sufficiently randomized.
The argument in favor of randomized url as csrf mitigation is, even an attacker is able to grab the url, it won't be valid for next session. So, the attacker can't exploit it by sending/ embedding in link/ images etc as it would stand invalid. Thus csrf mitigated.
But let's consider the scenario wherein an attacker goes to logged in victim's machine, applied his social engineering tricks and note down the url , convince user to click the forged link sent to him. If the user stays at the same page, he gets exploited. Though, it's a infeasible scenario and unpractical one and there's very remote chance of its technical viability, still it's a risk. The attack window is certainly very small and short timed, but why to take chances.Even, I have detected some static urls in a page which don't rewritten each time, so they can be easily forged. Therefore it's always best mitigation is to implement anti-csrf tokens on the pages where critical actions are performed.
The url rewriting improves the anti-csrf defense mechanism, but we need to be sure that the strings/ tokens etc are unique and sufficiently randomized.So, even they are cached, they are unusable in next session or unpredictable. Invalidating them is always a part of good session management once the user is logged out.
However there are flipside of rewriting is that it can't be bookmarked for a later use and it would be a server intensive task as generating random strings for each and every ulr will affect the performance too.
So the bottom line is, though url rewriting raises the bar it's not foolproof solution to CSRF. The safest approach is using anti-csrf token in pages.
I thank all the people on webappsec.org mailing list for such a nice discussion.
Comments
I want ur favour.
I want t know if i put this query ’ 1 OR 1=1‘“ in search field the record is not displayed but the editinf field is displayed..
so i this page vulnerable to sql?