質問

The search application I am working on allows a user to select the results and export them in different formats like EXCEL, XML etc.

The unique id of the selected results are stored in a cookie, using JavaScript, so that later this information can be posted to export the search results.

The problem is - when the content of the cookie becomes fairly large, the Web Application Firewall blocks the requests. On analyzing, I see that all cookie content becomes part of the request header making it over the size allowed by the firewall.

Is this behavior same for all cookies? What would be the suggested work around in such scenarios?

Technology is ASP.NET 4.0, on IIS 7.5 server.

役に立ちましたか?

解決

Yes, the cookies will be included in every request to the server when the URL matches the cookie's path.

Given that the user can select arbitrary items to be exported later, you need to store the IDs for those items somewhere.

Storing them in plain text, in a cookie, is obviously causing a problem with the firewall. One option to avoid this is to compress the IDs. For example, if the IDs are 123456, 123459, 123463 then you really only need to store 123456;3;4 (the lowest number, and the increments) but this form can be difficult to process and will only benefit in some cases.

Another option would be to compress the cookie data using, for example, the gzip format. But this may not be straightforward if writing the cookie using JavaScript (although you might get away with writing only the items on the current page in plain text and having the server compress them as the user moves between pages).

If the quantity of data in the cookie remains too large you could look into server-side state, using a database for example. A table with ExportId and ItermId columns where you add or remove rows according to the user's actions and only need to send ExportId in the URL or cookie.

ライセンス: CC-BY-SA帰属
所属していません StackOverflow
scroll top