Question

I'm developing an asp.net MVC web application and the client has request that we try our best to make it as resilient as possible to Denial of Service attacks. They are worried that the site may receive malicious high volume requests with the intention to slow/take down the site.

I have discussed this with the product owner as really being out of the remit for the actual web application. I believe it falls to the responsibility of the hosting/network team to monitor traffic and respond to malicious requests.

However they are adamant that the application should have some precautions built into it. They do not want to implement CAPTCHA though.

It has been suggested that we restrict the number of requests that can be made for a session within a given time frame. I was thinking of doing something like this Best way to implement request throttling in ASP.NET MVC? But using the session id not the client IP as this would cause problems for users coming from behind a corporate firewall - their IP would all be the same.

They have also suggested adding the ability to turn off certain areas of the site - suggesting that an admin user could turn off database intensive areas..... However this would be controlled through the UI and surely if it was under DOS attack an admin user would not be able to get to it anyway.

My question is, is it really worth doing this? Surely a real DOS attack would be much more advanced?

Do you have any other suggestions?

Was it helpful?

Solution

A Denial of Service attack can be pretty much anything that would affect the stability of your service for other people. In this case you're talking about a network DoS and as already stated, this generally wouldn't happen at your application level.

Ideally, this kind of attack would be mitigated at the network level. There are dedicated firewalls that are built for this such as the Cisco ASA 5500 series which works it's way up from basic protection through to high throughput mitigation. They're pretty smart boxes and I can vouch for their effectiveness at blocking these type of attacks, so long as the correct model for the throughput you're getting is being used.

Of course, if it's not possible to have access to a hardware firewall that does this for you, there are some stopgap measures you can put in place to assist with defence from these types of attacks. Please note that none of these are going to be even half as effective as a dedicated firewall would be.

One such example would be the IIS Module Dynamic IP Restrictions which allows you to define a limit of maximum concurrent requests. However, in practice this has a downside in that it may start blocking legitimate requests from browsers that have a high concurrent request throughput for downloading scripts and images etc.

Finally, something you could do that is really crude, but also really effective, is something like what I had written previously. Basically, it was a small tool that monitors log files for duplicate requests from the same IP. So let's say 10 requests to /Home over 2 seconds from 1.2.3.4. If this was detected, a firewall rule (in Windows Advanced Firewall, added using the shell commands) would be added to block requests from this IP, the rule could then be removed 30 minutes later or so.

Like I say, it's very crude, but if you have to do it at the server level, you don't really have many sensible options since it's not where it should be done. You are exactly correct in that the responsibility somewhat lies with the hosting provider.

Finally, you're right about the CAPTCHA, too. If anything, it could assist with a DoS by performing image generation (which could be resource intensive) over and over again, thus starving your resources even more. The time that a CAPTCHA would be effective though, would be if your site were to be spammed by automated registration bots, but I'm sure you knew that already.

If you really want to do something at application level just to please the powers that be, implementing something IP-based request restriction in your app is doable, albeit 90% ineffective (since you will still have to process the request).

OTHER TIPS

You could implement the solution in the cloud and scale servers if you absolutely had to stay up, but it could get expensive...

Another idea would be to log the ip addresses of registered users. In the event of a DOS restrict all traffic to requests from 'good' users.

Preventing a true DoS attack on the application-level is not really doable, as the requests will most probably kill your webserver before it kills your application due to the fact that your application is associated with an application pool which again has a maximum of concurrent requests defined by the server technology you are using.

This interesting article http://www.asp.net/web-forms/tutorials/aspnet-45/using-asynchronous-methods-in-aspnet-45 states that windows 7, windows Vista and Windows 8 have a maximum of 10 concurrent requests. It goes further with the statement that "You will need a Windows server operating system to see the benefits of asynchronous methods under high load".

You can increase the HTTP.sys queue limit of the application pool that is associated with your application in order to increase the amount of requests that will be queued (for later computation when threads are ready), which will prevent the HTTP Protocol Stack (HTTP.sys) from returning Http error 503 when the limit is exceeded and no worker-process is available to handle further requests.

You mention that the customer requires you to "try [your] best to make it as resilient as possible to Denial of Service attacks". My suggestion might not be an applicable measure in your situation, but you could look into implementing the Task-based Asynchronous Pattern (TAP) mentioned in the article in order to accommodate the customers requirement.

This pattern will release threads while long-lasting operations are performed and making the threads available for further requests (thus keeping your HTTP.sys queue lower) - while also giving your application the benefit of increased overall performance when multiple requests to third-party services or multiple intensive IO computations are performed.

This measure will NOT make your application resilient to DoS attacks, but it will make your application as responsible as possible on the hardware that it is served on.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top