Question

I'm getting several requests in web apps that are basically wrong in ways my code shouldn't be generating... Mainly it's requests to .ashx without any GET parameters specified.

The user agent is "Mozilla/4.0" (nothing more than that) The IPs vary from day to day.

This is a bot, right?

Thanks!

Was it helpful?

Solution

This seems very odd to me. Any legitimate bot would identify itself in a way you can recognize. Any malicious bot would be able to do a much better job making the user agent look like a normal browser. This is somewhere in the middle. That, combined with the bad requests, leads me to believe you're dealing with plain old incompetence.

Either way, you probably want to 404 these requests rather than return a yellow screen error.

OTHER TIPS

Sorry to bump old question, but I think this is the bot used by the Great Firewall of China. They crawl web contents, and do their censorship.

Check your log and see if there is anything like 'GET /cert/bazs.cert'.

100% sure if this is found.

According to http://www.user-agents.org the 'Yahoo Mindset: Intent-driven Search' bot reports this.

But yeah it wouldn't be a browser reporting that.

Are these requests to existing pages you wrote yourself, or do they get a 404?

In the latter case, it could be some sort of scan attack, trying to detect vulnerable application instances before hitting them with an exploit.

I had implemented asp.net side request tracking on several web sites and by looking at records I can say that only User agent "Mozilla/4.0" can be produced by any of this reasons:

  • incompetence
  • search robots
  • attack bots

It's interesting that my first Android was identified as "Safari 3.0", the latest Android is identified as "Mozilla 0"! So it's hard to point incompetence to specific software generation.

Returning 404 on every such request maybe would not be the best approach for Search robots, especialy if this is public web site with frequent content change.

On the other hand you should be aware that requests to WebResource.axd where destination is invalid are pointing to cross site scripting attacks. In this situations using SanitizerProvider is recommended. You can read more about this type of attack on Cross-site_scripting.

Another good thing to identify attacks is looking at IIS log files, that are commonly positioned on [system root]:\inetpub\logs\LogFiles\W3SVC1. Here is snipp from my tool for parsing IIS log files:

enter image description here

In this cases User agent was not a problem, bot attack was identified by requesting "/dbadmin/index.php" from 2 different IPs. There are couple of files/pages that attack bots are looking for.

Hope this helps and gives additional value to this question.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top