First of all, I advise against using system calls, especially when you have that many requests. Running external processes can cause big performance problems and since in your case the no. of processes / memory usage changes rapidly (you were saying 2000 requests at a time), you cannot use a cronjob to cache those values (even if you run a cron every second, you can't be sure the values are 100% real). You can get the memory usage for your script, approximate the no. of processes you can handle at a time and that should do it.
Now, as far as i understand, you want to process requests in a certain order: process requests 1-200, then 201-400, and so on? If that is the case, you would need to keep track of the requests that were already processed.
A simple way to achieve this would be to keep a request queue in a database - if you can use memcached or something similar, even better:
- every time you get a request, you check the queue and make sure you don't have more than 200 active requests;
- the next step would be to check that the request should run (this implies you can uniquely identify each request i.e by checking some value in GET/POST) - this allows you to make sure that if request #200 was processed, let's say in the last minute, you will ignore it and allow for request #201 to run;
- if the request checks out, you add it to the queue as active and mark it as completed / delete it from queue once it's done;
However, if request order doesn't matter to you, instead of a request queue you could just keep a request count, and make sure you never go above a certain limit.