Question

Alright...I've given the site a fair search and have read over many posts about this topic. I found this question: Code for a simple thread pool in C# especially helpful.

However, as it always seems, what I need varies slightly.

I have looked over the MSDN example and adapted it to my needs somewhat. The example I refer to is here: http://msdn.microsoft.com/en-us/library/3dasc8as(VS.80,printer).aspx

My issue is this. I have a fairly simple set of code that loads a web page via the HttpWebRequest and WebResponse classes and reads the results via a Stream. I fire off this method in a thread as it will need to executed many times. The method itself is pretty short, but the number of times it needs to be fired (with varied data for each time) varies. It can be anywhere from 1 to 200.

Everything I've read seems to indicate the ThreadPool class being the prime candidate. Here is what things get tricky. I might need to fire off this thing say 100 times, but I can only have 3 threads at most running (for this particular task).

I've tried setting the MaxThreads on the ThreadPool via:

ThreadPool.SetMaxThreads(3, 3);

I'm not entirely convinced this approach is working. Furthermore, I don't want to clobber other web sites or programs running on the system this will be running on. So, by limiting the # of threads on the ThreadPool, can I be certain that this pertains to my code and my threads only?

The MSDN example uses the event drive approach and calls WaitHandle.WaitAll(doneEvents); which is how I'm doing this.

So the heart of my question is, how does one ensure or specify a maximum number of threads that can be run for their code, but have the code keep running more threads as the previous ones finish up until some arbitrary point? Am I tackling this the right way?

Sincerely,

Jason


Okay, I've added a semaphore approach and completely removed the ThreadPool code. It seems simple enough. I got my info from: http://www.albahari.com/threading/part2.aspx

It's this example that showed me how:

[text below here is a copy/paste from the site]

A Semaphore with a capacity of one is similar to a Mutex or lock, except that the Semaphore has no "owner" – it's thread-agnostic. Any thread can call Release on a Semaphore, while with Mutex and lock, only the thread that obtained the resource can release it.

In this following example, ten threads execute a loop with a Sleep statement in the middle. A Semaphore ensures that not more than three threads can execute that Sleep statement at once:

class SemaphoreTest
{
    static Semaphore s = new Semaphore(3, 3);  // Available=3; Capacity=3

    static void Main()
    {
        for (int i = 0; i < 10; i++)
            new Thread(Go).Start();
    }

    static void Go()
    {
        while (true)
        {
            s.WaitOne();

            Thread.Sleep(100);   // Only 3 threads can get here at once

            s.Release();
        }
    }
}
Was it helpful?

Solution

Note: if you are limiting this to "3" just so you don't overwhelm the machine running your app, I'd make sure this is a problem first. The threadpool is supposed to manage this for you. On the other hand, if you don't want to overwhelm some other resource, then read on!


You can't manage the size of the threadpool (or really much of anything about it).

In this case, I'd use a semaphore to manage access to your resource. In your case, your resource is running the web scrape, or calculating some report, etc.

To do this, in your static class, create a semaphore object:

System.Threading.Semaphore S = new System.Threading.Semaphore(3, 3);

Then, in each thread, you do this:

System.Threading.Semaphore S = new System.Threading.Semaphore(3, 3);

try
{
    // wait your turn (decrement)
    S.WaitOne();
    // do your thing
}

finally {
    // release so others can go (increment)
    S.Release();
}

Each thread will block on the S.WaitOne() until it is given the signal to proceed. Once S has been decremented 3 times, all threads will block until one of them increments the counter.

This solution isn't perfect.


If you want something a little cleaner, and more efficient, I'd recommend going with a BlockingQueue approach wherein you enqueue the work you want performed into a global Blocking Queue object.

Meanwhile, you have three threads (which you created--not in the threadpool), popping work out of the queue to perform. This isn't that tricky to setup and is very fast and simple.

Examples:

OTHER TIPS

It's a static class like any other, which means that anything you do with it affects every other thread in the current process. It doesn't affect other processes.

I consider this one of the larger design flaws in .NET, however. Who came up with the brilliant idea of making the thread pool static? As your example shows, we often want a thread pool dedicated to our task, without having it interfere with unrelated tasks elsewhere in the system.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top