سؤال

Take the following naive implementation of a nested async loop using the ThreadPool:

ThreadPool.SetMaxThreads(10, 10);
CountdownEvent icnt = new CountdownEvent(1);
for (int i = 0; i < 50; i++)
{
    icnt.AddCount();
    ThreadPool.QueueUserWorkItem((inum) =>
    {
        Console.WriteLine("i" + inum + " scheduled...");
        Thread.Sleep(10000);  // simulated i/o
        CountdownEvent jcnt = new CountdownEvent(1);
        for (int j = 0; j < 50; j++)
        {
            jcnt.AddCount();
            ThreadPool.QueueUserWorkItem((jnum) =>
            {
                Console.WriteLine("j" + jnum + " scheduled...");
                Thread.Sleep(20000);  // simulated i/o
                jcnt.Signal();
                Console.WriteLine("j" + jnum + " complete.");
            }, j);
        }
        jcnt.Signal();
        jcnt.Wait();
        icnt.Signal();
        Console.WriteLine("i" + inum + " complete.");
    }, i);
}
icnt.Signal();
icnt.Wait();

Now, you'd never use this pattern (it will deadlock on start) but it does demonstrate a specific deadlock you can cause with the threadpool - by blocking while waiting for nested threads to complete after the blocking threads have consumed the entire pool.

I'm wondering if there's any potential risk of generating similarly detrimental behavior using the nested Parallel.For version of this:

Parallel.For(1, 50, (i) =>
{
    Console.WriteLine("i" + i + " scheduled...");
    Thread.Sleep(10000);  // simulated i/o
    Parallel.For(1, 5, (j) =>
    {
        Thread.Sleep(20000);  // simulated i/o
        Console.WriteLine("j" + j + " complete.");
    });
    Console.WriteLine("i" + i + " complete.");
});

Obviously the scheduling mechanism is far more sophisticated (and I haven't seen this version to deadlock at all), but the underlying risk seems like it may still be lurking there. Is it theoretically possible to dry up the pool that the Parallel.For uses to the point of creating deadlock by having dependencies on nested threads? i.e. is there a limit to the number of threads that the Parallel.For keeps in it's back pocket for jobs that are scheduled after a delay?

هل كانت مفيدة؟

المحلول

No, there is no risk of a deadlock like that in Parallel.For() (or Parallel.ForEach()).

There are some factors that would lower the risk of deadlock (like dynamic count of threads used). But there is also a reason why the deadlock is impossible: the iteration is run on the original thread too. What that means is that if the ThreadPool is completely busy, the computation will run completely synchronously. In that case, you won't get any speedup from using Parallel.For(), but your code will still run, no deadlocks.

Also, a similar situation with Tasks is also solved correctly: if you Wait() on a Task (or access its Result) that hasn't been scheduled yet, it will run inline in the current thread. I think this is primarily a performance optimization, but I think it could also avoid deadlocks in some specific cases.

But I think the question is more theoretical than practical. .Net 4 ThreadPool has default maximum thread count set to something like a thousand. And if you have thousand Threads blocking at the same moment, you're doing something very wrong.

مرخصة بموجب: CC-BY-SA مع الإسناد
لا تنتمي إلى StackOverflow
scroll top