Question

I am using Parallel.ForEach to do my job but got "Out of Memory Exception".

Parallel.ForEach(flist, (item) =>
{
    string f1 = item.Split('|')[0];
    string f2 = item.Split('|')[1];
    a = File.ReadAllText(f1);
    b = File.ReadAllText(f2); 
    Consume(a, b);
});

flist's size is 351, a and b are string, each of them has 20kb size. At a certain time, the system memory blown up.

Consume return a string list, usually around 1000 strings in each iteration.

How to deal with it?

Was it helpful?

Solution

Try replacing:

  Parallel.ForEach(flist, (item) =>
    {
        string f1 = item.Split('|')[0];
        string f2 = item.Split('|')[1];
        a = File.ReadAllText(f1);
        b = File.ReadAllText(f2); 
        Consume(a, b);
    });

With:

    Parallel.ForEach(flist, 
    new ParallelOptions { MaxDegreeOfParallelism = 4 },
    (item) =>
    {
        string f1 = item.Split('|')[0];
        string f2 = item.Split('|')[1];
        a = File.ReadAllText(f1);
        b = File.ReadAllText(f2); 
        Consume(a, b);
    });

This will prevent too many threads being created. Then you can always experiment with higher numbers and see if performance improves.

OTHER TIPS

You're reading an entire file into a single string twice per loop. That's very likely the source of your problem if the file is large.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top