Question

I've written a code in c++ in MVC++ 2010. Inside it, the program iterates over elements of a 1-D pointer array (double *). However, when I make the input (size of the pointer array) very large, for example 15000, and run the program, it stops working and shows a window to close the program since it is not responding! What's the problem?

Here is part of the code that builds the array I'm talking about:

map<int, double *> CF;
CoefficientMap(CF);

double *T = new double[I * J];
for (int r = 1; r <= I * J; ++r)
    T[r] = 100;
SOR(T, CF, 1.8);

and here is the iterator function:

void SOR(double *T, map<int, double *> &CF, double w)
{
int iter = 0;
cout << "Stage 2: Solving the linear system of equations using SOR method... ";

const double tol = 0.00001;
double error = tol + 1;

double *TOld = new double[I * J];
for (int i = 1; i <= I * J; ++i)
    TOld[i] = 100;

while (abs(error) > tol)
{
    ++iter;
    for (int i = 1; i <= I * J; ++i)
        T[i] = (CF[i][0] + CF[i][1] * T[i + 1] + CF[i][2] * T[i + J] + CF[i][3] * T[i - J] + CF[i][4] * T[i - 1]) * w + (1 - w) * T[i];

    error = errorCalc(TOld, T, I * J);

    for (int i = 1; i <= I * J; ++i)
        TOld[i] = T[i];

    if (iter % 100 == 0)
    {
        cout << endl << endl;
        cout << "100 iterations done, please wait..." << endl << "Total accumulative error till this point: " << error << endl;
    }

    if (iter > 10000)
        return;
}

cout << "Done!" << endl << endl;
cout << "Converged after " << iter << " iterations!" << endl;
cout << "Final accumulative error: " << error << endl << endl;

}

Now, when (I * J) gets large enough (15000 for example) the program stops working!

Was it helpful?

Solution

Most likely explanation is, you run out of stack space. Simple fix is to make the array static or global. You could also allocate it with new from heap. Both move the array out of stack.

Best is probably to use smart pointer and put it to heap:

std::unique_ptr<double[]> arrayOfDoubles(new double[size]);

That will take care of freeing the memory when smart pointer variable goes out of scope, no need for manual delete.

For better answer, edit question to contain code...


Your added code has at least problem with array indexing. Indexes start at 0 and go to array size minus one. Correct loop:

double *T = new double[I * J];
for (int r = 0; r < I * J; ++r)
    T[r] = 100;

You have same error in other loops too, same fix.

Alternative fix: If you want to start indexing from 1 (like, because you have pseudocode algorithm written that way and don't want to change indexing), simplest is to allocate one bigger array and not use index 0:

double *T = new double[I * J + 1];

With that you can use your current loops.


Such buffer overruns by one array element are nasty, because often there may be unused space at the end of allocated memory block, so bug may go totally unnoticed until you change array size and unused space disappears. And even if overrun results in heap corruption, it might go unnoticed, until you change code and effect of corruption changes. So for example adding debugging code may hide the problem if you are unlucky.

OTHER TIPS

It sounds like you're allocating a plain array on the stack, like this:

void f()
{
    double a[123456];
    ...
}

The stack is limited in size - you should either allocate with new or (better) use a std::vector.

You've allocated too much space on the stack, so there is not enough memory to fulfill your request. As an alternative you can give the object static storage duration, or you can put it on the free store with new:

std::unique_ptr<int[]> ptr(new int[size]);
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top