Question

I am implementing genetical algorithm to minimize some function of 20 variables.

Each individual is stored as a vector. scores are stored as doubles.

double sum = 0;
double sum = sumOfScores();
double random = (rand() * sum)/RAND_MAX;
int selected = 0;
while(random >= 0) {
 random -= individuals_score[selected];
 selected++;
}
return selected - 1;

The problem is, when the number of generations grow very big (in the thousands), the individuals of the generations start to converge to the solution and all their scores start to revolve around the optimal solution, and a strange thing sometimes happen: even though we have iterated over all the defined individuals, random is still > 0 (though very small, debugger tells me it is in the order of 10^-13). Therefore it tries to continue looping over individuals that don't even exist (since selected increments for every iteration). Which gives a vector subscript out of range error.

This happens when the number of generations is big enough and, logically, when the random number approaches the sum.

theoretically this should never happen but I think the problem may be because of the limited representation of the numbers or truncation or something along these lines.

Any ideas?

Was it helpful?

Solution

double sum = 0;
double sum = sumOfScores();
double random = (rand() * sum)/RAND_MAX;
int selected = 0;

//determine the number of elements in individuals_score
const int arraySize = sizeof(individuals_score) / sizeof(individuals_score[0]);

while(random >= 0 && selected < arraySize) {
   random -= individuals_score[selected];
   selected++;
}
return selected - 1;
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top