문제

I am implementing genetical algorithm to minimize some function of 20 variables.

Each individual is stored as a vector. scores are stored as doubles.

double sum = 0;
double sum = sumOfScores();
double random = (rand() * sum)/RAND_MAX;
int selected = 0;
while(random >= 0) {
 random -= individuals_score[selected];
 selected++;
}
return selected - 1;

The problem is, when the number of generations grow very big (in the thousands), the individuals of the generations start to converge to the solution and all their scores start to revolve around the optimal solution, and a strange thing sometimes happen: even though we have iterated over all the defined individuals, random is still > 0 (though very small, debugger tells me it is in the order of 10^-13). Therefore it tries to continue looping over individuals that don't even exist (since selected increments for every iteration). Which gives a vector subscript out of range error.

This happens when the number of generations is big enough and, logically, when the random number approaches the sum.

theoretically this should never happen but I think the problem may be because of the limited representation of the numbers or truncation or something along these lines.

Any ideas?

도움이 되었습니까?

해결책

double sum = 0;
double sum = sumOfScores();
double random = (rand() * sum)/RAND_MAX;
int selected = 0;

//determine the number of elements in individuals_score
const int arraySize = sizeof(individuals_score) / sizeof(individuals_score[0]);

while(random >= 0 && selected < arraySize) {
   random -= individuals_score[selected];
   selected++;
}
return selected - 1;
라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top