문제

I have implemented the first, pre-alpha version of my little genetic algorithm framework and it is working very well thus far. Now, I am in the process of writing documentation and finishing up some details. I just wanted to clarify something.

The term "mutation rate". Does it mean:

  • The likelyhood of a given chromosome being mutated at all?

  • The likelyhood of a given gene in a chromosome being modified?

  • Or the likelyhood of a single allele in a gene being modified?

Depending on which of the above is the correct answer (or something entirely different for that matter), please also clarify whether I need to scale the mutation rate by some other value (the number of genes in a chromosome, for instance).

도움이 되었습니까?

해결책

I'm not sure how you implement a single allele, but I would say mutation rate is the chance of a single binary mutating (e.g you have DNA 0000, and a 25% chance of every binary digit, the zeros in this case, to "mutate" to a 1).

In the projects I have done myself I have not scaled the mutation rate.

Edited.

다른 팁

Though there is no universally accepted definition of the term mutation rate, in most scholastic contexts it refers to β, the probability of mutation of a single bit in binary-encoded GA schemes for each individual.

The mutation rate really comes into play only when dealing with solution spaces with encompass multiple local optima and serve as a corrective measure to ensure global optimality. It should however be confined to a low value range as the larger the mutation rate, the more becomes the convergence time.

A detailed discussion about the same may be found here.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top