Lecture 4 Flashcards
what is the success probability p_a+?
p_a+ = P{f(m(a)) > f(a)}
the probability that the mutated candidate solution has a higher fitness than the candidate solution
what is the convergence velocity of a (1+1) GA?
phi(1+1) = sum from k=0 to k_max of (k * p_a+(k))
for Counting Ones, k_max = l - f_a
where f_a is the fitness, or the amount of bits that do not have to change
what is a (1+1) GA?
a genetic algorithm where both the parent and the offspring consist of one instance
for the OneMax problem, what is the probability that we flip i 1s to 0s?
(f_a boven i) p^i * (1 - p)^(f_a - i)
p is the mutation rate, f_a is the fitness, equal to the number ones in the string
for the OneMax problem, what is the probability that we flip (i+k) 0s to 1s?
(l - f_a boven i + k) p^(i+k) * (1 - p)^(l - f_a - i - k)
what is absorption in a Markov chain?
the state (0) where all l bits are correct
what is the time to absorption of OneMax with the (1+1) EA as a function of l?
in the order of magnitude O(l ln l)
what is the approximated optimum mutation rate?
p* = 1 / (2(f_a + 1) - l)
what is the effect of different mutation rate settings?
if p_m is too large, we get exponential complexity because it becomes random search
if p_m is too small, the time to absorption is almost constant
what is (1 + lambda) GA?
one parent is used to generate lambda offspring
the next generation parent is chosen as the best among offspring and the old parent: a_next = best(O_t U {a_t})
what is (1, lambda) GA?
one parent is used to generate lambda offspring
the next generation parent is chosen as the best among offspring a_next = best(O_t)
what is the probability of having exactly k improvements in a (1+1)-GA?
p+(k) = the sum from i = 0 to min(f_a, l - f_a - k) of (the probability that i 1s flip to 0s * the probability that (i+k) 0s flip to 1s)