GA Case studies Flashcards

1
Q

Roulette wheel pc

A

+ Fittest individual multiple times
+Simple
+chance for all
- Premature convergence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Tounrament

A

+ Diversirty
+Paralell processing
-Large tounrament

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Stochastic universal sampling

A

similar to roulette but more chance for less fit individuals / less os individual chosen multiple times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Truncation

A

+built in elitism sort of

  • Incredibly computationally intensive -> soting
  • low diversity
  • not used
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Selection case study

A

2005: Tounrament converged to a more optimal solution is less than half the time compared to roulette and other.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Elitism case study

A

1994: using markov chains analysis GA are not guarenteed to converge to optimal solution unless there is elitism built in.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Crossover case study

A

2012: ordered crossover + variants almost half the relative distacnce for tsp. using roulette

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Integrated Lin kernighan case study

A
1995
- LK defacto algorithm
- LK generates initial population
- GA improves it
- 0.75% held karp lower bound
Parthenogenetic (1) with local search  = high perf
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Darwinean natural selction

A
  1. Heridity
  2. variation
  3. selection
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

GA vs others

A

1992: GA much better than greedy. with better time efficiency
2015 much worse than greedy + nearest neighbour.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Advantagges of GA

A
  1. Work well with large fitness landscapes, i.e. large problem spaces. (From the Discussion section.)
  2. Their randomness helps avoid local optima, and therefore sub-optimal solutions. (From the Discussion section.)
  3. There is a natural balance between exploration and exploitation. (From the Discussion section.)
  4. They explore even further through novelty searches. (From the Discussion section.)
  5. Their concept is easy to understand.
  6. Can find an answer fairly quickly.
  7. Work on a wide variety of problems.
  8. Coding is relatively easy.
  9. They are inherently parallel.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Disadvantages of GA

A
  1. Don’t find the definitive answer.

For all but the smallest sets, it will almost surely not find The definitive most optimal solution to the defined problem.

  1. Getting the design right is difficult.

Designing an objective function and getting the representation right can be difficult.
i.e. It’s really hard to come up with a good heuristic which actually reflects what we want the algorithm to do.

  1. Figuring out ideal parameters is difficult.

And even having decided on the design of a certain representation/algorithmic strategy, you really don’t know what optimal parameters are, such as how big an initial population to use, the mutation factor, or even how many generations we should run it to get the expected result, etc.
(See the parameter list image below from the Genetic Algorithm program we are using.)

So it’s generally not possible to predict their effects before playing around with the parameters. (From the Discussion section.)

  1. And so it’s somewhat of an “art”, employing trial and error. (From the Discussion section.)

So one way to look at it in terms of the previous two points is that their implementation is still as much an art as a science.

  1. Can be a little computationally expensive.

Even though better than most alternatives, they can still be computationally expensive i.e. time-consuming.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

GA time complexity

A

nlog()n +. linear + polynomial

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Lin kernighan complexity

A

n^2.2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Simulated annealing complexity

A

n^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Nearest neighbour compelxity

A

n^2

17
Q

greedy complexity

A

n^2logn

18
Q

Define compbinatorial optimisation

A

A discrete optimization problem seeks to determine the best possible solution from a finite set of possibilities.

19
Q

Dfe=ien comutationally intractible

A

a problem in which no efficient solution (other than the worst-case complexity) exists.