Optimisation Problems Flashcards

(43 cards)

1
Q

What is unique about optimization problems?

A

Path is less relevant focused on state as solution to a problem.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define Local Search. What is an advantage?

A

Iterative improvement algorithms.
Keep track of single current state and tries to improve it.
Moves only to neighbouring states.

A: Uses very little memory.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define The State Neighbourhood.

A

The neighbouring states for a given state.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define Successor Function.

A

Chooses neighbour to proceed to based on some heuristic.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the travelling salesman problem.

A

A NP-hard problem. A salesman must visit each city exactly once and then return home in the quickest time possible.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a shoulder?

A

A plateau region which has an “uphill” ascent.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is global maximum /minimum?

A

Is the best possible state of state space
landscape. It has the best value for objective function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is local maximum / minimum?

A

Is a state which is better than its neighbor states, but there is also another state which is better than it.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

In the state space landscape graph what is the x-axis and y-axis?

A

x-axis is the state-space
y-axis is the objective function rating of those states.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Define current state.

A

A a state in a landscape diagram where an agent is currently present.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Define flat local maximum / minimum.

A

A portion of the state space landscape that is both a plateau and a local maximum / minimum.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the general approach for the hill climbing algorithm?

A

Choose the successor state that results in the most beneficial state change but if no neighbour has better value, then return the current state. If tied choose next state at random.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Hill climbing pseudocode.

A

current := initial-state
repeat
neighbour := successor of current
if neighbour.value <= current.value
return current
else
current := neighbour

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Name three problems associated with hill climbing.

A

The foothill problem.
The ridge problem.
The Plateau problem.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Describe the foothill problem.

A

Search process may reach local maxima, where changes degrade quality value and stop search thinking optimal state has been found when only local optima found.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How might we fix the foothill and ridge problems?

A

Non-determinism: Random steps can help escape local maxima.

Use backtracking technique: Maintain list of visited states, able to backtrack.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Explain the ridge problem.

A

Ridges are local maxima which fall off steeply to low quality states either side. Algorithm struggles to move in diagonal direction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Explain the plateau problem. Distinguish between shoulder and flat local maximum.

A

Large flat region (plateau) where local changes don’t increase quality and function doesn’t know what to do.

Shoulder is plateau where at end we can progress.

Flat local maximum is plateau where no exit exists.

19
Q

How can we solve the plateau problem?

A

Allow a sideways move to states of same value but limit the amount allowed.

20
Q

Define the following:
1. Simple Hill Climbing
2. Steepest Ascent Hill Climbing
3. Stochastic Hill-Climbing
4. First-Choice Climbing
5. Random-Restart Hill-Climbing

A
  1. Only takes into account one neighbouring state and sets as current if better.
  2. Considers multiple neighbouring states and selects best result (or current) as current
  3. Selects at random from uphill moves. Probability of selection varies with steepness. (steeper preferable)
  4. Randomly generates successors until a better one if found.
  5. Repeated HC from random initial until goal reached.
21
Q

What is the expected number of restarts for HC to reach a solution?

A

1 / probability of success.

22
Q

What is simulated annealing?

A

Escapes from local optima by also accepting, with a probability that decreases during the search, moves that are worse than the current solution.

23
Q

Describe SA procedure.

A

The process begins with: a random viable solution, a temperature parameter which decreases according to predefined annealing schedule. Loop until temperature 0 where freezing occurs.

During loop, accept any uphill move and accept downhill move with probability based on temperature.

24
Q

SA maximizing pseudocode.

A

current <- initial
while T > 0
Reduce T by annealing schedule
next <- rand succ current
E = next.val - current.val
if E > 0
current <- next
else if e^(E / T) > rand(0,1)
current <- next
return current

25
What changes are needed for minimising version?
E <= 0 and e^(-E / T) > rand(0,1)
26
What is local beam search?
SA that keeps track of k states instead of one. Initially k randomly selected states then we determine all successors and if any goal finish. Else select best k from successors and repeat. Information shared among threads.
27
What is a downside of local beam search?
Can suffer from lack of diversity.
28
Define these terms in relation to genetic algorithms: 1. offspring 2. population 3. individual 4. chromsome 5. gene 6. fitness function
1. A successor state 2. k randomly generated start states 3. A state 4. A full candidate solution 5. A part of a candidate 6. Evaluation function.
29
Explain genetic algorithms.
Offspring produced from two parents. We start with random population. An individual is represented as a chromosome string of genes. E.g. 1s & 0s Fitness function determines which parents are used so over time population becomes fitter (closer to solution).
30
Name the stages in the process of genetic algorithms.
1. Population Initialisation LOOP 2. Fitness Function Calculation 3. Selection 4. Crossover 5. Mutation LOOP until termination criteria met 6. Terminate and return best
31
What is the term used to describe termination of genetic algorithms?
Convergence
32
What is Roulette Wheel Selection? What is stochastic universal sampling?
Random spin, where probability of being spun is based on fitness. Roulette chooses one parent at a time. Stochastic picks both parents at same time (reduced likelihood of parents being same with fixed points on opposite sides of sun wheel determining parents.)
33
What is rank-based selection? When is it often used?
We simply rank chromosomes by their fitness value and choose best as parents. When individuals have similar probability of being spun.
34
What is Tournament Selection?
Select k individuals at random from population and select best of these to be a parent. Repeat for each parent.
35
Explain one point, multi-point and uniform crossover.
A random crossover point selected, tails are swapped. Multiple points selected and swapped. Treat each gene (bit) separately and swap or don't swap. Can be biased toward one parent or another.
36
Why do we need a mutation operator?
To ensure that all parts of the search space is reached in case of a bad population. E.g. Not all possible genes in population.
37
What is the disadvantage of too little and too much mutation?
Too little: Increasing number of generations. They don't learn. Too much: Decreases convergence rate, undermining fitness based selection. They are completely different.
38
What is a good mutation rate range?
[0.001, 0.01]
39
Explain four different types of mutation algorithms.
Bit Flip: Change 1 or more random bit(s) Swap Mutation: Swap two randomly selected bits. Scramble Mutation: A subsection of genes is shuffled. Inversion Mutation: A subsection of genes is reversed.
40
Termination criteria for GA?
- Point in search space reached - Maximum number of fitness computations executed. - Maximum number of generations reached. - When a number of generations has passed without improvement.
41
What are the advantages and disadvantages of GAs?
+: - Understandable and wildly applicable. - Works in parallel - Provides a list of solutions not just one. - Works well for large or many parameter search spaces. -: - Fitness value computation expensive. - Not optimal - Hard to design fitness function
42
How do I calculate strength of fitness function for a chromosome?
fitness function on chromosome / sum of fitness function on every chromosome Taken as a percentage
43
When we have new chromsome, how do we add it to population?
We replace unfit chromosomes in original population with new chromosomes.