Final Iteration Flashcards

(70 cards)

1
Q

Definition of Decision Support

A

As it sounds, used in a variety of contexts related to decision making

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition of Closed Systems

A

Closed systems are totally independent i.e. ‘closed off’ from the environment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Definition of Open Systems

A

Open Systems are dependent on their environment - ‘open to’ the environment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Definition of System Effectiveness

A

System Effectiveness - the degree to which goals are achieved. In simple terms, how good the results are that are produced by the system

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Definition of System Efficiency

A

System Efficiency - A measure of the use of inputs (or resources) to achieve output. In simple terms, how little the computer uses in terms of resources to produce a solution, or how fast the system can produce a valid output.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Definition of Global Optimisation

A

The task of finding the absolutely best set of admissible conditions to achieve the objective.
Fundamental problem of optimisation is to arrive at the best possible decision/solution in any given set of circumstances

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the difference between global and local optimum?

A

Global - better than all other solutions i.e. the absolute best
Local - better than all solutions within a certain neighbourhood (N)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Difference between perturbative and constructive search paradigms?

A

Perturbative - complete solutions
Constructive - partial solutions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Define Heuristic

A

Problem dependent search method which seeks good/near-optimal solutions, at a reasonable cost without being able to guarantee optimality.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are the drawbacks of heuristic search?

A

No guarantee for the optimality of obtained solutions
Usually can be used for the specific situation for which they are designed
Heuristics have parameters - performance could be sensitive to the setting of those parameters
May give a poor solution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Difference between deterministic and stochastic heuristic search

A

Deterministic - provides the same solution when run on the given problem instance regardless of how many times
Stochastic - Contain a random component and may return a different solution at each time they are run on the same instance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is a pseudo-random number?

A

A long sequence of numbers that is produced using a deterministic process but which appears to be random

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are some problems with pseudo-random numbers?

A

Shorter than expected periods for some seed states - such seed states may be called ‘weak’
Lack of uniformity of distribution
Correlation of successive values
Distances between where certain values occur are distributed differently from those in a random sequence distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the main components of a meta/hyper heuristic search optimisation method?

A

Representation
Neighbourhood relation
Evaluation function
Initialisation
Search process
Mechanism for escaping from local optima

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Characteristics of representation

A

Completeness - all solutions must be representable
Connexity - all solutions must be reachable
Efficiency - must be easy/fast to manipulate by the search operators

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Definition of neighbourhood

A

Neighbour of solution x is a set of solutions that can be reached by a simple move operator (move oeprator/heuristic)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Definition of Hamming Distance

A

The number of positions at which the corresponding symbols differ i.e. how many differences there are between solutions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Definition of Evaluation Function

A

Indicates the quality of a given solution, distinguishing between good and bad solutions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Definition of Delta (Incremental) Evaluation

A

Calculate effects of differences between current solution and a neighbour on the evaluation function value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Definition of mutational Heuristic/operator

A

Processes a candidate solution and generates a solution which is not guaranteed to be better than the input

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Definition of hill-climbing heuristic/operator

A

Processes a given candidate solution and generates a better or equal quality solution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Definition of hill climbing algorithm

A

A local search algorithm which constantly moves in the direction of decreasing (minimum) or increasing (maximum) level/objective value to find the lowest point (minimisation) or highest point (maximisation) of the landscape or best/near optimal solution to the problem.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Give examples of simple hill climbing heuristics:

A

Steepest Descent/ascent hill climbing (SDHC)
Next Descent/ascent hill climbing (NDHC)
Random Walk/random mutation hill climbing (RWHC or RMHC)
Random-restart (shotgun) hill climbing (RRHC)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Difference between Best Improvement (steepest descent) HC and First Improvement (Next Descent) HC?

A

Best improvement - doesn’t actually ‘accept’ a bit flip i.e. doesn’t apply it to the candidate solution until the main loop has finished. Remembers the bit flip that gave it the best solution, and then does the bit flip after the loop
First improvement - Upon detecting a solution that has better objective value, immediately enacts the mutation on the candidate solution and then continues from there using that solution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
How does Davis's Bit Hill Climbing work?
Generates a random permutation, with elements in the range of [0, length of solution-1] Using this permutation, it selects which bit to flip. If the bit flip produces a better solution, accepts it immediately, then continues on.
26
How does Random Mutation HC work?
Selects a random bit anywhere in the solution (randomly generates the position) and if it is better, immediately accepts the solution.
27
Strengths of Hill Climbing
Very easy to implement, requiring only three features - representation, evaluation function, and a measure that defines the neighbourhood around a point in the search space
28
Weaknesses of Hill Climbing
Local optimum if all neighbouring states are worse or the same Plateau - If all neighbouring states are the same as the current state, the hill climbing will effectively be a random walk Ridge/valley - The search may 'bounce' from one side of the search space to another. Either way, no progress is made on either side HC may not find the optimal solution Usually no upper bound on computation time Success/failure of each iteration depends on starting point
29
Definition of metaheuristic
High-level problem independent algorithmic framework that provides a set of guidelines or strategies to develop heuristic optimisation algorithms
30
What mechanisms are used to escape local optima?
Iterate with different solutions, or restart - initialisation could be costly Change the search landscape - change the objective function, use different neighbourhoods Use memory - tabu search Accept non-improving moves, allow search using candidate solutions with equal or worse evaluation function value than the one in hand - could lead to long walks on plateaus during the search process Note - none of the mechanisms is guaranteed to always escape effectively
31
What types of termination criteria are there?
A fixed maximum number of iterations, or moves, or a fixed amount of CPU time is exceeded Consecutive number of iterations since the last improvement in the best objective function value is larger than a specified number Evidence can be given that an optimum solution has been obtained No feasible solution can be obtained for a fixed number of steps/time
32
How to deal with infeasible solutions
Use a problem domain specific repair operator Penalise each constraint violation for the infeasible solutions
33
What is Iterated Local Search?
Based on visiting a sequence of locally optimal solutions by: perturbing the current local optimum, and applying local search/hill climbing after starting from the modified solution
34
What does a low perturbation, or a high perturbation cause in ILS?
Too low - may generate cycles Too high - Good properties of the local optima are lost
35
What is a Tabu Search?
Applies hill climbing/local search, but some elements/moves are regarded as tabu i.e. forbidden Regarded as a stochastic local search algorithm which relies on the use of an explicit memory of the search process
36
Definition of scheduling
Deals with the allocation of resources of tasks over given time periods and its goal is to optimise one or more objectives. A decision-making process.
37
What are the three categories for a scheduling problem?
alpha - machine characteristics beta - processing/job characteristics gamma - optimality criteria (objective to be minmised)
38
What is Lj and how is it calculated?
Lateness of job j: Cj - Dj = means = time when job j exits the system - due date i.e. committed shipping or completion date of job j
39
What is Tj and how is it calculated?
Tardiness of job j max(Cj - Dj, 0) Means: Maximum value between either 0, or whatever Cj - Dj is
40
What are the three types of parameter setting mechanisms?
Static - either no parameters, or they are set to a specific value Dynamic - Parameter values vary with respect to time/iteration count. Adaptive - The acceptance threshold or acceptance probability is not guaranteed to be the same as one or more components
41
What does an basic adaptive move acceptance method look like?
Late acceptance - compares the quality of the solution with that of the solution accepted/visited L iterations previously, and accepts the move if and only if f(s') <= f(S(t-l))
42
What are some dynamic Threshold Move Acceptance methods?
Deluge, Great Deluge & Flex Deluge
43
How does Great Deluge work for minimisation?
Choose an initial configuration, the 'rain speed' and the initial water level, of which both must be greater than 0. Then, get a stochastic perturbation of the old configuration, and compute its objective value. If the objective value is smaller than the water level, then set old config to new config, and the water level decreases by the decay rate. Otherwise, if it has looped for quite a while without increase in quality, or too many iterations have occurred, then stop, otherwise continue looping.
44
What is the difference between Great Deluge and Extended Great Deluge?
Feedback is received during the search, and decay rate is updated/reset accordingly whenever there is no improvement for a long time in Extended Great Deluge
45
What are examples of stochastic move acceptance?
Static - Naive Acceptance Dynamic - Simulated Annealing Adaptive - Simulated Annealing with reheating
46
Strengths of Simulated Annealing
Easy to implement Achieves good performance given sufficient running time Requires a good parameter setting for improved performance
47
Definition of Parameter Tuning
Finding the best initial settings for a set of parameters before the search process starts
48
Definition for Parameter Control
Managing the settings of parameters during the search process e.g. dynamic and adaptive
49
What are some parameter tuning methods?
Use of an arbitrary setting Trial and error with settings based on intuition Use of theoretical studies
50
Definition of Latin Hyper-cube sampling
Decide the number of sample points (M) for N variables and for each sample point remember in which row and column the sample point was taken
51
Definition of Orthogonal sampling
Sample space is divided into equally probably subspaces. Sample points simultaneously, ensuring they form an ensemble of Latin Hypercube samples.
52
Components of a GA
Genetic representation Initialisation Fitness function Selecting parents Crossover Mutation Replacement Strategy Termination criteria
53
Explain Roulette Wheel Selection
Fitness level is used to associate probability of selection Then, randomly select representatives of each individual in the pool, with the chance corresponding to their probability chance.
54
Explain Tournament Selection
Running a number of 'tournaments' among randomly chosen individuals of tour size, selecting the one with best fitness at the end.
55
What is 1PX?
1 Point Crossover: Split the chromosomes into 2 parts (1 split, hence 1 point crossover) Swap the two parts with another child that has been split
56
Difference between memetic algorithm and genetic algorithm?
MAs improve GAs by embedding local search Local search is used after mutation step.
57
When does a genetic algorithm perform better than a memetic algorithm?
When there is no need for exploitation, a GA would perform better as it would be faster due to the lack of hill climbing embedded in MAs.
58
Definition of Benchmark function
Serve as a testbed for performance comparison of meta/hyper heuristic optimisation algorithms
59
What is PMX?
Partially Mapped Crossover Choosing a subsequence of a tour from one parent, and choose two random cut points to serve as swapping boundaries. Then, swap segments between the two points
60
What is OX?
Order Crossover: Choosing a subsequence of a tour from one parent Split into three parts, then read the string after the 2nd cut point from the other parent. If there are any duplicate elements, skip over them, and continue reading until all elements are filled.
61
What is CX?
Cycle Crossover: Split into three parts, then put both parents above or below each other. Then, go to the 1st element. Add that to the 1st child, and see what element it corresponds to in the 2nd parent. Find that element in the 1st parent, and then add that to the 1st child. Keep repeating this until you end up back at the starting element. Fill the rest of the child in with the 2nd parent's values. For the 2nd child, do the reverse i.e. start at the 2nd parent's 1st element, and go from there.
62
What is an MMA?
Multimeme Memetic Algorithms: Introduces memetic material for each individual
63
What is a meme?
Meme encodes how to apply an operator (which one to apply, maximum number of iterations, acceptance criteria), as well as when to apply, where to apply, and how frequently to apply i.e. a set of rules and procedures to follow. Represent instructions for self-improvement - specifies set of rules, programs, heuristics, strategies, behaviours etc... to follow/use
64
What is innovation rate?
The probability of mutating the memes Mutation randomly sets the meme option to one of the other options IR = 0, no innovation. If a meme option is not introduced at the beginning generation, it never will be IR = 1, all different strategies implied by the available M memes might be equally used
65
Definition of Hyper-heuristic
A hyper-heuristic is a search method or learning mechanism for selecting or generating heuristics to solve computationally difficult problems
66
What different types of heuristic selection are there?
Greedy Selection - take the heuristic that produced the best objective value, and sue that one Reinforcement learning - Maintains a score for each heuristic. If it produces a better result, then add 1 to the score, otherwise decrease the score. Typically picks the highest scoring heuristic for use. Choice Function - Maintains a record of performance for each heuristic, and chooses based off of this information
67
What is a graph-based heuristic?
Largest degree - sort in order of vertices that have the largest degrees to the smallest.
68
Definition of Genetic Programming
A method for automatically creating a working computer program from a high-level problem statement of the problem Iteratively transforms a population of computer programs into a new generation of programs via evolutionary process
69
Difference between GPs and EA?
GP contains the same algorithmic components, but also includes: Random generation of the initial population of possible solutions (programs) Genetic crossover of two promising solutions to create new possible solutions Mutation of promising solutions to create new possible solutions
70
What is a bin-packing heuristic?
Best-fit - take the best option available to you, that's it.