Quizes Flashcards

1
Q

A mathematical function always has a global minimum

A

No

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

local optima only occurs when f’(x) = 0

A

No. can also occur when f’(x) not defined.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a feasable set?

A

A set that satisfies the constraints on the objective function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What properties does the Hessian(f) Matrix have?
A. Symmetric
B. Diag(H) != 0
C. No elements are zero

A

A. Symmetric.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Consider a 2x2 matrix if 1s. Is semi-definite?

A

Yes. Since all eigenvals are >= 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Once the search direction has been find in an iterative opt. method, the problem of finding the step size is one-dimensional.

A

Yes. Step size is a scalar.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Newton-Rapson method always finds a minima.

A

No. It finds a stationary point s.t f’(x) = 0. Second order derivative test can settle (if it exists) if the point is max/min/saddle

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

An optimization problem is convex if f(x) is convex

A

No. It also requires the set of feasable solutions to be convex.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

At a local minima of f(x) s.t h(x) = 0 what of the following conditions hold?
A. grad(fx) and grad(hx) are paralell
B. grad(fx) and grad(hx) are orthogonal
C. grad(hx) = 0

A

A. grad(fx) and grad(hx) are paralell. (see, lagrange multipliers)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

If two individuals are to have offspring they must be of the same species

A

No. Counterexample: Mules and Ligers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Humans have more chromosomes than other species

A

No. Human 48~ and carp 100+.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why must codons consist of at least three letters?

A

Because there are 20 amino acids and 4^x where x < 3 doesn’t have enough combinations to encode all 20 different amino acids.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Because of random initialization stochastic algorithms converges at a slower rate.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In tournament selection, the competing number of individuals is always two

A

No. Sometimes more.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

After selection of two individuals, crossover is always performed.

A

No. This step depends on crossover probability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How many possible mutations can occur given m genes and p_m mut prob?

A

m number of mutations.

17
Q

In connection to the schema theorem and GAs, what is a building block?

A

A building block is fixed number of genes that:
A. Short defining length, i.e distance between first and last non-wild card symbol
B. “Low order”, nr of non-wildcard symbols.
C. High fitness

18
Q

f(j) = j(m-j) is a function of unitation.

A

Unitation is a function whose value only depends on the number of ones in the chromosome. This depends on only on the number of ones in the chromosome since m is a constant.

19
Q

Good option to avoid premature convergence is to rerun the GA a couple of times.

A

Yes. This is equivalent to island modelling, where crossover only happens between invdividuals on the same island.

20
Q

Why use two point crossover in LGP?

A

Dont know.

21
Q

In LGP, you have a limited number of constants that cannot be changed (TRUE/FALSE)

A

TRUE. Constants are set once and only once. New constants can be formed by using the others as building blocks.

22
Q

In Interactive Evoluationary Computation, as many pictures as possible should be presented to the evaluator

A

FALSE. This will cause user fatigue

23
Q

Biological neural networks are able to carry out complex tasks because neurons are fast?

A

FALSE. Neurons have a firing capacity of 1khz, which is not that fast compared to a modern computers 3 ghz. Its the combination of many computational units that makes the animal brain so good at complex tasks.

24
Q

Gene regulation plays a role in long-term memory learning tasks.

A

TRUE. Gene regulation causes neurons to permanently change state whereas

25
Q

feed forward neural network requires backpropagation to train.

A

TRUE. It just does.

26
Q

When training a neural network, one should do the training for as long as possible.

A

FALSE. A good rule-of-thumb is to train until your testing set error starts increasing after decreasing.

27
Q

Ants have skilled leaders and are capable of launching nuclear missiles.

A

FALSE. Ants appearent smart behaviour is an emergent property of communication using pheromones.

28
Q

What is stigmergy?
A. A mechanism for long-range communication using sound.
B. A mechanism for indirect communication by means of (local)
modification of the environment.
C. A mechanism for long-range communication using vision.

A

B. A mechanism for indirect communication by means of local modification of the environment.