Multi-Layer Neural Network Flashcards

1
Q

What is the issue with a perceptron?

A

It cannot separate anything that is not linearly separable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a multi-layer perceptron?

A

> Contains multiple neurons

> Contains multiple layers

  • Input layer
  • Hidden layer
  • Output layer
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the simplest example of something that is not linearly separable?

A

XOR gate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Theoretically, how many layers is required for a universal aproximator?

A

2 layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the issue with just 2 layers?

A

In theory only 2 layers are enough to aproximate any function but in practice the hiddle layer needs to be so large that it is computationally impossible so instead we can add mutiiple hidden layers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the equation for mean squared error?

A

E(x) = 1/2 ∑ (yn - tn )^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Why do we use a sigmoid function?

A

For multiple layers we cannot use step functions because it is not differentiable. A sigmoid function aproximates a step function and is differentiable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the equation for the sigmoid?

A

f(x) = 1/(1 + eβx)

β = Changes the steepness of the sigmoid function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the derivative of the sigmoid function?

A

σ’ = (βe-βx) / (1 + e-βx)2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the sigmoid activation function?

A

y(wTx) = 1 / (1 + e-βw^T x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly