Intro to Neural Networks Flashcards

1
Q

What is backpropagation? What is its goal?

A

Backpropagation is a common method for training a neural network.

Our goal with backpropagation is to update each of the weights in the network so that they cause the actual output to be closer the target output, thereby minimizing the error for each output neuron and the network as a whole.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the equation for calculating total error?

What is it called?

A

Squared error function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How do you successfully perform a backward pass?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the power rule and what do you need to remember about it?

A

When the inner function is more complicated you have to remember to take the derivative of the inner function as well. In this case it isn’t noticed because the derivative of x is just 1, but the calculation was still performed regardless.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the derivative of the logistic function?

A

Note the final form of f(x)(1-f(x))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the parts of the node you need to remember when taking the derivative for new weight calculations?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are the steps to calculating a new weight with backpropagation?

What do you need to remember with this calculation?

A

This is just the calculation for weight and layer. So a few more calculations need to be perfomed.

And you don’t use the new weights for the hidden layer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly