14-backpropagation Flashcards

1
Q

Why can’t we use the perceptron rule for neural networks?

A

The perceptron rule requires the true target outputs, but we only have access to true values at the output layer. As a result we use backpropagation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a backpropagation?

A

Backpropagation provides us with a way to update weights in a MLP. It works by propagating the error from the output node through the hidden layer nodes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly