wk4 Flashcards
(4 cards)
1
Q
What does w^L_{j,k} denote
A
connection of weights from neuron k in layer L-1. To neuron j in later L
2
Q
What is z^L_j in a neural network
A
weighted input to unit j in layer L. It is the sum of the previous weights put through the previous activation function a^L-1_k + bias (b^L_j)
3
Q
what does m often mean in the context of a neural network
A
the output number of neurons
4
Q
in a fully connected multi layer perceptron. Given neuron width m and number of layers L. How many parameters are there to train
A
(m^2+m)(L-1)