Test 5 Flashcards

(8 cards)

1
Q

5 Step Training Loop

A
  1. Generate Prediction - model(x, y)
  2. Compute Loss - loss(y, y~)
  3. Zero Gradient Parameters (optimizer.zero_grad())
  4. Compute Gradients (loss.backward())
  5. Change weights - optimizer.step()
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does the feature vector project onto the weight space?

A

Only the POSITIVE projection of z onto the w-space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the 7 steps that a NN layer performs

A

Rotation, Reflection, Scaling, Rotation, Reflection, Bias, Positive Part

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

1989

A

Yann Lecunn Introduces CNNs which perform dimensional expansion without blowing up the number of weights, allowing us to compute the feature vector from an image

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Three Properties of a Wave/Signal

A

Stationarity - Fluctuation of a Wave Does not Change Much - You will see similar patterns

Locality - Things close in domain-space are more likely to be correlated

Compositionality - The overall information is transmitted through the integration of the parts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How is stationarity represented in CNNs?

A

Parameter Sharing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

1997

A

Jordan Schmidhuber introduces the long term short memory cell, a dynamical system (RNN) that is able to model long term dependencies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

If you have multiple partial derivatives across each time step for the same weights, how do we resolve this in an RNN?

A

We sum the partials

How well did you know this?
1
Not at all
2
3
4
5
Perfectly