Back propagation Flashcards

(15 cards)

1
Q

Back propagation

A

When training, starting from the output layer and moving backwards to the input layer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Why must you start at the output layer when doing back propagation?

A

Because it is where the error was first detected

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What part of training is feed forward and what is back propagation?

A

Getting the error (loss) is through feed-forward prediction

Updating weights and biases is through back propagation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the goal of training a model?

A

generalization

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Why can’t you use the testing dataset for the training dataset?

A

May become familiar with images in training but be unsure when your provide a new image

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How does testing vary from training?

A

There is no updating based on the prediction

It is strictly prohibited to include training data into the test dataset

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Training data contamination and ChatGPT

A

-People speculate that GPT model has been trained to give the exact answer for the exact questions

An experiment done found that GPT solved 10/10 pre-2021 problems and 0/10 recent problems from Codesforce
Indication that the coding exams have been coded into their database

The date at which they stopped training was the point at which the model wasn’t able to solve new questions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

*How does the performance of models depend on the size of the dataset?

A

A large neural network excels when trained on a vast dataset

20 years ago the dataset was very small and performance was terrible

Dataset started to increase with the internet

Increasing the data does not increase the models performance if the model is not adjusted – the performance will plateau

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the current state of data size and model performance

A

Although we have increased data in the large neural network, it’s performance has not plateaued

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What happens when you have a small amount of data

A

When you have a small amount of data and a large neural network, the model will memorize the answers to the questions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What happens if you increase the network size and increase the dataset size?

A

the model will perform well

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the current question with dataset size?

A

Do we have a large enough data set if we keep making the model larger and larger?

If we create so many paramters that we don’t have enough data, will the performance plateau?

if not, the model will remember the answers
The amount of parameters in the model is only increasing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why is 2020 a critical period?

A

This is the time when we saw a huge increase in the model size

People found out that if you make the data larger, your model must be large enough with enough parameters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

With training being so expensive, how might small companies utilize large models?

A

They may use a pretrained model that can be downloaded (however this is not tailored for use specific tasks)

May add another layer of neurons at the end and change the weight

Freezing all but last layer

When you do back propagation, you don’t adjust weights of the certain data, only the neuron layer you added

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

*How can you use a few short prompts to improve performance?

A

Give an example, and the answer - then you give them a question and ask them what the answer is

A way of finetuning your model that is less expensive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly