IFN580 Week 9 Deep Learning Flashcards

(13 cards)

1
Q

Why is stacking non-linearities on non-linearities important in Deep Learning?

A

Because it allows the layers to learn more complex, higher-level functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

The “deep” in “deep learning” refers to

A

the number of layers in the neural network

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Is feature engineering always necessary with deep learning models, and if not, why?

A

No, they are designed to automatically learn relevant features

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How to prevent overfitting in Deep learning?

A

Regularisation and Dropout, early stopping

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

When designing a deep neural network, do all layers have to contain the same
number of parameters?

A

No, different layers can contain a different number of neurons and even be of
completely different types

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a loss function?

A

measures how wrong the model’s predictions are

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is an autoencoder?

A

a type of neural network that compresses data into lower dimensional representation (latent space) and reconstructs it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the latent space?

A

it’s the compressed representation of the data. it captures the most important features needed to reconstruct the original

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How does an autoencoder work?

A

ENCODER takes input, compresses it
DECODER takes compressed input and reconstructs it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is Recurrent Neural Network (RNN)?

A

An NN that processes data sequentially. it depends on the previous hidden state

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is an LSTM (Long Short-Term memory)

A

A type of NN that’s designed to learn long-term dependencies. Also sequential but more sophisticated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why does LSTM use activation functions

A

Uses activation functions as gates to control flow of information

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a Transformer model?

A

A type of NN that uses self-attention mechanisms to processes data parallel

How well did you know this?
1
Not at all
2
3
4
5
Perfectly