Week 6 Flashcards

1
Q

Common uses of GANs

A

Produce new content (eg extra digits for MNIST)
Text to image generation
Image to image translation
Increasing image resolution
Predicting next video frame

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Explicit model: trackable density

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Approximate density

A

Variational auto encoder

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Training objective function for VAE

A

L2 loss

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

NN structure for auto encoder

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

VAE show loss breakdown across encoding and decoding

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Problem that requires GANs and how it is a solution

A

Problem is we want to sample from complex and high dimensional training sample distribution

Solution is we sample from a simple distribution (eg random noise) and transforming it to training distribution (by generator network)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

GAN structure

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Data distributions for GANs

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Notation for GAN

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Loss function for GAN

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Formulate training for GAN

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Minibar y SGD for GAN

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

GAN problem of non convergence

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

GAN problem of diminished gradient

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Other problems for GANs

A

Mode collapse

Unbalance between generator and discriminator (overfitting, eg discriminator works too well)

Highly sensitive to hyper parameters

17
Q

Improving GANs with network design

18
Q

Improving GANs with cost functions

19
Q

Improving GANs with optimisation (experience replays)

20
Q

Improving GANs with optimisation - training with labels

21
Q

Improving GANs with optimisation - adding noise

22
Q

Improving GANs with optimisation - unrolling GAN

23
Q

Predict move of discriminator by unrolling