Tut 5 Flashcards

1
Q

LReLU

A

f(x) = x if x >= 0; ax if x < 0
Where a is some predetermined constant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

PReLU

A

f(x) = x if x >= 0; ax if x < 0
Where a is some learned parameter

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Def Feature map

A

The output produced by a single mask (before or after the application of the activation function, or pooling) in a convolutional layer of a CNN.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Def sub sampling

A

Process of reducing the width and height of a feature map by pooling (also the process of reducing the resolution of an image)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Def adversarial example

A

A sample that has been manipulated so that it is classified differently than the original sample

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Def regularisation

A

Any method intended to avoid overfitting and improve generalisation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Def dropout

A

A method to reduce over fitting that works by giving an activation of 0 to a random sample of neurons in a layer at each iteration during training

To avoid specific neurons being associated to specific data samples

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Alernative way to calculate Var[x]

A

Var[X] = E[ (X - E[X])2]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Formula for calculating output height/width of convolutional layer

A

Output_height = 1 + (input_height - mask_height + 2 x padding)/stride

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does 1 x 1 convolution do

A

Creates weighted sum of input channels at each location
Flattens channels into 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly