Tut 5 Flashcards
LReLU
f(x) = x if x >= 0; ax if x < 0
Where a is some predetermined constant
PReLU
f(x) = x if x >= 0; ax if x < 0
Where a is some learned parameter
Def Feature map
The output produced by a single mask (before or after the application of the activation function, or pooling) in a convolutional layer of a CNN.
Def sub sampling
Process of reducing the width and height of a feature map by pooling (also the process of reducing the resolution of an image)
Def adversarial example
A sample that has been manipulated so that it is classified differently than the original sample
Def regularisation
Any method intended to avoid overfitting and improve generalisation
Def dropout
A method to reduce over fitting that works by giving an activation of 0 to a random sample of neurons in a layer at each iteration during training
To avoid specific neurons being associated to specific data samples
Alernative way to calculate Var[x]
Var[X] = E[ (X - E[X])2]
Formula for calculating output height/width of convolutional layer
Output_height = 1 + (input_height - mask_height + 2 x padding)/stride
What does 1 x 1 convolution do
Creates weighted sum of input channels at each location
Flattens channels into 1