Model Training, Tuning and Evaluation [up to RNNs] Flashcards
(9 cards)
What is an activation function?
The function inside a neuron that takes in the inputs and uses that to figure out the output.
What is a rectified linear unit activation function (ReLU)?
An activation function where you have a linear function above 0 and below 0 it just outputs whatever was input.
This is easy and fast to compute.
What is a parametric ReLU activation function?
The same as ReLU, except of just outputting the inputs on the back half the slope is learned through back-propagation.
Complicated.
What is a Swish activation function?
Developed by Google. Works very well for deep neural networks.
What activation functions do RNNs tend to use?
Non-linear activation functions w/ TanH
What are convolutional neural networks mainly used for?
Image analysis
What does feature location invariant mean? And which type of neural network is it?
Means it doesn’t matter where within an image the key object is. Convolutional neural networks.
How does a convolutional neural network work?
Takes a source image, breaks it into chunks called convolutions. Then slowly layers these convolutions and increases the complexity processed.
For example, start with lines, then with shapes, then recognising and assembling these shapes, then recognising objects, etc.
What is something important to note about the source data for CNNs?
It must be of the appropriate dimensions - width x length x colour channels.