Model Training, Tuning and Evaluation [up to RNNs] Flashcards

(9 cards)

1
Q

What is an activation function?

A

The function inside a neuron that takes in the inputs and uses that to figure out the output.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a rectified linear unit activation function (ReLU)?

A

An activation function where you have a linear function above 0 and below 0 it just outputs whatever was input.
This is easy and fast to compute.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a parametric ReLU activation function?

A

The same as ReLU, except of just outputting the inputs on the back half the slope is learned through back-propagation.
Complicated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a Swish activation function?

A

Developed by Google. Works very well for deep neural networks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What activation functions do RNNs tend to use?

A

Non-linear activation functions w/ TanH

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are convolutional neural networks mainly used for?

A

Image analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does feature location invariant mean? And which type of neural network is it?

A

Means it doesn’t matter where within an image the key object is. Convolutional neural networks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How does a convolutional neural network work?

A

Takes a source image, breaks it into chunks called convolutions. Then slowly layers these convolutions and increases the complexity processed.
For example, start with lines, then with shapes, then recognising and assembling these shapes, then recognising objects, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is something important to note about the source data for CNNs?

A

It must be of the appropriate dimensions - width x length x colour channels.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly