L6 - Introduction to Deep Learning Flashcards

(8 cards)

1
Q

What is the major benefit of Deep Learning over Machine Learning?

A

Automatic Feature Learning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What 3 advancements have made Deep Learning feasible/popular?

A
  1. Big Data - Easier collection, storage and availability.
  2. Hardware - better GPUs
  3. Software - Improved techniques
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Explain the Perceptron

A

The Output is the result of an activation function which takes the sum of the input values multiplied by thier input weights.
A Bias term may be added as an extra ‘constant’ input.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the purpose of the Activation function in the Perceptron theory?

A

The Activation function introduces non-linearity.

Sigmoid Function is commonly used.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the steps to optimise a Loss in a Neural Network?

A
  1. Initialise weights randomly
  2. Compute the Loss Gradient
  3. Update the weights to reduce the loss
  4. Repeat from step 2 until convergence.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the learning rate of Loss Optimisation?
What makes choosing it difficult?

A

The size of the steps taken when updating the weights in the optimisation process.
Too large or too small and the true global optimum could be missed.

The solution is to use an adaptive learning rate which changes depending on various factors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is overfitting ?

In the context of neural networks specifically

A

Wwhen a model learns the details and noise in the training dataset to the extent that it negatively impacts its performance on new data. This means the model memorizes specific examples from the training data rather than generalizing from them, resulting in high accuracy on the training set but poor performance on validation and test sets.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is Dropout Regularisation?

A

A technique that involves randomly setting a fraction of neurons to zero during each forward and backward pass, preventing the model from becoming overly reliant on any specific neurons.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly