IFN580 Week 6 Neural Network (11%) Flashcards

(19 cards)

1
Q

When training a NN, what is the purpose of backward pass?

A

To update the model parameters stepping in the direction of the gradient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What’s the general flow of NN?

A

Feed input data into the INPUT LAYER, pass raw data to the HIDDEN LAYER.

HIDDEN LAYER transforms the data through weighted connections and activation functions.

The OUTPUT LAYER receives input from the final hidden layer and generates a final prediction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What makes NN a better ML algorithm?

A

Non-linearity, which is introduced by activation functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

A feed-forward neural network model is said to be fully connected when:

A

all nodes at one layer are connected to all nodes in the next higher layer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

The input values to a feed-forward neural network must be:

A

numeric

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

When training a neural network, what does an epoch represent?

A

the number of times that the training data has been passed through the
network

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Which of the following supervised learning techniques can produce both numeric
and categorical outputs?

A

Neural networks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

too many nodes result in

A

overfitting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

too few hidden nodes result in

A

underfitting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Recall that last week we mentioned that the training process for a neural network
involves passing the training data through the network multiple times (an “epoch”).
During each of these training passes, which of the following statement is true?

A

Individual network weights are modified.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What happens when a neural network is over trained?

A

An over-trained neural network will fail to generalise to the trend of the data,
instead memorising specific values from the training set (aka. “overfitting”).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why are neural networks referred to as a “universal approximator”?

A

A well-trained neural network can represent a wide variety of problems including
both classification and regression. These models can also produce good accuracy
even with the presence of noise, errors, missing values, etc

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

With decision trees and regression models, we may need to pre-process
features/attributes to enable the model to effectively learn patterns in the training set
(“feature engineering”). Do neural networks still require feature engineering, and
why/why not?

A

Different layers in neural networks can “create” their own features that are
learnt from the data. This eliminates the need for feature engineering to some
extent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Name three common activation functions typically used in neural networks. Which
function trains the fastest?

A

Sigmoid or logistic function.
Hyperbolic tangent or tanh function.
ReLU or Rectified Linear Unit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

The ______ function is responsible for computing the difference between the
predictions and training data

A

loss

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the goal of backpropagation?

A

Compute gradients that are propagated backwards from the output layer,
through the hidden layers to minimise the loss function.

17
Q

Name three common optimisers used for training neural networks

A
  • Adam (adaptive moment estimation)
  • Root Mean Square Propagation (RMSProp)
  • Stochastic Gradient Descent (SGD)
18
Q

In K-nearest neighbour classification, _______ values for 𝑘 may result in _______.

A

small, overfitting

19
Q

Which of the following are limitations of K-nearest neighbour classification?

A

Requiring storing all training data in the model.