Chapter 10 Flashcards

(27 cards)

1
Q

What inspired the development of artificial neural networks?

A

Biological principles of how neurons in animal brains function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the basic architecture of an artificial neural network?

A

The multilayer perceptron (MLP).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Why did neural networks regain popularity in recent years?

A

Due to more data, greater computing power, better training algorithms, and increased funding.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a perceptron?

A

A simple ANN model that uses threshold logic units for linear classification.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What function does a threshold logic unit (TLU) perform?

A

Computes a weighted sum of inputs and applies a step function to determine the output.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the main limitation of a single-layer perceptron?

A

It cannot solve non-linearly separable problems like XOR.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How does a multilayer perceptron (MLP) address the perceptron’s limitation?

A

By stacking multiple layers to model non-linear relationships.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the purpose of an activation function in a neural network?

A

To introduce non-linearity so the network can learn complex functions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are common activation functions used in MLPs?

A

Sigmoid, hyperbolic tangent (tanh), and ReLU.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why are weights in a neural network initialized randomly?

A

To break symmetry and allow neurons to learn different features.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does backpropagation do in neural networks?

A

It computes gradients and updates weights using gradient descent to minimize error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the two phases of backpropagation?

A

Forward pass (compute outputs) and backward pass (compute gradients).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the role of the learning rate in training neural networks?

A

It controls how much the weights are adjusted during training.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are typical components of an MLP architecture?

A

Input layer, one or more hidden layers (with bias neurons), and an output layer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What type of output and activation is used in regression MLPs?

A

A single output neuron with no activation or a softplus function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How is classification handled in MLPs?

A

Using one output neuron per class with softmax activation for multi-class classification.

17
Q

What is the Keras Sequential API?

A

A simple way to build feedforward neural networks layer by layer.

18
Q

What is the Functional API in Keras used for?

A

Creating complex models with non-sequential architectures like multi-input or multi-output networks.

19
Q

What does the softmax activation function do?

A

Converts logits into probabilities that sum to 1 across output classes.

20
Q

How are bias terms initialized in Keras models?

A

Biases are typically initialized to zero.

21
Q

What loss functions are used in Keras for classification?

A

sparse_categorical_crossentropy, categorical_crossentropy, and binary_crossentropy.

22
Q

What optimizer is commonly used in Keras examples?

A

Stochastic Gradient Descent (sgd).

23
Q

What is early stopping in model training?

A

A technique to stop training when validation performance stops improving.

24
Q

What does the predict() method do in Keras?

A

Outputs the predicted probabilities for each class.

25
What is TensorBoard used for?
Visualizing training metrics and model architecture in real time.
26
Why is tuning hyperparameters important in neural networks?
Because many parameters affect performance, and neural networks are highly flexible.
27
What tools can be used for hyperparameter tuning in Keras?
Hyperopt, Hyperas, Keras Tuner, Scikit-Optimize, Spearmint, Hyperband.