Week 7: Neural Networks & Deep Learning Flashcards

1
Q

What is Deep Learning (DL)?

A

A subfield of machine learning that uses deep neural networks to model complex patterns in data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Why has DL become prominent recently?

A

Due to better algorithms, computing power (GPUs), large labeled datasets, and open-source tools.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is an Artificial Neural Network (ANN)?

A

A network of interconnected artificial neurons that process information in layers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a perceptron?

A

The simplest type of neural network unit performing binary classification using weighted inputs and a step function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the activation function in a perceptron?

A

A step function that outputs 1 if the weighted sum exceeds a threshold, else 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a sigmoid neuron?

A

A neuron that outputs a smooth value between 0 and 1 using the sigmoid function, enabling learning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Why use sigmoid neurons over perceptrons?

A

Because small changes in weights lead to small changes in output, allowing gradient-based learning.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the sigmoid function formula?

A

σ(z) = 1 / (1 + e^(-z))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the purpose of an activation function?

A

To introduce non-linearity into the network, enabling the learning of complex patterns.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

List common activation functions.

A

Sigmoid, ReLU, tanH, linear, step, Gaussian.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is forward propagation?

A

The process of computing outputs from input by passing data through the network layers.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is backpropagation?

A

A learning algorithm for neural networks that computes gradients to update weights and biases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What cost function is often used in binary classification?

A

Binary cross-entropy or log loss.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the softmax function used for?

A

To convert logits to probabilities in multiclass classification problems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the mathematical form of softmax?

A

softmax(z_i) = e^(z_i) / sum(e^(z_j)) for all j.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the purpose of a cost function?

A

To measure how well the neural network predictions match the true outputs.

17
Q

What is stochastic gradient descent (SGD)?

A

An optimization algorithm that updates weights using one or a few training examples at a time.

18
Q

What does a neuron compute?

A

A weighted sum of its inputs followed by an activation function.

19
Q

What is meant by deep in deep learning?

A

The use of multiple hidden layers in a neural network.

20
Q

What are input, hidden, and output layers?

A

Input: receives data; Hidden: performs computation; Output: provides final prediction.

21
Q

What is a bias term in a neuron?

A

An additional parameter that allows the activation function to shift left or right.

22
Q

What is overfitting in neural networks?

A

When a model learns training data too well, failing to generalize to new data.

23
Q

What is regularization in DL?

A

Techniques like L1/L2 penalties or dropout used to prevent overfitting.

24
Q

What is an epoch?

A

One complete pass through the entire training dataset during training.

25
What is the main goal of training a neural network?
To minimize the cost function by adjusting weights and biases using gradient descent.