#12 -Machine learing Flashcards

(23 cards)

1
Q

Dropout

A

A regularization technique where neurons are randomly turned off during training to reduce overfitting. Analogy: like in a classroom where random students are excused each class to ensure everyone learns equally.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

ReLU (Rectified Linear Unit)

A

An activation function that sets all negative values to zero and keeps positive values unchanged: ReLU(x) = max(0, x). It helps prevent the vanishing gradient problem.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Leaky ReLU

A

A variation of ReLU where negative values are allowed but scaled by a small factor (e.g., 0.01): LeakyReLU(x) = x for x > 0, a*x for x <= 0. Helps prevent dead neurons.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Batch Normalization

A

A technique to normalize layer activations to stabilize and speed up training. Analogy: like revising notes after each class to maintain consistent difficulty.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Sigmoid

A

An activation function for binary classification problems, outputting values between 0 and 1: sigmoid(x) = 1 / (1 + e^(-x)).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Binary Cross-Entropy (BCE)

A

A loss function for binary classification that measures the difference between predicted probability and true label. Formula: L = - (ylog(p) + (1-y)log(1-p)).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Adam Optimizer

A

A popular optimizer combining the benefits of AdaGrad and RMSProp, adjusting the learning rate for each parameter.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

MaxPooling

A

A downsampling operation selecting the maximum value from each window (usually 2×2). It reduces resolution and highlights key features.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Bilinear Upsampling

A

A method to increase image resolution by bilinear interpolation, averaging neighboring pixels. Used in decoders like in UNet.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Conv2D

A

A convolution operation in neural networks for 2D images, extracting features like edges and patterns. Parameters include number of filters, kernel size, padding, and stride.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the encoder in U-Net?

A

Part of the model responsible for feature extraction — reduces image size using e.g. max pooling.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the decoder in U-Net?

A

Part of the model that restores resolution via upsampling, often using transposed convolution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are skip connections?

A

Links between encoder and decoder layers that help preserve spatial details in the image.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Is max pooling the only way to downsample?

A

No. You can also use strided convolutions or attention-based pooling.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What can replace transposed convolution in the decoder?

A

You can use upsample + convolution (e.g. nn.Upsample + Conv2D) or pixel shuffle.

17
Q

How can skip connections be improved?

A

By adding attention (e.g. SE block, CBAM) or using residual connections instead of concat.

18
Q

What are popular U-Net variants?

A

UNet++, Attention UNet, ResUNet, DeepLab, PSPNet, HRNet.

19
Q

Why perform ablation studies?

A

To determine which components of the model architecture improve performance.

20
Q

What is UNet++?

A

An advanced U-Net with denser skip connections between encoder and decoder blocks.

21
Q

What is Attention UNet?

A

A U-Net that incorporates attention modules to highlight important regions in the input.

22
Q

czym jest skip-connections?

A

Ich rolą jest łączenie szczegółowych informacji z warstw płytkich (wysoka rozdzielczość) z informacjami semantycznymi z warstw głębokich (więcej kontekstu).

23
Q

Kiedy trzeba użyć if __name__ == “__main__”:

A

Na Windowsie (i czasem w Jupyterze) procesy uruchamiane przez DataLoader (z num_workers>0) muszą mieć punkt wejścia w __main__, by uniknąć rekurencyjnego uruchamiania skryptu przez multiprocessing.