Transfer Learning Flashcards

(5 cards)

1
Q

What is transfer learning?

A

Starting from a pre-trained model and adapting its learned features to a new task.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Why reuse early CNN layers in transfer learning?

A

They capture generic visual features (edges, textures) that generalize across different datasets.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How do you adapt a pre-trained CNN for new classes?

A

Replace the original classifier head with a new dense layer sized to your target labels.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When should you use freeze-only-head, train-all, or hybrid fine-tuning?

A

Freeze head only: small datasets for speed and less overfitting; train all: large or different-domain datasets; hybrid: start with head then unfreeze deeper layers for a balance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Why does transfer learning often outperform training from scratch?

A

Pre-trained features provide a significant head start, achieving high accuracy in fewer epochs on modest datasets.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly