Transfer Learning Flashcards
(5 cards)
What is transfer learning?
Starting from a pre-trained model and adapting its learned features to a new task.
Why reuse early CNN layers in transfer learning?
They capture generic visual features (edges, textures) that generalize across different datasets.
How do you adapt a pre-trained CNN for new classes?
Replace the original classifier head with a new dense layer sized to your target labels.
When should you use freeze-only-head, train-all, or hybrid fine-tuning?
Freeze head only: small datasets for speed and less overfitting; train all: large or different-domain datasets; hybrid: start with head then unfreeze deeper layers for a balance.
Why does transfer learning often outperform training from scratch?
Pre-trained features provide a significant head start, achieving high accuracy in fewer epochs on modest datasets.