Online Learning Flashcards
(13 cards)
What are the two types of batch learning?
Full Batch and Mini-batch
How does full batch learning work?
Full batch learning uses all the data and computes the true gradient.
What is the advantage of full batch learning?
It is simpler to reason about and has a smoother a more consistent convergence.
What are the disadvantages of Full Batch Learning?
The data might not be static resulting in concept drift. This means the ML model needs to be updated several times, which will take a long time if all the available data is used.
How does mini batch learning work?
It uses part of the data and computes an estimate of the gradient.
What are the advantages of Mini Batch Learning?
Only more recent data is used, making it less computationally expensive. Older data doesn’t affect the model as much, resulting in incremental learning.
What are the disadvantages of Mini-batch learning?
More complexity is added in the choice of batch size. Convergence will be less stable depending on the data fluctuations in the batch.
How does online learning work?
Model parameters are updated whenever a new observation arrives. The model is always adjusting to changes in the data.
What are the disadvantages of Online Learning?
bad data will have an immediate performance impact controlled by the learning rate. It requires close monitoring, and convergence will be less stable.
What are the three types of data shift that cause model decay?
- Covariate Shifts
- Prior probability shifts
- Concept drifts
What are covariate shifts?
Changes in the input variables from the training data to the test data while the target variables are unchanged.
What are prior probability shifts?
Changes in the target variables while input variables are unchanged
What are concept drifts?
The relationship between input and output changes over time in unforeseen ways.