Optimization Considerations Flashcards
(11 cards)
What does optimization focus on?
iteratively improving model performance by adjusting parameters and hyperparameters to minimize a loss function or maximize a desired objective, leading to more accurate and efficient models
What is hyperparameter tuning?
Adjusting parameters that control the learning process (learning rate, number of layers or trees, regularization strength)
What is feature selection
Identifying the most relevant features to improve model performance and reduce complexity
What is model selection
Choosing the appropriate machine learning algorithm for your problem and data
What is regularization
Techniques to prevent overfitting
What are ensemble methods?
Combining multiple models to improve performance and robustness
What are the Optimization Algorithms
Gradient Descent
Stochastic Gradient Descent
Adam
Bayesian Optimization
What is gradient descent?
A fundamental algorithm for minimizing loss functions by iteratively adjusting parameters in the direction of the negative gradient
What is Stochastic Gradient Descent?
A variation of gradient descent that uses small batches of data for faster training
What is Adam?
An adaptive optimization algorithm that combines momentum and RMSprop for efficient training
What is Bayesian Optimization?
A probabilistic approach to finding optimal hyperparameters by building a model of the objective function