5) Convex optimisation Flashcards
(11 cards)
What is an objective function, and how are global and local minimisers defined
What is a necessary condition for a local minimiser of a continuously differentiable function
What are key properties of minimisers for convex functions defined on convex sets
For the objective function Φ(w)= 1/2 ∥Aw−b∥ ^2 , what condition characterises global minimisers
What is the basic idea of gradient descent for convex optimisation
In gradient descent, why is the direction of fastest descent given by the negative gradient
What is exact line search in gradient descent
What does it mean for a function to be Lipschitz continuous
What convergence guarantee does gradient descent have for convex, smooth functions with Lipschitz continuous gradients
What is stochastic gradient descent (SGD), and how does it address the computational cost of large datasets
What convergence guarantee does stochastic gradient descent (SGD) have for convex, smooth functions