5) Convex optimisation Flashcards

(11 cards)

1
Q

What is an objective function, and how are global and local minimisers defined

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is a necessary condition for a local minimiser of a continuously differentiable function

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are key properties of minimisers for convex functions defined on convex sets

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

For the objective function Φ(w)= 1/2 ∥Aw−b∥ ^2 , what condition characterises global minimisers

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the basic idea of gradient descent for convex optimisation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

In gradient descent, why is the direction of fastest descent given by the negative gradient

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is exact line search in gradient descent

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does it mean for a function to be Lipschitz continuous

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What convergence guarantee does gradient descent have for convex, smooth functions with Lipschitz continuous gradients

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is stochastic gradient descent (SGD), and how does it address the computational cost of large datasets

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What convergence guarantee does stochastic gradient descent (SGD) have for convex, smooth functions

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly