Constraint Optimisation - Equality Constraints Flashcards

1
Q

What is a constraint?

A

A condition on the input points for that point to be valid. We only accept points that adere to given constraints.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do we embed the constraint in the equation?

A

We add it to the equation
Example:
f(x) = x^2
min_xmax_λ x^2 + λ(x - 2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

We want to find the minim with regards to x and the maximum with regards to λ. how do we do this?

A

Differentiating wit regards to each variable and equating to zero

min_xmax_λ x^2 + λ(x - 2)
∇_x f(x,λ) = 2x + λ = 0
∇_λ f(x,λ) = x - 2 = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

[Picture 12]

What are the properties of the red line?

A

> The red line is the constraint
As you move along the line, the gradient of the function (no the equation of the line) for each point on the line moves down towards the centre.
At a point along the line, the gradient is minimum (gloval minimum) and the gradient of the function and the gradient of the line are perpendicular

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the equation of the gradients of the line at the mima? What is the equation for the gradient of the lagrangen at this point? What can we therefore do?

A

Gradients:
-∇_x f(x) = λ∇_x h(x)
Lagrangen gradient:
∇_x L(x,λ) = 0

Combined
∇_x L(x,λ) = λ∇_x h(x) -∇_x f(x) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What do the two parts of the equation, ∇_x L(x,λ) = λ∇_x h(x) -∇_x f(x), do?

A

∇_x f(x,λ): The gradient of the function and the constraint must be parallel
This is the derivative of the lagrangen
∇_λ f(x,λ): This shows that the gradient is enforced.
This is the definition of the red line. Without this, we could find any point where the gradient of the function is parallel to the gradient of the constraint

How well did you know this?
1
Not at all
2
3
4
5
Perfectly