Unconstrained Optimization Flashcards

1
Q

What is optimization?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Write out the general equation for the gradient of a function f.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Write the general equation for the Hessian of a function f

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the general steps to find the minimum (or maximum) of a function of one variable?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does it mean for a problem to be multimodal?

A

The problem has multiple minima.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the general steps to find the minimum (or maximum) of a function with multiple variables?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

When is x* a global minimizer?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

When is x* a local minimizer?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can you put the problem (function) in standard form if you wish to maximize a function?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

When is x* a (weak) local solution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

When is x* a strong local solution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

For any two points x and y, a convex function satisfies…[what is the inequality?]

(Hint: this is Dr Kennedy’s notation)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

For any two points x1 and x2, a convex function satisfies…[what is the inequality?]

(Hint: this is Dr German’s notation)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

There are two equivalent conditions to identify convex functions. What are they?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

For a convex function, _____ optimality implies ______ optimality.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

_____ _____ optimality implies _____ _____ optimality.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

In Dr. Kennedy’s class, we put the function f(x) into a standard form. What is this form?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Let f(x) be a convex function. If x* is a local solution to the unconstrained minization problem, then x* is a _____ ______ to the unconstrained minimization problem.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is an Optimization Algorithm?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

What are direct search methods?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

What are line search methods?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

What are trust region methods?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Give the equation for a line search according to Dr. German, and explain each of the terms.

A
34
Q

Give the steps to performing a line search, according to Dr. Kennedy

A
35
Q

Give the steps to performing a line search, according to Dr. German

A
36
Q
A
37
Q
A
38
Q
A
39
Q

List three methods to find a search direction in a line search.

A

Steepest Descent
Conjugate Gradient (Conjugate Directions)
Newton’s Method (Newton Search Directions)
Quasi-Newton Method

40
Q

Give an example of a zeroth-order method

A
41
Q

Give an example of a first-order method.

A
42
Q

Give an example of a second-order method.

A
43
Q

Give the equation of a descent direction (either Kennedy’s or German’s notation)

A
44
Q

Give an example of a method used to find the step size in a line search.

A

Polynomial Approximations
Golden Section Method

45
Q

What is the equation for the first Wolfe Condition (Armijo rule, sufficient decrease).

(either Dr Kennedy’s or Dr German’s notation)

A
46
Q

What is the equation for the first Strong Wolfe Condition (Armijo rule) (sufficient decrease)

(either Dr Kennedy’s or Dr German’s notation)

A
47
Q

What is the equation for the second Wolfe Condition (curvature condition)

(either Dr Kennedy’s or Dr German’s notation)

A
48
Q

What is the equation for the second Strong Wolfe Condition (curvature condition)

(either Dr Kennedy’s or Dr German’s notation)

A
49
Q

What is the problem with the first Wolfe Condition (if we only enforce this condition)?

A
50
Q

How does the curvature condition help prevent step sizes that are too small (Wolfe Conditions)?

Hint: what does it mean for the slope?

A
51
Q

Give three examples of direct search algorithms.

A

Grid Search
Random Walk
Random Search
Compass Search
Coordinate Pattern Search

52
Q

What is the problem with the second Wolfe Condition?

A

The second Wolfe Condition is satisfied by arbitrarily large steps

(The opposite problem of the first Wolfe Condition)

53
Q

What is backtracking?

A

Backtracking is the process of decreasing the step size by a constant percentage.

54
Q

What does backtracking do for us in terms of the Wolfe Conditions?

A

It can be shown that it is necessary to check only the first Wolfe Condition (Armijo rule, sufficient decrease) if we begin a line search with a “large” step size and then successively decrease it by a constant percentage, i.e., if we being a line search with a “large” step size and then backtrack.

We do not need to check the second Wolfe Condition (curvature condition).

55
Q

Graphically show the Wolfe Conditions (first, second, and strong) on an arbitrary function.

A
56
Q

Write out a generic algorithm to perform backtracking on a line search.

A
57
Q

What are two disadvantages of using backtracking in a line search?

A
58
Q

What is the Golden Section Method (or Golden Section Search)?

A
59
Q

What are the steps in the Golden Section Method?

A
60
Q
A
61
Q
A
62
Q

What do we use polynomial approximations for?

A

To estimate the step size in a line search.

63
Q

What is the main concern regarding the use of steepest descent?

A

Steepest descent performs very poorly - very slow convergence.

We search in essentially the same direction at multiple iterations

64
Q

Write out a generic algorithm to perform a line search

(Kennedy’s notation)

A
65
Q

Write out a generic algorithm to perform steepest descent.

(Kennedy’s notation)

A
66
Q
A
67
Q

Search directions are always _____ whenever an exact line search is performed.

A

Search directions are always perpendicular whenever an exact line search is performed.

68
Q

For an inexact line search, search directions are only _____ _____.

A

For an inexact line search, search directions are only approximately perpendicular.

69
Q

What is the main idea behind a polynomial approximation? What types of polynomials are the most common?

A
70
Q

If the Hessian is positive definite, give the equation for the Newton Search Direction

(Kennedy’s or German’s notation)

A
71
Q

When the Hessian is positive definite, the Newton Direction is a _______ ______.

A

When the Hessian is positive definite, the Newton Direction is a descent direction.

72
Q

Concerning Newton Search Directions, name two problems that can arise if the Hessian is NOT positive definite.

A
73
Q

Newton’s Method displays _____ convergence.

A

Newton’s Method displays quadratic convergence.

74
Q

What are Quasi-Newton Methods? Write an equation for Quasi-Newton Methods.

A
75
Q

What is the BFSG Method?

A
76
Q

Write the equation for a Newton Direction.

Write the equation for a Quasi-Newton Direction.

(German’s notation)

A
77
Q

What does it mean for a set of vectors to be conjugate with respect to a matrix?

A
78
Q

How many steps will it take for the conjugate method to converge?

A
79
Q
A
80
Q

Mutually perpendicular directions (such as the coordinate directions) are conjugate with respect to ____.

A

Mutually perpendicular directions (such as the coordinate directions) are conjugate with respect to the identity matrix.

81
Q

What is the condition number?

What is the condition number if A is symmetric positive definite?

A
82
Q
A