Lecture 23 Flashcards

1
Q

Bisection method

A
  • Continuous function interval [a,b] containing root
  • evaluate mid-point m=(a+b)/2
  • check signs. a=m if sign(f(m))==sign(f(b)) else b=m.
    Interval reduced to half every iteration
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Convergence iterative method

A
ek = |xk-x| (error at iteration k) converges at rate r if
lim(kinf) ||ek+1||/||ek||^r = C
r=1 linear
r=2 quadratic
1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Bisection method convergence

A

Linear r=1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Bisection method operations

A

two function evaluations at first iteration then only one needed (f(m))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Bisection length interval, # iterations for tolerance

A

length = b-a/(2^k)

length <= tol then k >= log2(b-a/tol)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Newton’s method

A
Local convergence (sometimes does not converge towards a solution)
x0 initial guess
xk+1 = xk - f(xk)/f'(xk)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Newton’s method convergence

A

Typically quadratic r=2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Newton’s method operations

A

2 evaluations per iteration (function and first derivative)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Secant method

A

Newton’s with first derivative approximation (still local convergence)…
needs two starting guess xk, xk-1.
f’(xk) = (f(xk)-f(xk-1))/(xk - xk-1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Secant method operations

A

Only one function evaluation per iteration

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Secant method convergence

A

superlinear, r= (1+sqrt(5))/2 = 1.618 (golden ratio)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Bisection method python

A
import scipy.optimize as opt
def f(x):
    return x**3 - x - 1

root = opt.bisect(f, a=1, b=2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Newton’s method python

A

import scipy.optimize as opt

def f(x):
    return x**3 - x - 1
def fprime(x):
    return 3 * x**2 - 1

root = opt.newton(f, x0=1, fprime=fprime)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Secant method python

A

import scipy.optimize as opt

def f(x):
    return x**3 - x - 1

root = opt.newton(f, x0=1)

(same as Newton’s without fprime)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

A step in Newton’s method can always be carried out for any smooth function and any value of the current guess xk.

A

False.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Python derivative

A
from sympy import *
x = Symbol('x')
y = x**2 + 1
yp = diff(y, x)
f = lambdify(x, yp, 'numpy')