Week 4 Flashcards

1
Q

Difference between exact function and interpolation polynomial

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Intermediate Value Theorem

A

For any C1 function which vanishes at A and B

There is at least one point inside [A,B] where it’s derivative vanishes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Theorem of error of polynomial interpolation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Error on a mesh

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Prove

A

Introducing q(y) which is equal to zero for N + 2 points y = x0,…,x N and x

Apply derivative N times (at each point a zero vanishes) then by IVT q(N+1(y) vanishes at some ζ € [a,b]

Then for y = ζ we get

r(x) = (f(N+1)(ζ))/(N+1)!

See pg 39 LN

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Interpolation error estimation error function

A

Where ζ is some point

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Approximate 1/(1+x)

A

~ 1 - x + x^2 - x^3 + O(x^4)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Break down components of approximation of f’(x)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Consequence of machine error for approximation of f’(x)

A

O(ε/h) therefore choosing small h causes machine error to grow

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Explain this graph

A

You are decreasing h along x axis which is causing method error to decrease (inversely) on y

After certain h, machine error starts to increase overall error hence jagged line as machine error is random

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How to choose h for numerical differentiation to minimise error? Why?

A

We want machine error to (roughly) equal method error
=>
O(h) ~ O(ε/h)
=>
O(h2) ~ O(ε)
=>
O(h) ~ O(sqrt(ε))

Now order of ε is 10-16 in Py
=> h = 10-8

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Central difference approximation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How to find method error for central difference approximation

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

General formula for order of overalll error given method error

A

As this is bounded by ε, this shows you can’t get rid of machine error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Importance of Chebyshev Points

A

Chebyshev polynomials and are used to minimize the problem of Runge’s phenomenon in polynomial interpolation. Here’s a detailed explanation:

-polynomial interpolation, especially with high-degree polynomials, a phenomenon known as Runge’s phenomenon can occur. This is where large oscillations appear at the edges of the interval of interpolation, leading to poor approximations outside a specific set of points.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What are chebyshev polynomials

A

Chebyshev polynomials are a sequence of orthogonal polynomials that arise in various approximation problems. They are defined over the interval [-1,1]

17
Q

What are chebyshev nodes

A

Chebyshev nodes of the first kind for a polynomial of degree are typically defined as the roots of the Chebyshev polynomial of the first kind, . These nodes are calculated using the formula:

These points are spaced more densely near the ends of the interval and are more sparse near the middle.

18
Q

Which assumption is required for Taylor expansion

A

Function is sufficiently smooth

19
Q

Taylor expansion of f(x+h), f(x-h), f(x+2h)

A
20
Q

Taylor expansion of first deriv (central difference)

A

For small h (less than 1)

21
Q

Numerical evaluation of approximation of first derivative

A
22
Q

Find optimal h for O(ε/h)

A

Let ε = uh + vε/h

Find minimum wrt h

Which is obtained at:
u - vε/h2 = 0

=> h = sqrt((εv)/u) = O(sqrt(ε))

Which is often 10-8