4 Stochastic Differential Equation Flashcards

1
Q

stochastic differential equation:
defining lipshitz continuity

A

f,σ:R to R lipshitz continuous functions meaning there exists constants k s,t
|f(x)-f(y)|≤k_1|x-y|
and
|σ(x)-σ(y)|≤k_2|x-y|

now taking max of these we can redefine this as existence of K
|f(x)-f(y)|≤K|x-y|
and
|σ(x)-σ(y)|≤k|x-y|

essentially this is close to meaning differentiable with bounded derivative

behave similar to linear functions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

stochastic differential eq

A

On [0, T], we consider the equation

dX(t) = f(t, X(t)) dt + σ(t, X(t)) dW(t),

IC
X(0) = x_0 In R

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

A solution of:

On [0, T], we consider the equation

dX(t) = f(t, X(t)) dt + σ(t, X(t)) dW(t),

IC
X(0) = x_0 In R

A

is continuous in time variable
(F_t){t in [0,T]}- adapted stochastic process (X(t)){t in [0,T]}

and satisfies

X(t)= x + ∫{0,t} f(X(s)) ds + ∫{0,t} σ(X(s) dW(s)
for all t in [0,T]
which is the same as saying X is an ito process

as it has form

X(t) =X(0)+ ∫[0,t] A(s).ds + ∫[0,t] B(s) dW(s)

thus X(0)=x
A(t)=f(X(t))
B(t)=σ(X(t))
thus its an equation as functions of itself X(t)

note if B(t)=0 for all then we have X’(t)=f(X(t))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

EXISTENCE OF A SOLUTION

A

if f,σ:R to R lipshitz continuous
THEN
there exists a
UNIQUE SOLUTION
for every FIXED IC

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

BOUND with the solution
Lemma 4.1.5 (A priori estimates)

A

if f,σ:R to R lipshitz continuous
THEN
the for solution X

Then, for any p > 0, there exists
a constant N_0 depending only on p, T, K such that

E[sup_{t in [0,T]} |X(t)|ᵖ]
less than or equal to
N₀(1+|x₀|ᵖ)

the pth moment of solution
controlled by something involving pth power of IC

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Exercise 4.1.4. Show that under Assumption 4.1.1 the integrals in (4.4) are well defined.

A

occurs as the dWs stoachastic integral part is in S (adapted, continuous)

if X ito process composed etc

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

AN EXAMPLE FROM LECTURE OF LINEAR

Exercise 4.1.7. Let G : [0, ∞) → R be function of class C^1
such that G(0) = 0. Let a, b ∈ R.
Find a solution to the equation

dX(t) = aX(t) dt + bX(t) dGt,
X(0) = x, t ∈ [0, T].

In the above, of course dGₜ = Gₜ’dt, since G is continuously differentiable

A

we are solving
dX(t)/dt = aX(t) + bX(t)G’(t).
which is seperable

assuming X exists positive we have
(1/X(t))(dX/dt)= a +bG’(t)
thus
d/dt(lnX(t)) = a +bG’(t)
{0,t} d/dt(lnX(s)) .ds= ∫{0,t} (a +bG’(s)).ds

ln(X(t))-ln(X_0) = at+bG - (a.0 +bG(0))
X_0=x
G(0)=0
giving
ln(X(t))=ln(x) +at+bG(t)

X(t)=xexp(at+bG(t))
solution which we can see by differentiation is a solution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Exercise 4.1.8. Let W be a one-dimensional Wiener processes. In the solution that you found in
the exercise above, replace G(t) by W(t), that is, set X(t) = xexp(at+bW(t)
. Write an equation for X

A

X(t) := x exp(at + bW(t))
Define
Y (t) := at + bW(t)
and let
u(y) = x exp(y).
(derivs wrt y of u equal)

Then Y is an Itô process with differential
dY (t) = adt + b dW(t),
and u ∈ C²

So by Itô’s lemma:
dX(t) = du(Y (t))
= u’(Y (t))adt + u’(Y (t))bdW(t) +0.5u’‘(Y (t))|b|²dt

= x exp(Y (t))adt + x exp(Y (t))bdW(t) +0.5x exp(Y (t))b²dt
= X(t)adt + X(t)bdW(t) + 0.5X(t)b²dt

So as IC is X(0)=xexp(0 +bW(0)=x

so equation we are looking for is
dX(t)=(a+0.5b^2)X(t)dt +bX(t)dW(t)
X(0)=x

we can check it is a solution!!!
IN EXAM IF I GIVE YOU THIS PROCESS YOU SHOULD BE ABLE TO JUSTIFY THIS SOLVES THE DIFFERENTIAL

for exam “knowing by heart that this is a solution” exp(a-0.5b^2)t+bW(t))

and if we define Y(t) composed with u then u(x) =x_0exp(x) meaning X(t)=u(Y(t)) then applying itos formula should be able to show that the differential of this is exactly the one given (known by heart?)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

this was an example of solving linear stochastic differential eq

A

the rest of the content in the notes is for L5 students

but L3 YOU SHOULD KNOW

If i give you a linear stochastic differential equation you should know
what is the solution

be able to justify this is the solution

in the exam: if i give you this and ask you to show this is a solution you shoudl be able to do it

by the ito process, smooth function composed
itos formula and conclude the differential of this is exactly this formula

ie if we know X we can write the differential dX(t)
if I have the initial condition +dX(t) then I can find X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

setup for Feynman-Kac formula
equations

A

Looking at equation with
f,σ:R to R lipshitz continuous functions meaning there exists constant K
|f(x)-f(y)|≤K|x-y|
|σ(x)-σ(y)|≤K|x-y|
equation
dX(t) = f(t, X(t)) dt + σ(t, X(t)) dW(t),
IC
X(0) = x_0 In R
THUS
ADMITS A UNIQUE SOLUTION
Xˣ(t)
(notation emphasise dependence on IC in superscript)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Feynman-Kac formula:
define differential operator

A

only look at one dimensional case:
L(φ(x)) =0.5(σ²(x))∂²ₓₓφ(x)+f(x) ∂ₓφ(x)

where
φ:R to R twice continuously differentiable

this links to functs in eq
dX(t) = f(t, X(t)) dt + σ(t, X(t)) dW(t),

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Feynman-Kac formula

A

Suppose u is a solution
u:[0,T] X R to R
u(t,x)
∂ₜu(t,x) +Lu(t,x)=0 with terminal condition u(T,x)=g(x)

so we take a function u which satisfies this cond in [0,T] and all R of space (rectangle)( data will be this terminal condition)

at the boundary of terminal time we have the terminal condition)

F-K thm says
will have a unique solution and and be represented by
u(t,x)= E[g(Xˣ(T-t)]

which uses the solution of the stochastic differential equation

checking u(T,x)=g(x) correct

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

markov family
DEFINITION

A

A family of stochastic processes Xˣ
x ∈R is a TIME HOMOGENEOUS MARKOV FAMILY
((Xˣ_m))ₜ>₀)_{x in R}one for each initial point superscript x

IFconditions:
1) For each t ≥ 0 a and Fixing a borel set A∈B(r) the map
x to P(Xˣ(t)∈A) is borel measurable
(technical dont pay attention to this)

2)P(Xˣ(0)=x)=1
each process starts from x at time 0

3) for all bounded measurable functs g:R^d to R and for all t,s≥ 0,

E[g(Xˣ(t + s))|F_s] = ψg(t,Xˣ((s)),

where ψ_g(t, x) := E[gXˣ((t))]
(if s present, sigma algebra F_s contains up to this point,
ψ means expectation of solution at time t starting from x

saying that in place of starting point we plug in x as starting point and run for time s)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

solutions of stoch

A

turns out that the solutions of a stochastic differential eq satisfy the markov property

if we take one solution then family satisfies markov property

thm 4.4.3
For x ∈ Rd, let X^x be the solution of equation (4.7). Then the family X^x
, x ∈ R^d,
is a time-homogeneous Markov family.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly