Discrete-time Markov chains Flashcards

1
Q

Define what it means that state j is accessible from state i (for i, j ∈ E)

A

We say that state j is accessible from state i, written i → j, if the chain may ever
visit state j, with positive probability, starting from i. In other words, i → j if there
exist m ≥ 0 such that pij (m) > 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define what it means that states i and j communicate (for i, j ∈ E)

A

States i and j communicate if i → j and j → i, written i ↔ j, i.e. if there exists
n, m ≥ 0 such that pij (n) > 0 and pji(m) > 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

State the Markov condition

A

P(Xn = j|Xn−1 = i, Xn−2 = xn−2, . . . , X0 = x0) = P(Xn = j|Xn−1 = i)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define what it means for P to be a stochastic matrix and prove that the transition matrix is a stochastic matrix.

A
  1. P has non-negative entries, pij ≥ 0 for all i, j.
  2. P has row sums equal to 1; sum of pij = 1 for all i.

pg 16 for proof (law of total probability)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define the n step transition matrix

A

pij (n) = P(Xm+n = j|Xm = i)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

State the the Chapman-Kolmogorov equations

A
pij (m + n) = sum_l (pil(m)plj (n) )
that is
P_{m+n} = P_mP_n,
and
P_n = P^n
pg 18
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Write the marginal distribution of the chain at time n ∈ N_0

A

ν^(n)_i = P(Xn = i)

ν^(n) = ν^(0)P_n = ν^(0)P^n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Closed / Not closed
Finite / Infinite
Write out the class properties

A

pg 38

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the value of the elements of a stationary distribution corresponding to transient states?

A

0

pg 53

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Define what it means for a vector λ to be a distribution on E

A

∀ j ∈ E, λj ≥ 0, and Sum_j∈E (λj) = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does it mean for λ to be invariant for P?

A

λP = λ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the stationary distribution of a Markov chain?

A

definition for stationary and invariant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Stationary distribution of a doubly stochastic matrix

A

pi_i = 1/n for n=|E|

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

When is X time reversible?

A

Set Y_n = X_{N-n}
If transition matrices of X and Y are the same.
Iff the detailed balance equations hold (pi_i p_ij = pi_j p_ji)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Show by induction that for a discrete Markov chain (Xn)n∈N0, we have
P(Xn+m = xn+m|Xn = xn, . . . , X0 = x0) = P(Xn+m = xn+m|Xn = xn)

A

PS1 (1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What’s an effective way to compute p_ij(n) ?

A

Diagonalize P through eigenvalue decomposition (PS1 e7)

17
Q

what is f_00(n)?

A

The probability that the first return to 0 after starting from 0 is after n steps

18
Q

Define what it means for a state to be recurrent

A

P(Xn = j for some n ∈ N|X0 = j) = 1

j ∈ E is recurrent if and only if Sum_n^∞ pjj (n) = ∞
fjj = 1

19
Q

Define what it means for a state to be transient

A

j ∈ E is transient if and only if Sum_n^∞ pjj (n) < ∞
fjj < 1
pij (n) → 0 as n → ∞

20
Q

What is a null and what is a positive recurrent state

A

A recurrent state i ∈ E is called null if µi = ∞ and positive (or non-null) if µi < ∞.

21
Q

Define the period of a state

A

d(i) = gcd{ n : p_ii(n) > 0 }

22
Q

Define the first passage time

A

T_j = min{n : X_n = j}

23
Q

What is the return probability?

A

f_jj = P(T_j < inf | X_0=j) = sum_{n=1}^inf f_jj(n)

24
Q

Consider a discrete-time homogeneous Markov chain on a countable state space E. Suppose that the Markov chain is irreducible, has a stationary distribution denoted by π and all states are recurrent. What are the values of the stationary distribution?

A

πi = µ_i ^{−1}

for all i ∈ E, where µ_i denotes the mean recurrence time for state i

25
Q

Define the first passage probability

A

f_ij(n) = P(T_j = n | X_0=i)

26
Q

What is the mean recurrence time for state i?

A

µ_i = E[T_i | X_0 = i] = sum_{n=1}^{inf} nf_ii(n)

for i recurrent and inf otherwise

27
Q

Define ergodic state

A

Positive recurrent and aperiodic

28
Q

What does it mean to say a class is closed? And irreducible?

A
Closed: impossible to leave the class
Irreducible: all states communicate
29
Q

What does it mean for a state to be aperiodic?

A

d(i) = 1 or doesn’t exist