Discrete time markov processes Flashcards

(19 cards)

1
Q

What is a Markov process?

A

A stochastic process is a Markov process if the probability of future states depends only on the present state and not on previous states.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the Markov property?

A

The Markov property states that the value of a process at time t_n+1 depends only on its value at time t_n, not on prior values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is an intuitive explanation of the Markov property?

A

The Markov property means that given the current state, future events are independent of past events.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Why is the Markov property useful?

A

The Markov property simplifies probability calculations, allowing joint probabilities to be expressed as products of simple conditional probabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does the law of total probability help with in a Markov process?

A

It helps calculate the probability of reaching a particular state by conditioning on the next state the process will move to.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a time-homogeneous Markov chain?

A

A time-homogeneous Markov chain is a chain where the transition probabilities do not depend on the time step, i.e., P(X_n+1 = j | X_n = i) is constant for all n.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a transition matrix?

A

A transition matrix is a matrix where the (i,j)-th element represents the probability of transitioning from state i to state j in one time step.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the Chapman-Kolmogorov equation?

A

The Chapman-Kolmogorov equation relates the n-step transition probabilities to the product of one-step transition matrices.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the n-step transition probability?

A

The n-step transition probability p(n)_ij is the probability that a process in state i will be in state j after n steps.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are marginal probabilities in a Markov process?

A

Marginal probabilities specify the distribution of a Markov process at a particular time step without conditioning on previous states.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does the duration of stay in a state in a Markov chain follow?

A

The duration of stay in a state in a Markov chain follows a geometric distribution with parameter (1 - p_ii), where p_ii is the probability of staying in the same state.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the first step decomposition (FSD) technique?

A

The FSD technique involves conditioning on the first step of the Markov process to compute probabilities or expectations of interest.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How can FSD be used to calculate the probability of reaching a state?

A

FSD can calculate the probability of reaching a state by conditioning on where the process goes next and using the total law of probability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is an example of using FSD to compute a probability?

A

In Example 2.2, FSD is used to compute the probability of reaching state 1 from state 2 in a Markov process by conditioning on the next state.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the expected duration of stay in a state i?

A

The expected duration of stay in state i is 1 / (1 - p_ii), where p_ii is the probability of staying in state i.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How do you solve for the probability of reaching a state using FSD?

A

By recursively writing the probability of reaching a state in terms of the probabilities of reaching it from other states and solving the resulting system of equations.

17
Q

What does P_C represent in Example 2.4?

A

P_C represents the probability that the Markov process will eventually reach state 1, given an initial distribution.

18
Q

How do you compute P_C in Example 2.4?

A

P_C is computed by taking a weighted sum of the probabilities of reaching state 1 from each initial state, weighted by the initial distribution.

19
Q

How do we handle marginal probabilities in a Markov chain?

A

Marginal probabilities can be computed by multiplying the initial distribution by the n-step transition matrix to get the probability distribution at time n.