Checking Markov Property and sample problems Flashcards

(15 cards)

1
Q

What is the first step decomposition technique and formula?

A

First step decomposition is a technique used to compute expectations and probabilities in Markov chains by conditioning on the next state of the process. E[X] = E[E[X|Y]]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How is the law of conditional expectation applied in Markov chains?

A

The law of conditional expectation is used to compute expectations by conditioning on future states, allowing us to break down complex problems into simpler subproblems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

In Example 5.1, how do we compute the expected time for Geraint to deliver the parcel?

A

We use first step decomposition to condition on Geraint’s next move, then solve iteratively by applying the law of total expectation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the expected time for Geraint to deliver the parcel in Example 5.1?

A

The expected time for Geraint to deliver the parcel is 2.5 minutes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the second question in Example 5.1 about Geraint’s delivery?

A

The second question asks how many times Geraint will visit location B on average before delivering the parcel.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How can you tell if a process is Markov?

A

To check if a process is Markov, you can either logically deduce the Markov property or mathematically verify that the process satisfies the Markov property using conditional probabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does it mean if a Markov property does not hold?

A

If the Markov property does not hold, it means that the future state of the process depends on more than just the current state, violating the memoryless property.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How can you mathematically prove that a process is not Markov?

A

To prove that a process is not Markov, find a counterexample where the conditional probability depends on past states, violating the Markov property.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How do you decide if a new process derived from a Markov process is also Markov?

A

To determine if a new process is Markov, check if the transformation function is bijective. If it is, the new process will be Markov.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What happens if the transformation function is not bijective in a derived Markov process?

A

If the transformation function is not bijective, the new process may or may not be Markov. You may need to test it using counterexamples.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

In Example 5.2, what is the relationship between X_n and Y_n?

A

Y_n is a function of X_n, and the function is not bijective because there is ambiguity in the value of Y_n when X_n takes specific values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why is {Y_n, n=0,1,2,…} not a Markov process in Example 5.2?

A

The process {Y_n} is not Markov because the transformation from X_n to Y_n is not bijective, which creates ambiguity in the value of Y_n.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the key feature that makes {Z_n, n=0,1,2,…} a Markov process?

A

{Z_n} is a Markov process because the transformation from X_n to Z_n is bijective, meaning we can uniquely determine X_n from Z_n.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How do you check if a new process derived from a Markov process is Markov?

A

To check, determine if the transformation function between the original process and the new process is bijective. If it is, the new process will also be Markov.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

In Example 5.2, why is {X_n, n=0,1,2,…} a Markov process?

A

Because {X_n} is a sequence of independent random variables, and any sequence of i.i.d. random variables satisfies the Markov property.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly