Markov Chains Flashcards Preview

Probability > Markov Chains > Flashcards

Flashcards in Markov Chains Deck (47)
Loading flashcards...
1

Define a state and state-space.

2

When do we call a transition matrix a stochastic matrix?

3

What does the entry pij represent?

The probabilitiy of transitioning from state i to state j after one time step.

4

Define a Markov chain.

5

What do we say as a shortened version of a Markov Chain?

6

What is one way to think of a Markov Chain?

Intuitively a Markov chain is a random process where only the current state of the process influences where the chain goes next. 

7

What is another way to write property 1 in the following?

X0 has probability distribution defined by λ where λ = (λi : i ∈ I) 

8

What is another way to write property 2 in the following?

Probability of a future event conditioned on the past and present is equal to the probability of the future event conditioned on the past.

9

Finish the following theorem.

10

Prove the following theorem.

11

What is another way to write: the future state of a Markov chain is only dependent on its current state?

12

What is the Markov Property theorem?

13

Prove the following theoem.

14

What is a stochastic matrix?

One where the rows sums are equal to 1 and has non-negative entires.

15

What do the following probabilities equal when P is a stochastic matrix that generates a Markov chain with entries: [[1-α, α], [β, 1-β]]?

16

Finish the following theorem.

17

Prove the following theorem.

18

What does pij(n) stand for?

The nth transition probability from i to j.

19

What is Chapman Kolmogorov Equatiosn theorem?

20

Prove the following theorem.

21

If a Markov chain naturally breaks up into separate pieces what are the pieces called?

Communicating classes

22

Define i leads to j.

23

Define i communicates with j.

24

Finish the following theorem.

25

Prove the following theorem.

26

Define when a communicating class is closed.

27

Define when a state i is absorbing.

28

Define when a class is called irreducible.

29

Define hitting time.

30

What does HA stand for? 

The first time that the Markov chain hits the set A.