Define a **state** and **state-space.**

When do we call a transition matrix a stochastic matrix?

What does the entry p_{ij} represent?

The probabilitiy of transitioning from state i to state j after one time step.

Define a **Markov chain**.

What do we say as a shortened version of a Markov Chain?

What is one way to think of a Markov Chain?

Intuitively a Markov chain is a random process where only the current state of the process influences where the chain goes next.

What is another way to write property 1 in the following?

X_{0} has probability distribution defined by λ where λ = (λ_{i} : i ∈ I)

What is another way to write property 2 in the following?

Probability of a future event conditioned on the past and present is equal to the probability of the future event conditioned on the past.

Finish the following theorem.

Prove the following theorem.

What is another way to write: the future state of a Markov chain is only dependent on its current state?

What is the Markov Property theorem?

Prove the following theoem.

What is a stochastic matrix?

One where the rows sums are equal to 1 and has non-negative entires.

What do the following probabilities equal when P is a stochastic matrix that generates a Markov chain with entries: [[1-α, α], [β, 1-β]]?

Finish the following theorem.

Prove the following theorem.

What does p_{ij}^{(n)} stand for?

The n^{th} transition probability from i to j.

What is Chapman Kolmogorov Equatiosn theorem?

Prove the following theorem.

If a Markov chain naturally breaks up into separate pieces what are the pieces called?

Communicating classes

Define **i leads to j**.

Define **i communicates with j.**

Finish the following theorem.

Prove the following theorem.

Define when a communicating class is **closed.**

Define when a state i is **absorbing.**

Define when a class is called **irreducible.**

Define **hitting time.**

What does H^{A} stand for?

The first time that the Markov chain hits the set A.