17. Markov Models 1 Flashcards
(20 cards)
What is a stochastic process?
A family of random variables indexed by time, either discrete or continuous.
What is a discrete-time stochastic process?
A process where time index T belongs to the set of non-negative integers (ℕ₀).
What is a continuous-time stochastic process?
A process where time index T belongs to the positive real numbers (ℝ₊).
What is a Markov process?
A stochastic process where the future depends only on the present state, not the past.
What is a Markov Chain?
A discrete-time Markov process represented with a Bayes network.
What is the Markov assumption?
The future is independent of the past given the present.
What are the main types of systems in Markov models?
Fully observable, partially observable, autonomous, and controlled systems.
Define a recurrent state in a Markov model.
A state the chain eventually returns to.
What is an absorbing state?
A state that, once entered, cannot be left.
What does it mean for a Markov chain to be ergodic?
All states are positive recurrent and aperiodic.
What is the joint probability in a Markov model?
P(X₁, …, X_T) = P(X₁) * P(X₂|X₁) * P(X₃|X₂) * … * P(X_T|X_T₋₁).
What is a first-order Markov model?
Future states depend only on the current state.
What is an m-order Markov model?
Future states depend on the last m states.
What are four ways to represent transition probabilities?
Transition table, state diagram, trellis diagram, matrix representation.
What is a stationary distribution?
A distribution P∞ where probabilities do not change over time.
What is a Hidden Markov Model (HMM)?
A model with hidden states generating observable evidence with conditional emission probabilities.
In HMMs, what relates evidence variables?
The hidden states they depend on.
What is an application of Hidden Markov Models?
Speech recognition, robot localization, classification problems.
What is the Law of Total Probability?
P(A) = Σ P(A|Bₙ)P(Bₙ).
What is Bayes Rule?
P(C|E) = P(E|C) * P(C) / P(E).