Lecture 3a Flashcards
(16 cards)
What is a Markov Process?
a random process in whitch the future is independent of the past, given the present
what are the two types of “States” in whitch a system can be?
a) mutually exclusive: the system can only be in ONE state at each time
b) exhaustive: the system must be in ONE state at ALL times
What does stocastically mean?
It means that somethig will happen randomly in time: Eg. Random Walk or a Markov Process (under cetain conditions)
The transition between the working state and the failed state is stochastic
Does the Markov Model have discrete times?
YES the process is observed at discrte times
What is the difference between a General Stochastic Process and a Markov Process? what are the assumptions of this model?
In a stochastic process the probability of a future state usually depends on its entire life history
In a Markov Process the probability of a future state only depends on its present state !! – > The Process has NO memory
Is the Transition probability matrix Stochastic?
YES
What type of process are modeled by discrete-time discrete state Markov process?
used to model systems that evolve in discrete time steps and have a countable (often finite) number of states. These processes are characterized by the property that the future state of the system depends only on the current state and not on the sequence of states that preceded it (the Markov property).
–> Random Walk
What is the foundamental quantity to be computed? (Markov Model)
Compute the probability that the system is in a given state at a given time, for all possible states and times
What are the prerequisites for a Markov process?
a. Needs to be Stochastic
b. Homogeneous in time
c. States are mutually exclusive and exhaustive
d. Process is memoryless
properties of transition probabilities?
a. probabilities are non negative
b. Probabilities sum to 1
c. Stocastic Matrix
d. Markov Property: No memory
e.
Definition of recurrent?
A state is recurrent if the system starting at such state will surely return to it at some point
Define Transient?
A state is Transient is the system starting at such state has a finite probability of NEVER returning to it
Can we have a Markov Model where ALL states are Transient??
NO - because then we will eventually leave all of them, and we MUST be somewhere at Steady State !!
What is an absorving state?
A state is absorbing if the system cannot leave it once it enters
Occupation of a state?
Is the number os steps before the system exits the state