Lecture 3a Flashcards

(16 cards)

1
Q

What is a Markov Process?

A

a random process in whitch the future is independent of the past, given the present

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

what are the two types of “States” in whitch a system can be?

A

a) mutually exclusive: the system can only be in ONE state at each time

b) exhaustive: the system must be in ONE state at ALL times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does stocastically mean?

A

It means that somethig will happen randomly in time: Eg. Random Walk or a Markov Process (under cetain conditions)

The transition between the working state and the failed state is stochastic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Does the Markov Model have discrete times?

A

YES the process is observed at discrte times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the difference between a General Stochastic Process and a Markov Process? what are the assumptions of this model?

A

In a stochastic process the probability of a future state usually depends on its entire life history

In a Markov Process the probability of a future state only depends on its present state !! – > The Process has NO memory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Is the Transition probability matrix Stochastic?

A

YES

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What type of process are modeled by discrete-time discrete state Markov process?

A

used to model systems that evolve in discrete time steps and have a countable (often finite) number of states. These processes are characterized by the property that the future state of the system depends only on the current state and not on the sequence of states that preceded it (the Markov property).

–> Random Walk

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the foundamental quantity to be computed? (Markov Model)

A

Compute the probability that the system is in a given state at a given time, for all possible states and times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the prerequisites for a Markov process?

A

a. Needs to be Stochastic

b. Homogeneous in time

c. States are mutually exclusive and exhaustive

d. Process is memoryless

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

properties of transition probabilities?

A

a. probabilities are non negative
b. Probabilities sum to 1
c. Stocastic Matrix
d. Markov Property: No memory
e.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Definition of recurrent?

A

A state is recurrent if the system starting at such state will surely return to it at some point

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Define Transient?

A

A state is Transient is the system starting at such state has a finite probability of NEVER returning to it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Can we have a Markov Model where ALL states are Transient??

A

NO - because then we will eventually leave all of them, and we MUST be somewhere at Steady State !!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is an absorving state?

A

A state is absorbing if the system cannot leave it once it enters

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Occupation of a state?

A

Is the number os steps before the system exits the state

How well did you know this?
1
Not at all
2
3
4
5
Perfectly