Lecture 3b Flashcards

(12 cards)

1
Q

Mutually Exclusive Definition:

A

The system must only be in ONE state at EACH time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Exhaustive

A

the system must be in ONE state at ALL times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

In a discrete state what does X(3) = 5 mean?

A

the system occupies state number 5 at time T_3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the goal of the Markov process?

A

Compute the probability that the system is in a give state at a given time, for all possible states and times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the difference between a Stochastic Process and a Markov process?

A

Stochastic Process: the probability of a future state usually depends on in entire history

Markov process: the probability of a future state ONLY depends on the PRESENT state!
–> the system has no Memory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the transition probability:

A

Probability that the system in state “i” at time T_m moves to state j at time t_n

Properties:

  1. are larger or equal to zero
  2. they must sum to One.

3.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does the transition probability depend on?

A

It depends on the time interval not on the individual time -

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the main premises of the Markov process:

A
  1. Process is Stochastic
  2. process is homogenous in time
  3. states are mutually exclusive and exhaustive
  4. the process is memoryless and only depends on the current state of the system
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is a recurring state?

A

If a system starting at state i will SURLY return to this state i at some point

Pi “NOT” = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is a Transient state?

A

a State i is transient if the system starting at such state has a finite probability of never returning to it

Pi = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Can we have a Markov process where all states are transients? Why / Why not?

A

We cannot because eventually It will leave the transient state and we MUST have a steady state

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is an absorbing state?

A

A state is absorbing if we cannot leave it once it enters

Pii = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly