Lecture 4 Flashcards

(9 cards)

1
Q

What type of process are modeled by COUNTINOUS time discrete-state Markov Models?

A

are used to model systems that evolve continuously over time but have a countable (often finite) number of states. No Memory. Eg. Reliability and Maintenance !!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the assumptions of the continuous Markov model model?

A

Once again, Memoriless.

Continuous: transition can happen at any point in time, not only at discrete intervals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How can you compute the transition rate matrix from the continuous-time transition probability matrix?

A

Transition probability –> Discrete
Transition Rate –> Continuous

The idea is that the time interval of the transition is so small that only one event can happen within it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How can you compute the steady-state probabilities in continuous- time discrete-state Markov processes. What do they represent?

A

They represent how much time the system steps at each state in the long term.

= average fraction of time the system is functioning
= average fraction of time the system is down (i.e., under repair)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define the system failure intensity for a general system modelled by continuous-time discrete-state Markov processes. How can you compute it?

A

YSTEM FAILURE INTENSITY, W_f = Rate at which system failures occur =
= expected number of system failures per unit of time =
= rate of exiting a success state to go into one of fault

which is basically: Rate x Probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Define the system repair intensity for a general system modelled by continuous-time discrete-state Markov processes. How can you compute it?

A

SYSTEM REPAIR INTENSITY, W_r
Rate at which system repairs occur =
= expected number of system repairs per unit of time =
= rate of exiting a failed state to go into one of success

again, it is a: Rate x Probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How can you compute the average time of occupancy of a state in continuous-time discrete-state Markov processes. What is the probability distribution that describe this time? Why?

A

time the system remains in state i before it departs towards another state with constant rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How can you generally compute the availability of any system using continuous-time discrete-state Markov processes?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly