STORM Flashcards

(22 cards)

1
Q

Memorylessness property

A

If X~EXP(λ). Information about what happened in the past does not affect the present and the future.
Pr(X>t+u|X>t)=Pr(X>t+u)/Pr(X>t)=Pr(X>u)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Properties of exponential distibution:

A

1: min(X1,…,Xn) is also an exponential random variable with λ=λ1+…+λn
2: Probability that Xi is the smallest: Pr(Xi<Xj)=λi/(λ1+…+λn)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Poisson process properties:

A

1: Interarrival times are exponentially distributed
2: Adding Poisson processes is possible when they are independent, new Poisson process: λ1+…+λn
3: If Poisson Process with rate λ is randomly split into two subprocesses with probabilities p and 1-p, new Poisson processes: pλ and (1-p)λ
4: For s<t and N(s)≤N(t), N(t)-N(s) is the number of events in (s,t]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Markov chain:

A

Discrete-time stochastic process with state space S. Something is a Markov chain if the state after the next transition only depends on the current state.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Irreducibility

A

If all states communicate with each other, there is exactly 1 class, and such a Markov chain is called irreducible

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Periodicity

A

A state i of a Markov chain has period d if pii=0, whenever n is not divisible by d, while d is the largest integer with this property. A state with period 1 is said to be aperiodic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Recurrence

A

A state is called recurrent if the process returns to that state with probability 1; the state will be visited an infinite number of times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Positive recurrence

A

If the mean return time to a recurrent state is finite, then that state is called positive recurrent, otherwise NULL recurrent.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

If a Markov Chain is irreducible, aperiodic and positive recurrent, then:

A

you can solve π=πP to find the limiting distributions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

A branching process dies out with probability 1 when:

A

when μ≤1, otherwise the branching process might live forever with probability 1-d>0 or dies out with probability d>1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How to calculate extinction probability?

A

Set d=h(d)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the main difference between Markov Chain and Markov Process?

A

Markov chain is a specific type of Markov process with a discrete state space. A Markov process is a stochastic process that satisfies the Markov property and can have a continuous state space.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a Birth-and-Death process?

A

Continuous time Markov Processes on state space S. Transitions only occur between neighbouring states; from i to i+1 (birth, λi) or i to i-1 (death, μi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Sojourn time:

A

The time the process spends in a state i before transitioning to state i+1 or state i-1; exponentially distributed with parameter λi+μi. Hence, expected time in certain state: 1/(λi+μi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Transition probabilities:

A

Given that a transition occurs from state i, probability of birth: λi/(λi+μi), probability of death: μi/(λi+μi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

W and L in queueing systems:

A

W: mean time a job spends in a system
L: mean number of jobs in a system

17
Q

Main theorem of open queueing networks:

A

Assuming stability, the stationary joint queue length distribution of an open network has the product form:
p(n)=p(n1,…,ns)=p1(n1)ps(ns)

18
Q

Selling price:
Purchasing price:
Salvage value:
Understocking cost:
Overstocking cost:

A

r
w
s: value of unsold items
Cu=r-w: cost of not having enough stock
Co=w-s: cost of having excess stock

19
Q

Net revenue function: Y(Q,d)=:

A

if d≥q: Y(q,d)=qCu
if d≤q: Y(q,d)=(Cu+Co)d-qCo

20
Q

Principle of marginal analysis:

A

R(q+1)-R(q); increase q with 1 in R(q+1)-R(q) until the term becomes negative, then Q*=q

21
Q

Newsvendor model:

A

q* is the smallest q such that:
Pr(D≤q)≥Cu/(Cu+Co)
Works because Pr(D≤q) is increasing in q, so that R(q) is a concave function in q

22
Q

Total cost
Fixed ordering cost per order
Demand rate
Order quantity
Holding cost per time period
EOQ that minimizes total cost
Lead time
Reorder point
Expected demand during lead time
Backorder cost per time period
Demand during lead time
Cost of lost sales per unit

A

C(q)
K
D
q
h
q*
L
r
μ_L
b
D_L
c