STORM Flashcards
(22 cards)
Memorylessness property
If X~EXP(λ). Information about what happened in the past does not affect the present and the future.
Pr(X>t+u|X>t)=Pr(X>t+u)/Pr(X>t)=Pr(X>u)
Properties of exponential distibution:
1: min(X1,…,Xn) is also an exponential random variable with λ=λ1+…+λn
2: Probability that Xi is the smallest: Pr(Xi<Xj)=λi/(λ1+…+λn)
Poisson process properties:
1: Interarrival times are exponentially distributed
2: Adding Poisson processes is possible when they are independent, new Poisson process: λ1+…+λn
3: If Poisson Process with rate λ is randomly split into two subprocesses with probabilities p and 1-p, new Poisson processes: pλ and (1-p)λ
4: For s<t and N(s)≤N(t), N(t)-N(s) is the number of events in (s,t]
Markov chain:
Discrete-time stochastic process with state space S. Something is a Markov chain if the state after the next transition only depends on the current state.
Irreducibility
If all states communicate with each other, there is exactly 1 class, and such a Markov chain is called irreducible
Periodicity
A state i of a Markov chain has period d if pii=0, whenever n is not divisible by d, while d is the largest integer with this property. A state with period 1 is said to be aperiodic
Recurrence
A state is called recurrent if the process returns to that state with probability 1; the state will be visited an infinite number of times
Positive recurrence
If the mean return time to a recurrent state is finite, then that state is called positive recurrent, otherwise NULL recurrent.
If a Markov Chain is irreducible, aperiodic and positive recurrent, then:
you can solve π=πP to find the limiting distributions.
A branching process dies out with probability 1 when:
when μ≤1, otherwise the branching process might live forever with probability 1-d>0 or dies out with probability d>1
How to calculate extinction probability?
Set d=h(d)
What is the main difference between Markov Chain and Markov Process?
Markov chain is a specific type of Markov process with a discrete state space. A Markov process is a stochastic process that satisfies the Markov property and can have a continuous state space.
What is a Birth-and-Death process?
Continuous time Markov Processes on state space S. Transitions only occur between neighbouring states; from i to i+1 (birth, λi) or i to i-1 (death, μi)
Sojourn time:
The time the process spends in a state i before transitioning to state i+1 or state i-1; exponentially distributed with parameter λi+μi. Hence, expected time in certain state: 1/(λi+μi)
Transition probabilities:
Given that a transition occurs from state i, probability of birth: λi/(λi+μi), probability of death: μi/(λi+μi)
W and L in queueing systems:
W: mean time a job spends in a system
L: mean number of jobs in a system
Main theorem of open queueing networks:
Assuming stability, the stationary joint queue length distribution of an open network has the product form:
p(n)=p(n1,…,ns)=p1(n1)…ps(ns)
Selling price:
Purchasing price:
Salvage value:
Understocking cost:
Overstocking cost:
r
w
s: value of unsold items
Cu=r-w: cost of not having enough stock
Co=w-s: cost of having excess stock
Net revenue function: Y(Q,d)=:
if d≥q: Y(q,d)=qCu
if d≤q: Y(q,d)=(Cu+Co)d-qCo
Principle of marginal analysis:
R(q+1)-R(q); increase q with 1 in R(q+1)-R(q) until the term becomes negative, then Q*=q
Newsvendor model:
q* is the smallest q such that:
Pr(D≤q)≥Cu/(Cu+Co)
Works because Pr(D≤q) is increasing in q, so that R(q) is a concave function in q
Total cost
Fixed ordering cost per order
Demand rate
Order quantity
Holding cost per time period
EOQ that minimizes total cost
Lead time
Reorder point
Expected demand during lead time
Backorder cost per time period
Demand during lead time
Cost of lost sales per unit
C(q)
K
D
q
h
q*
L
r
μ_L
b
D_L
c