Discrete distribution Flashcards
(24 cards)
Bernoulii random variable
can take two values 0 and 1
if a random var X has Bernoulli distribution then its p.m.f. :
px(1) = p, px(0) = 1 - p
Expectation and variance of Bernoulli random var
E(X) = p, Var(X) = p(1-p)
Binomial distribution
In each case there is a fixed number of trials each have two possible outcomes called success and failures each one can be described as Bernoulli random var - we are interested in counting the total number of successes
if a random var X has a binomial distribution with parameters n - the number of trials and p - the probability of success in each trial, then p.m.f
px(x) = n!/(x!(n-x)!) * p^x * (1-p)^(n-x) = (n/x)p^x*(1-p)^(n-x)
expectation and variance of binomial distribution
E(X) = np Var(X) = np(1-p)
cumulative distribution
Fx(x) = P(X<=x) = sum(px(a)) from a to x
binomial distribution in Matlab
p.m.f : binopdf(x, n, p)
c.d.f. : binocdf(x,n,p)
alfa quantile: binding(alfa, n, p)
binornd(nap, [1,m]) - generate m random observations from binomial distribution
geometric distribution
in geometric distribution there is a sequence of success or failure Bernoulli trials, but the number of trials is not fixed
p.m.f of geometric distribution
px(x) = (1-p)^(x-1)*p
expectation and variance
E(X) = 1/p Var(X) = (1-p)/p^2
poisson distribution
used to represent more general count data; the number of times an event occurs in a finite interval in time or space - used mainly when events is regarded as rare
p.m.f of Poisson distribution with rate parameter b
px(x) = P(X=x) = (e^-b*b^x)/x!
expectation and variance of Poisson distribution
E(X) = VAR(X) = b
the Poisson distribution in Matlab
pm.f - poisspdf(x, lambda)
c.d.f - poisscdf(x, lambda)
alfa quantile - poising(alfa, lambda)
joint probability mass function of two discrete random variables X and Y
p.m.f(x,y) = P{(X=x) & (Y=y)}
marginal distribution of X’s p.m.f. and Y’s p.m.f.
px(x) = sum(px,y(x,y)) py(y) = sum(px,y(x,y))
multinomial distribution
this is the generalisation of binomial distribution to the case where there are more than two possible outcomes in each trial - we have n trials and k - possible outcomes
p.m.f of multinomial distribution with n trials and probabilities p1, p2, …, pn
px(x1, x2, x3, … , xn) = (n!/x1!x2!…xn!)p1^x1p2^x2…*pn^xn
conditional distribution for multinomial random variable
(X2,….,Xn)|X1 = multinomial(n-X1; p2/(1-p1),…,pn/(1-p1))
what does it mean that variables are identically distributed
if n random variable have the same probability distribution
Expectation and variance of independent and identically distributed random variables
E(X’(n)) = a
Var(X’(n)) = b^2/n
where X’(n) to be the mean:
X’(n) = sum(Xi/n)
standard error
sqrt(Var(X’(n)) = b/sqrt(n),
denoted by SE(X’(n))
central limit theorem
for random variables X1, X2…,Xn each with mean a and variance b^2, the distribution of X’(n) tends to normal distribution with mean a and var b^2 as n-> inf:
X’(n) = N(a, b^2/n) or
sum(Xi) = N(na, nb^2)