Mean of a function of a random variable π¬(πΏ) = Β΅
πΈ(ππ + π) = π Γ πΈ(π) + π
Variance of a function of a random variable π½ππ(πΏ) = πΒ² (3)
β΄ πππ(ππ + π) = πΒ² Γ πππ(π)
β΄ ππ(ππ + π) = |π| Γ ππ(π)
Note: πππ(π) = πΈ(πΒ²) β [E(X)]Β²
Sum and difference of independent random variables
For T = X + Y
E(T) = E(X + Y) = E(X) + E(Y)
Var(T) = Var(X + Y) = Var(X) + Var(Y)
For T = X - Y
E(T) = E(X - Y) = E(X) - E(Y)
Var(T) = Var(X - Y) = Var(X) + Var(Y)
Any random variables vs independent random variables
For any random variables:
E(aX + bY) = a x E(x) + b x E(Y)
For independent random variables:
Var(aX + bY) = aΒ² x Var(X) + bΒ² x Var(Y)
Linear functions and combinations of normally distributed random variables
Linear combination of independent normal variables are also normally distributed
The distribution of the sum of 2 independent Poisson variables
X~Po(Ξ»β) and Y~Po(Ξ»α΅§) then,
E(X + Y) = E(X) + E(Y) = Ξ»β + Ξ»α΅§
Var(X + Y) = Var(X) + Var(Y) = Ξ»β + Ξ»α΅§
Mean and variance of X and Y are equal hence X + Y have a poisson distribution given that X and Y are independent
Note:
Linear combination of independent poisson variables of the for aX + bY can not have a poisson distribution