SU2 - The Properties of Random Variables, The Normal and Its Related Distributions Flashcards
What is expected value of a random variable?
It is the weighted average of all possible values of X aka mean
What is the Expectation of Sums?
E(X1+X2)
E(X1)+E(X2)
What is the result of E(c1X1+c2X2)?
E(c1X1) + E(c2X2) = c1E(X1) + c2E(X2)
What is the expectation of a binomial distribution?
np
What is Jensen’s inequality?
g(E(X)) ≤ E(g(X))
What is the result of E (aX + b)?
E (aX + b) = aE(X) + b
What is the expectation of a Bernoulli distribution?
p
Is the expectation of the function of a random variable equals to the function of the random variable’s expectation?
E [g (X)] = g (E [X])
No, unless g(X) is a linear function, Jensen’s inequality
Is the expectation of the product of random variables equal to the product of their expectations?
No, only if the random variables with the product are independent
What is the Cauchy distribution?
Student t distribution with 1 degree of freedom. In this case, the expectation may not exist
What is the variance?
To measure how “spread out” the values of a random variable are
𝑉𝑎𝑟𝑋=𝐸[(𝑋−𝜇)2] or 𝐸(𝑋2)−𝜇2
For any constant c, what is the value of Var(c)?
- For any constant, value of variance is 0 because there is no variance at all.
For any constants a and b, Var(aX+b) = ??
For any constants a and b, Var(aX+b) = a^2 Var(X)
When we add b to X, we only shift the distribution of X laterally. The shape of the distribution of X stays the same; therefore, the variance of X remains unchanged
For any constants a and b, and sd ( aX+b ) = ??
|a| sd (X)
What does standardization mean?
To re-centre the expectation of the random variable to 0 and to normalise its variance to 1, i.e. E(X) = 0 and Var(Z) = 1.
If 𝑋1 and 𝑋2 are uncorrelated/independent, Var𝑋1+𝑋2 =?
Var(X1) + Var(X2)
Is the variance of the sum of random variables always equal to the sum of their variances?
No, unless the variables are independent
What happens to the covariance when X and Y are independent?
If X and Y are independent, then Cov(X,Y) = 0, and E(XY) = E(X)E(Y)
What is the correlation coefficient?
The correlation coefficient can be thought of as a standardised covariance that does not depend on units of measurement.
corr(X,Y) = 1(-1) means a perfect positive (negative) linear relationship
What is the variance of a bernoulli distribution?
Var(X) = np(1-p)
What is the alternate expression for Cov(X, Y)?
E(XY) - 𝜇𝑋𝜇𝑌
E(XY) - E(X)E(Y)
What are the three properties of covariances?
- For any constant a, Cov(a,X) = 0.
- For random variable Z, Cov(X+Y,Z) = Cov(X,Z)+Cov(Y,Z).
- For any constants a1 and a2, Cov(a1 X,a2 Y) = a1 a2 Cov(X,Y).
Does zero correlation imply independence?
Yes
What happens if you add a constant to a random variable in a covariance?
Does not affect its covariance with another random variable
E(X) = E(E(X│Y)) = E(c) = c