Chapter 1 Info Flashcards

(33 cards)

1
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the sample space in probability?

A

The set of all possible outcomes of an experiment, denoted by Ω.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is an event in probability?

A

A subset of the sample space Ω, representing a collection of outcomes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are Kolmogorov’s axioms of probability?

A
  1. P(A) ≥ 0 for any event A. 2. P(Ω) = 1. 3. Countable additivity: P(∪A_i) = ΣP(A_i) for mutually disjoint events A_i.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

When are two events A and B independent?

A

When P(A ∩ B) = P(A)P(B).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is conditional probability P(A|B)?

A

P(A|B) = P(A ∩ B) / P(B), provided P(B) > 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the law of total probability?

A

P(A) = ΣP(A|B_j)P(B_j), where B_j are mutually exclusive and exhaustive events.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

State Bayes’ Theorem.

A

P(B_i|A) = P(A|B_i)P(B_i) / ΣP(A|B_j)P(B_j), where B_j are mutually exclusive and exhaustive.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is a random variable?

A

A function X: Ω → ℝ that assigns a real number to each outcome in the sample space.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the cumulative distribution function (CDF) of a random variable X?

A

F_X(x) = P(X ≤ x). Properties: F_X(-∞) = 0, F_X(∞) = 1, and F_X is right-continuous and non-decreasing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the probability mass function (PMF) for a discrete random variable?

A

p_X(x_i) = P(X = x_i), where x_i are the possible values of X.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the probability density function (PDF) for a continuous random variable?

A

The PDF is a function that describes the likelihood of a continuous random variable taking a specific value or falling within a range. It must satisfy:

  1. f(x) >= 0 for all x (non-negative).
  2. The total area under the curve is 1:
    Integral of f(x) from -ve infinity to infinity is 1
  3. The probability of the variable lying between two points a and b is integral from b to a of f(x) dx
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the expectation of a discrete random variable X?

A

E[X] = Σx_i p_X(x_i). For continuous X: E[X] = ∫_{-∞}^∞ x f_X(x) dx.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the variance of a random variable X?

A

Var(X) = E[(X - E[X])^2] = E[X^2] - (E[X])^2.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the covariance between two random variables X and Y?

A

Cov(X, Y) = E[(X - E[X])(Y - E[Y])] = E[XY] - E[X]E[Y].

17
Q

What is the correlation between X and Y?

A

Corr(X, Y) = Cov(X, Y) / √(Var(X)Var(Y)), with -1 ≤ Corr(X, Y) ≤ 1.

18
Q

What is the joint CDF of two random variables X and Y?

A

F_{X,Y}(x, y) = P(X ≤ x, Y ≤ y).

19
Q

What is the joint PMF for discrete X and Y?

A

p_{X,Y}(x_i, y_j) = P(X = x_i, Y = y_j).

20
Q

What is the joint PDF for continuous X and Y?

A

f_{X,Y}(x, y) such that F_{X,Y}(x, y) = ∫{-∞}^x ∫{-∞}^y f_{X,Y}(u, v) dv du.

21
Q

When are X and Y independent?

A

When p_{X,Y}(x, y) = p_X(x)p_Y(y) (discrete) or f_{X,Y}(x, y) = f_X(x)f_Y(y) (continuous).

22
Q

What is the conditional PMF of X given Y = y_j?

A

p_{X|Y}(x_i|y_j) = p_{X,Y}(x_i, y_j) / p_Y(y_j).

23
Q

What is the conditional PDF of X given Y = y?

A

f_{X|Y}(x|y) = f_{X,Y}(x, y) / f_Y(y).

24
Q

What is the conditional expectation E[X|Y = y]?

A

For discrete X: E[X|Y = y] = Σx_i p_{X|Y}(x_i|y). For continuous X: E[X|Y = y] = ∫x f_{X|Y}(x|y) dx.

25
State the law of iterated expectations.
E[X] = E[E[X|Y]], where the outer expectation is over Y.
26
What is the multinomial distribution?
A generalization of the binomial distribution for m+1 categories, with PMF P(N_1 = n_1, ..., N_m = n_m) = n!/(n_1!...n_{m+1}!) p_1^{n_1}...p_{m+1}^{n_{m+1}}.
27
What is the bivariate normal distribution?
A joint distribution for X and Y with PDF f_{X,Y}(x, y) = (1/(2πσ_Xσ_Y√(1-ρ^2))) exp[-1/(2(1-ρ^2)){(x-μ_X)^2/σ_X^2 - 2ρ(x-μ_X)(y-μ_Y)/(σ_Xσ_Y) + (y-μ_Y)^2/σ_Y^2}].
28
What are the marginal distributions of a bivariate normal?
X ~ N(μ_X, σ_X^2) and Y ~ N(μ_Y, σ_Y^2).
29
What is the conditional distribution of X given Y = y in a bivariate normal?
X|Y=y ~ N(μ_X + ρ(σ_X/σ_Y)(y - μ_Y), σ_X^2(1 - ρ^2)).
30
What is the covariance matrix for a random vector X?
Σ = Cov(X) with entries Σ_{i,j} = Cov(X_i, X_j). It is symmetric and positive semi-definite.
31
What is the PDF of a multivariate normal distribution?
f_X(x) = (1/((2π)^(k/2) |Σ|^(1/2))) exp[-1/2 (x - μ)^T Σ^{-1} (x - μ)], where k is the dimension of X.
32
What is the iterated conditional variance formula?
Var(X) = E[Var(X|Y)] + Var(E[X|Y]).
33