Probability & Statistics Flashcards

1
Q

Definition 1.1 (Sample Space)

A

The sample space is the set of all possible outcomes for the experiment. It is denoted by S

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition 1.2 (Event)

A

An event is a subset of the sample space. The even occurs if the actual outcome is an element of this subset.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Definition 1.3 (Simple Event)

A

An event is a simple event if it consists of a single element of the sample space S

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Meaning of Disjoint events

A

We say two sets A and B are disjoint if they have no element in common, i.e, A ∩ B = {}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

De Morgan’s laws

A

(A ∪ B)c = Ac ∩ Bc
(A ∩ B)c = Ac ∪ Bc

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Definition 2.1 (Kolgromov’s axioms for probability)

A

a) For every event A we have P(A) >= 0,

b) P(S) = 1,

c) If A1, A2, …, An are n pairwise disjoint events
then
P(A1 U A2 U … U An) = P(A1) + P(A2) + … + P(An)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Proposition 2.2

A

If A is an event then
P(Ac) = 1 - P(A)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Corollary 2.3

A

P(∅) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Corollary 2.4

A

If A is an event then P(A) <= 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Proposition 2.5

A

If A and B are events and A ⊆ B then
P(A) <= P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Proposition 2.6

A

The probability of a finite set is the sum of the probabilities of the corresponding simple events.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Proposition 2.7 (inclusion-exclusion for two events)

A

P(A ∪ B) = P(A) + P(B) − P(A ∩ B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Proposition 2.8 (Inclusion-exclusion for three events)

A

P(A∪B∪C) = P(A)+P(B)+P(C)−P(A∩B)−P(A∩C)−P(B∩C)+P(A∩B∩C)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Theorem 3.1(a) Ordered with replacement (repetition allowed)

A

n^r

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Theorem 3.1 (b) Ordered without replacement (no repetition)

A

n!
(n−r)!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Defenition 4.1 (Conditional Probability)

A

If E1 and E2 are events and P(E1) 6= 0 then the conditional probability of E2 given E1, usually denoted by P(E2|E1), is

P(E2|E1) = P(E1 ∩ E2)
P(E1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Theorem 3.1 (c) Unordered without replacement (no repetition)

A

nCr (n)
(r)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Defenition 4.2(independence)

A

We say that the events E1 and E2 are (pairwise) independent if

P(E1 ∩ E2) = P(E1)P(E2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Definition 4.4 a). Three events E1, E2, and E3 are called pairwise independent
if:

A

P(E1 ∩ E2) = P(E1)P(E2),
P(E1 ∩ E3) = P(E1)P(E3),
P(E2 ∩ E3) = P(E2)P(E3).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Definition 4.4 b) Three events E1, E2, and E3 are called mutually independent
if:

A

P(E1 ∩ E2 ∩ E3) = P(E1)P(E2)P(E3)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Definition 4.6. Two events E1 and E2 are said to be conditionally independent given an event E3 if:

A

P(E1 ∩ E2|E3) = P(E1|E3)P(E2|E3)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Defenition 5.1 (Random Variable)

A

A random variable is a function from S to R

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Definition 5.2 (Discrete Random Variables)

A

A random variable X is discrete if the set of values that X takes
is either finite or countably infinite.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Definition 5.3 (Probability Mass Function)

A

The probability mass function (p.m.f.) of a discrete random
variable X is the function which given input x has output P(X = x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Proposition 5.4 If X is a discrete random variable which takes values x1, x2, x3, . . ., then

A

The sum of the Outputs must equal 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Definition 6.1 Expectation

A

If X is a discrete random variable which takes values x1, x2, x3, . . ., then the expectation of X (or the expected value of X) is defined by
E(X) = x1P(X = x1) + x2P(X = x2) + x3P(X = x3) + · · · .

27
Q

Proposition 6.2 If m ≤ X(ω) ≤ M for all ω ∈ S, then

A

m ≤ E(X) ≤ M

28
Q

Proposition 6.3 (Expectation of a function)

A

E( f(X) ) = f(x1)P(X = x1) + f(x2)P(X = x2) + f(x3)P(X = x3) + · · ·

29
Q

Definition 6.4 (Moments)

A

The nth moment of the random variable X is the expectation E(X^n)

30
Q

Definition 6.5 (Variance)

A

Var(X) = [x1 − E(X)]2P(X = x1) + [x2 − E(X)]2P(X = x2)
+ [x3 − E(X)]2P(X = x3) + …

31
Q

Proposition 6.6( Variance formula)

A

Var(X) = E(X^2) − [E(X)]^2

32
Q

Proposition 6.7 (Linear function of expectation)

A

E(aX + b) = aE(X) + b

33
Q

Proposition 6.8 (Linear function of variancec)

A

Var(aX + b) = a^2Var(X)

34
Q

What is a Bernoulli(p) distribution:

A

It is where a random variable X only takes values 0 and 1

35
Q

Bernoulli distribution Expectation and Variance

A

E(X) = p, Var(X) = p(1 − p)

36
Q

What is Binomial distribution:

A

A discrete random variable X has the Binomial(n, p) distribution, denoted X ∼ Bin(n, p), if its p.m.f. is :

P(x =k) = nCk x p^k x (1-p)^n-k

37
Q

Binomial distribution Expectation and Variance

A

E(X) = np, Var(X) = np(1 − p)

38
Q

What is Geometric distribution:

A

A discrete random variable X has the Geometric(p) distribution, denoted X ∼ Geom(p), if its p.m.f. is :

P(X = k) = p(1 − p)^k−1

39
Q

Geometric distribution Expectation and Variance

A

E(X) = 1/p
Var(X) = 1 − p/ p^2

40
Q

What is Hypergeometric distribution

A

A bag contains n balls, m white balls and n − m black balls. You pick l balls at
random without replacement. Let X denote the number of white balls you pick.

P(X = k) = mCk x (n-m)C(l-k) / nCl

41
Q

Hypergeometric distribution Expectation and Variance

A

E(X) = l x (m/n)
Var(X) = l x (m/n) x (n-m/n) x (n-l/n-1)

42
Q

What is Negative binomial distribution

A

Consider a sequence of independent Bernoulli(p) trials. Given a fixed integer r denote by X the random variable which counts the number of failures when we have seen in total r successes (for the first time).

P(X = k) = (k+r-1)C(r-1) x p^r x (1-p)^k

43
Q

Negative Binomial distribution Expectation and Variance

A

E(T) = (1-p)r/p
Var(T) = (1-p)r/p^2

44
Q

What is Uniform distribution

A

uniform distribution refers to a type of probability distribution in which all outcomes are equally likely

45
Q

(discrete) Uniform distribution Expectation and Variance

A

where n = b-a, and m=a
E(X) = m +(n/2)
Variance = n(n+2)/12

46
Q

What is Poisson distribution

A

Consider a random variable X which takes values k = 0, 1, 2, 3, . . .. Denote by λ > 0 a fixed positive real number.

P(X=k) = (λ^k/k!) x e^-λ

47
Q

Poisson distribution Expectation and Variance

A

E(X) = λ, Var(X) = λ

48
Q

Cumulative distribution function

A

The cumulative distribution function (c.d.f.) of a random variable X is the function which given t has output
P(X ≤ t).

49
Q

Moment Generating function

A

Let X be a discrete random variable
which takes integer values. The moment generating function (mgf ) of X is the function which given t has output E(e^(tX))

50
Q

Definition 3 (Continuous random variables)

A

We say that a random variable X is a continuous random variable if there exists a
continuous function fx from R to [0, ∞) with the following property:

P(a<= X <= b) = integral of fx(t)

51
Q

Definition 4 (Expectation and Variance of crv)

A

E(X) = integral tfx(dt)
Var(X) = E(X^2) − (E(X))^2

52
Q

(continuous) Uniform Expectation and Variance

A

E(X) = (a+b)/2
Var(X) = (b-a)^2/12

53
Q

Exponential distribution

A

P(X=t) = λe^λt

54
Q

Exponential distribution Expectation and Variance

A

E(X) = 1/λ
Var(X) = 1/λ^2

55
Q

Joint Probability mass function

A

Let X and Y be two discrete random variables defined on the same sample space and taking values x1, x2, . . . and y1, y2, . . . respectively. The
function:

(xk, yl) → P( (X = xk) ∩ (Y = yl) )

56
Q

Proposition 10.2 Marginal Probability

A

P(X=xk) = suml P(X = xk, Y = yl)

same goes the other way

the idea is that if we only care about the probability of X
taking a particular value, we need to sum over all possible values of Y

57
Q

Proposition 10.3

A

If g(X, Y ) is a real-valued function of the two discrete random variables X and Y then the expectation of g(X, Y ) is obtained as

E( g(X, Y ) ) = sumk suml g(xk, yl)P(X = xk, Y = yl)

58
Q

Theorem 10.4

A

If X and Y are discrete random variables then

E(X + Y ) = E(X) + E(Y )

59
Q

Independence for Random Variables

A

Two discrete random variables X and Y are independent if
the events “X = xk” and “Y = yl” are independent for all possible values xk, yl

60
Q

Covariance of X, Y

A

The covariance of X and Y is defined by:

Cov(X, Y ) = E( [X − E(X)][Y − E(Y )] ).

61
Q

Correlation coefficient of X and Y

A

If Var(X) > 0 and Var(Y ) > 0:

Corr(X, Y ) = Cov(X,Y) /
sqrt(Var(X)Var(Y)

62
Q

Formula for Covariance (easy)

A

Cov(X, Y ) = E(XY ) − E(X)E(Y )

63
Q
A