Probability Flashcards

1
Q

Define a sample space

A

For an experiment, the sample space is the set of all outcomes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define an event

A

A subset of the sample space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define equally likely for a finite sample space

A

P(A) = |A|/|Ω| for all A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define a permutation

A

The number of ways to order disinguishable objects. For n objects, there are n! different ways.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define the binomial coefficient

A

The number of orderings of m of one object and n - m of another is the binomial coefficient, which is equal to n!/[(m!)(n-m)!]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Define a probability space

A

A triple (Ω, F, P) where Ω is the sample space, F is a collection of events (subsets of Ω) and P is a function from F to the real numbers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Give the axioms of F

A

F1: Ω ∈ F
F2: If A ∈ F, then A complement ∈ F
F3 if Ai ∈ F for all i ≥ 1, then the union of Ai for i to infinity ∈ F

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Give the axioms of P

A

P1: P(A) ≥ 0 for A in F
P2: P(Ω) = 1
P3: If Ai ∈ F for i ≥ 1and all Ai and Aj are disjoint for all i ≠ j, then the probability of the union of the individual events = the sum of the probabilities of the individual events.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Define conditional probability

A

The conditional probability of A given B is: P(A|B) = P(A∩B)/P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Define a partition

A

{B_1, B_2, …} partitions Ω if Ω = U(B_i) and (B_i)n(B_j) = {} whenever i ≠ j.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Give the law of total probability

A

For a partition of Ω: B1, B2, … with PBi) > 0 for each i > 0, P(A) = P(A|Bi)xP(Bi), summed from i to n.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Give Bayes’ Theorem

A

For a partition of Ω: B1, B2, … with PBi) > 0 for each i > 0, then:
P(Bk|A) = P(A|Bk)xP(Bk)/P(A)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Define independence of two events

A

A and B are independent if P(A∩B) = P(A)P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Define independence of a family of events

A

A family {Ai,i∈I} of events is independent if P[n(i∈J)Ai] = Product of Ai for all i∈J for all finite subsets J of I

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Define a discrete random variable

A

A discrete random variable X on a probability space is a function X:Ω->R where:
Im(X) {X(ω),ω∈Ω}a is a countable set
For each X in R, {ω∈Ω:X(ω)=x} is in F

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Define the probability mass function

A

The function R->[0,1]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Define the Bernoulli distribution

A

X~Ber(p) if P(X=1) = p and P(X=0) = 1 - p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Define the Binomial distribution

A

X~Bin(n,p) if P(X=k) = nCk.p^k.(1-p)^(n-k) for all k from 0 to n.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Define the Geometric distribution

A

X~Geom(p) if P(X=k) = p(1-p)^(k-1) for k>1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Define the Uniform distribution

A

On a finite set {x1,x2,x3,…,xn} P(X=xi)= 1/n for i = 1, 2, 3, …, n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Define the Poisson distribution

A

X~Poisson(λ) if P(X=k) = (λ^k*e^-λ)/k!

22
Q

Define expectation of a discrete variable

A

E[X] = x.P(X=x), summed for all x in the indexing set

23
Q

Define independence of discrete random variables

A

Two discrete random variables X and Y are independent if for all x and y in the reals, the events {X=x} and {Y=y} are independent

24
Q

Define joint distributions

A

Suppose X and Y are discrete random variables on (Ω,F,P). The joint probability mass function of X and Y is p_(x,y):R^2->[0,1]

25
Q

Give Stirling’s approximation

A

n! ∼ (2π)^(1/2)xn^(n + 1/2)xe^(−n)

26
Q

Define marginal distribution

A

For X, Y with joint probability mass function p_XY, the marginal distribution of X is given by the summing all the values of y in p_XY(x,y)

27
Q

Define the conditional probability mass function

A

The conditional probability mass function of X given B is p_(X|B) (x) = P({X=x}∩B)/P(B)

28
Q

Define conditional expectation

A

The conditional expectation of X given B, E[X|B] = x(p_x|B)x, summed for all x in ImX

29
Q

Give the law of total probability for expectations

A

E[X] = E[X|Bi]P(Bi), summing i in I, defined whenever E[X] exists

30
Q

Give the formula for linearity of expectation

A

E[X+Y] = E[X] + E[Y]

31
Q

Define variance

A

The variance of a discrete random variable X is Var(X) = E[(X-E[X])^2], provided both of these expectations exist.

32
Q

Define standard deviation

A

The square root of variance

33
Q

Define covariance

A

For discrete random variables X and Y, cov(X,Y) = E[XY] - E[X]E[Y]

34
Q

Define the probability generating function

A

G_x(s) = (p_k)(s)^k, summed for all k from one to infinity.

35
Q

Define a random variable

A

A function X: Ω -> R such that {X ≤ x} := {ω ∈ Ω : X(ω) ≤ x}

36
Q

Define a cumulative distribution function

A

The cumulative distribution function of a random variable X is the function F: R -> [0,1] defined by F(x) = P(X ≤ x)

37
Q

Define a continuous random variable X

A

A random variable whose cdf can bewritten as the integral of fx(u)du from -infinity to x.

38
Q

Define a probability density function of X

A

The function fx: R -> R in the integrand of a cdf of a continuous random variable

39
Q

Define a continuous uniform distribution

A

X ~ Unif([a,b]) if X is a random variable with pdf fx(x) = 1/(b-a) for a ≤ x ≤ b

40
Q

Define an exponential distribution

A

For λ > 0, X ~ Exp(λ) if X is a random variable with pdf fx(x) = λe^(-λx) for x ≥ 0

41
Q

Define a normal distribution

A

For μ in the reals and σ^2 > 0, X ~ N(μ, σ^2) if X is a random variable with pdf fx(x) = 1/(2πσ^2)*exp(-(x-μ)^2/2σ^2)

42
Q

Define the expectation of a continuous random variable

A

E[X] = integral of xfx(x)dx for all x from - infinity to infinity, provided that the integral is less than infinity.

43
Q

Define the joint cdf of two continuous distributions

A

F_XY(x,y) = P(X≤x,Y≤y)

44
Q

Define jointly continuous

A

X and Y are jointly continuous if F_X,Y(x,y) = the double integral of f_X,Ydydx, with each from -∞ to x and y respectively.

45
Q

Define independence of jointly continuous random variables

A

Independent if f_X,Y(x,y) = f_X(x)f_Y(y)

46
Q

Define a random sample

A

An independent selection of random variables from a distribution

47
Q

Define the sample mean

A

The sum of the variables in the sample, divided by the number of variables in the sample

48
Q

Give the weak law of large numbers

A

Probability of |(1/n).(sum of X from 1 to n) - μ| > ε converges to 0

49
Q

Give Markov’s inequality

A

P(X ≥ t) ≤ E[X]/t

50
Q

Give Chebyshev’s inequality

A

P(|Z - E[Z]| ≥ t) ≤ var(Z)/t^2