SOA Probability Flashcards

1
Q

If A ⊂ B then (A n B)

A

(A n B) = A

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Probability Generating Function Defined as PGF where

Px(t) =

A

Px(t) = E [tX]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

E ( X ( X -1 ) ) = 2nd Moment of what Generating Function?

A

PGF - Probability Generating Function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

E [X ∣ j ≤ X ≤ k] - continuous case

A

Integrate numerator from j to k

( ∫ x ⋅ fX (x) dx )

÷
( Pr ( j ≤ X ≤ k ) )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Percentile for Discrete Random Variables

A

Fxp) ≥ p

i.e the function at πp has to atleast be equal or greater than the percentile p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

E [X | j ≤ X ≤ k] - Discrete Case

A

Sum numerator from x = j to k

( ∑ (x)( Pr [j ≤ X ≤ k] )

÷

( Pr [j ≤ X ≤ k] )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Percentile for Continous Random Variable

A

density function fXp) = p

Has to equal the percentile

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Finding mode of discrete random variable

A

calculate probabilities of each possible value and choose the one that gives the largest probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Finding mode of continuous random variable

A

take derivative of density function set it equal to 0 and solve for mode. (finding local maximum of function)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Cumulative Distribution Function (CDF) of a probability density function (PDF)

A

integrate from lowest value of X to the variable x itself

0 <f></f>

∫ f(t) dt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Chebyshev’s Inequality

A

Pr( |X-µ| ≥ kσ ) ≤ ( 1 / k2 )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How to break up the inequality of Chebyshev’s Equation

A

Pr( |X-µ| ≥ kσ )

=

Pr( (X-µ) ≥ kσ ) + Pr( (X-µ) ≤ -kσ )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Univariate Transformation CDF METHOD

From X to Y

A

1.) Given PDF of X find CDF of X

2.) Perform Transformation where FY( y ) = P( Y ≤ y ) with subsitution

3.) Restate CDF of Y using CDF of X ,

then subsitute CDF of X found in step 1 into CFD of Y

4.) Take Derivative of CDF of Y to find PDF of Y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Univariate Transformation PDF METHOD

From X to Y

A

1.) Get PDF of X if not given

2.) Find PDF of Y using the formula

fY( y ) = fX( [g-1( y )] ) • | (d/dy) g-1( y ) |

3.) Integrate PDF of Y to get CDF of Y if required

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Discrete Uniform PMF

A

( 1 / b - a + 1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Discrete Uniform E[X]

A

( a + b / 2 )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Discrete Uniform Var[X]

A

[( b - a + 1 )2 - 1]

÷

12

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Bernoulli’s E[X]

A

p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Bernoulli’s Var[X]

A

pq

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Bernoulli’s MGF

A

pet + q

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Bernoull’s Variance Short-cut for Y = (a-b)X + b

A

(b - a)2• pq

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Property of Expected One RV: E[c]=

A

E[c]=c, c = constant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Property of Expected One RV: E[c⋅g(X)]=

A

E[c⋅g(X)]= c ⋅ E[g(X)]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Property of Expected One RV: E[g1(X)+g2(X)+…+gk(X)] =

A

E[g1(X)+g2(X)+…+gk(X)]

=

E[g1(X)] + E[g2(X)]+ …+E[gk(X)]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Variance formula for **One RV:** ## Footnote **Var[X]** **Var[g(X)]**
**Var[X] = E[(X−μ)2] = E[X2] − (E[X])2** **Var[g(X)] = E[(g(X) − E[g(X)])2] = E[g(X)2] − (E[g(X)])2​**
26
Property of **Variance One RV: Var[c] =**
**Var[c] = 0,** **c = constant**
27
Property of **Variance One RV: Var [aX + b]**
**Var [aX + b] =** **a2 • Var[X]** **a,b** **= constant**
28
Coefficient of **Variation for One RV** ## Footnote **CV[X]=**
**CV[X] =** **(SD[X])** * *÷** * *(E[X])**
29
Binomial Mean **E[X]**
**E[X] = np**
30
Binomial Variance **Var[X]**
**Var[X] = npq**
31
Binomial **MGF**
**MX(t) = (pet+q)n**
32
Binomial **PGF**
**PX(t) = (pt+q)n**
33
Hypergeometric **PMF: Pr(X = x)**
**Pr(X = x)** **[( m Choose x ) • ( N - m Choose n - x )]** **÷​** **( N choose n )** x = sucess out of m = total sucess n - x = failures out of N - m = total failures N = population size; n = sample size
34
Geometric **PMF**
**Pr(X = x) = (1−p)x−1 • p** **Where the first success is observed** **Define X as the number of rolls _to_ get**
35
Geometic **E[X]**
**E[X] = ( 1 / p )**
36
Geometric **Var[X]**
**Var[X] = [****(1 - p)****÷****p2]**
37
Geometric **MGF**
**MX(t) =** **(pet)** ** ÷** **(1−(1−p)et )** **for t \< −ln(1−p)**
38
Memoryless Property of a Distribution
**The memoryless property states that, for positive integer c,** **Pr( X − c = x∣ X \> c ) =** **Pr(X = x )​**
39
Negative Binomial PMF
**Pr(X = x) =** **(x-1 choose r−1) •pr•(1−p)x−r** **r = the desired number of "successes"** **p = probability of a success** **X = number of "trials" until the rth "success"** **X represent the number of coin tosses necessary for three heads to occur**
40
Negative Binomial **E[X]**
**E[X] = r ÷ p**
41
Negative Binomial **Var[X]**
**Var[X] = r•( 1−p ÷ p2)**
42
Negative Binomial **MGF**
**MX(t) =** **(pet **÷ **1 − (1 − p)et)r**
43
Geometric Distribution for **Pr( X ≥ x )**
**∑ (1-p)x-1 • p** **=** **(1-p)x-1**
44
Geometric Distribution Derivation
**_Independent Bernoulli trials​_** ## Footnote **P(X=x)** **= Pr( first "success" on xth "trial")** **= Pr ("_failure_" on the first x-1 "trials" and "_success_" on xth "trial")** **= Pr( "failure" on the first x-1 "trials") • Pr( "success" on the xth "trial")** **= (1 - p)x-1 • p**
45
Geometric Alternative Form (failures)
**Let Y be the number of "failures" before the first "success", rather than the number of "trials"** **number of "trials" = number of "failures'' + number of "successes"** **X = Y+ 1 ⇒ Y = X − 1** **Let Y be the number of rolls _before_ getting.**..
46
Negative Binomial Distribution Derivation
**_Independent Bernoulli Trials_** **Pr (X = x )** **= Pr (rth "success'' on xth "trial")** **= Pr( r−1 "successes'' on the first x−1 "trials" ∩ "success'' on xth "trial" )​** **= Pr(r−1 "successes'' on the first x−1 "trials" )•Pr("success'' on xth "trial")**
47
Negative Binomial Alternative Form "failures"
**Let Y be the number of "failures" before the rth "success"** **= Pr(X − r = y)** **= Pr(X = y + r)** **Let Y represent the number of tails before getting the third head**
48
Exponential **MGF**
**Mx(t) = ( 1 / 1−θt )** **t \< (1 / θ)**
49
Exponential **Var[X]**
**X∼Exponential(θ) = θ2** ## Footnote **X∼Exponential(λ) = (1 / λ2)**
50
**X∼Exponential(θ) E[X]**
**E[X];** **X∼Exponential(θ) = θ** **E[X];** **X∼Exponential(λ****) = λ**
51
Gamma **PDF**
**f**X**(x) =** ## Footnote **(1 / Γ(α)) ⋅ (xα−1α) ⋅ e(−x / θ)** **if α is a positive integer then** **Γ(α) = (α - 1)!**
52
Gamma **E[X]**
**E[X] = αθ**
53
Gamma **Var[X]**
**Var[X] = αθ2**
54
Gamma **MGF**
**MX(t) = (1 / 1−θt)α**
55
Normal Distribution **PDF**
**fX(x)=** **[(1 / (σ • sqrt(2π))] •** **( e−[((x−μ)^2) / (2 • σ^2)] )**
56
Standard Normal Distribution
**fZ(z) =** **(1 / sqrt(2π) •** **(e−((z)^2) / 2 ) )** **Z = ( X - μ ) / ( σ )**
57
Joint Density Function for **Pr( X + Y \< 2 )** ## Footnote **inner integral limit**
**double integration:** **Determine limits for inner integral following the picture** **∫ ( ∫ fX,Y(x,y) dy ) dx**
58
**Pr(X ≤ c ∣ Y = y)**
**Integrate from −∞ to c** **∫ fX∣Y(x∣y) dx** **=** **∫ [fX,Y(x,y) / fY(y)] dx**
59
**Pr(X ≤ x ∣ Y ≤ y)**
**[ Pr(X ≤ x Y ≤ y)** **/ Pr(Y ≤ y) ]**
60
Weighted Average of CDF
**FY(y) = a1FC1(y) + a2FC2(y)**
61
Weighted Average of Survival Function
**SY(y) = a1SC1(y) + a2SC2(y)**
62
To construct the **mixed** (or **unconditional)** distribution of Y
Pr(Y = y) = **Pr(Y = y∣X = x)⋅Pr(X = x) + Pr(Y = y∣X = x)⋅Pr(X = x)**
63
Pr(A ∣ B) + Pr(A′ ∣ B) =
**[ Pr(AB) + Pr(A′B)** **/** **Pr(B) ]**
64
**Pr(A ∣ B) + Pr(A′ ∣ B) =**
**[Pr(B)** **/** **Pr(B) ]** **=** **1**
65
**Double Expectation**
**E[X] = E[E [X ∣ Y] ]**
66
**Law of total Variance**
**Var[X]** **=** **E[Var[X | Y]]** **+ Var[E[X ∣ Y]]** **EVVE**
67
**Pr(X = x∣Y ≤ y) =**
**Pr(X=x Y≤y)** **/** **Pr(Y ≤ y)**
68
**Pr(X = x∣Y = y)**
**Pr(X=x Y=y)** **/** **Pr(Y=y)**
69
**Cov[X,X]**
***E[X⋅X] − E[X]⋅E[X]*** ***=*** ***E[X2] − (E[X])2*** **=** **Var[X]**
70
**Cov[a,X]**
0
71
**Cov[a,b]**
**0**
72
**Cov[aX , bY]**
**ab⋅Cov[X , Y]**
73
Var[aX]
**Cov[aX,aX]** **=** **a2⋅Cov[X,X]** **=** **a2⋅Var[X]**
74
**Cov[X+a , Y+b]**
**Cov[X , Y]**
75
**Cov[aX + bY , cP + dQ]**
**ac⋅Cov[X,P] + ad⋅Cov[X,Q] + bc⋅Cov[Y,P] + bd⋅Cov[Y,Q]**
76
**Var[aX + bY]**
**a2⋅Var[X] + 2ab⋅Cov[X,Y] + b2⋅Var[Y]**
77
**Cov[X,Y] =**
**E[XY] − E[X]⋅E[Y]**
78
**ρX,Y=Corr[X,Y]** Coefficient of Correlation
**Cov[X,Y]** **/** **SD[X]⋅SD[Y]**
79
Multivariate Transformation **CDF Method** Case 1: Transforming two variables (X and Y) to one variable (W).
1.) Using equation of transformation, W=g(X,Y), express FW(w)=Pr(W≤w) = Pr[g(X,Y)≤w] . 2. ) Calculate FW(w)=Pr(W≤w) by integrating over region of integration defined by domain of fX,Y(x,y) and g(X,Y)≤w (transformation) . 3. ) Differentiate FW(w) to get fW(w) if required.
80
Multivariate Transformation **PDF Method** Case 2: Transforming two variables (X1 and X2) to two variables (W1 and W2)
1.) Find fX1,X2(x1,x2) if not given. 2. ) Introduce dummy variable (for Case 1 only). 3. ) Find inverse of equations of transformation, h1(w1,w2) and h2(w1,w2) 4.)Calculate determinant of Jacobian matrix, J and take its absolute value 5.) Find fW1,W2(w1,w2) using the following formula. fW1,W2(w1,w2) = fX1,X2[h1(w1,w2),h2(w1,w2)]⋅|J|, J≠0
81
Combination Formula
ₙCₖ = [n! / (n-k)! * k!]
82
Permutation Formula
ₙPₖ = [n! / (n-k)!]
83
Combinatorial Probability
Probability =(Number of Permutations or Combination Satisfying requirements) / (Total Number of Permutations or Combinations)