Probability Flashcards

1
Q

Definition of sample space

A

In an experiment, the set of all possible outcomes is the sample space (S)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition of an event and written form

A

Any outcome in S (E), where E={x in S: x in E}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

7 set operations

A

Union, intersection, compliment, communicative, associative, distributive, De Morgan’s Law

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Definition of disjoint/ mutually exclusive events

A

Events A and B are mutually exclusive when their intersection is the empty set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Definition of pairwise mutually exclusive

A

For any two subsets of S (A1,A2,A3,…), their intersection is the empty set

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Partition

A

If A1,A2,A3… are pairwise mutually exclusive and all these sets comprise a set S, then the set {A1,A2,A3,…} partitions S

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Definition of sigma algebra

A

B, a collection of subsets in S is a sigma algebra if it satisfies the following properties 1) The empty set is contained in B 2) If A is in B then A^c is in B 3) If A1,A2,A3,… are in B then UAi are in B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the largest number of sets in a sigma algebra, B, with n sets?

A

2^n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Definition of probability

A

Given a sample space S with sigma algebra B, a probability function (or measure) is any assigned, real-valued function P with domain B that satisfies the Kolmogorov Axioms.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Kolmogorov Axioms

A

1) P(A)>=0, for any A in B 2) P(S)=1 3) For A1,A2,A3,… in B and are pairwise mutually exclusive then P(UAi)=SUM P(Ai)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

1) Gamma Distribution with different parameterizations
2) Expected values and variances of those distributions
3) Gamma funciton
4) Properties of Gamma funciton
5) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

1) Exponential Distribution with different Parameterizations
2) Different expected values and variances
3) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

1) Bernoulli Distribution
2) Expected value and variance
3) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

1) Geometric Distribution with different parameterizations
2) Expected values and variances
3) MGF (Also special rule to help solve this)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

1) Poisson Distribution
2) Expected value and variance
3) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

1) Binomial Distribution
2) Expected value and variances
3) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

1) Beta Distribution
2) Expected value and variance
3) Beta Funciton
4) Expectation of nth term

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

1) Bivariate Normal
2) Conditional expectation
3) Conditional variance

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

1) Normal and standard normal Distributions
2) Expected values and variances
3) MGFs

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

1) Continuous Uniform Distribution
2) Expected value and variance
3) MGF

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

1) Multinomial Distribution
2) Expected value and variance
3) Multinomial Theorem
4) cov(x_i,x_j)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Bonferonni Inequality

A

Pr(A∩B)≥Pr(A)+Pr(B)-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Table of ordered, non-ordered, with replacement, without replacement

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Fundamental Theorem of Counting

A

For a job which consists of k tasks and tere are ni ways to accomplish each ith task then the job can be accomplished in (n1n2…nk) ways

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Inequality between unordered with replacement and without replacement
Opposite of this
26
Binomial Theorem
27
Paschal's Formula
28
Three useful properties of binomial coefficients
29
Bayes' rule (for 2 sets and generally)
30
Definition of conditional independence
A is conditionally independent of C given B if P[A|B,C]=P[A|B]
31
Total Law of Probability
32
Definition of a random variable
A function from a sample space S into the real numbers Formally: P[X=xi]=P[sj in S: X(sj)=xi]
33
Conditions for a function to be a CDF (iff)
34
If RV's have the same cdf then...
X and Y are iid
35
Can a RV be both discrete and continuous?
Yes
36
If X and Y are iid (FX(x)=FY(x)) then does this mean X=Y?
No
37
A functin f(x) is a pdf (or pmf) iff
38
Definition of absolutely continuous x
X is absolutely continuous when it is both continuous and differentiable for all x
39
For a RV x, and Y=g(x), what does fy(y) equal? (A transformation of random variable)
40
Let X have cdf FX(x), let Y=g(X), and let X={x: fX(x)\>0} and Y={y:y=g(x) for some x in X} What if g is an increasing/ decreasing function on X?
1) If g is an increasing function on X, FY(y)=FX(g-1(y)) 2) If g is a decreasing function on X and X is a continuous random variable, FY(y)=1-FX(g-1(y))
41
Is it always true that E[g(x)]=g(E[x])?
No
42
How to find a moment generating function MX(t) and the moments of a probability distribution
MX(t)=E[etx] dn/dtn MX(t=0)
43
Three properties of MGFs
44
3 mathematical properties
45
Explain hypergeometric distribution in words
In the presence of two options, the distribution evaluates K selected from M objects from option 1 out of N total objects.
46
Explain the negative binomial in words
Explains how many trials need to occur to obtain n successes
47
Memoryless Property
48
Shapes of Beta distributions
49
General idea of Poisson process
50
Two major ways to determine exponential families with respective terms explained
51
Definition of curved and full exponential family
A curved exponential family is when the number of parameters for an exponential family is less than k A full exponential family is when the number of parameters equals k
52
Definitions of location, scale, and location-scale families
Location family is one that takes a pdf and includes it in the family of pdfs indexed by the finite parameter u such that f(x-u) remains in the family Scale family is one that takes a pdf and includes it in the family of pdfs indexed by the positive parameter s such that (1/s)f(x/s) remains in the family Location-scale is just a combination of the two
53
Markov Inequality
54
Chebyshev's Inequality
55
How to find E[X2|X1] given a joint probability function for the discrete and continuous cases
56
Given f(X1, X2) what is E[X2] (for discrete and continuous case)
57
If X1 and X2 are independent and Z=X1 + X2 then what does this say about the mgf's for these variables?
Mz(t)=MX1(t)MX2(t)
58
What is the equation for bivariate transformations for the continuous case?
59
If a joint probability function can be factorized what does this say about its factors?
They are independent
60
Basic idea of Hierarchical models
You are given f(X|Y) and f(Y) and need to find f(X)
61
E[X] in terms of conditional expectations for hierarchical models
EY[EX[X|Y]]
62
Var(Y) in terms of conditional expectations for hierarchical models
EX[VarY(Y|X)]+VarX(EY[Y|X])
63
Two forms of covariance
64
Correlation equation
65
cov(aX,bY)=
acov(X,Y)b
66
cov(X+Y,W+Z)=
cov(X,Z)+cov(X,W)+cov(Y,W)+cov(Y,Z)
67
Given data, basic difference between Classical and Bayesian approach
Classical model uses simulated or given distribution to obtain some fixed, unknown constant (the parameter) The Bayesian model assumes the parameter is a random variable and relies on prior knowledge of this variable and posterior enhancement through collected data. This method utilizes the idea of exchangability (conditional independence) for the samples
68
Cauchy-Schwarz Inequality
69
Holder's Inequality
70
Jensen's Inequality
71
If x1,x2,x3,... are mutually independent then what does this say about the function of this vector of mutually independent x's, the product of Expected transformations for each x, and their MGFs?
72
If _X_1,_X_2,_X_3,... are independent then what does this say about the transformation of these vectors?
73
What is the "mission" of a statistician?
To learn from data (by obtaining a sample) to make judegements about the unknown (through populations and their parameters)
74
What is the connection between a sample and a population?
Probability (a measure of randomness/stochasticity)
75
What is a statistic
A summary of the sample
76
Equation for S2
77
Convergence in Probability
78
WLLN
79
Convergence in Distribution
80
Convergence almost surely
81
SLLN
82
Definition of consistent
A statistic is consistent when it converges in probability to the truth
83
Central Limit Theorem
84
Comparison of convergence almost surely and convergence in probability
CAS is stronger, if CAS holds then CIP holds, but not always the converse
85
Comparison between convergence in probability and convergence in distribution
CIP is stronger because if CIP holds then CID holds, but not always the converse
86
Slutsky's Theorem
87
Three things necessary for proving x follows a t distribution
1. E[X] (X bar) and S2 are independent 2. E[X}~N(u,sig2/n) 3. (n-1)/(sig2)S2~X2n-1
88
Things to remember: a) If each xi follows a normal then x2i follows what dist? b) SUM( x2i) follows what distribution (sum to n) c) A chi squared distribution (with p degreees of freedom) follows what distribution with certian parameters? d) SUM(xi-E[X])=? e) SUM(xi-u)2=? f) S2=? (not just usual equation)
89
t statistic and distribution
90
1) Binomial to Poisson 2) Binomial to Bernoulli 3) Binomial to Normal
91
Bernoulli to Binomial
SUM Xi
92
Hypergeometric to Binomial
p=M/N, n=K, N-\>infinity
93
1) Beta to Normal 2) Beta to Continuous Uniform
1) alpha=beta-\>infinity 2) alpha=beta=1
94
1) Negative Binomial to Poisson 2) Negative Binomial to Geometric
95
1) Geometric to Negative Binomial 2) Geometric to itself
96
1) Poisson to Normal 2) Poisson to itself
97
1) Normal to itself 2) Normal to standard normal 3) Normal to lognormal
98
1) Gamma to Exponential 2) Gamma to Normal 3) Gamma to Beta 4) Gamma to Chi-squared
99
1) Exponential to Continuous uniform 2) Exponential to Gamma 3) Exponential to Chi-squared
100
1) Chi-squared to itself 2) Chi-squared to F 3) Chi-squared to Exponential
101
1) Standard Normal to Cauchy 2) Standard Normal to Chi-Squared
102
F to Chi-squared
103
1) t to F 2) t to Standard normal 3) t to Cauchy
104
Cauchy to itself (two ways)
1) SUM Xi 2) 1/X
105
1) Hypergeometric distribution 2) Expected value and variance
106
1) Negative Binomial distribution 2) Expected value and variance 3) MGF
107
1) Cauchy distribution 2) Expected value and variance 3) MGF
108
1) Chi-squared distribution 2) Expected value and variance 3) MGF
109
1) F distribution 2) Expected value and variance 3) MGF
110
1) Lognormal distribution 2) Expected value and variance 3) MGF
111
1) t distribution 2) expected value and variance 3) MGF