Probability Flashcards

1
Q

Probability

What is the distribution function of a random variable X (D)

A

F(x)=P(X<=x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Probability

Define the median of a distribution X (D)

A

The point m where P(X<=m)>=1/2 and P(X>=m)>=1/2. Or if this occurs over an interval, the midpoint of that interval

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Probability

What is the definition of convergence in distribution? (D)

A

For all point x where F is continuous, Fn(x)->F(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Probability

What is the definition of convergence in probability? (D)

A

For all ε>0,

P( ¦Xn-X¦ > ε ) -> 0 as n->∞

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Probability

What is the definition of convergence almost surely? (D)

A

P ( Xn -> X ) =1

So for all ε>0 there exists an N where ¦Xn-X¦

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Probability

What is that annoying proof you need to prove the convergence of random variables proofs> (T)

A

If An is a sequence of increasing events. A1 in A2 in A3 (A1 has the smallest probability) then
lim[n->∞] P (An) = P ( U An )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Probability
What is the rough proof of proving that:
For an increasing sequence of events An
lim[n->∞] P (An) = P ( U An ) (T)

A

Start on RHS. Write as a disjoint set then write as a sum rather than union. Write the sum as a limit. Take the limit outside then put the sum back into a union.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Probability

Give the rough proof that convergence in probability => convergence in distribution (T)

A

Note if Xn<= x then either X<=x+ε or ¦Xn-X ¦>ε but that these probabilities overlap.

So Fn(x)<=P(X<=x+ε or ¦Xn-X¦>ε) then show Fn(x)F(x-ε)-ε so Fn(x)->F(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Probability

Give the rough proof that convergence almost surely => convergence in probability (T)

A

Let AN = { ¦Xn-X¦=N }. Almost surely guarantees AN for some N. So P ( U An ) = 1 = lim[n->∞] P(An) by weird lemma. But An => ¦Xn-X¦

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Probability

Give an example as to why convergence in probability =/=> convergence almost surely (T)

A
Let Xn be a variable st. P(X=1)=1/n. P(X=0)=(n-1)/n.
Then P ( ¦Xn-0¦= P( Xn = 0 ) -> 1
So Xn->0 in probability. Since Xn discrete {Xn->0}={Xn=0 eventually}={U Bn}

Let BN = { Xn = 0 for all n>=N } . Then for any N,K>0
P(BN)<= P ( Xn = 0 for all n=N,…,N+K )
= N-1/N * N/N+1 * … * N+K-1/N+K = N-1/N+K
Since K arbitrary P(BN)=0

So lim[n->∞] P (Bn) = 0 = P( U Bn ) so Xn-/->0 a.s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Probability

Give an example as to why convergence in distribution =/=> convergence in probability (T)

A

Let Y be st. P(Y=0)=1/2=P(Y=1`) and let Xn have the same distribution as Y but that P(Xn=Y)=0 for all n
Xn clearly -> Y in distribution but P ( ¦Xn-Y¦>1/2 ) = 1 for all n so no convergence in probability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Probability

Under what conditions does convergence in distribution => convergence in probability (T)

A

When Xn converges to a constant c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Probability

State Markov’s inequality (T)

A

For non-negative X and z>0

P(X>=z)<=E[X]/z

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Probability

Prove Markov’s inequality (T)

A

Let Xz be st.
Xz=0 when 0<=X=z

So X>=Xz

E[X]>=E[Xz]= 0*P(X=z) then rearrange

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Probability

State Chebyshev’s inequality (T)

A

For Y with finite mean and variance and ε>0

P ( ¦Y-E[Y]¦>=ε ) <= Var(Y)/ε^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Probability

Prove Chebyshev’s inequality (T)

A

P( ¦Y-E[Y]¦>=ε ) = P( (Y-E[Y])^2 >= ε^2 ) <= E[ (Y-E[Y])^2 ]/ε^2 by Markov = Var(Y)/ε^2

17
Q

Probability

State the weak law of large numbers (T)

A

Let Xi be i.i.d with mean μ and Sn=X1+…+Xn

Then Sn/n -P-> μ

18
Q

Probability

Prove the weak law of large numbers (T)

A

E[Sn/n]=μ by independence. Var(Sn/n)=Var(Sn)/n^2=( Var(X1)+Var(X2)+…+Var(Xn) ) / n^2 = nσ/n^2=σ/n

By Chebyshev P ( ¦Xn-μ¦>ε ) <= Var(Sn/n)/ε^2 = σ/nε^2 -> 0 as n->∞

19
Q

Probability

State the strong law of large numbers (T)

A

Let Xi be i.i.d with mean μ and Sn=X1+…+Xn

Then Sn/n -> μ almost surely

20
Q

Probability

State the central limit theorem (T)

A

Let Xi be i.i.d with mean μ and variance σ. Let Sn=X1+…+Xn

Then Sn-μ/σroot(n) -d-> N(0,1)

21
Q

Probability

What 3 things does the CLT tell us? (Q)

A
  1. That the distribution of Sn concentrates around nμ
  2. That the fluctuations of Sn are of order root(n)
  3. That the asymptotic distribution of these fluctuations are normal