10. Asymptotics Flashcards

1
Q

Convergence in probability

A

lim Pr{|Xn-X| < epsilon }=1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Convergence in mean of order k

A

lim E{|Xn-X|^k}=0

if k=2 then we have the quadratic mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Convergence in distribution

A

lim Fn(x) = F(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Relationship between convergences

A
  • convergence in k-th mean implies convergence in probability which implies convergence in distribution
  • convergence in distribution to a real number implies convergence in probability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Proof that conv. in k-th mean implies conv. in prob

A

E{|Xn-X|^k}= E[|Y|^k] = int |y|f_Y
= int (|y|=> j) + int (|y| < j)
=> j^k int (|y|=> j) f_Y = j^k Pr{|Y| => j}
then 0<= Pr{|Xn-X| => e} <=1/e^k E{|Xn-X|^e}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Central Limit th.

A

(X* - m)/sqr(v/n) ~ N(0,1)

if X* is the sample mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Slutsky th.

A

Let Xn and Yn be 2 sequences, with Xn convergent in distr. to X and Yn convergent in prob. to y then:

  • Xn +/- Yn = X+/-y
  • XnYn = Xy
  • Xn/Yn = X/y
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Mann-Wald th.

A

if Xn is st:
sqr(n) (Xn - @) ~ N(0, v(@))
with g a differentiable funct, non vanishing in the first derivative, then:
sqr(n)(g(Xn) - g(@)) ~ N(0, v(@)*[g’(@)]^2)

it allows to construct an asymp. normal estimator from another

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Consistency

A

An estimator for g(@) is consistent if it’s convergent in probability (or in quadratic mean) to g(@)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Asymptotically normal estimator

A

if for some V, we have that sqr(n) (Tn - @) ~ N(0, V)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly