Asymptotic Theory Flashcards

1
Q

What does it mean for a parameter to be consistent?

A

The estimatorconverges in probability to the true parameter.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the Weak Law of Large Numbers? Theorem 12

A

Given a sequence of iid random variables with a first moment, the first moment estimator converges in propability to the expectation of the random variable y.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is Markov’s Inequality?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the WLLN for the vector case?

A

A sequence of random vectors Zn converges in probability to z as n goes to infinity if for all epsilon greater than zero, the limit as n goes to infinity of the probability that the norm of Zn - z is smaller than epsilon must equal 1. I.e

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the Central Limit Theorem?

A

Given a sequence of iid random variables with mean and variance lower than infinity, then the square root of n multiplied by the difference between an estimator and its true parameter must converge in distribution to a Normal with mean zero and variance sigma squared.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the WLLN fundamentally about?What

A

It is about the convergence of estimators to point parameters.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the CLT fundamentally about?

A

It links estimates, parameters and convergence rate to a Normal distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the three assumptions of the Lindeberg Levy CLT?

A

That the sequence of random variables is i.i.d, and that the first and second moments exist (smaller than infinity).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the two assumptions for WLLN to hold?

A

That the sequence of random variables be i.i.d and that its first moment exist, i.e bounded by infinity at the floor and ceiling.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the Continuous Mapping Theorem.

A

Given a random sequence of vectors and a continuous function g, then if the sequence converges in probability to z, then g of the sequence converges to g(z)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

State Lyapunov’s Equality

A

(E|A|)^r)^(1/r) ≤ (E|A|^p)^(1/p) for 1 <= r <= p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

State Minkowski’s inequality.

A

For any random mxn amtrices X and Y with p > 0, we have that
(E[|X+Y|^p})^(1/p) <= (E[|X|^p])^(1/p) + (E[|Y|^p])^(1/p)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly