Lecture 1 Flashcards

1
Q

Define the concept of Weak Consistency.

A

An Estimator based on a sample of n observations is weakly consistent if the estimator converges in probability to the true value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define Convergence in Probability.

A

A sequence of random variables Xn converges in probability to X if for all δ>0,

lim P(|Xn - X| > δ) = 0 as n -> inf.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define Almost Sure convergence.

A

A sequence of random variables Xn almost surely converges to X if for all δ>0,

lim P(|Xm - X| > δ for some m≥n) = 0 as n -> inf.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is an alternative way of writing almost sure convergence.

A

P(lim |Xn - X|=0) = 1 as n -> inf.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Between convergence in probability and almost sure convergence, which implies the other? Why

A

Almost sure convergence implies convergence in probability but not the reverse. Because the set of almost convergent random variables is a subset of of its brother in convergence in probability, then the probability fo A.S is upper bounded by convergence in probability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Give an alternative definition of convergence in probability.

A

For all δ>0, and ε>0, there exists some n0(δ,ε) s.t for all n ≥ n0,

P(|Xn - X| > δ ) < ε

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Define the concept of uniform convergence.

A

A Sequence of random variables {Xn(θ)} converges uniformily to a r.v X(θ) if ,

lim P(sup|Xn(θ) - X(θ)| < ε} = 1 as n -> inf.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Define the concept of complete convergence.

A

We say that Xn converges completely to X if for all δ > 0,

∑ P(|Xn - X| > δ) < inf.

Where we take the sum from n = 0 to infinity. In other words, the sum of probability that the absolute difference between all random variables Xn and X must be finite.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the relationships between Almost sure convergence, convergence in probability and complete convergence?

A

Complete implies A.S and A.s implies Probability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Define Slutzky’s Theorem.

A

Given Xn a sequence of kx1 vectors and X. Assuming Xn converges to X in probability, then given a function g(*) which is continuous on the domain of Xn. Then:
g(Xn) converges in P. to g(X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly