Lecture 2 Flashcards

1
Q

Given an Parameter estimator θn. When is such an estimator unbiased for θ?

A

An estimator is unbiased if E[θn] = θ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Given an Parameter estimator θn. When is such an estimator asymptotically unbiased for θ?

A

An estimator is asymptotically unbiased if lim as n->inf. E[θn] = θ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Do consistency and unbiasness imply each other? Give an example.

A

No, none imply the other.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define Markov’s Inequality. (Theorem 3)

A

For any r>0 for which E|X|^r < inf., then
P(|X| > δ) <= δ^(-1) * E|X|^r

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define convergence in r-th mean and its assumptions.

A

Assuming E|X|^r < inf and E|Xn|^r < inf, we say that Xn converges to X in r-th mean if

lim n-> inf. E[|Xn - X|^r > δ] = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Does convergence in r-th mean imply convergence in probability? How an we prove it?

A

Yes, by markov’s inequality.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Define consistency in r-th mean.

A

An estimator θn is r-th mean consistent for θ if
E[|θn - θ|^r] -> 0 as n -> inf.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Name an important implication of r-th mean consistency when r >= 1. Give a proof.

A

It implies asymptotic unbiasedness.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Say Xn converges to c in r-th mean and g(*) is a continuous function. What does this imply? Which implication might be true at first but isn’t and why?

A

It implies that g(Xn) converges in probability to g(c). It doesn’t imply that g(Xn) converges to g(c) in r-th mean because E[|g(Xn)|^r] might not exist.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Explain Theorem 7

A

To complicated to write.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is a linear process. Give examples of linear processes.

A

Given a sequence of non-random mxm matrices, ui is a linear process if:

ui = ∑Aj*ei-j and the sum of norms of Aj is less than infinity. With the sum indexing from j=0 to infinity.

Examples include AR(p), MA(q) or ARMA(p,q)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Given a linear process ui, what is its autocovariance structure?

A

the autocovariance E(uiui’) =
∑A(k)
S*A’(k+j-i) i≤j index from k=0 to inf. (Review examples 10 and 11)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly