Lecture 4 Flashcards

1
Q

State the relation between two sufficient conditions for UI.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

State and prove the relation between two sufficient conditions for UI.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Are transformations of independent variables also independent?

Are transformations of uncorrelated variables also uncorrelated?

A
  1. Transformations of independent variables are also independent.
  2. Transformations of uncorrelated variables are NOT also uncorrelated.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

State formally and intuivitely the condition for martingale sequences.

A

Intuitively: Expectation with relation to the past is 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

State the relationship between indepence, martingale differences and uncorrelation.

A
  1. Independence implies martingale difference.
  2. Martingale difference implies uncorrelated.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

State and prove the relationship between uncorrelation and martingale difference sequence.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Formally state the WLLN for independent UI sequences.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Formally state and prove the WLLN for independent UI sequences.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

State a relaxation for one of the assumptions of the WLLN for independent UI sequences.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How do you use WLLN for independent UI sequences when expected mean is not equal to 0? Provide all the details.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

State Khinchin’s WLLN.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

State and prove Khinchin’s WLLN.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

State Kolmogorov’s SLLN.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Describe how we can still use the SLLN if we relax the identical assumption.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Describe why we cannot use Khinchin’s Theorem in multiple regression with deterministic z_i.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Describe why we cannot use Chebyshev’s Theorem (Th 7) in multiple regression with deterministic z_i.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Show the consistency of beta hat in multiple regression with deterministic z_i.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Formally define a generalized linear process.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Clarify in words the difference (and relationship) between a linear process and a generalized linear process.

A

In the linear process, we assume the innovations e_i have second mean and are uncorrelated.

In GLP, we simply assume that e_i is UI. Therefore a linear process is by definition a GLP, since the assumptions on linear processes imply that they are uniformly integrable.

20
Q

Formally state the theorem regarding the convergence in first mean of GLP.

21
Q

Formally state and prove the theorem regarding the convergence in first mean of GLP.

22
Q

Give an example of regressors z_i such that Q_n / n does not have a finite limit.

23
Q

State the theorem relating the convergence of Beta hat in second mean and the eigenvalues of the Q matrix.

24
Q

State and prove the theorem relating the convergence of Beta hat in second mean and the eigenvalues of the Q matrix.

25
Relate the min eigenvalue of a matrix to it's elements.
26
State another condition that sufficies for Beta hat n consistency.
27
State and prove another condition that sufficies for Beta hat n consistency.
28
Describe the relationship between second moment and probability, in cases where second moment exists and doesn't exist.
29
State and show a condition that guarantees Beta hat consistency regardless of errors.
30
Show that y\_i = alpha + Beta x i + u_i is consistent.
31
Give an example of a linear regression that is not consistent and show why it is not consistent.
32
Define O(f_n) and o(f_n)
33
Define Op(f_n) and op(f_n)
34
Give a simpler definition of op(1).
35
State two relationships between convergence in probability to a constant and stochastic order of magnitude.
36
State and prove the two relationships between convergence in probability to a constant and stochastic order of magnitude.
37
Provide a more intuitive definition of the difference between Op() and op()
38
State the three results relating different stochastic orders of magnitude.
39
Which is the direction of implication between Op(fn) and op(fn)? Prove it.
40
State and prove the relationship between stochastic orders of magnitude relating fn and gn if fn/gn goes to 0.
41
State and prove the relationship between stochastic order of magnitude and rate of convergence provided a moment exists.
42
State the 4 results relating the stochastic orders of magnitudes of combinations of RVs.
43
State and prove the stochastic order of magnitude of multiplication of two RVs. (for big and little o)
44
State and prove the stochastic order of magnitude of the sum of two RVs. (for big and little o)
45
State and prove the relationship between the multiplication of RVs that are Op(fn) and op(gn).
46
What is the rate of convergence of Beta hat in LSE? Show it.