Econometrics Flashcards

(71 cards)

1
Q

1.3: Time Series Data Set

A

A time series data set consists of observations on a variable or several variables over
time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

1.3: Cross-Sectional Data Set

A

A sample of individuals, households, firms, cities,
states, countries, or a variety of other units, taken at a given point in time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

1.3: Pooled Cross Section Data Set

A

A data configuration where
independent cross sections, usually collected at different
points in time, are combined to produce a single
data set.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

1.3: Panel Data Set

A

A data set constructed from repeated cross
sections over time. With a balanced panel, the same
units appear in each time period. With an unbalanced
panel, some units do not appear in each time period,
often due to attrition.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

2.1: Simple Linear Regression Model

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

2.1: The Zero Conditional Mean Assumption

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

2.1: Population Regression Function (PRF)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

2.2: Equation of the Slope Parameter

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

2.2: Equation for the Intercept Parameter

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

2.2: OLS Regression Line/ Sample Regression Function

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

2.3: Total Sum of Squares, Explained Sum of Squares and Residual Sum of Squares

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

2.3: Coefficient of Determination

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

2.4: Δy in level-level, log-level, level-log, and log-log models

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

2.5: The Four Assumptions for Unbiasedness of OLS

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

2.5: Proof of Unbiasedness of the OLS Slope Parameter

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

2.5: Proof of Unbiasedness of the OLS Intercept Parameter

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

2.5: Sample Variances of the OLS Estimators

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

2.5: Definition of Homoskedasticity

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

2.5: Definition of Heteroskedasticity

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

2.5: Unbiased Estimator of the Error Variance and Standard Error of Regression

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

2.5: Standard Error of the Estimated Slope Parameter

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

2.6: Regression Through the Origin

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

3.1: General Multiple Linear Regression Model

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

3.1: The Zero Conditional Mean Assumption for Multiple Regression

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
3.2: Sample Regression Function for Mutiple Regression
26
3.2: Equation for the Slope Parameter for Multiple Regression
27
3.2: Comparison of Simple and Multiple Regression Estimators
28
3.3: The Four Assumptions for Unbiasedness of Multiple OLS
29
3.4: Sample Variance of Simple OLS Estimator
30
3.4: Sampling Variances of Multiple OLS Slope Parameters
31
3.4: Standard Deviation and Standard Error of Parameters
32
4.1: Assumption MLR.6: Normality
33
4.1: Normal Sampling Distribution Theorem
34
4.2: *t* Distribution for the Standardized Estimators
35
4.2: The *t* Statistic for a Parameter
36
4.2: Testing Against One-Sided Alternatives
37
4.2: Testing Against Two-tailed Alternatives
38
4.3: Confidence Interval for Population Parameters
39
4.4: Test Statistic for Testing Parameters
40
4.4: Comparing Two Parameters Method 2
41
4.5: F Statistic for the Multiple Linear Restrictions Test
42
4.5: Steps for Testing Multiple Linear Restrictions
43
4.5: R-Squared Form of the F Statistic
44
4.5: Testing the Overall Significance of Resgression
45
5.1: Assumption MLR.4' Zero Mean and Zero Correlation
46
5.1: Consistency of OLS Theorem
47
5.1: Deriving the Inconsistenct in OLS
48
5.2: Asymptotic Normality of OLS
49
5.2: Langrange Multiplier Statistic
50
6.1: Beta Coeffiecients
51
6.2: Making Logarithmic Approximations Accurate
52
6.2: Models with Interaction Terms
53
6.3: Adjusted R-Squared
54
6.3: Using Adjusted R-Squared to Choose between Functional Forms
55
6.4: Confidence Interval for Prediction
56
6.4: Confidence Interval for a Particular Value
57
6.4: Predicting y with a Logarithmic Dependent Variable
58
7.2: Dummy Variables on the Intercept
59
7.3: Uncentered Coefficient of Determination
60
7.3: Ordinal Information Using Dummy Variables
61
7.4: Dummy Variables on the Slope
62
7.4: Testing for Differences in Regression Functions across Groups
63
7.5: Linear Probability Model (LPM)
64
8.2: Heteroskedasticity-Robust Variance for Simple Regression
65
8.2: Heteroskedasticity-Robust Variance for Multiple Regression
66
8.2: Computing Heteroskedasticity-Robust LM Tests
67
8.3: The Breusch-Pagan Test for Heteroskedasticity
68
8.3: A Special Case of the White Test for Heteroskedasticity
69
9.1: Regression Specification Error Test (RESET)
70
9.1: The Davidson-MacKinnon Test
71