Quantitative Methods Flashcards

1
Q

Linear Regression Equation

A

Linear regression equation: Y_i= b_0+ b_1 X_1+ ε_i,i=1,…,n where Yi – dependent, Xi – independent, bo – intercept, b1 – slope coefficient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Confidence Interval

A

Confidence interval: (b_1 ) ̂± t_c s_(b_1 ) ̂ , tc – critical t-value, b1 hat – calculated t coefficient, sb – standard error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Hypothesis Test (b1)

A

Hypothesis Test: = ((b_1 ) ̂-b_1)/s_(b_1 ) ̂ = (calculated t coefficient- H_0 Value)/(Standard Error) , where standard error= coefficient/(ANOVA t-stat), if t > tc, reject null hypothesis b1 = ___ (if testing for statistical significance from zero, b1 = 0)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Standard Error of the Estimate (SEE)

A

Standard Error of the estimate of regression model: SEE= ((Unexplained Variation or sum of squares)/(n-1))^(1/2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Regression Degrees of Freedom

A

Regression degrees of freedom = # of independent variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Residual Degrees of Freedom

A

Residual degrees of freedom = total df – regression df = n – (k+1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

MSS Regression

A

MSS Regression = Regression SS / Regression df

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

MSS Residual

A

MSS Residual = Residual SS / Residual df

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

F =

A

F = MSS Regression / MSS Residual

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Correlation =

A

Correlation = (Cov(X,Y))/(s_x s_y )=r where s is standard deviation = square root of variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

T-Test (correlation is different from zero)

A

T-Test (correlation is different from zero): t=(r√(n-2))/√(1-r^2 ) where r is sample correlation, if t is greater than tc then reject the null hypothesis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Multiple Linear Regression

A

Multiple Linear Regression: Y_i= b_0+b_1 X_1i+b_2 X_2i+⋯+b_k X_ki+ε_i,i=1,2,…,n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Durbin Watson

A

Durbin Watson, if the DW stat is outside the critical values then fails to reject the null, if DW = 2 not serially correlates, if DW 2 then negatively correlated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Multicollinearity (definition)

A

Multicollinearity – a regression assumption violation that occurs when two or more independent variables (or combinations of independent variables) are highly but not perfectly correlated with each other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Heteroscedasticity (how to notice)

A

Heteroscedasticity – incorrect standard of errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Multicollinearity (hot to notice)

A

Multicollinearity – high R2 (f-stat related) and low t-stats

17
Q

Mean-Reverting Level =

A

Mean-reverting level = b_0/(1-b_1 )

18
Q

Unit Root

A

First use a unit root test for each of the two times series to determine whether either have a unit root: 1) if neither have unit root, can safely use linear regression to analyze relationship between two time series; 2) if one fails, cannot use linear regression to analyze relationship between two time series; 3) if both have unit root, need to establish whether the two time series are cointegrated before rely on regression

19
Q

Testing for ARCH

A

Test for autoaggressive conditional heteroscedasticity (ARCH): whether a1 is statistically different from 0 (a1 is like the b1 of the error regression function, where the error term is the independent variable)