Quantitative Methods Flashcards
Linear Regression Equation
Linear regression equation: Y_i= b_0+ b_1 X_1+ ε_i,i=1,…,n where Yi – dependent, Xi – independent, bo – intercept, b1 – slope coefficient
Confidence Interval
Confidence interval: (b_1 ) ̂± t_c s_(b_1 ) ̂ , tc – critical t-value, b1 hat – calculated t coefficient, sb – standard error
Hypothesis Test (b1)
Hypothesis Test: = ((b_1 ) ̂-b_1)/s_(b_1 ) ̂ = (calculated t coefficient- H_0 Value)/(Standard Error) , where standard error= coefficient/(ANOVA t-stat), if t > tc, reject null hypothesis b1 = ___ (if testing for statistical significance from zero, b1 = 0)
Standard Error of the Estimate (SEE)
Standard Error of the estimate of regression model: SEE= ((Unexplained Variation or sum of squares)/(n-1))^(1/2)
Regression Degrees of Freedom
Regression degrees of freedom = # of independent variables
Residual Degrees of Freedom
Residual degrees of freedom = total df – regression df = n – (k+1)
MSS Regression
MSS Regression = Regression SS / Regression df
MSS Residual
MSS Residual = Residual SS / Residual df
F =
F = MSS Regression / MSS Residual
Correlation =
Correlation = (Cov(X,Y))/(s_x s_y )=r where s is standard deviation = square root of variance
T-Test (correlation is different from zero)
T-Test (correlation is different from zero): t=(r√(n-2))/√(1-r^2 ) where r is sample correlation, if t is greater than tc then reject the null hypothesis
Multiple Linear Regression
Multiple Linear Regression: Y_i= b_0+b_1 X_1i+b_2 X_2i+⋯+b_k X_ki+ε_i,i=1,2,…,n
Durbin Watson
Durbin Watson, if the DW stat is outside the critical values then fails to reject the null, if DW = 2 not serially correlates, if DW 2 then negatively correlated
Multicollinearity (definition)
Multicollinearity – a regression assumption violation that occurs when two or more independent variables (or combinations of independent variables) are highly but not perfectly correlated with each other
Heteroscedasticity (how to notice)
Heteroscedasticity – incorrect standard of errors