# 2.) Ordinary Least Squares Flashcards

Why use Ordinary Least Squares?

- ) OLS is relatively easy to use.
- ) The goal of minimizing sum of error squared is quite appropriate from a theoretical point of view
- ) OLS estimates have a number of useful characteristics

Ordinary Least Squares (OLS)

is a regression estimation technique that calculates the estimated slope coefficients so as to minimize the sum of the squared residuals

An estimator is …

a mathematical technique that is applied to a sample of data to produce real-world numerical estimates of the true population regression coefficients (or parameters). So, OLS is an estimator, and an estimated slope coefficient produced by OLS is an estimated.

What are the three reasons for using OLS?

- ) It is the simplest of all econometric estimation techniques.
- ) Minimizing the summed, squared residuals is a reasonable goal for an estimation technique.
- ) Has the following two useful characteristics.

a. ) The sum of the residuals is exactly zero.

b. ) OLS can be show to be the “best” estimator possible under a set of specific assumptions

K =

of independent variables

i =

goes from 1 to N and indicates the ith observation of independent variable

the biggest difference btw a single-independent-variable regression model and a multivariate regression model is

the interpretations of the latter’s slope coefficients, often called partial regression coefficients, are defined to allow a researcher to distinguish the impact of one variable from that of other independent variables.

You should always include beta naught in a regression equation,…

but you should not rely on estimates of beta naught for inference

Total Sum of Squares

is the squared variations of Y around its mean as a measure of the amount of variation to be explained by the regression.

For OLS, the total sum of squares has two components….

- ) Variation that can be explained by the regression

2. ) Variation that cannot

TSS = ESS + RSS

this is usually called the decomposition of variance

Decomposition of the variance in Y

The variation of Y around its mean (Y - Yavg) can be decomposed into two parts:

- ) (Yobs - Yavg) = the difference between the estimated value of Y(Yhat) and the mean value of Y (Yavg).
- ) (Yi-Yhat) = the difference between the actual value of Y and the estimated value of Y.

Explained Sum of Squares (ESS) =

Measures the amount of the squared deviation of Yi from its mean that is explained by the regression line. It is attributable to the fitted regression line.

Residual Sum of Squares

This is the unexplained portion of TSS (that is, unexplained in an empirical sense by the estimated regression line)

The smaller the RSS is relative to the TSS…

the better the estimated regression line fits the data.