Chapter 10 Flashcards

1
Q

Variation in Y

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition - Simple linear Regression

A

Explains variation in a DEPENDENT variable in terms of the variation in a SINGLE INDEPDENT variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Definition - Dependent

A

variable whos variation is explained by the independent variable.

EXPLAINED variable or the predicted variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Definition - Independent

A

variable used to explain the variation of the dependent variable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Ordinary Least Squared Line

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Error Term

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Intercept Interpretation for ABC stock excess return of -2.3

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Slope interpretation for ABC stock excess return of 0.64%

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does it mean when variable have a hat ?

A

Hat means predicted value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Slope Coefficient

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Intercept

A

Regression l ine passes through (Xbar,Ybar)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Assumption of linear Regression

A

(1) Linear relationship with dep and indep

(2) Variance of error terms is constant (homoskedasticity)

(3) Error terms are independently distributed (i.e. uncorrelated with each other)

(4) Error terms are normally distributed.

Violation of 3 and/or 4 is called serial of auto-correlation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

homoskedasticity

A

case where prediction errors all have the same constant variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

hetroskedasticity

A

variance of the error terms not being constant.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

total sum of squares (SST)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

ANOVA definition

A

Analysis of variance (ANOVA) is a statistical procedure for analyzing the total variability of the dependent variable

17
Q

sum of squares regression (SSR)

A
18
Q

mean square regression (MSR) i

A

K = 1 for simple regression (number of independent variables)

19
Q

sum of squared errors (SSE)

A
20
Q

Relationship with SST, SSR, and SSE

A
21
Q

mean square error (MSE)

A

K = 1 for simple regression (number of independent variables)

22
Q
A
23
Q

coefficient of determination (R2)

Definition
Example R2 of 0.63
Relationship with R2 and correlation coeff, r

A
23
Q

Standard Error of Estimate (SEE)

A
24
Q

The F-Statistic

Definition
Tail
Formula

A
25
Q

Hypothesis Test of a Regression Coefficient

A
26
Q

Hypothesis Test of a Regression Coefficient (t b1)

Standard Error of Slope Coefficient

A
27
Q

t-test for simple linear regression is equivalent to what?

A

-test for a simple linear regression is equivalent to a t-test for the correlation coefficient between x and y:

r is corr coeff

28
Q

simple regression, this is the predicted (or forecast) value of Y:

A
29
Q

Confidence Intervals for Predicted Values

Definition and formula (two parts)

A

Confidence intervals estimate a prediction interval round a predicted value.

where:

SEE2 = variance of the residuals = the square of the standard error of estimate

sx2 = variance of the independent variable

X = value of the independent variable for which the forecast was made

30
Q

Functional Forms

A

When relationship between X and Y are NOT linear, fitting a linear model is biased prediction.

If you transform one or both variable by taking their natural log, you might make relationship between the transformed variable linear.

31
Q

Natural log Tranformation

  1. Log-Lin
  2. Lin-Log
  3. Log-Log
A

Y is independent
X is dependent

  1. taking natural log of the Y variable only.
    This is if the dependent variable is logarithmic, while the independent variable is linear
  2. taking natural log of the X only.
    This is if the dependent variable is linear, while the independent variable is logarithmic.
  3. taking natural log of both X and Y
    Both the dependent variable and the independent variable are logarithmic
32
Q

Log-Lin Model

A
33
Q

Lin-Log Model

A
34
Q

Log-Log

A