simple linear regresion Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

dependent variable - y

A

the variable we are seeking to explain

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

independent variable - x

A

the explenatory variable -

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

linear ligretion

A

assumes a linear relationship between the DV and IV

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

variation of y

A

SSE = suma(yi-yśr)^2

to test how x explain y is we take SST = suma(yi-yh)^2
yh - odległość od lini regresji

if SSE = SST there is no diffrence
if SSE>SST x can be more described by y with the line

yi = b0+b1*xi +e

b0 = intercept
b1 = slope coeficient = cov(x,y)/variance(x)
both known as regression coeficient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

least squares method

A

residuals = distance from the reg line

best fit line = minimizes the sum of the squares deviations between the observed values of y and the predicted value of y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

intercept (interpretation)

A

yśr=b0 if xi=o

bi = the change in y for a one unit change in x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

linearity

A

the relationship between x and y is linear in the parameters. b0 and b1 cant be multiplied or divided by any regression parameter

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

homoskedacticity

A

variance of (error) is the same for all observations if there is more than one it is a violation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

independence

A

pair of x and y should be independent. we should not be able to predict

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

normality

A

error is normally distributed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

analysis of variance

A
  1. SST=SSE+SSR
  2. sum(yi-yśr)^2 = sum(yi-yh)^2+sum(yh-yśr)^2
  3. total sum of squares = sum of squared errors(unexlained) = regression sum of squares (explained)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

coefficient of determination

A

measure of fit (not stat test) (R^2)

R^2 = SSR/SST = expl var/total var

how much variation of the independent value is being explained by the variation of the dependent value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

F-test of coeficient

A

f = MSR/MSE = var1/var2

F = SSR/k/SSE/n-(k+1)

k=slope coefficient
k+1 = regression coefficient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Standard error of the estimate

A

SEE = sum(yi-yh)^2/(n-2))^1/2 = MSE^1/2

the lower the see the more accurate the regression

standard dev of the error term

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

hypothesis test of b1

A

t = (bh-b1)/Sbh

Sbh = standard error of b1 (mianownik to SEE/pierw(suma(xi-xSr)^2

h0 = b1=0
h1 = b1=/0
bh = we should have

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

hypothesis test of b0

A

we do it when b0 have a meaning (mostly it does not have)

t=(bh-b0)/Sbh

17
Q

log-lin model

A

ln(yi) = b0 +b1*xi

x is mostly time

18
Q

lin-log model

A

y=b0+b1*ln(x)

when y and x are significant different scale

19
Q

residual

A

error = yi=yh

20
Q
A