Flashcards in BIO 330 Deck (379)
Loading flashcards...
241
extrapolating linear regression
DO NOT extrapolate beyond data, can't assume relationship continues to be linear
242
linear regression Ho
Slope is zero (β = 0), number of dees cannot be predicted from predator mass
243
linear regression Ha
slope is not zero (β ≠ 0), number of dees can be predicted from predator mass (2 sided)
244
Hypothesis testing of linear regression
testing about the slope:
–t-test approach
–ANOVA approac
245
Putting linear regression into words
Dee rate = 3.4 - 1.04(predator mass)
Number of dees decreases by about 1 pre kilo of predator mass increase
246
testing about the slope, t-test approach
test statistic t = b–β_o / SE_b
SE_b = √MSresidual/Σ(Xi-Xbar)^2
MSres. = Σ(Yi-Yhat)^2 / n-2
critical t = t_α(2),df
df = n - 2
compare statistic, critical
247
testing about the slope, ANOVA approach
source of variation: regression, residual, total
sum of squares, df, mean squares, F-ratio
248
calculating testing about the slope, ANOVA approach
SSregres = Σ(Yi^ - Ybar)^2
SSresid. = Σ(Yi-Yi^)^2
MSreg. = SSreg/df df=1
MSresid = SSres/df df=n-2
F-ratio = MSreg/MSres.
SStotal = Σ(Yi-Ybar)^2
df total = n-1
249
interpreting ANOVA approach to linear regression
If Ho is true, MSreg. = MSres
250
% of variation in Y explained by X
R^2 = SSreg/SStotal
a% of variation in Y can be predicted by X
251
Outliers, linear regression
create non-nomral Y-value distribution, violate assumption of equal variance in Y, strong effect on slope and intercept; try not to transform data
252
linear regression assumptions
linear relationship
normality of Y at each X
variance of Y same for every X
random sampling of Y's
253
detecting non-linearity
look at the scatter plot, look at residual plot
254
checking residuals
should be symmetric above/below zero
should be more points close line (0) than far
equal variance at all values of x
255
non-linear regression
when relationship is not linear, transformations don't work, many options- aim for simplicity
256
quadratic curves
Y = a + bX + cX^2
when c is negative, curve is humped
when c is positive, curve is u shaped
257
multiple explanatory variables
improve detection of treatment effects
investigate effects of ≥2 treatments + interactions
adjust for confounding variables when comparing ≥2 groups
258
GLM
general linear model; multiple explanatory variables can be included (even categorical); response variable (Y) = linear model + error
259
least-squares regression GLM
Y = a + bX
error = residuals
260
single-factor ANOVA GLM
Y = µ + A
error = variability within groups
µ = grand mean
261
GLM hypotheses
Ho: response = constant; response is same among treatments
Ha: response = constant + explanatory variable
262
constant
constant = intercept or grand mean
263
variable
variable = variable x coefficient
264
ANOVA results, GLM
source of variation: Companion, Residual, Total
SS, df, MS, F, P
265
ANOVA, GLM F-ratio
MScomp. / MSres.
266
ANOVA, GLM R^2
R^2 = SScom. / SStot.
% of variation that is explained
267
ANOVA, GLM, reject Ho
Model with treatment variable fits the data better than the null model but only 25% of the variation is explained
268
Multiple explanatory variables, goals
improve detection of treatment effects
adjust for effects of confounding variables
investigate multiple variables and their interaction
269
design feature for improving detection of treatment effects
blocking
270