From Quizes Flashcards

0
Q

The purpose of defining families of IVs in a model reduction exercise is to:

A

Let theory play a role in the analysis
Reduces likelihood of a type two error
Maximises chance the theoretically most important predictors will remain in the model
Decrease the number of hypothesis tests conducted

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
1
Q

Is it possible to obtain an approximate hypothesis test concerning the effect of an IV on a DV from the confidence interval? How?

A

Yes, if zero doesn’t lie within the 95% ci we will reject the null hypothesis that the population slope is zero.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

ANOVA - regression suns of squares gives us…

A

Is the part of the variance that has been accounted by the regression model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

ANOVA table - std dev given by…

A

Finding the square root of the mean square error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Assumptions of regression and how we assess them?

A
  1. Independent observations - normally met by design
  2. Normal distributions of residuals - via a pp plot - expected (y-axis) vs observed (x-axis), clear snaking around the line?
  3. Linear relationship between IV and DV - even # above and below the line? Consistent trend or pattern influencing the data?
  4. Constant variance - fanning?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How do we get ‘r square’ from sum of squares?

A

Divide regression SOS into total SOS.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Effect of a confounding variable?

A

The effect of x1 is negated or reduced after controlling for x2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Effect of suppression

A

The effect of x1 is only apparent or is increased after controlling for x2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Effect of interaction variable

A

The effect of x1 is estimated to be larger or smaller depending upon the value of x2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Direct & indirect effects of variables?

A

The effect of x1 acts on Y (at least partially) through x2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Collinearity

A

This occurs when two or more IV’s are so correlated that one can be predicted (almost) exactly from one or more of the others

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Tolerance and VIF tells us? What values?

A

Statistical indicator of collinearity. Regresses x1 on all other IV’s and assesses how much of the variance in x1 is not accounted for by the other IV’s. this .1 means there is little to distinguish it from a combination of other IV’s. VIF is inverse of tol.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Formula for number of possible pair wise comparisons

A

(k(k-1))/2 where k is the number of levels

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

In multiple regression, when we control for potentially confounding variables, we…

A

Estimate the effect of each IV with the other IV(s) held at a non-specified constant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In a log regression model, which column gives us the odds ratio

A

Exp(b)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How do we find correlation for the a model looking at the r square stats

A

Square root the r square stats and it gives you the correlation

Q8 practice exam questions

16
Q

Variance fitted last or unique examined variance is obtained by?

A

Squaring the part correlation for the corresponding value.

17
Q

If the effect of one IV (e.g. Anger) is suppressed by one or more IVs, we would expect:

A

The partial regression coefficient for anger to be larger after controlling for the other IV(s)

18
Q

The SD of the residuals on the a model regression can be found from where in the anova table output.

A

By square rooting the mean square residuals