Final Flashcards

1
Q

Repeated-Measures ANOVA

  • used when? (2)
A
  1. independant assumption violated
  2. 3+ conditions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Repeated Measures ANOVA

  • SS calculated? (4)
A

SStotal

SSwithin treatments

SSsubjects

SSerror

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Repeated Measures ANOVA

formulas:

  • SS total
  • SS within treatments
  • SS subjects
  • SS error
A

SS total = Σx2 - (Σx)2/N

SS <strong>within treatments</strong> = n Σ(x̄w - x̄)2

SS subjects = k Σ(x̄s - x̄)2

SS error = SStotal - SSwithin treatments - SSsubjects

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Repeated Measures Anova

  • MS formulas
  • df formulas
A

MStreatment

dftreatment = k - 1

MSerror

dferror = (k-1)(n-1)

When calculating SSerror, eliminates effects of MSsubjects

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Repeated Measures Anova

  • assumptions (2)
A

1) Normality
2) Sphericity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Repeated Measures ANOVA ASSUMPTIONS

2) Sphericity

A

assumption of equal variances & equal covariances

(NOT robust to violations)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Factorial ANOVA

  • simplest case?
A

2 b/w-subject IVs of 2 levels

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Main effect

A

mean difference among levels of one factor

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Interaction

A

mean differences between treatment conditions (cells) are different from what would be predicted from overall main effects of factors

  • effect of one factor depends on different levels of 2nd factor
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Two-Factor ANOVA

(3) hypothesis tests?

A

1) main effect of A
2) main effect of B
3) A x B interaction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Two-Factor ANOVA

  • SS
A

SS total

SS BG

SS WG

SS A

SS B

SS AxB

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Two Factor ANOVA

Formulas:

  • SS total
  • SS BG
  • SS WG
  • SS A
  • SS B
  • SS AxB
A

SS total = ΣXtotal2 - ( ΣXtotal)2/N

SS BG = ΣT2/n - G2/N

SS WG = SST - SSBG

SS A = ΣTrow2/nrow - G2/N

SS B = ΣTcolumn2/ncolumn - G2/N

SS AxB = SSBG - SSA -SSB

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Two-Factor ANOVA

  • MS
  • df
  • F
A

MS for:

  • A → a -1
  • B → b -1
  • AxB → (a-1)(b-1)
  • WG → ab(n-1)

F = MS/MSWG

  • FA, FB, FAxB
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Correlation

A

describes linear relationship between 2 ordinal/interval level variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Correlation

  1. relationship must be?
  2. restricted range & reliability of measures limits…?
A

1) linear
2) magnitude of correlation coefficient

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Correlation

  • (3) assumptions
A

1) Normality
2) Linearity
3) Homoscedasticity

17
Q

Correlation ASSUMPTIONS

3) Homoscedasticity

A

assumes variance around regression line is same for all X values

(equal spread)

18
Q

Regression

A

technique to fine line of best fit

19
Q

Correlation suggests we can….

A

predict Y values for given values of X

20
Q

If correlation is perfect, all points will..

If NOT?

A
  • If correlation is perfect, all points will*..fall on regression line
  • If NOT,* regression line must be calculated.
21
Q

Regression

Notation for:

  1. predicted values of Y
  2. residuals
  3. regression coefficient (slope)
  4. Y intercept
A

1) Y’
2) Y-Y’
3) b1
4) bo

22
Q

Regression

  • residuals
A

errors of underprediction & overprediction

23
Q

Sum of squared residuals is?

A

Minimal

(regression line = best-fitting line)

24
Q

Regression

  • formula for predicted values of Y (Y’)
A

Y’ = b1x + bo

25
**_Correlation & Regression_** formulas: * b1 * bo
b1 = r (Sy/Sx) bo = y-bar - b1
26
Correlation & Regression ## Footnote calculate?
1. r 2. b1 3. bo 4. Y' 5. **SSY**' = Σ(Y'-Ybar)2 6. **SSY-Y' **= Σ(Y-Y')2 7. MSpredicted 8. MSresidual 9. Fobtained
27
**_Regression_** * predicted/explained variance
Variability in **Y** predicted by **X** SSY' = Σ(Y'-Ybar)2
28
**_Regression_** * Residual/Error Variance
Variability in **Y** NOT predicted by **X**
29
Non-Parametric Tests
make no assumptions about shape of distribution
30
**_Chi-Square Test_** for Goodness of Fit * uses? * determines?
uses **sample data** to **test hypothesis** about **shape/proportion** of **population distribution** - **determines** how well **sample proportions _fit_** population proportions **specified by null hypothesis**
31
**_Chi-Square Tes_**t for Goodness of Fit * null hypothesis * alternative hypothesis
Ho: change in frequency **due to chance** H1: change in frequency **NOT** due to chance
32
**Chi-Square Test for Goodness of Fit** * formula * df?
X2 = Σ [(o - e)2/e] df = c -1 * **c** *= # of categories*
33
**_Chi-Square Test_ for 2 Independent Samples**
used to test whether there is association between 2 frequency variables
34
**Chi-Square Test for _2 Independent Samples_** * null & alternative hypotheses
Ho: treatment is independant of outocme H1: treatment is **NOT** independant of outocme
35
**Chi-Square Test _for 2 Independent Samples_** * formula * df?
X2 = Σ [(o - e)2/e] e = [(R)(C)]/N df = (R-1)(C-1)
36