Midterm material (anything unclear) Flashcards
(15 cards)
Explain how to calculate [AB] in a 2 way ANOVA if this is your data set:
B1 B2
A1. 7; 9 9;3
A2. 5;8 3;2
[(7+9)^2+(9+3)^2+(5+8)^2+(3+2)^2]/2
Numerator = add up (each A cell sum)^2
Denominator = # ppl per cell
How would you calculate [X] in a 2 way ANOVA table if this is your data set: 8; 8; 3; 6; 10;
(8)^2+(8)^2+(3)^2+(6)^2+(10)^2
What does eta squared and omega squared calculate? WHat’s the difference between the 2
They both represent the proportion of variation in DV that’s explained by the IV
Eta squared is positively biased (many type I errors), while omega squared is unbiased
What is a small, medium and large effect for eta squared and omega squared?
Small = 0.01
Medium = 0.06
Large = 0.14
If the mean square were equal to the mean square error in a study, would that indicate significant or unsignificant effect?
if MS= MS_error, that would not indicate significant effect
List some tests that can be used to assess normality
- Tests for skewness (*SE skewness≈√\frac{6}{N}; Compute the ratio of \frac{skewness}{SEskewness} and reject H_0 if t_{skewness} is over 3.2)
- Tests for normality: the Kolmogorov-Smirnov (K-S) test
What are the 5 assumptions for ANOVA
- Assumption of independence - the value of experimental error is independent from the value off experimental error for other people in the sample; one person’s score on the dependent variable is not going to influence another person’s score on the dependent variable
- Identical distribution (within group) - We assume we don’t know more about any one participant’s score than we do about others. We assume E_ij (experimental error) stems from the same distribution for all participants in sample
- Indentical distribution (between groups)
- Homogeniety of variance - every populations for which we’re comparing means have the exact same variance.
5.Normal distribution - E_{ij}and dependent variable come from a normal distribution
List some tests that can be used if homogeneity of variance assumption is not met
- Perform data transformation: √Y (weak); log(Y) (mild); 1/Y (strong) AND Box-Cox transformation
- Use nonparametric tests: Kruskal-Wallis ANOVA
Explain each part of the linear model outlining outcome variable Y:
Y_{ij} = µ + α_j+E_{ij}
Each person’s score (Y_{ij}) is due to the grand mean of the population they belong to (µ), plus the deviation from the grand mean (α_j) plus some experimental error (E_{ij})
What can violations of the independence assumption lead to? Does increasing sample size fix this?
Can lead to underestimation of the true variability; –> can lead to increased rate of false positives (Type I)
Increasing sample size DOESN’T fix this
What happens if your MS_R is inflated?
It inflates your F ratio
What does violation of the normality assumption do
It tends to produce Type I error rates that are lower than the nominal value
Why can’t you do a family of T-tests instead of a 2-way ANOVA
Because there’s a very big probability of making a type1 error if your null hypothesis is true
Define:
Between subjects design
Within subjects design
Between subjects - each participant is only assigned to 1 condition; all confounding effects should be cancelled out if using random assignment
Within subjects - each participant participates in more than one condition; vulnerable to practice effect and boredom effect- this is fixed through counterbalancing