Research Methods III Flashcards
(181 cards)
Covariance
- Reflects the degree to which 2 variables vary together.
- Relationship between two continuous variables in original RAW (unstandardized) scale.
- Scale dependent.
Sum of Squares
Compute deviations for X & Y and E (cannot be a negative value).
Sum of Products
Compute product of xy deviations and E (error)
Correlation
Standardized (z-score) measure of linear relationship between 2 continuous variables.
- Standardized.
- Z-score.
- Scale invariant.
Fisher z-test
Testing 2 independent sample correlations.
Effect of r
- Pearson correlation
- Provides a measure of effect size due to being based on standardized scores.
+/- 1 = small effect, +/-3 = medium effect, +/-5 = large effect
Regression
The statistical technique to produce the best straight line to predict Y.
Regression equation
Yi = bo + biXi + Ei
Yi
Dependent or outcome variable, criterion variable
Xi
Independent variable, predictor variable
bi
Regression coefficient for the predictor.
Gradient (slope).
bo
y intercept
value of y when x = 0
Ei
the errors in prediction based on the regression
Assumptions of Regression:
Linearity
Based on linear correlations, assumes linear bivariate relationship between each x and y, and also between y and predicted y.
Assumptions of regression:
Normality
Normally distributed, both univariate and multivariate distributions of residuals.
Y scores are independent and normally distributed (Shapiro-Wilk)
Assumptions of regression:
Independence of scores
Independence of Y (outcome: DV) scores.
Assumptions of regression:
Independence of errors
Errors (residuals) from observations should not be correlated with each other (Durbin-Watson test)
Assumptions of regression: Minimal multicollinearity
Predictors (IVs) should not be highly correlated with each other.
No higher than r = .80 for predictors.
Want Variance Inflation Factor (VIF) to be less than 10.
Assumptions of regression:
Homoscedasticity
Variance of residuals are uniform for all values of Y (test with Levene’s test) assumed with the sample size is large. Cannot be assumed if sample size is small.
Ordinary Least Squares regression (OLS)
- Yields values for b-weights (regression coefficients) and the y-intercept that will result in the sum of the squared residuals being at the minimum (smallest).
Best fitting line = smallest total error.
Resulting regression line = least-square error solution.
B-Weights + y-intercept = SS Residuals at minimum
Partially standardized regression coefficient
Regression coefficient predicting Y from X.
Only standardized on x, not y.
The Regression Model & Sum of Squares
Squaring each of the deviations and summing across observations yields SS for each source of variability of y.
ANOVA to test the Regression Model:
Regression
Variability in y that can be explained by the predictor(s) – represents the component of Y that is shared with x1
ANOVA to test the Regression Model:
Residual
Variability in Y that cannot be explained by the predictor – simply what is ‘left over’ after accounting for X.