Statistics: Inferential Stats Concepts and Terms Flashcards

(47 cards)

1
Q

Inferential Statistics: overview

A

Descriptive stats = summarize data

Inferential Stats = make inferences about a population based on sample drawn from a population

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Central Limit Theorem

A

Distribution approaches a normal curve as sample size increases

The mean of the sampling distribution = pop mean

SD of distribution = Standard Error of the Mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Type I Error (α)

A

Rejection of a true null hypothesis

Research erroneously shows significant effects

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Type II Error (β)

A

Retain a false null hypothesis

Research misses actual significant effects

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Power (1-β)

A

Likelihood of rejecting false null hypothesis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Parametric v Nonparametric Tests:

Measurement Scales

A

Parametric Tests: Interval or Ratio Scales

Non-Parametric Tests: Nominal or Ordinal Scales

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Parametric v Nonparametric Tests:

Commonalities and Differences

A

Both assume random selection and independent observations

Parametric tests (e.g. t-test, ANOVA) evaluate hypotheses about population means, variances, or other parameters.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Parametric Tests:

Assumptions

A

Normal Distribution

Homoscedasticity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Homoscedasticity

A

Assumption that variances of populations that groups represent are relatively equal

[For studies with more than one group]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

One-way ANOVA vs Factorial ANOVA vs MANOVA

A

One-way ANOVA: ONE IV, ONE DV

Factorial ANOVA, two-way = 2 IV’s, three-way = 3 IVs

MANOVA: used whenever there is more than one dv
(MULTIvariate analysis)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Effect Size:

What is it?

Name two types

A

Measure of the practical or clinical significance of statistically significant results

Cohen’s d

Eta squared (η²)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Cohen’s d

A

Effect size in terms of SD (d = 1.0 = 1SD change)

Small effect size = 0.2
medium effect size = 0.5
large effect size = 0.8

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Eta squared (η²)

A

Effect size in terms of variance accounted for by treatment

*Variance = σ², so think squared greek letter = variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Bivariate correlation assumptions

A

Linearity

Unrestricted range of scores on both variables

Homoscedasticity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Bivariate correlation “language” (X, Y)

A

X = predictor variable

Y = criterion variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Simple Regression Analysis

A

Allows predictions to be made with:

One predictor (X) 
One criterion (Y)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

F ratio calculation

A

MSB/MSW

Mean square between divided by mean square within

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

F ratio range

A

F is always greater than +1

Larger F ratio = increased likelihood of stat significance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Statistical Power definition

A

Degree to which a statistical test is likely to reject a false null hypothesis (1-β)

Reject false null = show statistical significance

20
Q

Ways to Increase Statistical Power

A

Increase alpha from .01 to .05

Increase sample size

Increase the effects of the IV

Minimize error

Use one-tailed test when appropriate

Use parametric test

21
Q

Effects of increasing alpha from .01 to .05

A

Greater likelihood of rejecting null hypothesis

*Greater likelihood Type I error

22
Q

Effects of decreasing alpha from .05 to .01

A

Decreased statistical power

However, increased confidence that statistically significant results are correct

23
Q

Nonparametric tests and data distribution

A

Nonparametric only evaluates hypotheses about Shape of distribution

NOT distribution’s mean, variance, or other parameter

24
Q

Two factors that determine critical value for statistical significance

A

alpha (e.g. .05)

degrees of freedom

25
Regression analysis: assumptions
Linear relationship between X and Y regression line = "line of best fit"
26
Regression Analysis: coefficient range
-1.0 to +1.0 | It's a correlational technique
27
Multiple regression
two or more continuous or discrete predictors (X) one criterion (Y)
28
Multicollinearity
High correlation between two or more predictors | Makes it difficult to interpret regression coefficients if correlated, how to know which X accounts for change in Y?
29
Forward Stepwise Regression
One predictor is added in each subsequent analysis
30
Backward Stepwise Regression
Analysis begins with all predictors One predictor is eliminated in each subsequent analysis
31
When to use Multiple Regression instead of ANOVA
when groups are unequal in size when IV's are measured on a continuous scale
32
Multiple Regression: factors that cause most Shrinkage
small original sample large number of predictors *result of cross-validation
33
Structural Equation Modeling (SEM)
Multivariate techniques Evaluates the causal (predicted) influences of multiple latent factors aka "causal modeling"
34
Structural Equation Modeling: 2 techniques
Path Analysis LISREL
35
SEM: Path Analysis
Causal relationships among variables represented in path diagram Coefficients indicate direction and strength of relationship between pairs of variables Only recursive (one way)
36
SEM: LISREL | Linear Structural Relations Analysis
LISREL includes: recursive (one way) paths nonrecursive (two way) paths latent traits measurement error
37
Multivariate Techniques for Data Reduction
Factor Analysis Cluster Analysis
38
Multivariate Data reduction: Factor analysis
Reduces larger number of variables to a small number of factors Factors explain inter-correlations between variables *e.g. to develop subscales for tests
39
Multivariate Data reduction: Cluster Analysis
Used to identify, define, confirm the nature and number of subgroups (clusters)
40
ANCOVA: main use
Removes variability due to extraneous variable
41
Interval recording/Event sampling
Measuring presence or absence of behavior during discrete intervals of a set period of time, or during an event
42
Trend Analysis
Analysis of Variance Quantitative IV Assesses linear and nonlinear (e.g. quadratic) trends
43
How to compensate for violation of homogeneity of variances
Decreasing alpha Having equal-sized groups
44
Best way to increase External Validity
Randomly select participants from target population
45
Survival analysis
Used to evaluate the length of time to critical event | e.g. relapse, promotion
46
Multiple Regression Analysis: Weighting of predictors
Predictor is weighted in: direct proportion to criterion inverse proportion to other predictors
47
Standard Error of the Mean calculation
σ/√N