Intro to Research Design Flashcards
Primary vs Secondary Source
Primary source - document provided directly by the author
Secondary source - Reviews, textbooks, etc
Independent variable
Condition that will change a given outcome
Dependent variable
Outcome variable dependent on the independent variable
Operational definitions
Definition of variable according to unique meaning within the study (normallly dependent variable)
Ex) Pain intensity, variability in stride velocity
Research objectives
- to evaluate measuing instrument
- to descrive population or clinical phenomena
- to explore relationships
- to compare between groups or conditions
Research hypothesis
- statement that predicts relationship between IV and DV
- statement of researcher’s expectations
Null hypothesis
- Statistical hypothesis
- States that “no difference” occurs between variables
Sampling bias
When sample over or under-estimates the characteristics being investigated
How to reduce sampling bias
- Random Sampling
- Well defined inclusion/exclusion criteria
Three characteristics of experiments
- Manipulation of variables
- Random assignment
- Control group
Intention to treat analysis
Statistical analysis used when handling incomplete data. It minimizes bias assicated with missing data.
How to exert control without it being a true experiment
- Homogeneous subjects
- Blocking
- Matching
- Repeated measures
Homogeneous Subjects
Subjects identical on a potentially confounding variable
Disadvantage - results only generalized to specific types of people
Blocking
Build extraneous variables into study design by using them as IV
Ex) Age blocked by decade
Matching
Match subjects based on specific characteristics
Ex) For every female gymnist, I will get one male gymnist
*Matching is during selection, blocking is after*
Repeated Measures
- All levels of the IV are experienced by all subjects
- Subjects are matched to themselves
- Efficient method for controlling potential inter-subject differences
Threats to validity heirarchy
- Statistical conclusion validity
- Internal validity
- Construct validity
- External validity
Statistical Conclustion Validity
If the statistics are incorrect, then the whole study is considered invalid
Internal Validity
Are changes in the DV due to factors other than manipulation of the IV?
- History - events between tests?
- Maturation - subject changes between tests
- Attrition - subject drop out
- Testing effects - the act of testing may change person’s response
- Order effects
- Instrumentation
Construct Validity
The operation (measurable) represents a construct (not-measurable).
How well is a variable “operationally defined”
External Validity
Can results be generalized to people/setting/times different from those used in the study?
Single Factor Experimental Design
Pretest-posttest control group
- Random
- Observation - Intervention - Observation
- Observation - control - Observation
Posttest only
- Random
- Intervention - Observation
- Not practical to give pretest (education)
Multi-factor Experimental Design
Factorial
- Two or more IVs
- AxB factorial design – 2 IVs, “A” levels in IV1, “B” levels in IV2
- Ex) “A” walking speeds, “B” environments
Repeated Measures
1-way
- Time as IV = Observation - Intervention - Observation - Observation - Observation…
- Treatment as IV = Intervention - Observation; Intervention2 - Observation; Intervention3 - Observation…
Mixed
- Two or more IVs
- One variable is repeated across subjects
- One variable has independent groups