Research Design Flashcards

(40 cards)

1
Q

Research

A
  • Investigation through scientific method to establish facts
  • Based on a hypothesis & intends to be generalised
  • Controlled internally, relies on validity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Evaluation

A
  • Use of a framework to determine value of a program/process
  • Intent to improve and make recommendations
  • Controlled externally, relies on feasability to determine value
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Scientific method

A

Systematic approach in research to identify problems, collect and analuse data, and develop theory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Evidence based practice

A

Integration of best available evidence into practice to improve patient care, build credibility and accountability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Research paradigm

A
  • Philosophical model/framework to guide research questions, methods, data collection and analysis
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Components of research paradigms

A
  • Ontology: study of existance, provides world view to guide study
  • Epistemiology: study of knowledge, provides focus
  • Methodology: framework for conducting study
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

3 Philosophical paradigms

A

Positivism
Interpretivism/constructivist
Critical approach

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Positivism

A
  • Explain truth through scientific method to assess for causal relationships (quantitative)
  • Deductive: theory –> conclusion
  • Reductionism/determinism: does not occur due to chance
  • Examples: descriptive (cross sectional etc), RCT
  • Clear, quick analysis, generalisable, high rigour
  • High cost, researcher bias, limited probing
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Interpretivism/constructivist

A
  • Descriptive, explores meaning
  • Inductive: observation –> concepts/meaning
  • Subjective: researcher interpretation, value in dialogue and social constructs
  • Examples: phenomenology, descriptive, ethnography, grounded theory
  • Low cost, complex phenomena, member checking
  • Researcher bias, lack of generalisability, biased subjects, lack of research clarity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Critical approach

A
  • Focus on society to critique and challenge power dynamics
  • Goal is to encourage equality, change social structures
  • Examples: emancipatory research ( benefit to disadvantaged), action research, feminist research
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Quantitative design

A
  • Positivist
  • Control: use of comparison group to eliminate extraneous variables and threats to IV such as history, maturation and selection
  • Randomisation: create similar groups to ensure changes are due to intervention
  • Manipulation & blinding

= QT has atleast 1, best to have 3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Types of quantitative designs

A

Experimental and non-experimental

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Experimental

A
  • Manipulation of the IV to observe the effect on the DV
  • Limits confounding factors, establishes causality BUT required extensive review & prep, cost
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Types of experimental designs

A
  • RCT: causality through control, randomisation & manipulation (high IV)
  • Quasi-experimental: manipulation but lacks either/both control & randomisation (weak causality)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Non-experimental/observational

A
  • No IV manipulation, not establishing causality only exploring relationships between variables
  • Low evidence/IV, high bias
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Types of non-experimental designs

A
  • Observational: explores relationships between variables when little is known
  • Descriptive: measures variables of interest
  • Cross sectional: frequency and characteristics of x in a population at a point in time
  • Cohort studies: disease free population studies over time, with exposed and unexposed groups compared (prospective - defines sample & measures beforehand - or retrospective)
  • Case-control: retrospective look back for explanatory factors to link exposure to outcome (compare cases & controls)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Types of qualitative designs

A

Descriptive
Phenomenology
Ethnography
Grounded theory
Action research
systematic reviews

17
Q

Descriptive (QL)

A
  • Summary of events/experience with no theory/methodology
  • Small data set, thematic analysis
18
Q

Phenomenology

A
  • explores thoughts, feelings & behaviours to understand meaning
  • No causal inferences
19
Q

Ethnography

A
  • Study of culture from perspective of subject (emic), occurs in the field (etic)
  • Tradition: single unfamiliar setting over time
  • Focused: pre-identified topic with subcultural groups
  • Auto: study of own culture
20
Q

Grounded theory

A
  • inductively derived grounded theory about a phenomena based on collected data
21
Q

Action research

A
  • Research at the same time as action (change and improvement)
22
Q

Systematic reviews

A
  • Critical assessment and evaluation of research studies about a particular topic
23
Q

Independent vs dependant variable

A
  • I: factor influencing the outcome
  • D: result or outcome being studied
24
Internal validity
Accuracy: whether the outcome is attributed to the cause (strength of causal relationship)
25
External validity
Extent to which the findings can be generalised beyond the sample
26
Construct validity
* Association between data and prediction of theoretical trait (construct)
27
Face validity
* whether it appears to measure what its supposed to (subjective measure)
28
Content validity
* Whether the measured content covers the full domain (subjective but relies on experts)
29
Sources of error in internal validity
* **History: **events during the study that affect the DV * **Maturation**: changes within a person that affect the DV * **Selection:** poor selection that results in a non-representative sample * **Instrumentation:** measurement or observation inconsistencies * **Testing effect ** * **Mortality:** people dropping out * **Participant reaction bias** * **Experimenter bias** * **Confounding variables **
30
Reliability
* How accurate or trustworthy the results are (are they consistent and reproducible)
31
Measures of reliability
* Test-retest method: assess the correlation between use of the same instrument on the same sample * Internal consistency: use of correlation coefficients to assess correlation between different items to measure the same thing * Alternative forms: 2 similar forms of a test to same population to eliminate practice effects
32
Sources of reliability error
* true experimental variability: real differences * Random fluctuations: mood, noise, fatigue * Systematic error: confounding variables (subjective influence) * Inter-observer error
33
Factors influencing reliability
* Length of test * Objectivity of the assessment * Method of estimating reliability
34
Rigour 4 components
* reliability, validity and trustworthiness of research * Credibility/IV * Transferability/EV * Dependability/reliability * Confirmability/objectivity
35
How to ensure credibility/IV
* Selection * Triangulation * Extensive data collection * Member checking, reflection
36
How to ensure transferability/EV
* Appropriate sampling & description of sample and settings * Strong IV
37
How to ensure dependability/reliability
* Audit trail * External audit * Instrument consistency assessment
38
How to ensure confirmability/objectivity
Strategies to limit bias * Audit/peer review * Triangulation * Member checking & relfection
39
NHMRC levels of evidence
* Level 1: SR of RCT * Level 2: RCT * Level 3: cohort study, case-control * Level 4: cross sectional, pre-post test