Week 12: Evaluating Research Reports Flashcards

1
Q

Evidence Base Practice: health care providers

A
  • incorporate research findings into clinical judgement and treatment decisions
  • evaluate research reports
  • determine if findings provide sufficient evidence to support clinical practices
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

EBP: critical appraisal

A
  • determine scientific merit of research report
  • applicability to clinical decision making
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

When evaluating any research report what do you examine to determine?

A
  1. validity of design and analysis
  2. meaningfulness of findings
    - relevance of results to practice
    - not enough to answer yes or no you need rationale or implications of answer to evaluate study’s value
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Research validity - explanatory design types

A
  • statistical conclusion validity
  • internal validity
  • construct validity
  • external validity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Statistical conclusion validity

A
  • is there a relationship between the independent and dependent variables
  • appropriate use of statistical procedures for analyzing data
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

internal validity

A

is there evidence of a causal relationship between IV and DV

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

construct validity

A

to what constructs can results be generalized

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

external validity

A

can the results be generalized to other persons, settings or times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Threats to statistical conclusion validity: statistical power

A
  • statistical power: ability to document real relationships between IV and DV (low power = may not be able to identify statistical relationship) and often affected by sample size
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

threats to statistical conclusion validity: violated assumptions

A
  • violated assumptions: most statistical tests based on assumptions if not met = erroneous inferences
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Statistical conclusion validity: variance reliability

A
  • statistical conclusions threatened by extraneous factors that increase variability within data
  • unreliable measurement
  • failure to standardize protocol
  • environment interferences
  • heterogeneity of subjects
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

failure to use intention to treat analysis

A
  • should maintain randomization to groups
  • analyst according to original assignment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Evaluation process - introduction (what should the introduction have/do)

A
  • establish problem being investigated an dits importance
  • demonstrate that researchers have thoroughly synthesized literature
  • provide rationale for pursuing this research
  • contribution of this research
  • purpose aims Or hypotheses
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Evolution process: methods:

A
  • participants (target population & accessible population)
  • recruitment
  • inclusion/exclusion criteria
  • sample size: authors should describe how estimated (related to power)

design
- appropriate to answer research question
- control for potential confounding variables
- rationale for choices of interventions and measurements
- adequacy of time frame of study

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

evaluation process: data collection (methods)

A
  • operational definition of variables
  • measurement reliability (assessed within the study/based on prior research)
  • measurement validity: prior research
  • data collection described clearly
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Evolution process: results

A
  • participants complete protocol as originally designed
  • all participants accounted for in final measurements
  • results address research question
  • effect size
  • results support or refute the propose hypothesis
17
Q

evolution process discussion

A
  • interpret results and alternative explanations considered for findings
  • address each research question
  • relationship to other research
  • limitations
  • clinical importance of findings (clinical vs statistical significance there is a difference)
18
Q

evaluation process: final step

A
  • findings of sufficient strength to inform patient management
  • study participants similar to patient
  • acceptability to patient
  • feasibility oof intervention (clinical setting, resources available)
19
Q

External validity considerations
(intervention studies)

A

participants should specify
- recruitment methods
- inclusion and exclusion criteria
- sample size estimates

20
Q

Internal validity considerations: design
(intervention studies)

A

study description should indicate
- type of design
- number of groups
- number, levels of IV
- dependent variables, frequency of measures
- randomization - if repeated measures
- equal treatment for groups except for experimental intervention

21
Q

Internal validity considerations: data collection
(intervention studies)

A

study descriptions should provide
- operational definitions (intervention and measurement procedures)
- replication
- measurement tool reliability and validity
- groups treated equally except for intervention
- bias control = blinding

22
Q

Statistical conclusion validity considerations: data analysis
(intervention studies)

A
  • confidence intervals
  • confirmation that groups are similar at baseline on relevant characteristics (how are differences handled)
  • adherence to study protocol
  • attrition (do not complete study) –flowchart–account of differential attrition in groups
  • intention to treat
23
Q

Meaningfulness of results (intervention studies)

A
  • state whether hypotheses were supported or not
  • provide reasons on hypotheses
  • confidence intervals effect sizes should be reported
  • data should reflect amount of change or difference
  • if differences not significant address potential for type 2 error
24
Q

external validity considerations
(diagnostic studies)

A

participants
- sample representative of patients whom test would apply
- reflect full range of condition (variance) often purposive sampling
- inclusion and exclusion criteria specified

25
Q

Internal validity considerations: design (diagnostic studies)

A
  • meaningfulness based on criterion validity
  • index test compared with reference standard
  • validity of reference standard must be documented
26
Q

internal validity considerations: data collection (diagnostic studies)

A
  • both index test and reference test given to all studies
  • methods of measurement well documented
  • explanation of means of bias control by blinding testers from subject’s true diagnosis
27
Q

statistical conclusion validity considerations (diagnostic studies)

A
  • data analysis:
  • sensitivity, specificity, predictive values, likelihood ratios
  • indication of tests ability to determine posttest probabilities
  • Confidence intervals
  • flow diagram: how attrition affected ratios