Midterm Flashcards

(88 cards)

1
Q

evidence-based practice (ebp)

A

the integration of clinical expertise, evidence, and client/caregiver perspectives

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

clinical expertise

A

knowledge, judgment, and critical reasoning acquired through training and professional experience

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

evidence definition (including the types of evidence)

A

the best available information gathered from the scientific literature (external) and from data/observations collected on the individual client (internal)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

client/caregiver perspectives

A

the unique set of personal and cultural circumstances, values, priorities, and experiences identified

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

purpose of EBP

A

helps clinicians ask questions about assessment, treatment, and interventions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Cochrane Library

A

a collection of databases that contain different types of high-quality, independent evidence to inform healthcare decision-making

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

history of the Cochrane Library

A

Archie Cochrane developed the library when he was frustrated trying to distinguish between scientifically valid and invalid medical therapies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

centre for evidence-based medicine (cebm)

A

develops, teaches, promotes, and disseminates better evidence for health care

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

history of cebm

A

David Sackett started the first CEBM in Britain

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

David Sackett famous quote

A

“half of what you learn in medical school is dead wrong”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Joanna Briggs Institute Model of Evidence-Based Healthcare (JBI Model of EBHC)

A

considers evidence-based healthcare as decision-making that considers the feasibility, appropriateness, meaningfulness, and effectiveness of healthcare practices

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

challenges associated with EBP

A
  • evidence changes quickly
  • people have their own agendas in disseminating information and influencing the acceptance of EBP
  • are sources of evidence credible?
  • is the evidence good or bad? (myth, opinion, fact)
  • where do we look for evidence?
  • how fo we keep up to date with evidence?
  • what if there is no evidence?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

5 key EBP steps

A
  1. Ask the right clinical question
  2. Acquire the best evidence
  3. Appraise the evidence
  4. Apply the evidence
  5. Assess your performance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

epidemiology studies definition and purpose

A
  • nature, risk, prevalence, course of the condition
  • helps us understand conditions and plan services; we can plan intervention better if we know the trajectory of the disease
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

research evidence definition and purpose

A
  • focused on experimental design, carefully controlled interventions, and measurable outcomes
  • helps us understand the efficacy of interventions, diagnostic accuracy of tests, etc.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

randomized controlled trials (RCTs)

A
  • a study design that randomly assigns participants into an experimental group and/or a control group
  • only expected difference between the control and experimental groups is the outcome variable being studied
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

research evidence pyramid

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

qualitative studies definition and purpose

A
  • interviews, surveys, QOL instruments, autobiographical accounts, etc.
  • helps us understand the perspectives of key stakeholders, impact of conditions on lives, tolerance of procedures, etc.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

PICO(T) question definition

A
  • Problem/patient
  • Intervention (cause, prognostic factor, treatment; What intervention am I considering?)
  • Comparison intervention (if necessary; is x intervention better than y?)
  • Outcomes (What can I hope to achieve? Measurable?)
  • Time: time to demonstrate clinical outcomes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

PICO(T) question purpose

A

convert a clinical need into an answerable question

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

popular databses

A
  • CINAHL
  • EMBASE
  • PubMed
  • Web of Science Core
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

CINAHL controlled vocab

A

CINAHL Headings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

EMBASE controlled vocab

A

Emtree

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

PubMed controlled vocab

A

MeSH

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Web of Science Core controlled vocab
none
26
controlled vocabulary definition
the article is tagged with standardized terms
27
cons of controlled vocab
- can miss out on jargon, slang, newer terms - most recent articles if subject terms have not been assigned
28
keyword searching definition
finding an article using your own words
29
cons of keyword searching
can be inefficient
30
critical appraisal definition
the process of systematically examining research evidence to assess the validity, results, and relevance before using it to inform a decision
31
knowledge needed for critical appraisal
- understanding what should be in a research paper, clinical guideline, position paper - understanding different research designs - recognizing the importance of matching the right design to the research question
32
key components of a research article
- title - abstract - introduction - methods - results - discussion/conclusions - references
33
research article title
indicates research design (method)
34
research article abstract
- a succinct description of the article that matches the article itself - includes the introduction, methods, results, and discussion
35
research article introduction
- why is a study needed? what research has been done before? build up to the research questions and aims - PICO question should come back (if the study is about an intervention)
36
research article methods
- describe the design of the study, materials used, participants, how the data was collected, and how the data was analyzed - must describe how the research question was answered
37
research article results
- main findings of the study, research question answered, statistical analysis if appropriate - all data must be accounted for
38
research article discussion/conclusion
- research question addressed? findings expected or unexpected? do the findings agree with other studies? implications for clinical practice? next phase of research? - study limitations
39
reference manager softwares
- endnote - mendeley - zotero
40
equator network
an international intiative that seeks to improve the reliability and value of published health research literature by promoting transparent and accurate reporting guidelines
41
types of study methodologies
- quantitative - qualitative - single-blind - double-blind - placebo-controlled
42
quantitative definition
- research questions or aims are about numbers - differences in scores between interventions x and y
43
qualitative definition
- research questions or aims about perceptions, beliefs, opinions - an in-depth exploration of the lived experience of a condition cannot be obtained from questionnaires alone
44
does researching the efficacy of one intervention compared to another one yield quantitative or qualitative results
quantitative
45
single-blind definition
the participants do not know which treatment they were receiving
46
double-blind definition
investigators and participants did not know which treatment they were giving or receiving
47
placebo-controlled
control group that receives a placebo
48
robey & schultz model (1998)
- phase 1: exploring therapy procedures and proving that it is sage and worth exploring further - phase 2: attempt to define how therapy works, select appropriate clients, assessments, and outcome measures - phase 3: design large-scale efficacy study (RCTs) - phase 4: effectiveness study to see if therapy works clinically - phase 5: effectiveness studies continue to look at cost-effectiveness, consumer satisfaction, and effects on QOL
49
types of bias
- selection bias - performance bias - detection bias - attrition bias - reporting bias - other bias
50
selection bias
a distortion in a measure of association (such as a risk ratio) due to a sample selection that does not accurately represent the target population
51
how is selection bias avoided
by using random sequence generation and allocation concealment
52
random sequence generation
- researchers randomly assign participants into groups - ideally computer-generated
53
allocation concealment
the person randomizing the participants to experimental and control groups could not predict what group the person will be allocated to
54
performance bias
- the effects of unequal treatment between study groups - can occur when participants know which group they were assigned to, changing their responses or behaviors and affecting the outcomes of the study
55
how to control for performance bias
double-blinding the participants and clinicians
56
detection bias
when the person or people measuring the outcomes know which participants received the intervention
57
how to control for detection bias
bling the person completing the outcome measurements
58
attrition bias
- differences between experimental and control groups because of withdrawals from a study - withdrawals lead to incomplete outcome data
59
how to control for attrition bias
explain the reasons for withdrawal and account for all of the data
60
intention to treat analysis
analyze all of the data available from the study cohort regardless of participation
61
reporting bias
- differences between reported and unreported findings - statistically significant differences between intervention groups are more likely to be reported than non-statistically differences
62
reasons for other biases
- can be related to trial design - typically in good studies these are mentioned in the limitations
63
placebo outcomes
- placebo - nocebo
64
placebo effect
improvement of an illness following a placebo intervention
65
nocebo effect
disimprovement of illness following a placebo intervention
66
critical appraisal skills programme (CASP)
contains a checklist of questions for readers to appraise RCTs critically
67
CONsolidated Standards of Reporting Trials (CONSORT Guidelines)
help researchers report their trial protocols and finish RCTs completely and transparently
68
difference between CASP and CONSORT Guidelines
- CASP for readers - CONSORT Guidelines for researchers
69
types of reviews
- systematic review - scoping review - meta-analysis - literature/narrative review - rapid review - living review - critical review - overview - qualitative systematic review - state-of-the-art-review - umbrella review
70
systematic review
- secondary research - a summary of all of the available research or evidence in response to a research question
71
when are systematic reviews necessary
- often required by research funders to establish the state of existing knowledge - guideline development
72
purposes of a systematic review
- uncover international evidence - confirm current practices, address variations, and identify new practices - identify and inform areas for further research - identify and investigate conflicting results - produce statements to guide decision-making
73
most common type of systematic review
effectiveness review
74
effectiveness review definition
a review assessing the effectiveness of an intervention
75
effectiveness definition
the extent to which an intervention (when used appropriately) achieves the intended effect
76
what kind of review is widely used to inform the development of trustworthy clinical guidelines
effectiveness review
77
which questions are used to develop effectiveness reviews
PICO
78
steps of a systematic reviews
1. prepare your topic 2. search for studies 3. screen your studies 4. extract the data 5. analyze and synthesize evidence 6. rate the quality of evidence 7. report your findings
79
what is covidence
a tool for filtering out irrelevant data
80
GRADE acronym
Grading of Recommendations, Assessment, Development, and Evaluations
81
GRADE purpose
used to grade the quality of evidence and make recommendations
82
scoping review
exploratory projects that systematically map the literature available on a topic identifying key concepts, theories, sources of evidence, and gaps in the research
83
meta-analysis
a combination of a group of studies to reach a conclusion statistically about the effect of an intervention
84
how are meta-analysis results usually depicted
in a forest plot
85
when not to conduct a meta-analysis
- heterogeneity of studies - poor quality of studies - publication bias - small number of studies or limited sample size
86
publication bias
selective publication of positive studies and exclusion of negative studies
87
why should you not conduct a meta-analysis if there is a small number of studies or limited sample size
- provide less information to summarize or pool the results - can yield inaccurate, unstable, or erroneous results - too few studies impede the exploration of publication bias and can confound conclusions
88
PRISMA Guidelines
provide guidelines for the reporting of systematic reviews evaluating the effects of intervention