EBP Midterm/Final Flashcards

1
Q

Define EBP

A
  • Practice based on the best available evidence, patient preferences, and clinical judgment
  • EBP is a process involving the examination and application of research findings or other reliable evidence that has been integrated with scientific theories.
  • “the conscientious, explicit, and judicious use of theory-derived, research-based information in making decisions about care delivery to individuals or groups of patients and in consideration of individual needs and preferences”
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Components of EBP
Purpose of EBP

A

Components
- Best evidence
- Clinical Judgment
- Patient preference
Purpose
- Patient outcomes are better when evidence is used as a basis for practice
- Nursing care is more efficient when ineffective processes are replaced
- Errors in decision making become less frequent
-Consumers want evidence-based evidence to make decisions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Individual barriers to EBP and how to overcome them

A

Lack of time
Lack of value placed on research in practice
Lack of knowledge about EBP and research
Lack of technological skills to find evidence
Lack of ability to read research
Resistance to change
*Strategies to overcome
*Strategies need to be aimed at instilling an appreciation for EBP, increasing knowledge, developing necessary skills, and changing behaviors
Devote 15 minutes per day to reading evidence related to a clinical problem.
Sign up for emails that offer summaries of research studies in your area of interest.
Use a team approach to equitably distribute the workload among members.
Bookmark websites that have clinical guidelines to promote faster retrieval of information.
Evaluate available technologies (i.e., tablets) to create time-saving systems that allow quick and convenient retrieval of information at the bedside.
Negotiate release time from patient care duties to collect, read, and share information about relevant clinical problems.
Search for established clinical guidelines because they provide synthesis of existing research.
Make a list of reasons why healthcare providers should value research, and use this list as a springboard for discussions with colleagues.
Invite nurse researchers to share why they are passionate about their work.
Seek support from colleagues.
When disagreements arise about a policy or protocol, find an article that supports your position and share it with others.
When selecting a work environment, ask about the organizational commitment to EBP.
Link measurement of quality indicators to EBP.
Participate in EBP activities to demonstrate professionalism that can be rewarded through promotions or merit raises.
Provide recognition during National Nurses Week for individuals involved in EBP projects.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Organizational barriers to EBP and how to overcome them

A

Factors can include organizational management failing to embrace EBP and lack of institutional support, such as financial or release time
Resistance to change
Lack of resources to access evidence
Lack of resources
*Strategies: directed toward creating and maintaining an environment where EBP can flourish
*Engaging the administration
*Having a culture of safety
*Engaging stakeholders
*Implementing care bundles
*Promoting interprofessional collaboration
*Overcoming research-related barriers
*Ensuring nurses meet EBP competencies
Listen to people’s concerns about change.
When considering an EBP project, select one that interests the staff, has a high priority, is likely to be successful, and has baseline data.
Mobilize talented individuals to act as change agents.
Create a means to reward individuals who provide leadership during change.
Write a proposal for funds to support access to online databases and journals.
Collaborate with a nursing program for access to resources.
Investigate funding possibilities from others (i.e., pharmaceutical companies, grants).
Link organizational priorities with EBP to reduce cost and increase efficiency.
Recruit administrators who value EBP.
Form coalitions with other healthcare providers to increase the base of support for EBP.
Use EBP to meet accreditation standards or gain recognition (i.e., Magnet Recognition).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Research-related barriers to EBP and how to overcome them

A

Research-related factors can include the communication gap between researcher and clinician, the technical writing associated with research reports, and lack of dissemination of research findings
*Strategies
Use social media to share research findings.
Write research reports using user-friendly language.
Collaborate with clinicians to identify topics relevant to clinical practice.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Level I Hierarchy of Evidence includes…

A

Summaries
Synopses
Meta-analysis
Systemic reviews of experiments/quasi-experiments
Clinical practice guidelines
Best practice articles

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Level I
Summaries

A

Best practice recommendations based on an appraisal of information about a particular practice question
After stating a clinical question, key findings are identified and ranked. Summaries end with best practice recommendations.
Usually limited to one to three pages, summaries are particularly helpful for nurses to quickly find evidence for practice in their clinical settings.
- JBI

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Level I
Synopses

A

Brief descriptions of evidence that provide an overview of key points of evidence from multiple sources
Shorter version of a summary
The difference between an abstract and a synopsis is that an abstract summarizes a single study, whereas a synopsis is about more than one study

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Level I
Systematic review

A

rigorous and systematic synthesis of research findings from experimental and quasi-experimental studies about a clinical problem.
Like all evidence in this level, systematic reviews involve compiling findings from various single studies.
In a systematic review, the authors will provide a very detailed account about how they searched the literature and selected studies to be included in their review. Inclusion criteria (e.g., studies done in the United States)
NO statistical analysis
Only published works included
Cochrane Library

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Level I
Meta-analysis

A

A research method that estimates the effect of an intervention by using statistical methods to analyze data from both published and unpublished single studies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Level I
Clinical Practice Guidelines

A

Statements that include recommendations intended to optimize patient care that are informed by a systematic review of evidence and an assessment of the benefits and harms of alternative care options

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Level II Hierarchy of Evidence

A

Randomized control trials (RCTs)
Participants are randomly assigned to experimental or control groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Level III Hierarchy of Evidence

A

Quasi-experimental designs
Research designs involving the manipulation of the independent variable but lacking random assignment to experimental and comparison groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Level IV Hierarchy of Evidence includes…

A

Correlational
Cohort Studies
Case-Control Studies
Quantitative findings from mixed-methods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Level IV
Correlational studies

A

Nonexperimental designs used to study relationships among two or more variables
When using this design, researchers can claim that as a variable changes, another variable will also change

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Level IV
Cohort studies

A

Epidemiological designs in which participants are selected based on their exposure to a particular factor
Designed to observe patterns of disease in populations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Level IV
Case-control studies

A

Epidemiological studies whereby participants are grouped on the presence or absence of a particular disease or condition and are then compared for similarities and differences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Level IV
Mixed methods design

A

Findings from the quantitative part of the study would be considered Level IV, and findings from the qualitative portion of the study would be in a lower level.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Level V Hierarchy of Evidence includes…

A

Integrative review
Metasynthesis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Level V
Integrative review

A

Scholarly papers that include published nonexperimental studies in the synthesis to answer clinical questions
Although an integrative review may include RCTs and higher-level evidence, the inclusion of nonexperimental studies makes integrative reviews a lower quality of evidence in comparison to systematic reviews

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Level V
Metasynthesis

A

A systematic review of qualitative studies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Level VI Hierarchy of Evidence includes…

A

Single descriptive study
Single qualitative study
Qualitative findings from mixed methods design
EBP Project
QI Project
Case Series Studies
Case Studies
Concept analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Level VI
Descriptive Survey Design

A

Nonexperimental studies that involve asking questions of a sample of individuals who are representative of a group

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Level VI
QI Projects

A

Quality improvement (QI) projects: Structured, continuous activities designed to systematically improve the ways care is delivered to patients

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Level VI
Case series study

A

Epidemiologic reports used to describe rare diseases or outcomes.
Because the purpose of a case series study is to understand the natural progression of disease in a population, there is no control and no intervention

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Level VI
Case study

A

A description about a single or novel event of interest

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Level VI
Concept analysis

A

A process that explores the attributes and characteristics of a concept

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Level VII Hierarchy of Evidence includes…

A

Narrative reviews: Papers based on common or uncommon elements of works without concern for research methods, designs, or settings
Opinion of authorities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Components of a Research Article
Abstract

A

First section of a research article
Usually limited to 100–150 words
Most important findings

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Components of a Research Article
Introduction

A

Contains a statement of the problem and purpose statement
Background of why they did this study, why they wrote the article

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Components of a Research Article
Review of literature

A

Is an unbiased, comprehensive, synthesized description of relevant, previously published studies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Components of a Research Article
Theoretical framework

A

The structure of a study that links the theory concepts to the study variables; a section of a research article that describes the theory used

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Components of a Research Article
Methods Section

A

Discussion about study design, sample, and data collection
“Quasi-experimental, randomized, qualitative”
Sample: the target population, sample size
Data collection: what type of data they were looking to collect “using a survey to assess ______”
Talk about demographics in sample

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

Components of a Research Article
Results section

A

Methods used to analyze data and characteristics of the sample
Statistical findings “statistically significant”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

Components of a Research Article Discussion section

A

Interpretation and discussion of the results
Not every finding
“We found this, so what?” discussion
Inform where the next steps should go

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

Components of a Research Article
List of references

A

Use formal guidelines such as Publication Manual of the American Psychological Association

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

Steps in the EBP Process

A

1) Cultivate a spirit of inquiry along with an EBP culture and environment.
2) Ask a relevant clinical question [PICO(T)]
3) Conduct a literature search and collect the best/relevant evidence.
4) Critically appraise the evidence.
5) Integrate the evidence with clinical expertise and patient preferences to make the best clinical decision.
6) Evaluate the outcomes of the EBP change.
7) Disseminate the outcomes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

Problem Statement

A

The problem statement formally identifies what problem is being addressed in the study.
A problem statement must include the scope of the research problem, the specific population of interest, the independent and dependent variables, and the goal or question the study intends to answer.
Problem statement: The use of alcohol by college freshmen contributes to alcohol-related injuries and emergency department visits at a state university.

39
Q

Purpose Statement

A

The purpose statement is derived from the problem statement and indicates the aim of the study.
Purpose statement: To determine if brief screening and nursing intervention for alcohol use during freshmen orientation reduces self-reported alcohol use, alcohol-related injuries, and emergency department visits among college freshmen at a state university.

40
Q

Null hypothesis definition

A

A hypothesis stating that there is no relationship between the variables; the statistical hypothesis
Statistical testing is used to either accept or reject this statement.

41
Q

Research Question

A

The research question flows from the problem statement and study purpose, and often it is the interrogatory form of the problem statement.
Research question 1: Is there a difference in self-reported alcohol use between college freshmen who receive brief screening and nursing intervention for alcohol use during fall orientation and the previous class of freshmen students who did not receive brief screening and nursing intervention?

42
Q

Variables (independent/dependent/extraneous)

A
  • Independent: Variable that influences the dependent variable or outcome; intervention, innovation, or treatment that is manipulated by the researcher; X variable
  • Dependent: Outcome or variable that is influenced by the independent variable; Y variable
  • Extraneous: Factors that interfere with the relationship between the independent and dependent variables; confounding variables; Z variable
43
Q

PICOT Components

A

P = Patient population or patient condition of interest
I = Intervention of interest
C = Comparison of interest
O = Outcome of interest
T = Time (this element is not always included)

44
Q

PICOT Question purpose

A

Clinical questions for specific patient problems are identified so that healthcare providers can find clinically relevant information using Internet search engines and databases

45
Q

Searching for Evidence
Boolean operators
Truncation
Keywords
Search Strategy Steps

A

Boolean operators: A set of terms that act as commands for connecting parts of a search strategy such that a database combines terms in the proper order
- When searching for multiple topics in a question, the AND term should be used.
- The operator OR tells the database that articles that use alternative words or phrases (i.e., synonyms) to describe a concept would also be helpful
- The operator NOT, although not used frequently, is used to eliminate or subtract results from a search.
Truncation: A technique of shortening a word and adding a wildcard symbol to tell a database to add variant endings to the word in a search
Keywords: Common words or phrases used to describe a concept
Search Strategy
- Step 1: State the PICOT Question
- Step 2: State the Concepts in Keyword Form
- Step 3: Select Two to Three Databases
- Step 4: Develop a List of Synonyms
- Steps 5 and 6: Use Boolean Operators to Create a Search Statement and Search the Database

46
Q

Key Concepts in Quantitative Designs
Casuality
Control
Manipulation
Bias
Confounding

A

Causality: The relationship between a cause and its effect.
Control: Ability to manipulate, regulate, or statistically adjust for factors that can affect the dependent variable.
Manipulation: The ability of researchers to control the independent variable.
Bias: Systematic error in selection of participants, measurement of variables, and/or analysis of data that distorts the true relationship between IV and DV.
Confounding: When extraneous variables influence the relationship between the independent and dependent variables.

47
Q

Key Concepts in Quantitative Designs
Between-groups and within-groups

A

*Between-groups design: Study design where two groups of participants can be compared.
- For example, a researcher who is studying condom use among adolescents may wish to know the practices of high school juniors and seniors as well as college freshmen and sophomores. The researcher could make comparisons among these four groups about the frequency of condom use.
*Within-groups design: Comparisons are made about the same participants at two or more points in time or on two or more measures.
- Using a within-groups design, the researcher would measure the participants’ levels of pain before the intervention, conduct the intervention, and then measure pain levels after the intervention.
Researchers can combine between-groups and within-groups designs within the same study to compare the effectiveness of more than one intervention. In the pain example, the researcher might offer music therapy as an intervention to one group of participants and relaxation therapy as another intervention to a different group of participants

48
Q

Four Essential elements of evaluation

A

Credibility: Refers to the truth or believability of findings.
Transferability: Relates to whether findings from one study can be transferred to a similar context; application of findings to a different situation.
Dependability: Relates to consistency in the findings over time; auditability; findings are reflective of data.
Confirmability: Relates to the rigorous attempts to be objective and the maintenance of audit trails to document the research process; findings can be substantiated by participants.

49
Q

Types of Qualitative Research
Phenomenology:

A

A type of qualitative research that describes the lived experience to achieve understanding of an experience from the perspective of the participants.

What is the lived experience of a woman dying from breast cancer?
Case studies

50
Q

Types of Qualitative Research
Grounded Theory:

A

Grounded Theory: A type of qualitative research that examines the process of a phenomenon and culminates in the generation of a theory.
What is the process of recovery following breast cancer?
Focus groups

51
Q

Types of Qualitative Research
Ethnography:

A

A type of qualitative research that describes a culture.
What are the breast self-exam practices of Amerasian women?
Gatekeeper

52
Q

Types of Qualitative Research
Historical:

A

A type of qualitative research used to examine events or people to explain and understand the past to guide the present and future.
What were the ancient Romans’ basic beliefs regarding diseases of the breast?
Strategic sampling: Sampling in historical research to locate a small group of people who were either witnesses of or participants in the phenomenon being studied.

53
Q

Ethical considerations

A

Consider
Code of ethics
Vulnerable population
◦ Respect for persons
Anonymity
Remaining anonymous by using pseudonyms in place of participant name
Confidentiality
Informed consent
Institutional review board
Making sure the study does not cause harm
Coercion:
Assent
Beneficence: doing good
Justice
Protect the participants’ rights and interests.
Put participants’ interests first.
Clearly communicate the goals of the research.
Obtain informed consent.
Ensure privacy.
Avoid exploitation.
Share results and reports.

54
Q

Probability sampling

A

Sampling method in which elements in the accessible population have an equal chance of being selected for inclusion in the study
Simple random sampling
Stratified random sampling
Cluster sampling
Systematic random sampling

55
Q

Simple random sampling

A

Randomly selecting elements from the accessible population

56
Q

Stratified random sampling:

A

Selecting elements from an accessible population that has been divided into groups or strata

57
Q

Cluster Sampling

A

Random sampling method of selecting elements from larger to smaller subsets of an accessible population; multistage sampling

58
Q

Systematic random sampling:

A

Sampling method in which every kth element is selected from a numbered list of all elements in the accessible population; the starting point on the list is randomly selected
K = size of sampling frame/size of sample

59
Q

Nonprobability Sampling Methods:

A

Sampling methods that do not require random selection of elements
Convenience sampling
Quota sampling
Purposive sampling

60
Q

Convenience sampling:

A

Nonprobability sampling method in which elements are selected because they are easy to access

61
Q

Quota sampling:

A

Nonprobability sampling method involving selection of elements from an accessible population that has been divided into groups or strata

62
Q

Purposive sampling:
Snowball:

A

Nonprobability sampling method used in qualitative studies to select a distinct group of individuals who either have lived the experience or have expertise in the event or experience being studied; sampling method to recruit specific persons who could provide inside information
Snowball sampling: Recruitment of participants based on word of mouth or referrals from other participants.

63
Q

o Levels of Measurement

A

A system of classifying measurements according to a hierarchy of measurement and the types of statistical tests that are appropriate; levels are nominal, ordinal, interval, and ratio

64
Q

Nominal:

A

The lowest level of measurement whereby data are categorized simply into groups; categorical data
Are you having pain? Yes? No?

65
Q

Ordinal:

A

A continuum of numeric values where the intervals are not meant to be equal; organizes attributes by rank
How much pain are you experiencing? Mild. Moderate. Severe.

66
Q

Interval:

A

A continuum of numeric values with equal intervals that lacks an absolute zero (temperature)

67
Q

Ratio:

A

The highest level of measurement that involves numeric values that begin at absolute zero and have equal intervals (height, weight, time)

68
Q

Validity
Content (and face)

Criterion-related (concurrent, predictive)

Construct (hypothesis testing, convergent, divergent, multitrait-multimethod, Known groups, Factor analysis)

Study validity

A

The degree to which an instrument measures what it is supposed to measure
Content validity: Is the content representative of the content domain under study?
- Face validity: Colleagues or subjects examine an instrument and are asked whether it appears to measure the concept.
- Content validity: Experts about the topic are asked to judge each item on an instrument by assigning a rating to determine its fit with the concept being measured.

Criterion-related validity: To what degree are the “observed score” and the “true score” related?
- Concurrent validity: New instrument is administered at the same time as an instrument known to be valid. Scores of the two instruments are compared. Strong positive correlations indicate good validity.
- Predictive validity: New instrument is given at time 1 and scores are used to predict a different outcome in the future (time 2). Strong positive correlations indicate good validity.

Construct validity: To what extent does the instrument measure the theoretical construct or trait?
- Hypothesis Testing: Hypotheses derived from theories are tested with the new instrument.
- Convergent: New instrument is administered at the same time as an instrument known to be valid. Scores of the two instruments are compared. Strong, positive correlations indicate good validity.
Divergent: New instrument is administered at the same time as an instrument measuring the opposite of the concept. Scores of the two instruments are compared. Strong negative correlations indicate good validity.
Multitrait-multimethod: New instrument, established instrument of same concept, and established instrument of opposite concept are given at the same time. Strong positive and negative correlations indicate good validity.
Known groups: New instruments are administered to individuals known to be high or low on the characteristic being measured.
Factor analysis: Statistical approach to identify items that group together.

Study validity: Ability to accept results as logical, reasonable, and justifiable based on the evidence presented

69
Q

Statistical conclusion validity:
Internal validity:
Construct validity
External validity:
Threats to each

A

*Statistical conclusion validity: The degree that the results of the statistical analysis reflect the true relationship between the independent and dependent variables.
- Low statistical power
- Low reliability of the measures
- Lack of reliability of treatment implementation
*Internal validity: The degree to which one can conclude that the independent variable produced changes in the dependent variable.
- Selection Bias
- History
- Maturation
- Testing
- Instrumentation
- Morality
*Construct validity: A threat to validity when the instruments used do not accurately measure the theoretical concepts.
- Inadequately defined constructs
- Bias
- Confounding
- Reactivity
- Experimenter expectancies
*External validity: The degree to which the results of the study can be generalized to other participants, settings, and times
- Effects of selection
- Interaction of treatment and selection of subjects
- Interaction of treatment and setting
- Interaction of treatment and history

70
Q

Interpreting p-value based on Alpha

A

< or equal to 0.05 indicates strong evidence against the null hypothesis so you reject it (statistically significant)
> 0.05 indicates the weakest evidence against the null hypothesis so you fail to reject it (not statistically significant)

71
Q

Data Saturation Definition

A

In qualitative research, the time when no new information is being obtained and repetition of information is consistently heard

72
Q

Qualitative Data Analysis:
Coding: open and axial
Data Reduction:

A

Qualitative data analysis: The production of knowledge that results from analysis of words
Coding: Assignment of labels to each line of transcript in qualitative analysis
- Open coding: The grouping of qualitative data into categories that seem logical
- Axial coding: The analysis of categories and labels after completion of open coding
Data reduction: The simplification of large amounts of data obtained from qualitative interviews or other sources

73
Q

Strategies to Establish Trustworthiness of a Study
Credibility: truth or believability of findings

A

Use of well-established research methods
Prolonged engagement
Triangulation
- Use of different research methods in qualitative research to gather and compare data
Thick description
Detailed interviews
Data saturation
Peer debriefing
- A technique used in qualitative research in which the researcher enlists the help of another person, who is a peer, to discuss the data and findings
Member checks
- A strategy used in qualitative studies when the researcher goes back to participants and shares the results with them to ensure the findings reflect what participants said
Constant comparison
Negative case analysis
- A qualitative strategy involving the analysis of cases that do not fit patterns or categories
Reflexivity (reflective journaling)
Persistent observation:
- When the researcher has spent sufficient quality time with participants while attempting to describe and capture the essence of the phenomenon.
Referential adequacy:
- A technique used in qualitative research in which multiple sources of data are compared and the findings hold true

74
Q

Strategies to Establish Trustworthiness
Transferability: application of findings to a different situation

A

Clear explanation of the boundaries/limitations of the study
Thick description
Checking for representativeness of the data
Audit trail

75
Q

Strategies to Establish Trustworthiness
Dependability: findings are reflective of data

A

Audit trail
Peer debriefing
Coding checks that show agreement
Uniformity of responses across subjects
Ability to relate previous research findings to the current study

76
Q

Strategies to Establish Trustworthiness
Confirmability: findings can be substantiated by participants

A

Audit trail
Peer debriefing
Member checks
Self-reflection of the researcher evidenced by journals

77
Q

Impetus for Change

A

New Knowledge
Safety Concerns
- Sentinel Events: A patient safety event resulting in death, permanent harm, or severe temporary harm
Healthcare outcomes
- Structural measures evaluate how care is organized.
- Indicators: Quantitative criteria used to measure outcomes (also referred to as measures)
- Process measures are used to evaluate activities that were performed with respect to patient care.
- Outcome measures are the consequences of the health care provided.
Healthcare costs
Societal need for professional nursing
Conferring With Others

78
Q

Healthcare outcomes
Structural, process, outcomes examples

A

Structural measures evaluate how care is organized.
- Nurse turnover
- RN education/specialty certification
- RN survey options
- Staffing and skill mix
Process measures are used to evaluate activities that were performed with respect to patient care.
- Patient falls
- Pressure injuries
- RN survey options
Outcome measures are the consequences of the health care provided.
- Patient falls
- Catheter-associated bloodstream infections (CABSIs)
- Ventilator-associated pneumonia (VAP)
- Pediatric peripheral intravenous infiltrations
- Pressure injuries
- Assaults by psychiatric patients
- RN survey options
- Catheter-associated urinary tract infections (CAUTIs)

79
Q

Overcoming Barriers to Change

A

Individual barriers can be due to a nurse’s lack of knowledge, skills, time, and confidence.
- Strategies need to be aimed at instilling an appreciation for EBP, increasing knowledge, developing necessary skills, and changing behaviors
Efforts to implement EBP can be thwarted by a lack of organizational support. Sometimes organizational culture and philosophy may promote the status quo and oppose any need for change
- Administrators and clinical leaders can promote improvements in patient care by motivating staff to adopt EBP and removing barriers.
- Communicating expectations for implementation of EBP is an important responsibility of administrators, because they set the tone of the organizational culture
- Organizations can promote a “culture of safety” in which reporting of errors, adverse events, and injuries is encouraged by consistent and helpful responses without fear of punishment or embarrassment
IPE
Research-related barriers are those barriers that are inherent in research that make it difficult for nurses to understand or implement evidence.
Ensure that researchers disseminate findings to practitioners.
Writing articles in user-friendly styles.
Display posters in high-traffic areas
Research findings can be presented at nursing grand rounds or in organizational newsletters.
Another way to overcome research-related barriers is to include staff nurses in the research process.

80
Q

Differentiating EBP from Research

A

Research
- Generates new knowledge
- Fills gap in literature
- Research question
- Participants
- Designed to describe a phenomenon, find a relationship, or test an intervention
- Analysis of data
- Evaluates findings in light of research question
EBP
- Applies new knowledge to point of care
- Based on evidence in literature
- Clinical question
- Patients
- Designed to change practice in clinical setting
- Analysis of data
- Evaluates practice change by measuring patient outcomes

81
Q

Quantitative vs Qualitative designs
Mixed methods

A

Quantitative research: Research that uses numbers to obtain precise measurements
- Philosophical perspective: One reality that can be objectively viewed by the researcher
- Primarily deductive reasoning
- Role of researching: controlled and structured
- Strategies: control and manipulation of situations, analysis of numbers with statistical tests, large number of participants
- Examples: Randomized controlled trial, quasi-experimental, correlational, descriptive survey
Qualitative research: Research that uses words to describe human behaviors
- Philosophical perspective: Multiple realities that are subjective, occurring within the context of the situation
- Primarily inductive reasoning
- Role of researcher: Participative and ongoing
- Strategies: naturalistic, analysis of words to identify themes, smaller numbers of participants
- Examples: phenomenological, ethnographic, grounded theory, historical
Mixed Methods: A design that combines both quantitative and qualitative data gathering and evaluation

82
Q

Guidelines for Conducting Ethical Research
Human rights:
Nuremberg Code:
Declaration of Helsinki:
Informed consent:
American Nurses Association (ANA):

A

Human rights: Rights (such as freedom from unlawful imprisonment, torture, and execution) regarded as belonging fundamentally to all persons
International standards
- Nuremberg Code: Ethical code of conduct for research that uses human participants
- Declaration of Helsinki: An international standard providing physician guidelines for conducting biomedical research
Informed consent: An ethical practice requiring researchers to obtain voluntary participation by participants after they have been informed of possible risks and benefits
American Nurses Association (ANA):
- The Nurse in Research: ANA Guidelines on Ethical Values
- Rights of human participants: (1) right to freedom from harm, (2) right to privacy and dignity, and (3) right to anonymity.

83
Q

Institutional Review Boards (IRBs):
Kinds of review

A

Institutional Review Boards (IRBs): Committees that review research proposals to determine whether research is ethical
Two kinds of review
*Full review: necessary when vulnerable populations are involved or when risks are not minimal
*Expedited review: an option when there is minimal risk to human subjects
- Minimal risk: probability and magnitude of anticipated harm or discomfort are not greater in and of themselves than those ordinarily encountered in daily life or during performance of routine physical/psychological exams or tests
Exempt studies
*Applies to certain low-risk studies
*Negates need for informed consent but IRB approval still needed

84
Q

Principles of Ethical Research (Understanding and Applying)
Belmont Report

A

Belmont Report (1979): A report outlining three major principles foundational for the conduct of ethical research with human participants
- Respect for persons: Principle that individuals should be treated as autonomous and that those with diminished autonomy are entitled to protection
- Beneficence: The principle of doing good
- Justice: The principle of equity or fairness in the distribution of burdens and benefits

85
Q

Based on three components, EBP is a lifelong problem-solving approach for addressing clinical practice problems. These components

A

(1) critical appraisal of relevant research
(2) the clinician’s expertise
(3) the patient’s preferences and values

86
Q

o Why Appraising Evidence is Important

A
  • Multiple types of evidence should be gathered during a literature review to ensure that the innovation is safe and effective.
  • Without sufficient evidence about the efficacy of an innovation, changes to practice are unwarranted and further research is needed.
  • Critical appraisal involves judging whether evidence is free of bias and evaluating the strengths and limitations of the study methods. Without this stringent evaluation, nurses could potentially implement practice changes based on unsound evidence.
  • The overall strength of the evidence is the level of evidence plus the quality of evidence
87
Q

Appraising Evidence for Strength

A

The use of evidence hierarchies
- Study limitations: The degree to which included studies examining a specific outcome have adequate protection against bias.
- Directness: The degree to which evidence links interventions directly to a health outcome of importance and whether comparative studies are based on head-to-head studies.
- Consistency: The degree to which included studies find the same outcome or a similar magnitude of effect.
- Precision: The degree of certainty surrounding an estimated effect with respect to a given outcome.
- Reporting bias: Selective publishing or reporting of research findings based on the favorability of study direction or magnitude of effect.

88
Q

Knowledge translation

A

Knowledge translation - a mutually collaborative process that includes synthesis, dissemination, exchange, and ethically sound application of knowledge to improve nursing practice and patient outcomes
Six attributes
- Collaboration
- Action
- Receptivity
- Process
- Translation
- Improved health outcomes

89
Q

Lewin’s Change Theory

A

An early change theory involving unfreeze–change–refreeze
Unfreeze
- Determine what needs to change:
- Ensure support from administration:
- Create the need for change
Change
- Communicate with staff
- Promote and empower action
Refreeze
- Link new changes to organizational culture
- Identify and implement strategies to sustain the change over the long term:
- Offer training and support
- Celebrate successes

90
Q

Transtheoretical model:

A

A model explaining how individuals change their behaviors that includes six stages of change: precontemplation, contemplation, preparation, action, maintenance (sustained change for 6 months), and termination

91
Q

Kotter’s eight stages of change model

A

A model of organizational change that has eight stages
1. Establish urgency about the need to achieve change
2. Create a guiding coalition
3. Develop vision of change
4. Communicate the vision to staff
5. Empower broad-based action
6. Generate short term wins
7. Consolidate gains/produce more
8. Anchor new approaches in the organizational culture

92
Q

Importance of Sample size (understanding generalizability)
- Power analysis
- Significance level
- Effect size
- Type 1 error
- Homogenous/heterogenous

A
  • Generalizability: The applicability of the results of a study to the target population; external validity
  • The degree to which the results of the study can be generalized to other participants, settings, and times.
  • In general, the larger the sample, the more representative it will be.
    *Typically, as the number of elements in the sample increases, the closer the mean of the sample is to the mean of the population.
  • The best way for researchers to estimate sample size is to conduct a power analysis, which is the most powerful and accurate method to determine sample size.
  • Power analysis: A statistical method to determine the acceptable sample size that will best detect the true effect of the independent variable
  • To conduct a power analysis, two factors must be established: significance level and effect size
  • Significance level: The alpha level established before beginning a study (p =.05)
  • Effect size: An estimate of how large a difference will be observed between the groups
  • Some researchers use p =.01 as the significance level, which reduces the risk of a type 1 error.
  • Type 1 error: When the researcher rejects the null hypothesis when it should have been accepted
  • Another important factor to consider when determining sample size in quantitative studies is the degree to which elements are homogenous
  • Homogenous: The degree to which elements are similar or homogenous
    *Heterogeneous: The degree to which elements are diverse or not alike. In this case, the sample size will need to be larger than when a population is homogeneous.
    *When using terms such as homogenous and heterogenous, researchers are describing within-group characteristics. In contrast, representativeness is a word that describes between-group characteristics of a population and its sample
    *Attrition rate: Dropout rate; loss of participants before a study is completed; threat of mortality
93
Q

Collecting Quantitative Data:
Observation:
Questionnaires:
Scales:
Likert scales:
Visual analog scale (VAS):
Physiological measures:

A
  • Observation: A technique of carefully observing phenomena using the five senses to gather data
  • Questionnaires: Instruments consisting of a series of questions designed to gather data from participants
  • Items: Questions on a survey or questionnaire
  • Scales: Used to assign a numeric value or score on a continuum
  • Likert scales: Ordinal-level scales containing seven points on an agree or disagree continuum
  • Visual analog scale (VAS): A ratio-level scale of a 100-millimeter line anchored on each end with words or symbols
  • Physiological measures: Data obtained from the measurement of biological, chemical, and microbiological phenomena
  • Electronic medical records
94
Q

Collecting Qualitative Data:
Participant observation
Covert observation
Interviews
Focus groups
Storytelling

A
  • Participant observation: Role of the researcher in qualitative methods when the researcher is not only an observer but also a participant during data collection
  • Covert observation: When individuals are unaware that they are being observed
  • Interviews: Conversations for collecting data where questions are asked to elicit information; can be done in-person or through a variety of media
  • Focus groups: A strategy to obtain data from a small group of people using interview questions
  • Storytelling: A method of data collection associated with qualitative methods in which researchers and participants tell their stories about the phenomenon of interest