AOR 4: Eval and Research Flashcards

(62 cards)

1
Q

dissemination

A

the process of communication procedures, findings, or lesions learned from an evaluation to relevant audiences in a timely, unbiased, and consistent fashion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

true experimental designs

A

include manipulation of at least one independent variable and the research participants are randomly assigned to either the experimental or control group arms of the trial

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

decision-making components

A

based on four components designed to provide the user within the context, input, processes, and products with which to make decisions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

outcome evaluation

A

focused on the ultimate goal, product, or policy and is often measured in terms of health status, morbidity, and mortality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

convenience sampling

A

selection of individuals or groups who are available

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

descriptive statistics

A

show what the data reveal, as well as provide simple summaries about what the samples’ measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

validity

A

the consistency, dependability, and stability of the measurement process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

descriptive analysis

A

designed to describe phenomenon specific to a population using descriptive statistics such as raw numbers, percentages, and ratios (exploratory)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

quasi-experimental designs

A

include manipulation of at least one independent variable and they may contain a comparison group; however, due to ethical or practical reasons, random assignment of participants does not occur

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

test-retest reliability

A

evidence of stability over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

nonexperimental designs

A

cross-sectional in nature and do not include manipulation of any kind

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

process questions

A

help the evaluator understand phenomena, such as internal and external forces that affect program activities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

purposive sampling

A

researcher makes judgments about who to include in the sample based on study needs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

stratified multistage cluster sampling

A

in several steps, a variable of interest is used to split the sample, and then groups are randomly selected from this sample

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

impact evaluations

A

immediate and observable effects of a program leading to the desired outcomes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

propriety

A

behave legally, ethically, and with due regard for the welfare of those involved and those affected

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

research

A

organized process in which a researcher uses the scientific method to generate new knowledge

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

list of factors that affect program decisions

A

-political environment
-cultural barriers
-funding limitations
-shifting and variable leadership priorities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

steps in evaluation practice

A
  1. engage stakeholders
  2. describe the program
  3. focus the eval design
  4. gather credible evidence
  5. ensure use and share lessons learned
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

unit of analysis

A

what or who is being studied or evaluated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

utilization-focused

A

accomplished for and with a specific population

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

criterion validity

A

refers to a measure’s correlation to another measure of a variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

steps involved in qualitative data analysis

A
  1. date reduction (selecting, transforming, focusing, and condensing data)
  2. data display (creating an organized way of arranging data through a diagram or chart)
  3. conclusion drawing and verification (data is revisited multiple times to verify, test, or confirm patterns and themes)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

stratified random sampling

A

the sample is split into groups based on a variable of interest, and an equal number of potential participants from each group are selected randomly

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
attainment
focused on program objectives and the program goals
26
feasibility
be realistic, prudent, diplomatic, frugal
27
variables
operational forms of a construct
28
five elements for critically ensuring the use of an evaluation
1. design 2. preparation 3. feedback 4. follow-up 5. dissemination
29
rater reliability
the difference among scorers of items and controls for variation due to error introduced by rater perceptions
30
systematic random sampling
an inclusive list of the priority population is used, and starting with a random number, every nth potential participant is selected
31
quota sampling
selecting individuals who have a certain characteristic up to a certain number
32
network sampling
when respondents identify other potential participants who might have desired characteristics for the study
33
goal-free
not based on goals; evaluator searches for all outcomes including unintended positive and negative side effects
34
summative evaluation
associated with measures or judgments that enable the investigator to conclude impact and outcome evaluations
35
utility
serve the info needs of intended users
36
multivariate outliers
unusual combinations of scores on different variables
37
data analysis plan
used to detail how data will be scored and coded, missing data will be managed, and outliers will be handled
38
nominal scores
cannot be ordered hierarchically but are mutually exclusive (male and female)
39
logic model
take a variety of forms but generally depict aspects of a program such as inputs, outputs, and outcomes
40
formative evaluation
process that evaluators or researchers use to check an ongoing process of the evaluation from planning through implementation phases
41
steps in data analysis and synthesis
1. enter data into the database and check for errors 2. tabulate the data 3. analyze and stratify the data 4. make comparisons
42
systems analysis
based on efficiency that cost benefits or cost-effectiveness analysis is used to quantify effects of a program
43
ratio scores
represent data with common measurements between each score and a true zero (number of staff)
44
limitations
phenomena the evaluator or research cannot control that place restrictions on methods and, ultimately, conclusions
45
data screening
may include assessing the accuracy of data entry, how outliers and missing values will be handled, and if statistical assumptions are met
46
analytic analysis
is explanatory and both descriptive statistics and inferential statistics may be used to explain the phenomenon
47
inferential statistics
used when researchers or evaluators wish to draw conclusions about a population from a sample
48
accuracy
reveal and convey technically accurate information
49
naturalistic
focused on qualitative data and responsive information from participants in a program is used; most concerned with narrative explaining "why" behavior did or did not change
50
multistage cluster sampling
in several steps, groups are selected using cluster sampling
51
delimitations
decisions made by an evaluator or researcher that ought to be mentioned because they are used to help the evaluator identify the parameters and boundaries set for a study -often involve narrowing a study by geographic location, time, and population traits
52
evaluation
a series of steps that evaluators use to assess a process or program to provide evidence and feedback about the program
53
cluster sampling
when naturally occurring groups are selected instead of individuals
54
ordinal scores
do not have a standard unit of measurement between them but are hierarchal (grades)
55
data management plan
a set of procedures for determining how the data will be transferred from the instruments used in the research to the data analysis software -should include procedures for transferring data from the instruments to the data analysis software
56
process evaluation
any combination of measures that occurs as the program is implemented to assure or improve the quality of performance or delivery
57
simple random sampling
an inclusive list of the priority population is used to randomly select a certain number of potential participants from the list
58
interval scores
have common units of measurement between scores, but no true zero (temperature)
59
content/face validity
a concept that involves the instrument's items of measurement for the relevant areas of interest
60
vulnerability
a weakness in a system design resulting from inadequate risk management and testing
61
kirkpatrick's four-level training evaluation method
reaction, learning, behavior, result
62
Construct validity
Ensures that concepts of an instrument relate to the concepts of a particular theory