Evaluation Flashcards

1
Q

Evaluation

A

Describing the object of interest so its worth and merit can be judged

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Worth

A

Whether or not a program is needed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Merit

A

Whether or not a program is good

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

5 things that evaluations help planners with

A

Make decisions based on systematically collected info
See if goals and objectives have been met
See if it is being implemented as intended
See if changes need to be made along the way
Develop cost effective strategies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

2 critical purposes of evaluation

A

Assessing and improving quality

Determining program effectiveness

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

6 reasons stakeholders want evaluation

A
Determine achievement of objectives
Improve implementation
Make people accountable for their roles
Raise community support
Add to scientific knowledge and literature
Inform policy decisions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

When should the evaluation begin

A

When goals and objectives are developed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Who does the evaluation

A

Collaborative effort among stakeholders

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

5 threats to good evaluations

A
Fear of cuts if results arent good
Political skewing
Not reporting appropriate data
Political ramifications of truthful events
Implications for agencies
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Evaluation during initial plannin

A

Needs assessment with market evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Evaluation during development

A

Formative evaluation with market testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Formative evaluation

A

Save money by figuring out what people will actually use

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Evaluation during early implementation

A

Implementation evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Implementation evaluation

A

Extent to which program conforms to original plan

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Evaluation during routine operation

A

Process evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Process evaluation

A

Appraisal of program delivery and usage under normal operation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Evaluation during stable operation

A

Outcome evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Outcome evaluation

A

Appraisal of impact on clients in relation to level of participation and baseline characteristics to see if long term objectives were met

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

10 steps in evaluation

A
Clarify your program
Engage stakeholders
Assess resources
Design evaluation
Determine methods of measurement and procedures
Develop workplan, budget and timeline
Data collection
Data analysis
Interpret results
Take action
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

3 issues with asking participants if they benefited from the program

A

Retrospective reporting bias
Assumption that change could only occur with intervention
Social desirability bias

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Standards of accesibility

A

Minimum levels of effectiveness and benefits used to judge value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Formative Evaluation

A

Relates quality assessment and program improvement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Summative Evaluation

A

Determining effectiveness

24
Q

Difference between formative and process evaluation

A

Formative focuses on improving quality during implementation while process measures degree to which program was successfully implemented

25
Q

2 parts of summative evaluation

A

Impact evaluation

Outcome evaluation

26
Q

Impact evaluation

A

Focus on intermediary measures such as change in behaviour or attitude

27
Q

6 steps in CDC framework

A
Engage stakeholders
Describe program
Focus evaluation design
Gather credible evidence
Justify conclusions
Ensuring use and sharing lessons
28
Q

4 CDC standards of evaluation

A

Utility- information needs of users are satisfied
Feasability- Realistic and affordable
Propriety- Ethical
Accuracy

29
Q

Baseline data

A

Reflecting initial status of population

30
Q

Designs

A

Summative evaluation

31
Q

Focus of formative evaulations

A

Quality of program content and implementation

32
Q

When does formative evaluation occur

A

From inception of program through implementation

33
Q

Cost identification analysis

A

Compare different interventions available for a program

34
Q

Cost effectiveness analysis

A

Quantify effects of a program in monetary terms

35
Q

Multiplicity

A

Multiple component programs cater more effectively to varied needs

36
Q

Adjustment

A

Planners make necessary changes based on feedback

37
Q

6 components of process evaluation

A
Fidelity
Dose
Recruitment
Reach
Response
Context
38
Q

Fidelity

A

Programs are implemented as intended

39
Q

Dose

A

Number of program units delivered

40
Q

Recruitment

A

Degree that population is recruited for participation

41
Q

Reach

A

Proportion of population given opportunity to participate

42
Q

Response

A

Proportion of the population actually participating

43
Q

Context

A

External factors that may influence results

44
Q

Pre testing

A

Testing components of a program and collecting baseline data

45
Q

Quantitative method

A

Deductive in nature and produces numeric data

46
Q

Qualitative method

A

Inductive model produces narrative data

47
Q

Mcleroy model 1

A

Qualitative methods are used to help develop quantitative methods

48
Q

Mcleroy model 2

A

Qualitative results are used to help interpret a quantitative evaluation

49
Q

Mcleroy model 3

A

Quantitative results are used to help interpret qualitative results

50
Q

Mcleroy model 4

A

Qualitative and quantitative are used equally

51
Q

Posttest

A

Measurement after completion of the program

52
Q

Quasi experimental design

A

Interpretable and supportive evidence of program effectiveness but can not control confounding factors

53
Q

Non experimental design

A

Does not use comparison or control groups an has little control over confounding factors

54
Q

Internal validity

A

Degree to which change that was measured can be attributed to the program

55
Q

External validity

A

Extent to which program can be expected to produce similar effects in other populations