Exam 2 Chapter 4 Flashcards

(27 cards)

1
Q

Program Process Theory (process evaluation builds off of this)

A

identifies the critical components, functions, and
relationships assumed necessary for the program to be effective

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Criteria for assessing program process performance may include:

A

stipulations from the program theory, administrative standards, applicable legal, ethical, or professional standards, and after-the-fact judgment calls

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Process evaluations can be stand alone or ongoing, which is called:

A

Program Process Monitoring

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Process evaluation is usually carried out along with this other evaluation:

A

Impact Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Process Evaluation Slide show definition

A

Information about what the program or program staff have to do to create change

Activities
Attendance
Satisfaction
Fidelity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Outcome Evaluation Slides definition:

A

Information about changes program seeks to create in clients or for clients

Knowledge
Attitudes
Behaviors
Changes in health status

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What are Process and Outcome measuring?

A

Change = Outcome

Activities that Cause Change = Process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Types of Process Evaluation

A

Snapshot Approach
Program Process Monitoring

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Type of Process Eval: Snapshot approach

A

Time limited

Done primarily for evaluation purposes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Type of Process Eval: Program process monitoring

A

Ongoing

Done for both management and evaluation purposes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Why monitor program process?

A

Confirm that the program is reaching its intended targets

Understand outcome/impact data

Improve the program even without outcome data

Facilitate program management – are administrative standards being met?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Components of Comprehensive Process Evaluation

A

Fidelity
Dose Delivered
Dose Received
Satisfaction
Reach
Recruitment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Components of Comprehensive Process Evaluation: Fidelity

A

Fidelity

Delivered as designed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Components of Comprehensive Process Evaluation: Dose Delivered

A

Dose Delivered

Amount of program delivered

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Components of Comprehensive Process Evaluation: Dose Received

A

Dose Received

Actual use of program

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Components of Comprehensive Process Evaluation: Reach

A

Reach

Proportion of intended targets served

17
Q

Components of Comprehensive Process Evaluation: Satisfaction

A

Satisfaction

Program and staff

18
Q

Components of Comprehensive Process Evaluation: Recruitment

A

Recruitment

Procedures

19
Q

Main components of process evaluation

A

Service Utilization
Organizational Function

20
Q

Main components of process evaluation:
Service Utilization

A

Service Utilization
Who is the program serving?

21
Q

Other Aspects of Service Utilization

A

Coverage

Is program participation reaching intended levels?
How many people are we serving?

Bias

Is the program serving the types of people it was designed to serve?

22
Q

Skimming (Related to service Utilization)

A

Systematically eliminating clients who are difficult to serve

Must be distinguished from legitimate efforts to refer clients who cannot be successfully served by program

23
Q

Creaming

A

Systematically Including clients who would be easy to serve.

24
Q

Monitoring Organizational Functioning

A

Is the program actually delivering the intended services? Are the services being delivered according to plan?

25
Implementation Failure (related to monitoring organizational functioning)
Intervention not delivered Wrong intervention is delivered Intervention is completely unstandardized
26
How Do We Collect Process Data?
Use existing program records Develop simple record keeping instruments or databases MIS systems Client or Community Surveys Satisfaction Awareness Utilization
27
Process Evaluation or Monitoring Plans must be designed to:
Avoid being overly punitive to staff Avoid encouraging dishonestly or unwanted changes in staff behavior