Evaluation Flashcards

1
Q

Most commonly cited reasons not to evaluate

A

lack of budget and lack of time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How to evaluate with limited budget

A

piggyback studies, secondary analysis, quick tab polls, internet surveys or intercept interviews

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Practitioners’ readiness list for evaluation process?

A
  1. Understand comms, media effect theory and audience effects
  2. Understand difference between outputs and outcomes
  3. Articulate SMART objectives
  4. Be numeric as well as rhetorical
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are basic questions of evaluation?

A
  1. What is extent and distribution of the target problem
  2. Does the program conform with intended goals?
  3. What are projected or existing costs?
  4. Is the program reaching target populations?
  5. Are intervention efforts being conducted?
  6. Is the program effective in achieving goals?
  7. Can the results be explained by an alternate process?
  8. Is the program having unintended impacts?
  9. What are the costs?
  10. Is the program using resources efficiently?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Evaluation Research Steps

A
  1. Establish agreement on uses, purposes of evaluation
  2. Secure commitment to evaluate
  3. Develop consensus on using eval research
  4. Write objectives in measurable terms
  5. Select most appropriate criteria for evalution
  6. Determine best way to gather evidence
  7. Keep program records
  8. Use eval findings to manage the program
  9. Report results to management
  10. Add to professional knowledge
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Watson’s Unified Evaluation Model

A

Four stages - inputs, outputs, impact and effects

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Elements of evaluation

A
  1. Preparation/inputs
  2. Implementation/outputs
  3. impact/outcomes/effects
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Preparation evaluation

A

Assesses the quality and adequacy of info used to develop strategy and tactics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Implementation evaluation

A

Monitors effort and progress as the program unfolds

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Impact evaluation

A

Documents the consequences of the program and feedback on extent to which objectives and goals were achieved

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Preparation Criteria and Methods

A
  1. Information Base - adequacy of the background information
  2. Program Content - organization and appropriateness of program and message content
  3. Presentation Quality - packaging of info; technical and production values
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Readability and listenability

A

The approximate ease with which printed material can be read and comprehended
ELF - Easy Listening Formula - correlate with readability scores

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Gunning Fog Index

A

A method used for measuring readability. Measures difficulty based on average sentence length and the percentage of words with three or more syllables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Implementation Criteria and Models

A

Cannot be substituted for program impact;
1. Distribution - number of messages distributed
2. Placement - number of messages placed in media and content analysis
3. Potential audience - the number of people exposed to messages
4. Attentive Audience - number of people who attend to messages or attend events. Readership, listenership, viewership.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Equivalent Advertising Value

A

Calculates how much money an organization would have to have paid to secure the same space or time in the media

Calculation is flawed and misleading

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Fallacy of AVE

A
  1. Publicity can be irrelevant or in low-priority media
  2. Publicity can be neutral or negative
  3. Publicity can contain coverage of competitors
  4. Publicity can be poorly positioned or poorly presented
  5. Calculations based on casual advertising rates
  6. Calculations measure cost, not value
17
Q

Content Analysis elements

A
  1. Place or position
  2. Prominence
  3. Share of voice
  4. Issues or topics
  5. Messages
  6. Visuals
18
Q

Impact criteria and models

A

Document the extent to which outcomes were achieved
Formative - research findings to start
Intermediate impact - during the program
Summative impact - after the program

19
Q

What is knowledge gain?

A

The number of people who learn message content; measuring knowledge, awareness and understanding

20
Q

What is opinion change?

A

The number of people who change or form opinion
Criticizing to praising
Negative to positive mentions
Arguing to agreeing

21
Q

What is attitude change?

A

Number of people who change or form attitudes
Higher-order program impact
Less subject to short-term change

22
Q

What is behavior change?

A

Number of people who act in the desired fashion
Assessments include surveys, direct observation (attendance at meetings, events), indirect observation (agency records, library checkout records, by-products of behavior)

Social media measures are no different

Repeated behavior - # of people who continue or sustain desired behavior

23
Q

What is ethnography?

A

observing people in their natural habitats

24
Q

What is social and cultural change?

A

Ultimate summative evaluation is contribution to positive social and cultural change

25
Q

What is data reduction?

A

Distill data for meaning.

Necessary whenever large amounts of data are collected

26
Q

What is data display?

A

Displaying data by main categories, groupings, statistics,

Vital for helping researchers and practitioners interpret data

27
Q

End point of evaluation

A

Learning what worked and what has not and if not, why not.

NOT DATA

28
Q

When evaluation fails…

A
  1. The theory behind the strategy was faulty
  2. Program errors were made when prepping the program
  3. Evaluation didn’t detect program impact
29
Q

The benchmark model

A

Evaluation research can tell practitioners where they started, and where they want to end, and how best to get there.