Process Evaluation Flashcards

(30 cards)

1
Q

What is process evaluation?

A

A method of assessing how and why an intervention works, not just whether it works.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Name three core components of process evaluation according to Moore et al. (2015)

A
  • Understanding complex interventions
  • Developing and refining interventions
  • Testing feasibility for real-world applications
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does “implementation” refer to in process evaluation?

A

How the intervention was actually delivered VS the what was planned.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is “context” in process evaluation?

A

External factors that influence the intervention’s:
- Theory
- Implementation
- Outcomes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are “mechanisms of impact”?

A
  • How and why the intervention produces change.
  • factors that explain the relationsip between treatment and outcome
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is intervention fidelity? (faithfulness)

A

The degree to which an intervention is delivered as planned.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Name two ways to measure intervention fidelity.

A
  • Recording therapy sessions
  • Comparing delivery to intervention manuals.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is meant by the “dose” of an intervention?

A

The amount (e.g., number and length of sessions) and intensity of intervention delivered.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Why is fidelity important in evaluating interventions?

A

It helps distinguish between intervention failure and implementation failure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is an example of a mechanism of impact in a health behaviour intervention?

A
  • Change in attitudes
  • Increase in knowledge
  • Change in behaviours
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are two characteristics of complex interventions?

A
  • They are multi-component
  • tailored to individual needs.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why can’t we assume outcomes alone tell us if an intervention was successful?

A
  • we can’t tell whether results are due to the intervention itself or external factors.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the risk if we don’t assess context in an intervention study?

A

We might wrongly attribute success or failure to the intervention without recognising external influences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

According to Craig et al. (2008), what framework provides guidance for evaluating complex interventions?

A

The MRC Framework.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Why is stakeholder involvement important in intervention design?

A

It ensures the intervention is relevant, practical, and more likely to be accepted by users.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

In the CHErIsH study (Toomey et al., 2020), who were the two key stakeholder groups?

A
  • Healthcare professionals (HCPs)
  • parents.
17
Q

Why is using mixed methods important in process evaluation?

A

It provides a richer, more complete understanding of how the intervention worked.

18
Q

Give an example of a contextual factor that might affect an intervention’s success.

A
  • Socioeconomic conditions
  • Organisational support
19
Q

How does measuring mechanisms of impact help improve future interventions?

A

It identifies which processes are most important in driving behaviour change.

20
Q

What might indicate that an intervention failed due to POOR IMPLEMENTATION rather than poor design?

A
  • fidelity and dose are low
  • intervention theory remains strong.
21
Q

Explain how LOW FIDELITY in intervention DELIVERY could falsely suggest an intervention is ineffective.

A
  • If fidelity is low, the intervention was NOT delivered as designed
  • means poor outcomes may be due to DELIVERY ISSUES rather than the intervention itself.
22
Q

Why might high intervention fidelity not always guarantee positive outcomes?

A

Even if delivered perfectly, the intervention’s core theory may be flawed or not applicable to the context.

23
Q

Why do we measure “mechanisms of impact”

A
  • By examining what changed during the intervention
  • we can detect expected benefits and unexpected side effects or harms.
24
Q

Why is it important to measure dose and fidelity separately in process evaluation?

A
  • A high number of sessions (dose) does not guarantee content quality (fidelity), and vice versa
  • both aspects influence outcomes differently.
25
What specific risks arise if context is ignored in a process evaluation?
- Findings may lack generalisability - Interventions might fail when scaled to different settings or populations
26
Give a practical example of how "opportunity" (context) could moderate an intervention’s success.
A smoking cessation programme might be less effective in a workplace that tolerates smoking breaks, compared to one with strict no-smoking policies.
27
How could process evaluation findings lead to the adaptation of an intervention before scaling it up nationally?
- By identifying weak points in delivery or mechanisms. - Developers can modify content, delivery method, or targeting to enhance effectiveness before rollout.
28
Describe a situation where process evaluation might show STRONG intervention MECHANISMS but poor overall outcomes.
- Ppts may gain knowledge and motivation (mechanisms work) - if opportunity barriers like cost or access persist, behaviour change won’t occur.
29
In what ways can engaging stakeholders early in intervention design improve later process evaluation results?
Stakeholder involvement ensures interventions fit the: - real-world context - improving fidelity - acceptability - understanding of mechanisms during evaluation.
30
Why is it problematic to interpret positive outcome data without accompanying process evaluation data?
- can’t determine if positive outcomes were due to the intervention itself - external factors - or participant selection bias.