Lecture 4: Evaluating a Surveillance System Flashcards

1
Q

Research VS Evaluation: (Research)–> “research seeks to prove and evaluation seeks to improve”

A
  • Production of generalizable knowledge
  • research-derived questions
  • paradigm stance
  • more controlled setting
  • clearer role
  • published
  • clearer allegiance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Research VS Evaluation: (Evaluation)–> “research seeks to prove and evaluation seeks to improve”

A
  • Knowledge intended for use
  • program or funder derived questions
  • judgemental quality
  • role conflicts
  • often not published
  • multiple allegiances
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Definition: Goal

A

general, big-picture statement of desired results

  • specifies expected program effect
  • identifies target population
  • in the form of a declarative statement
  • free of jargon
  • short and concise
  • easily understood
  • stated in positive terms
  • provides framework for strategies and objectives
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Definitions: Objectives

A

specific activities you will engage in to achieve the goal (ex: SMART objectives)

  • Specific
  • Measurable
  • Achievable
  • Realistic
  • Time-based
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

SMART objectives

A
  • Specific
  • Measurable
  • Achievable
  • Realistic
  • Time-based
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Definitions: Surveillance

A

tracks disease or risk behaviors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Definitions: monitoring

A

tracks changes in program outcomes over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Definitions: Evaluation

A

seeks to understand specifically why these changes occur

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Why Evaluate Surveillance Systems?

A
  • ensures efficient monitoring (of PH issues)
  • assess effects (to see how well the surveillance systems are meeting their objectives)
  • improve practice (to modify or adapt the surveillance systems to enhance their quality, efficiency and usefulness)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Types of Evaluations

A
  • Needs/assets evaluation
  • Process evaluation
  • Outcome evaluation
  • Impact evaluation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Four standards for effective evaluation

A
  • Utility (ensures that the information needs of intended users are met)
  • Feasibility (ensures that evaluation is realistic, prudent, diplomatic and frugal)
  • Propriety (ensures evaluation is conducted legally, ethically and with regard for welfare of all involved
  • Accuracy (ensures that the evaluation reveals and conveys technically accurate information)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Steps in Evaluating a Surveillance systems

A
  • Engage stakeholders
  • Describe the surveillance system to be evaluated
  • Focus the evaluation design
  • Gather credible evidence regarding the performance of the surveillance system
  • Justify and state conclusions and make recommendations
  • Ensure use of evaluation findings and share lessons learned
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Engage stakeholders

A
  • people who use the data to promote healthy lifestyles and the prevention and control of disease
  • PH practioners
  • Health care providers
  • Government
  • NGO’s
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Describe the surveillance system to be evaluated

A
  • describe ph importance (use indices of frequency, severity, disparities, costs, public interest, preventability)
  • describe purpose and operation of the surveillance system (draw flow chart, describe components of the system)
  • describe the resources used to operate the surveillance system (direct costs, indirect costs, or costs from societal perspective
  • -> use SurvCost to calculate)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Logic Model

A

a systematic and visual way to present and share your understanding of the relationships among the resources you have to operate your program, the activities you plan and the changes or results you hope to achieve.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Components of logic Model

A
  • Input
  • Activities
  • Output
  • Outcomes (short-term, intermediate or long-term)
  • helps with evaluation
17
Q

Focus the evaluation design

A
  • Purpose (determine specific purpose of evaluation)
  • Users (identify stakeholders who will receive findings
  • Uses (consider what will be done with the information generated.
18
Q

Gathering credible evidence regarding s.s. performance

A
  • Indicate level of usefulness
  • Describe s.s. attributes (simplicity, flexibility[best evaluated retrospectively] and data quality, acceptability, sensitivity [a/a+c],PPV [a/a+b], representativeness, timeliness, stability )
  • Describe informatics characteristic of s.s.
19
Q

Capture-recapture

A
  • uses info from overlapping lists of cases from distinct sources
  • estimate extant of missing data and the complete population
  • two-source vs. multiple source
20
Q

Case detection

A

proportion of reported persons who actually had the health related event under surveillance

21
Q

Factors influencing representativeness

A
  • Choice of an appropriate denominator for the rate calculation
  • selection of standard population for adjustment rates
  • identifying subgroups that might be systematically excluded from the reporting system.
  • identifying and targeting appropriate times to accurately describe health -related events over time
22
Q

Biases in surveillance systems

A
  • case-ascertainment bias

- information bias

23
Q

Information quality

A
  • Accuracy
  • Completeness
  • Relevance
  • Consistency
24
Q

System Quality

A
  • Usability
  • availability
  • adaptability
  • response time
  • functionality
  • data quality
  • portability
  • improved data capture
  • error reduction
  • use standard data codes
  • security
25
Q

User experience and service quality

A
  • reliability
  • responsiveness
  • assurance
  • empathy
26
Q

Justify Conclusions

A
  • standards
  • analysis and synthesis
  • interpretation
  • judgements
  • recommendations
27
Q

Ensure Use and Share lessons learned

A

-design
follow-up
-dissemination