SUPPORT SUPERVISION (SS), MONITORING & EVALUATION Flashcards

1
Q

What is support supervision

What is the importance of support supervision

A
Support Supervision (SS)– a process of guiding, helping, teaching and learning from staff in order to improve performance.
To deliver services of good and acceptable quality.
To work as a team to meet common goals and objectives.

he need to move away from the traditional model of
•Isolated superficial type of supervision – spend very little time in the facility (hello)
•Inspection and fault-finding.
•Fire- fighting activities.
•Focus on individuals rather than on processes.

Towards a problem-solving orientation or climate where the focus is on using information gathered during supervisory sessions to improve the quality of services.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the key features of support supervision

A

Full participation of the supervisee(s) and supervisor(s) in a two-way communication to resolve existing problems.
•Mutual trust
•Lines of accountability, professional responsibilities and boundaries of confidentiality are clearly defined.
•Fosters ownership and team work.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the benefits of SS(name four)

State and explain the types of SS

A

Less problems to solve as staff learn to solve their own problems.
•Less need for technical assistance.
•Respect and confidence of staff.
•Increased satisfaction.

Integrated SS
•Technical SS
•Emergency SS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

I’m the supervisory system what things are put in,what process is undertaken to produce an outcome

A
Input:
•Supervisors
•Supervisees
•Guidelines
•Tools –  checklist
•Transport
•Stationery
•Allowance
•Etc.

Process:

Planning /budgeting
•Observing
•Interviews
•Problem-solving
•Couching
•Teaching
•Decision-making
•Reporting
•Feedback/follow-up

Outcome:

Improved compliance with standards
•Improve effectiveness of care
•Improved efficiency
•Improve patient satisfaction
•Increased utilization
•Improved staff motivation and satisfaction, etc.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the skills required in a support supervision

A
Technical – good knowledge and skills on the subject/job –e.g. IPC
•Human relations skills:
●able to work with other people
●includes good communication skills
●ability  to inspire others
●establish trust
●empower others
●provide opportunities for growth
●Flexible
●open to new ideas
●promote team work.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the qualities of a good supervisor

State four resources required in supervision

A
Smart/presentable
•Punctual
•Approachable
•Exemplary
•Flexible, etc.

Time – proper scheduling, adherence to schedules
•Human resources – supervisors with necessary knowledge, skills and attitudes.
•Funds – allowances, fuel, etc
•Others – transport, stationery, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is monitoring

What does it involve

Monitoring answers what questions rn where does it get the data from

A

Collection and analysis of data, interpreting, reporting and providing feedback in order to assess whether progress is being made towards achieving set objectives.

Consists of operational and administrative activities that track
resource acquisition and allocation, cost, production of goods and
services, delivery of goods, services, and intermediate outcomes

  • Tracking progress in accordance with previously identified targets,
    objectives, or indicators (plan vs. reality) by collecting, analyzing
    information about program or project being implemented
  • Day-to-day follow up of activities during implementation to
    measure progress, identify deviations
  • Routine follow-up to ensure activities are proceeding as planned,
    and are on schedule

Routine assessment of activities; results; and answers the
question, “what are we doing?”

- Periodic, using data routinely gathered or readily obtainable,
generally from internal sources.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the benefits and limitations of monitoring

A

Usually quantitative, but cannot indicate causality

  • Assumes appropriateness of programs, activities, objectives and
    indicators
  • Tracks progress against small number of targets, indicators by
    focusing on inputs and outputs, alerts managers about problems
  • Difficult to use for impact assessment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Explain the types of monitoring .

A

Financial Monitoring:
- Aimed at tracking correct use of program funds, disbursements,
internal cash flows, assessment of cost effectiveness and efficiency
in the achievement of needs

Diagnostic Monitoring:
- Examines methods used to implement projects, identify problems
causing delays in achieving objective, or affecting quality, and
provide solutions

Operations Monitoring:
- Involves assessing the capacity of projects to continue delivering
intended services and benefits through its planned life

Midterm Assessment:
- The review and analysis of project performance to provide overall
progress in order to identify key issues and required changes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Explain performance monitoring and how it is done. Inputs Leads to Activities Leads to Outputs Leads to Outcomes Leads to Goal (Impacts). State what is involved at each level

A

Performance Monitoring:
- Tracks the use of project/program inputs and production of outputs
and to identify delays and problems

  • Consists of common ways to keep abreast with project progress
    like inspections, interim progress reviews, testing, and auditing
  • Also include regular collection and analysis of actual results
    relating to outcomes and impactsImplementation Leads to Results

Inputs:
- Financial, Human, and Material Resources

Activities:
- Tasks Personnel Undertake to Transform Inputs into Outputs

Outputs:
- Products and Services Produced

Outcomes:
- Intermediate Effects of Outputs on Target Group

Goal (Impacts):
- Long-Term, Widespread Improvement in Society

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Under the outcomes in performance monitoring, state the chain of outcomes and give examples at each level

A

Short outcome:

Learning

Changes in

  • Awareness
  • Knowledge
  • Attitudes
  • Skills
  • Opinion
  • Aspirations
  • Motivation
  • Behavioral intent

Medium outcome:

Action

Changes in

  • Behavior
  • Decision-making
  • Policies
  • Social action

Long term:

Conditions
Changes in

Conditions
Social (well-being)
Health
Economic
Civic
Environmental
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Give an example that clearly explains outcome and output

State six reasons why you should monitor activities and programs

A

Example:
Number of patients discharged from state mental hospital is an output.
Percentage of discharged who are capable of living independently is an outcome

Provide information to project management, staff, stakeholders on
whether progress is being made towards achieving objectives

  • Provide regular feedback to enhance learning; to improve planning
    of intervention programs
  • Ensure effective use of resources (material, human), and increase
    accountability with donors and other stakeholders
  • Ensure quality and learning to improve activities and services
  • Provide managers with opportunity to make timely adjustments &
    corrective actions to improve program/project design, and
    implementation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

State the key questions monitoring seeks to answer

A

To what extent are planned activities actually realized?

  • Are we making progress toward achieving set targets and
    objectives?
  • What services are provided, to whom, how often, for how long,
    and in what context?
  • How well are the services and products being provided, and is the
    program reaching the target group?
  • What is the quality and the cost per unit of the service or product?
  • To what extent is the program being implemented consistent with
    the design or plan?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

State four principles for good monitoring

A

Focus on results and follow-up; to look for what is going on well
and what is not in terms of progress towards intended results

  • Be based on regular visits to focus on results and follow-up for
    verification and validation of progress
  • Be participatory to ensure commitment, ownership, follow-up and
    feedback on performance
  • Participatory monitoring mechanisms include stakeholder meetings
    steering committees, focus group discussions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

State six common methods of monitoring

A
Client satisfaction surveys (patient and staff)
•Patient compliant systems.
•Record review.
•Clinical audits.
•Mortality audits.
•Review of adverse incidents.
•Facilitative supervision.
•Mystery client.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is evaluation?
What is a vital outcome of program evaluation?
Evaluation is designed specifically with what intention?

A

Process of making judgments about a project or program on-going
or completed based on systematic and objective collection, and
analysis of information for stakeholders

  • Episodic assessment of overall achievement, important milestones,
    and impacts, and has usually been conducted by external agencies
  • Vital outcome of program evaluation is a set of recommendations
    to address issues relating to a program design and implementation
  • Systematic way of learning from experience to improve current
    activities, and promote better planning for future action
  • Designed specifically with intention to attribute changes to an
    intervention itself; questions the rationale, relevance of a program
17
Q

Evaluation answers what question

Evaluation is the use of social science research procedures to do what?

What kind of data does it use?
Evaluation can address what questions?

A

Answers the question, “what have we achieved and what impact
have we made?”

  • The use of social science research procedures to systematically
    investigate the effectiveness of social intervention programs
    designed to improve social condition
  • In-depth analysis of achievements, and can identify unintended
    and planned impacts and effects
  • Can address “how” and “why” questions; provide guidance for
    future directions
  • Can use data from different sources, and from wide variety of
    methods (quantitative and qualitative methods)
18
Q

Why conduct evaluation program?

A

Evaluation is for sense-making and social betterment

  • To determine the effectiveness of the program
    • Did it achieve its objectives?
    • Were effects similar across subgroups?
  • To identify ways of improving on existing program design, policy,
    services and thinking
  • To improve learning and decision-making at all levels in the
    organization or with a program/project
  • To satisfy donor requirements; assuring accountability; instilling
    evaluative/questioning culture
  • For “Political” reasons; for PR; for Advocacy
19
Q

In an ideal situation: –

Monitoring and Evaluation can be complementary . How can they be complementary?

A

Monitoring
•Clarifies program objectives
•Links activities and their resources to objectives
•Translates objectives into performance indicators and sets targets
•Routinely collects data on these indicators
•Reports progress to managers and alerts them of problems

Evaluation
•Analyzes why intended results
  were or were not achieved
•Assesses specific causal contributions of activities to results
•Examines implementation process
  • Explores unintended results
  • Provides lessons, highlights significant accomplishment or program potential, and offers recommendations for improvement
20
Q

What are the forms of evaluation and state their purpose

A

Formative:
Initial assessment of target population and contextual environment. Determines concept and design

Process:
Seeks to identify the extent to which planned activities have been achieved and assesses the quality of the activities/services

Outcome:
Examines specific program outcomes, accomplishment. What changes were observed, what does it mean, and if changes are the result of the interventions?

Impact:
Gauges the program’s overall impact and effectiveness. Aims to strengthen design and replication of effective programs, and strategies

21
Q

What are formative evaluation questions? Name the kind of questions the evaluator may pose

A

Formative Evaluation Questions
•Questions on minds of stakeholders
–How can the program be improved?
–How can it become more efficient or effective?

•Kinds of questions the evaluator might pose
–What are the program’s goals and objectives?
–How are the program activities supposed to lead to the attainment of these objectives?
–Which activities best contribute to the achievement of objectives?
–What adjustments in the program might lead to better attainment of goals?
–What adjustments in program management and support are needed?
–What measures and designs could be recommended for use during summative evaluation of the program?

22
Q

What are process evaluation questions? Name the kind of questions the evaluator may pose?

A

Process Evaluation Questions
•Questions on minds of stakeholders
●What is happening in program C?
●To what extent has the program been implemented as designed?
●How much does the program vary from site to site?
•Kinds of questions the evaluator might pose
●What are the critical activities, staffing, administrative arrangements in the program?
●How many staff and participants are taking part? When? how often? where?
●Is the program running as planned?
●What is the typical schedule of activities and/or of services?
●How are time, money, and personnel allocated?
●What activities do participants in the program become involved in?
●How does the program vary from one site to another?

23
Q

What are summative evaluation questions? Name the kind of questions the evaluator may pose?

A

Summative Evaluation Questions
•Questions on minds of stakeholders
●Is Program X worth continuing?
●How effective is it?
●What conclusions can be made about the effects of Program X on its various components?
•Kinds of questions the evaluator might pose
●What are the program’s goals and objectives?
●How are the program’s most important characteristics, activities, services, etc.
●Did the planned program occur?
●Does the program lead to goal achievement?
●How effective is Program X? in comparison with alternative programs?
●Is the program differentially effective with particular types of participants and/or in particular locales?
●How costly is the program?

24
Q

What are outcome based evaluation questions? Name the kind of questions the evaluator may pose?

A

Outcome-Based Evaluation Questions

•Questions on minds of stakeholders
–To what extent is program X meeting its goals?
–Is there steady progress toward the attainment of objectives?

•Kinds of questions the evaluator might pose
–What are the goals of the program?
–How can they be measured or otherwise assessed?
–What do measures show about the degree of goal attainment?
–What other outcomes are associated with the program?
–What objectives and sub-objectives are essential to the attainment of program goals?
–What gaps exist in the attainment of objectives or sub-objectives?

25
Q

Under monitoring and evaluation framework state the level ,the description and the frequency

A

Inputs(level):
Description:Resources that are put into the project. Lead to the achievement of the output.
Frequency:Continuous

Outputs(level);
Description:Activities, services that the project is providing. Outputs lead to outcomes.
Frequency:Quarterly

Outcomes(level):
Description:Changes in behaviors or skills as a result of the implemented project. Outcomes are expected to generate impacts
Frequency:2-3 years (short to medium term)

Impacts(level):
Description:Measurable changes in behaviors, skills, health status, eg, reduced STI/HIV transmission, and reduced AIDS impacts. Impact are the effects of interventions
Frequency:3-5 years (long term)

26
Q

In the accountability era what’s done?

A

Accountability era
•What gets measured gets done

  • If you don’t measure results, you can’t tell success from failure
  • If you can’t see success, you can’t reward it
  • If you can’t reward success, you’re probably rewarding failure
  • If you can’t see success, you can’t learn from it
  • If you can’t recognize failure, you can’t correct it.
  • If you can demonstrate results, you can win public support.
27
Q

When should you evaluate

A

Before the program or project implementation (Ex-ante) to
improve its design

  • During implementation to improve program implementation
  • At the end of implementation (Terminal) for accountability and
    impact purposes
  • Long after implementation (Ex-post) to assess impact
28
Q

How do you plan an evaluation in order
Output gives immediate results true or false
Support supervision is the best form of supervision true or false

A

Identify stakeholders

Arrange preliminary meetings

Assess evaluability and negotiate with stakeholders

Examine the literature

Clarify evaluation values and theories

Determine the methodology

Present your evaluation plan