service evaluation and surveys Flashcards

(25 cards)

1
Q

What is service evaluation

A

It comes under the umbrella of quality improvement.

It measures current practice within a service. The results of the service evaluation help towards producing internal recommendations for improvements that are not intended to be generalised beyond the service area.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does service evaluation play an important role in

A

Planning and developing services

Service improvement

Providing a quality service

Ensuring intervention is evidence based

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the three areas of interest for an evaluation?

A

Project monitoring: looking at the routine functioning of your improvement work. Is it doing what you wanted it to?

Process evaluation: looking at the way in which your improvement work is implemented and runs. Can you learn from the process?

Impact evaluation: looking at whether or not your improvement work is delivering the objectives set. Are you getting the outcomes you planned for?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

(providing quality service) What is quality? The SIX DIMENTIONS OF HEALTHCARE QUALITY

A

STEEEP
Safe (avoiding harm to patients)
Timely (reducing waits/harmful delays)
Effective (evidence based services)
Efficient (avoiding waste)
Equitable (care doesn’t vary in quality because of a persons characteristics)
Person-centred (establishing a partnership between practitioners and patients)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How does a service evaluation differ from an audit

A

It does not measure performance against a standard. It provides practical information about whether a development or service should continue or not and what needs to change/improve

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do service evaluations and audits differ from research?

A

Service evaluation looks at intervention/are that is routine. Research may involve a new treatment.
Service evaluation uses analysis of existing data (interviews/questionnaires)
For audits/se results only relevant locally, research wide
Research requires REC approval

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q
A

Key differences: Service evaluation looks at intervention that is in routine use – with selection of intervention based on choice of health care professional.

Service evaluation involves analysis of existing data but may involve additional interview/questionnaire.

Research can look at novel treatments and choice is governed by research/may involve randomisation.

Research involves collection and analysis of data not normally part of routine care; hence research needs ethical approval.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What designs and methods can you use?

A

Surveys
Semi-structured interviews/focus groups

Objective outcome measures e.g. assessment results, goal attainment

Subjective outcome measures e.g. therapist reported outcomes, patient reported outcomes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

5 steps of service evaluation

A
  1. Develop an outline plan to evaluate the work. (question, design, data to be collected, way it will be analysed, who will do evaluation)
  2. Work with your stakeholders (patients, staff, organisation leadership, commissioners of the service, other parts of public sector)
  3. Be clear about the data needed. Where possible data that are routinely available should be used. Specific data may be required for the evaluation which is not already collected routinely. It is critical that a practical approach to collecting the data is developed, and that those collecting the data are able to collect it in a way that does not impact on their day-to-day work.
  4. Develop a plan for the evaluation (key milestones, who is responsible for what, timescales)
  5. Plan the dissemination
    * who is the principal audience for the evaluation?
    * how do you intend to feed back the findings of the evaluation to them?
    * have you asked them what they want to see, and in what format?
    * is there anyone else with whom you should be sharing the findings?
    * will you be generating important learning that should be shared more widely?
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What are surveys

A

Surveys are the use of a systematised proforma to elicit the views of a particular constituent group through their responses to questions. Used to find out about something
Commonly used in SLT practice to find out about people’s views, experiences, and satisfaction levels on services, on a specific aspect of practice

Surveys may be:
* Interviewer administered or
* Self-administered
carried out:
* Face to face
* Video call
* By telephone
* Online survey
* Through the post
* By email
Or a combination.
You can ask closed or open-ended questions. Closed questions will give you quantitative data and open will give you qualitative data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are the pros and cons of close-ended questions

A

Pros
Quicker and easier to answer
Higher response rates
Easier to compare responses
Fewer irrelevant answers

Cons
Cannot provide all possible answers
Don’t allow respondents to expand on answers/offer alternative views
Can be frustrating
Participants may select any response at random.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Pros and cons of open ended questions

A

Pros- Allow for unlimited responses
Provide more detail
Offer richer qualitative data
Deliver new insights that the researcher may not have thought of.

Cons- Time-consuming to answer
Lower response rates
Potentially irrelevant information
Trickier to interpret and analyse

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

How is the survey rate found

A

the number of people who completed the survey divided by the number of people it was sent to. Usually expressed as a percentage.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What does having a good response rate help to do?

A

Produce a more representative sample
Increase sample size and statistical power
Reduce wasted time and materials

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How to maximise survey response rate

A

Pilot and revise survey
Advance and cover letters (saying WHY you’re doing it)
Incentives
Follow up reminders
Keep survey concise and clear
Optimise survey for all devices
Be flexible
Assure confidentiality
Translators and interpreters.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Why do we pilot surveys

A

To ensure the questions are right can help maximise response rate. Work closely with stakeholders to get this right. Consider translation and interpreting services if needed.
Things to watch out for:
Simplicity – is the language used accessible to the sample subjects? Have confusing acronyms/jargon been avoided?

Clarity – have ambiguous questions and answers been avoided? E.g., double-barreled, double-negatives

Length – are the questions and answers concise? Could length be distracting from the key issue being asked?

Wording – are the questions thoughtfully designed to elicit the desired information from the respondents?

Order effects – are there any potential sources of bias due to the order in which questions are asked or the way response options are presented?

Question structure – are conceptually similar questions grouped together?

Pros & cons of using open-ended/closed-ended questions

Response choices – are they mutually exclusive and exhaustive? Are questions with too few choices forcing respondents to answer imprecisely?

Should questions be compulsory? What happens if they are/are not?

Lie detectors/attention checks

Non-response – could a certain question not be answered because it is confusing/accidentally missed?
Beware of question order and sampling bias with surveys.

17
Q

Must be aware of question order and sampling bias with surveys

A

Question Order Bias occurs when the order of questions are asked in a survey or study can influence the answers that are given.

Sampling- needs to be representative of the target population- who needed to be seen and who was seen. What proportion of the target group? Survey attrition, sampling bias.

18
Q

What to watch out for: Data analysis.

A

Cleaning and preparing data
Cross tabulations- comparing different groups
Descriptive stats e.g. %
Dealing with missing data
Stats

19
Q

What to watch out for: Reporting

A

Should be clear and transparent for readers to assess the strengths and limitations of the study

Use of visuals when presenting data aids understanding of the information conveyed

Link your findings to your research aims and knowledge in field

Be careful of wording- cautious not to over-interpret findings

Consider implications for practice and future research.

20
Q

Evaluating survey reports what should you consider:

A

Coverage
Sampling
Non-response
Measurement
Other factors

21
Q

Evaluating— Coverage

A

Did most members of the target population that the sample is meant to represent have a chance to be selected? If not, are those who did not have a chance to be selected different in important ways from those who did?

If the sample did not come from a traditional sampling frame, how were potential respondents identified and recruited?

22
Q

Evaluating– sampling

A

How was the sample selected?
What steps were taken as part of the sampling and/or data collection process to ensure that the sample is representative of the target population?
How can I tell if these steps were effective?
What about sampling error?

23
Q

Evaluation- Non-response

A
  • What was the response rate (for a probability sample) or the participation rate (for a non-probability sample)?
  • How concerned should I be that not everyone who was selected in turn responded?
  • How can I tell if nonresponse is a problem? Might it be leading to bias in the survey results?
  • What steps, if any, were taken to adjust for nonresponse?
  • What impact did these adjustments have on the survey results?
24
Q

Evaluating—- measurement

A

How was the survey administered (e.g. in person, by telephone, online, multiple modes, etc.)?
Were the questions well constructed, clear, and not leading or otherwise biasing?
What steps, if any, were taken to ensure that respondents were providing truthful answers to the questions, and were any respondents removed from the final dataset (e.g., speeders, multiple completions)?

25
Evaluating--- OTHER FACTORS
* How long was the survey in the field and how much effort was put to ensuring a good response? * What incentives, if any, were respondents offered to encourage participation? * What is the record of accomplishment of the organization that conducted the survey?