Weeks 9 & 10: Mixed Method Designs & Program Evaluation Flashcards

Also includes week 10 guest lecture

1
Q

What is a mixed-method research design?

A

A form of triangulation in which two sets of findings from different strategies leads to a third, more meaningful interpretation; the combining of more than one method of research design (quantitative and/or qualitative)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Mixed method research can be either “within” or “between” methods. How is “within” different from “between” in mixed-method research, and which is more common?

A

-“Within” means that the mixed methods come from within the same type of design (either quantitative or qualitative)

-“Between” means that methods from two different types of research design are used (ex. one quantitative and one qualitative); this is the most common type

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Why might someone choose to use mixed-method research design? (4)

A

-To counter-balance the limitations of the other method(s)
-To answer variations of the research question, either in order to generalize or to gain depth
-To add more breadth and depth to the overall results
-To use one method as a preliminary to the main method (ex. assessing the appropriateness of a questionnaire using focus groups before administering the questionnaire)11

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the types of mixed-method design? (4)

A

DEEEm+:
-Descriptive
-Exploratory
-Explanatory
-Exposing marginalization
-Sequential
-Divergent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does the “Creswell two-phase” mixed method research design entail?

A

The quantitative and qualitative designs occur in two separate phases, one after the other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does “Creswell’s dominant/less dominant” mixed method research design entail?

A

One methodology is the primary design and then a second methodology simply complements the first; both occur with the same timeframe
-Equivalent to “Engel & Schutt’s embedded”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does “Creswell’s mixed methodology” research design entail?

A

Two designs are integrated throughout within the same timeframe; not often used, as it would compromise the strength of each design alone
-Equivalent to “Engel & Schutt’s convergent/parallel”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What does “Creswell’s convergent design” entail?

A

The quantitative and qualitative designs occur within the same study timeframe and convergences and divergences inform the results

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does “Creswell’s explanatory sequential” mixed-method research design entail?

A

Two methods in sequence, with the second one (usually qualitative) informing the first (quantitative)
-Equivalent to “Engel & Schutt’s explanatory sequential”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does “Creswell’s exploratory sequential” mixed-method research design entail?

A

Two methods in sequence where the qualitative occurs first to create measures and then the quantitative method uses those measures

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are some examples of “Creswell’s complex options” of mixed-method research design? (4)

A

-Experimental (intervention) designs: experiments with a qualitative component
-Participatory action research designs: more than one method as part of PAR, decided on by participants
-Multiple case study designs: mixed methods used to examine case studies
-Evaluation designs: mixed methods used in program evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What does “Engel & Schutt’s convergent/parallel” mixed-method research design entail?

A

Quantitative and qualitative methods each have equal weight in contributing to the answer to the research question and are implemented at the same time; findings are integrated and interpreted together
-Equivalent to “Creswell’s mixed methodologies”
-Also referred to as “integrated” designs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What does “Engel & Schutt’s explanatory sequential” mixed-method research design entail?

A

The quantitative method is implemented first, then a qualitative method is used afterwards to provide more depth
-Equivalent to “Creswell’s explanatory sequential”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What does “Engel & Schutt’s embedded” mixed-method research design entail?

A

The main method (qualitative or quantitative) is only complemented by the secondary design, which is a small addition during the study’s timeframe used to obtain further depth (not sequential)
-Equivalent to “Creswell’s dominant/non-dominant”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does “Engel & Schutt’s transformative” mixed-method research design entail?

A

A social justice focus, in the same kind of way as participatory action research

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What does “Engel & Schutt’s multiphase” mixed-method research design entail?

A

A long-term design that likely occurs over a period of several years; each design’s results informs the next design
-Also referred to as “staged design”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

“Two-eyed seeing” can use any mixed-method research design, but what sets it apart from purely eurowestern research design? (2)

A

-Combines the strength of Indigenous and eurowestern worldviews
-Incorporated reciprocity, Indigenous rituals, and accountability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are the four types of possible difference between mixed-method research designs?

A

-Timing (concurrent vs. sequential)
-Weight (dominant vs. less dominant)
-Purpose (descriptive, explanatory, exploratory, o exposure of marginalization, or a combination)
-Degree of integration (fully integrated vs. presented separately)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What are some disadvantages to mixed-method research designs? (4)

A

-Time consuming
-Require expertise in both paradigms
-Scope of study becomes less focused/more complicated
-Lengthy reporting requirements

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What would “purists” say about mixed-method research designs?

A

They cannot be done/are impossible

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What would “situationalists” say about mixed-method research designs?

A

Their feasability/potential depends on the situation they are used in

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What would “pragmatists” say about mixed-method research designs?

A

The method that is used must depend on the research question being asked

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What would “pluralists” say about mixed-method research designs?

A

That both quantitative and qualitative methods must be incorporated in order to gain knowledge about both subjective and objective realities

24
Q

What is the most common type of research conducted by social work practitioners?

A

Program evaluation

25
Q

When conducting program evaluation from a structural/anti-oppressive perspective, what must be included?

A

An advisory committee: the perspectives/participation of service users in the development and implementation of the design

26
Q

What is program evaluation?

A

Research, usually for a specific program/initiative, used to reach the following goals:
-assessing the strengths and needs for change/additional programming
-showing client satisfaction
-reporting on output (services provided, completion rates)
-demonstrating outcomes, including effectiveness

27
Q

Why are program evaluations conducted? (5)

A

-Evidence of effectiveness for service users
-Evidence of cost-effectiveness for funders
-Evidence for accreditors that interventions are not harmful
-To help decision-makers (board to directors, managers, etc.) to make decisions about change
-Information for frontline workers about what is working/not working/harmful

28
Q

What are the three types of program evaluation?

A

-Needs/strengths assessment
-Formative/process
-Outcome

29
Q

What type of evaluation is most appropriate for the “determining need” stage of a program?

A

Needs/strength assessment

30
Q

What type of evaluation is most appropriate for the “planning” stage of a program?

A

Formative/process

31
Q

What type of evaluation is most appropriate for the “implementation” stage of a program?

A

Formative/process

32
Q

What type of evaluation is most appropriate for the “completed” stage of a program?

A

Outcome

33
Q

What is meant by “descriptive” research when it comes to mixed-methods, and what kind of statistics does it use?

A

Describes the population/group in question; univariate statistics

34
Q

What is meant by “exploratory” research when it comes to mixed-methods, and what kind of statistics does it use?

A

Obtaining depth or investigating a new topic area; in-depth qualitative data rather than statistics

35
Q

What is meant by “explanatory” research when it comes to mixed-methods, and what kind of statistics does it use?

A

Prediction of cause/effect relationship between variables and determining whether interventions are effective; requires inferential statistical analysis

36
Q

What is the second most common type of research for social workers, and what is it used for?

A

Needs/strengths assessment; used to examine need for/availability of resources, identify barriers to accessing resources, develop a program plan

37
Q

What are the most common designs for a needs/strengths assessment? (2)

A

-Survey
-Focus groups, sharing circles, interviews

38
Q

Do needs/strengths assessments tend to be exploratory, explanatory, descriptive, or exposing of marginalization?

A

-Usually descriptive
-Increasingly exposing of marginalization
-May be exploratory or explanatory, but not as common

39
Q

What is a program logic model, and what is it used for?

A

A flow chart representing the connection of various components of a program

Used in program planning to develop ideas and decide on vision, mission, goals, objectives, activities, and intended outcomes

Also used to guide program evaluation questions about inputs, process, outputs, and outcomes

40
Q

What is an environmental scan?

A

A method of monitoring a program/service’s external environment for relevant present factors (demographics, waitlists, etc.) and future trends (drawn from past evaluation results, census data, literature, etc.)

41
Q

What is a formative evaluation?

A

AKA “process evaluation”; a process conducted through expert consultation which is focused on the planning stage of program development; investigates…
-Organizational structure
-Supports available to workers
-How practice decisions are made
-Whether a program is being implemented the way it was designed

42
Q

What is the difference between inputs, program processes, outputs, and outcomes in program evaluation?

A

Inputs: client characteristics and number entering the service

Program processes: activities, services offered

Outputs: number of “graduates”, amount of training

Outcomes: impacts on clients (ex. feelings of depression)

43
Q

What is meant by “target population” in program evaluation?

A

Includes not just those who attend a program, but all people who ought to be able to attend

44
Q

What is meant by “stakeholders” in program evaluation?

A

Everyone who has a stake in the program: service users, funders, accreditors, staff, and the public

45
Q

What is a summative evaluation?

A

AKA “outcome evaluation”; a process that focuses on the effectiveness of the program through explanatory research designs; requires consideration of internal and external validity to determine the strength of the design

46
Q

What is the best way to objectively show effectiveness of an intervention?

A

Establishing cause and effect through the use of a pre-test/post-test control group

47
Q

What is a pre-test/post-test comparison group?

A

If a control group is not possible but it is possible to have two groups of participants, they are given similar (but non-equivalent) conditions; cannot establish cause and effect, but does show how much service users change in comparison to another condition (better than no measurement at all)

48
Q

What is meant by “one group pre-test/post-test”?

A

There is no option to compare groups, as all participants must be part of the same group; limited external and internal validity, but does show whether service users changed between the start and end of the program

49
Q

What is a “one group post-test” in program evaluation?

A

A measurement of how service users functioned (or their satisfaction) at the of a program when we cannot compare groups or have a pre-test (often a survey); limited validity and low response rate, but better than no measurement at all

50
Q

What makes satisfaction surveys less favourable than randomized controlled trials in program evaluation? (3)

A

-Low response rate
-Mostly positively skewed
-Satisfaction does not mean effectiveness

51
Q

What is a “single subject/system design” in program evaluation?

A

Measuring one unit of analysis (a single subject, a single system/group of people, etc.) rather than comparing groups like you would do in an experimental design
-You still measure an outcome/dependent variable and an independent/intervention variable, but measurements occur repeatedly throughout all phases of the research
-Analysis is done through graphs/trends rather than through statistics

52
Q

What notation is used in single-subject/system design program evaluation?

A

A = baseline
B = intervention

So B = measurement during intervention only, AB = measurement at baseline and intervention, ABA = measurement at baseline, intervention, and withdrawal, etc.

53
Q

What are some strengths of single-subject/system designs in program evaluation? (4)

A

-Easy to comprehend
-Focuses on the individual, respecting culture and dignity
-Almost instant feedback
-Can be incorporated directly into therapeutic sessions

54
Q

What are some weaknesses of single-subject/system designs in program evaluation? (4)

A

-Inability to generalize
-Questionable accuracy (self-reporting; reactive effects, testing effects)
-Issues of worker fidelity, incompetence, and/or dual relationship
-Requires cooperation during administration

55
Q

What are the two types of efficiency evaluation/analysis in program evaluation?

A

-Cost/benefit analysis: determining if a program is worth its cost

-Cost effectiveness: determining which program is most cost effective through comparing several against one another