Research Methods (Week 4) Flashcards

(32 cards)

1
Q

5 Ps of Conceptualization

A
  • guide the process of developing a clear and structured concept
  • help ensure that all relevant aspects of an idea or intervention are considered before moving forward
    1. Predisposing Factors
    2. Precipitating Factors
    3. Presenting Factors
    4. Perpetuating Factors
    5. Protective Factors
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Predisposing Factors

A

What factors make the individual vulnerable? (social, biological, psychological, environmental)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Precipitating Factors

A

What is the acute event(s) that got the individual to this point?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Presenting Factors

A

What is the primary presenting concern?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Perpetuating Factors

A

What’s making the situation/disorder worse?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Protective Factors

A

What is a good indicator of the success of therapy/intervention?
ex. person has a strong social support or person is motivated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Reliability

A
  • consistency of a measurement
  • measured by correlation
  • stronger the correlation the better the reliability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Internal consistency reliability

A
  • degree to which items on a test relate to each other
  • looks at 1 test or scale
  • way of checking if the items in a scale are all assessing the same concept
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Test-retest reliability

A
  • People who take the same test twice - are their scores consistent?
  • depends on what you are trying to measure; if it is a stable trait or not
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Inter-Rater Reliability

A

Degree to which two raters/observers agree

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Alternate form reliability

A

Degree to which scores on two forms/versions of a test are the
same

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the 2 components of reliability?

A
  1. Sensitivity
  2. Specificity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Sensitivity

A
  • Agreement regarding the presence of a particular diagnosis
  • How good is a test at correctly identifying that a person has a
    disorder? (Correct positive)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Specificity

A
  • Agreement concerning the absence of a particular diagnosis
  • How good is a test at not misidentifying that a person has a
    disorder when they do not? (Correct negative)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Validity

A
  • extent to which a test fulfills its intended purpose
  • measures what its supposed to
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Relationship between validity and reliability

A
  • validity is related to reliability
  • unreliable measures will not have good validity
  • reliability is a precursor and necessary to validity
17
Q

Content validity

A
  • Degree to which a test appropriately samples from the domain of interest
  • “Does this test include all the important elements of the concept?”
18
Q

Construct validity

A
  • Degree to which the test measures an abstract characteristic or construct that is
    not simply defined
  • “Are we really measuring what we think we’re measuring?”
19
Q

Criterion validity

A
  • Degree to which a test is associated with a related measure/variable in the
    expected way
  • “Does this test predict or reflect what it should in the real world?”
    2 types:
    Concurrent - degree to which a test is associated with a related
    measure/variable that is measured at the same time
    Predictive - degree to which a test is associated with a related
    measure/variable that is measured at the future time
20
Q

Ecological Validity

A
  • Degree to which a test measures ‘real-world’ functioning/characteristics/behaviour
  • “Does this reflect what happens in everyday life?”
21
Q

What is a Case Study?

A
  • collection of historical or biographical information on a single individual, often including experiences in therapy
  • Providing detailed descriptions of all aspects of their history, symptoms, concerns,
    behaviours, etc
  • have to keep in mind role of the clinician’s paradigm in determining the kinds of information actually collected and reported in a case study
22
Q

Pros and Cons of Case Studies

A

Pros
- can include much more detail than is typically included with other research methods
- Helpful in generating hypotheses & making discoveries
- can be used as evidence to negate an assumed universal relationship or law
Cons
- limited by size (n = 1)
- lack control and objectivity

23
Q

What is Epidemiological Research?

A
  • study of the frequency and distribution of a disorder in a population
  • provides a general picture of a disorder
  • important for planning health care facilities and services and for allocating provincial and federal grants for the study of disorders
  • focuses on 3 features of a disorder:
    1. Prevalence
    2. Incidence
    3. Risk Factors
24
Q

Prevalence

A

the proportion of a population that has the disorder at a given point or period of time (often lifetime)

25
Incidence
the number of new cases of the disorder that occur in some period, usually a year
26
What is Correlational Research?
- Measuring relationship or association between two or more variables - variables being studied are measured as they exist in nature, no manipulations, only observational
27
Directionality problem
- A difficulty that arises in the correlational method of research when it is known that two variables are related but it is unclear which is causing the other - "Correlation does not imply causation"
28
ABAB design
- used to evaluate the effectiveness of an intervention Structure A (baseline) B (intervention) A (withdrawal) B (reintroduction)
29
Internal validity
- degree to which a study can show that changes in the dependent variable are directly caused by the independent variable, and not by other factors Strong internal validity: - study controls confounding variables - a clear cause-and-effect relationship - results are not due to bias, errors, or external influences
30
External validity
- extent to which the results of a study can be generalized to other people, settings, times, or situations Strong external validity: - findings apply beyond study sample - context of the study mirrors real world conditions
31
Analogue experiments
- designed to simulate or approximate real-life conditions in a controlled setting - often used when studying complex, sensitive, or hard-to-observe phenomena directly in the real world is difficult, unethical, or impractical - Researchers create a simplified or artificial version of a real-world situation or behaviour - May lack ecological validity
32
Meta-analysis
- combines or pools results of multiple studies that all have the same/similar research question - increased robustness of evidence - increased confidence in evidence - may be subject to bias