Exam 2 pt 2 Flashcards

(49 cards)

1
Q

translating concepts of interest in a study into something observable & measurable

A

Operationalizing a Concept

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

a method to measure (quantify) a concept or variable(s) of interest

A

instrument

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

(survey) via mail, in-person, email, phone

A

questionnaire

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

systematic coding, specified time

A

observation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

researcher is present

A

interview scale

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

existing records

A

document review

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

instruments used in phenomenology

A

in depth interviews
diaries
artwork

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

instruments used in grounded theory

A

observations
open ended questions(interview)
individuals or small groups

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

instruments used in ethnography

A
observation
open ended questions (interview)
diagrams
documents
photographs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

instruments used in historical

A
open ended questions 
interviews
documents
photographs
artifacts
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

examines causes of certain effects

A

experimental/clinical trial

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

examines why certain effects occur

A

quasi experimental

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

examines relationships among variables

A

correlational

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

answers what questions

describes frequency of occurrence

A

exploratory/descriptive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

unsystematic error such as a transient state in the subject, the context of a study, or in the administration of the instrument (reliability)

A

random error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

altering the measurement of true responses in some way consistently (validity)

A

systematic error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Measurement Error theoretical formula

A

observed score= true score+error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

what is reliability concerned with

A

the repeatability or consistency with which an instrument measures the concept of interest

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

three types of reliability

A

stability, equivalence, internal consistency

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

(test/retest or intra-rater)

21
Q

(inter-rater) – alternate form

22
Q

homogeneity – split-half reliability, item-to-total correlation, Kuder-Richardson coefficient, or Chronbach’s coefficient alpha

A

internal consistency

23
Q

how is reliability reported

A

as a reliability coefficient

24
Q

how are reliability coefficients (r) expressed

A

are expressed as positive correlation coefficients ranging from 0 to +1

25
how do you interpret r
r=0.80 or higher is acceptable for existing instruments | R=0.70 or higher is acceptable for a newly developed instrument
26
is a necessary, though insufficient, condition for validity
high reliability
27
Instruments may have good reliability even if they
are not valid | they don’t measure what they are supposed to measure
28
is concerned with the consistency of repeated measures under the same circumstances
stability
29
what is stability also called
test-retest reliability or intra-rater reliability
30
what is equivalence focused on comparing
``` two versions of the same instrument (alternate form reliability) two observers (inter-rater reliability) measuring the same event; consistency in raters using discrete categories ```
31
a reliability coefficient of what indicates good agreement
.75 or greater
32
addresses the correlation of various | items within a single instrument or homogeneity; all items are measuring the same concept
internal consistency
33
internal consistency is also called what
split-half, item-to-total correlation, | Kuder-Richardson coefficient, or Chronbach’s coefficient alpha
34
divide items on instrument in half to make two versions & use Spearman-Brown formula to compare halves
split half reliability
35
each item on instrument is correlated with the total score; strong items have high correlations with the total score
item to total correlation
36
divide the instrument with dichotomous (yes/no) or ordinal responses in half every possible way
Kuder-Richardson or KR-20
37
divide the instrument with interval or ratio responses in half every possible way
Cronbach’s alpha (coefficient alpha)
38
the extent to which an instrument accurately measures what it is supposed to measure
validity
39
types of validity
Content or face validity Construct validity Criterion-related validity
40
how well the instrument compares to an older instrument (concurrent validity) or is able to predict future events, behaviors, or outcomes (predictive validity)
criterion related validity
41
how representative is the instrument of the concept(s)
content validity | -determined by a panel of experts
42
extent to which the instrument performs theoretically; how well does the instrument measure a concept
construct validity
43
ways to determine construct validity
Hypothesis testing Convergent or divergent or multi-trait-multi-method testing Known group(s) testing Factor analysis
44
use two or more instruments to measure the same concept (example: two pain scales)
convergent testing
45
compare scores from two or more instruments that measure opposite concepts (example: depression vs. happiness)
divergent testing
46
Administer instrument to subjects known to be high or low on the characteristic being measured
known groups (construct validity)
47
Use complex statistical analysis to identify multiple dimensions of a concept
factor analysis (construct validity)
48
what are the data collection methods
interviews observation text sampling
49
what is crucial for qualitative data collection
trustworthiness