STRATEGIES IN EPIDEMIOLOGY: Measurements and Measurement Errors; Assessing Evidence of Disease Causation Flashcards

(40 cards)

1
Q

A number or label assigned to empirical properties of a variable according to rules

A

MEASUREMENT

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

numerals that have quantitative meaning

A

NUMBERS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

attributes that have qualitative meaning

A

LABELS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Categorizing each subject into 2 or more mutually exclusive groupS

A

CLASSIFICATION (LABELS)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

examples of CLASSIFICATION (LABELS)

A
  • NUTRITIONAL STATUS
  • SEVERITY OF PAIN
  • DISEASE STATUS
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The fewer / smaller the errors, the better the measurements

A

QUALITY OF MEASUREMENTS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

errors of QUALITY OF MEASUREMENTS

A
  • MISSCLASSIFICATION
  • DEVIATION
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Pre-requisite for making measurements

A

OPERATIONAL DEFINITION

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

example of OPERATIONAL DEFINITION

A
  • WEIGHT
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

[OPERATIONAL DEFINITION]

measurement of gravitational force acting on an object

A

CONTEXTUAL

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

[OPERATIONAL DEFINITION]

result of an object on a Newton spring scale

A

OPERATIONAL

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

give the FOUR ‘SOURCES OF ERROR’

A
  1. OBSERVER: examiner, interviewer
  2. SYSTEM: coding and classifying systems
  3. SUBJECTS
  4. INSTRUMENT
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

[SOURCES OF ERROR]

  • Differences or changes in the diagnostic criteria used by most clinicians
  • Differences or changes in the application of diagnostic criteria by individual clinicians
  • Prior knowledge
A

OBSERVER: examiner, interviewer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

[SOURCES OF ERROR]

  • Defects or changes in Classification of diseases / causes of death Coding of diseases / causes of death
A

SYSTEM: coding and classifying systems

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

[SOURCES OF ERROR]

  • Behavioral
  • Interactive responses
  • Biologic variability
A

SUBJECTS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

[SOURCES OF ERROR]

‘SUBJECTS’

  • Recall problems
  • Unwillingness to disclose information
A

BEHAVIORAL

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

[SOURCES OF ERROR]

‘SUBJECTS’

  • Response modified by the behavior of interviewer
  • Response modified by knowledge that one is being observed or studies
A

INTERACTIVE RESPONSES

18
Q

[SOURCES OF ERROR]

‘SUBJECTS’

  • Random or short-term fluctuations in some biological factors
A

BIOLOGIC VARIABILITY

19
Q

[SOURCES OF ERROR]

  • Equipment / mechanical instrument
  • Single index instrument
  • Abstraction form
  • Observation Checklist
20
Q

[SOURCES OF ERROR]

‘INSTRUMENT’

  • Analytic or scaling problems of combining information from 2 or more items to form an overall index or indicator of the factor/disease
A

SINGLE INDEX INSTRUMENT

21
Q

[SOURCES OF ERROR]

‘INSTRUMENT’

  • not properly labeled
  • incomplete / unclear
A

ABSTRACTION FORM

22
Q

[SOURCES OF ERROR]

‘INSTRUMENT’

  • incomplete / unclear
A

OBSERVATION CHECKLIST

23
Q

[ABSTRACT and CONCRETE VARIABLES]

  • not measured directly
  • not easily defined
  • measured by combining the results of 2 or more item scores into single index
24
Q

[ABSTRACT and CONCRETE VARIABLES]

  • measured directly
  • easily defined
  • closely related to observed variables
25
examples of INDICATORS OF A VARIBALES
- Single - Composite - Proxy indicator
26
[INDICATORS OF A VARIBALES] - obvious and has one indicator
SINGLE
27
[INDICATORS OF A VARIBALES] - not so obvious and can have multiple indicators
COMPOSITE
28
human errors which involves PROCESSING, DATA ABSTRACTION, TRANSCRIPTION, IMPROPER USE OF SOFTWARE, SOFTWARE VIRUSES/BUGS
DATA PROCESSING
29
two examples of CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT
- RELIABILITY - VALIDITY
30
[CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT] 'RELIABILITY' examples
- INTER-OBSERVER RELIABILITY - INTRA-OBSERVER RELIABILITY - INTERNAL CONSISTENCY
31
[CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT] 'VALIDITY' examples
- SENSITIVITY - SPECIFICITY - PREDICTIVE VALUE (+) - PREDICTIVE VALUE (-)
32
[CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT] - the extent to which the measurements obtained are reproducible or repeatable
RELIABILITY
33
[CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT] 'RELIABILTY' - Across 2 or more observer
INTER-OBSERVER RELIABILITY
34
[CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT] 'RELIABILITY' - Within the same person looking at the same data, giving same results
INTRA-OBSERVER RELIABILITY
35
[CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT] 'RELIABILITY' - Similarity among items of a composite measure
INTERNAL CONSISTENCY
36
[CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT] - the extent to which measurements reflect the true values of the theoretical factors that the observed variable is supposed to measure
VALIDITY
37
[CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT] 'VALIDITY' - proportion of people labelled positive by the test among those with the disease
SENSITIVITY
38
[CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT] 'VALIDITY' - proportion of people labelled negative by the test among those without the disease
SPECIFICITY
39
[CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT] 'VALIDITY' - proportion of people who tested positive who have the disease
PREDICTIVE VALUE (+)
40
[CRITERIA FOR ASSESSING QUALITY OF MEASUREMENT] 'VALIDITY' - proportion of people who tested negative among those without the disease
PREDICTIVE VALUE (-)