Measures of Reliablity and Validity Flashcards
(39 cards)
requires constant collection, evaluation, analysis, and use of quantitative and qualitative data.
Clinical Medicine
Error
- Mistakes in the diagnosis and treatment of patients
- Mistakes due to clear negligence
Goal
minimize error in data so as to guide, not mislead
PROMOTING
Accuracy and Precision
What errors do we need to reduce
Differential and Indifferential Erros
We need to reduce what variability?
intraobserver and interobserver
closer to the true value
Accuracy
Also known as “reproducibility” or “reliability”
•Ability of a test to give the same result or a similar result with repeated measurement of the same factor
Precision
Differential error
information errors differ between groups
information is incorrect, but is the same across groups
Nondifferential error
refers to any systematic error that may occur during the collection of baseline or follow-up data
Measurement bias
Examples of Measurement Bias
• Blood pressure values
• measuring height with shoes on
• laboratories and the use of different methods
• Variability and unpredictability
• results in lack of precision
• some observations are too high and some are too low
Random error
one observer examining the same results more than once
Intraobserver variability (within the observer)
Interobserver variability (between observers)
2 or more observers examining the same material
measure of the consistency of a metric or a method
Reliability
• Overall percent agreement
• Paired observation
• Multiple variables
• Kappa test ratio
MEASURES OF RELIABILITY
Common way to measure agreement
Overall Percent Agreement (OPA)
- does not include prevalence
- does not show how disagreement occurred
- Agreement might be due to chance alone
Drawbacks of OPA
Percent Agreement Formula
PA = a+d/a+b+c+d(100)
Measures the extent to which agreement exceeds that expected by chance
KAPPA TEST RATIO
Kappa =
(Percent Agreement Observed)-(Percent agreement expected by chance alone)/100% -(Percent agreement expected by chance alone)
FORMULA FOR KAPPA TEST RATIO
INTERPRETATION OF KAPPA
<0 = Less than Chance agreement
0.01-0.20 = Slight
0.21-0.40 = Fair
0.41-0.60 = Moderate
0.61-0.80 = Substantial
0.81-0.99 = Almost perfect agreement
Ability of a test to distinguish between
WHO HAS a disease and WHO DOES NOT
Validity