Reliability and Validity Statistics Flashcards
Extent to which a measurement is consistent and free from error
Can be expected to repeat the same score on two different occasions provided that the characteristic of interest does not change
Reliability
Ratio of total variability of scores (measurements) compared to individual variability within scores (measurements)
Measured as a unitless coefficient
Relative Reliability
Indicates how much of a measured value, expressed in the original units, is likely due to error
Absolute Reliability
Absolute measure of reliability
In same units of measurement as variable
SD of the distribution of theoretical multiple measurements
Standard Error of Measurement
Often used to evaluate scales and questionnaires
How well do these questions reflect the SAME construct, not do they actually measure the construct
Internal Consistency
Each subject assessed by the same set of raters
Used for test-retest and inter-rater reliability
We can generalize findings to other raters
ICC Model 2
Each subject is assessed by the sames set of raters, but the raters represent the only raters of interest
Used for inter-rater reliability or when you do not wish to generalize the scores to other raters
ICC Model 3
Represents correlation among items and correlation of each individual item with the total score
Cronbach’s Alpha
Proportion of agreement between raters after chance agreement has been removed
Can be used on both nominal and ordinal data
Can be interpreted like ICC
Kappa Coefficient