Interpreting Measures Flashcards

1
Q

Kendall’s tau

A

Correlation for Nominal data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Spearman’s rho

A

Correlation for ordinal data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Correlation

A

Degree of association between two sets of data

Does not:
Tell us the extent of agreement
Give a sufficient measure of reliability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Percent agreement

A

The extent to which observers agree in their ratings

Does not consider agreement that would occur by chance - Need similar frequencies in each category

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Kappa Statistic

A

Indicates the proportion of agreement beyond that of expected chance (does not tell if there are random differences or systematic differences)

Used to express reliability for nominal or ordinal data

Only agreement beyond that expected by chance can be considered agreement

Represents ―average rate of agreement for the entire set of scores (doesn’t tell you where the discrepancies lie)

Examples: Inter-rater, Intra-rater, test-retest reliability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Kappa Scores

A

Range from –1 to 1

1: perfect agreement
0: agreement no better than if raters guessed
Negative K: agreement worse than expected by chance

Interpretation is always dependent on situation
>.80 is ―excellent
>.60 is ―substantial
.40–.60 is ―moderate
<.40 is ―poor to fair

How well did you know this?
1
Not at all
2
3
4
5
Perfectly