reliability ( part 2) Flashcards

1
Q

reliability means
OBSERVED SCORES=true score(what intrested) +measure error (other infleucne)

A

consistency overtime

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

how many types reliability

A

6

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

test re test

A

participants repeat same scale
after delay of time
the run correlation of first and second round

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

test re test (add on)

A

strong positive corr= reliable
consider mean effect and stability of construct

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

alternative reliability( parrell)

A

consruct meausre using different scale

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

alter (add on

A

strong pos-reliable
difficulty in equivilant scale

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

split in half reliability

A

item within scale measure same construct
split into two and run correlation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

split ( add on)

A

strong pos- internal consitency
relies having enough items

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

odd even reliability

A

similar to split in half.
items based on odd or even

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

odd even ( add on)

A

strong positive= internal consitency

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

CRONBACHS ALPHA

A

most common
internal consitency
correlates every split half and calc avareage correlation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

cronbach ( add on)

A

value- 0-1= strong correlation= increase relaible
min value .7 PERFERABLE .8

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

INTER RATER relibaility

A

observational data
degree of agreement=higher agreement=higher reliability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

inter rater (add on)

A

calc= kappa coeffiecent(k) closer to 1 higher agreement - o no agreement

0.10-20 slight
0.21-40fair
0.41-60-moderate
0.61-80-substantioal
0.81-0.99-near perfect
1=perfect

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

what mean inter rate you want

A

.15 to .50

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

inter rater and cronbach

A

both meaure internal reliability
- use inter rater if less than 10