Lecture 2 - Validity and Reliability Flashcards

1
Q

What are 5 rest selection considerations?

A
  1. Population
  2. Ease and feasibility of test administration
  3. Ease of normative data comparison
  4. Ethics and fairness
  5. Validity and reliability
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

______ = the ability of a test to measure accurately, and ________= the consistency or repeatability of an observation.

A

validity; reliability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What 3 things does validity depend on?

A
  1. reliability
  2. relevance
  3. appropriateness of scores
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are 4 types of validity evidence?

A
  1. Construct validity
  2. Logical (face) validity
  3. Criterion validity
  4. Convergent validity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

All common types of validity evidence can be estimated either _______ or ________.

A

logically; statistically

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

_________ validity evidence = the test effectively measures the desired construct.

A

construct

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

_______ validity evidence = the measure obviously involves the performance being measured.

A

logical/face

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Is statistical evidence required for logical/face validity evidence?

A

NO

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is criterion validity also called?

A

statistical or correlation validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

______ validity = degree to which scores on a test are related to a recognized standard or criterion.

A

criterion (or statical or correlation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How is criterion validity obtained?

A

By determining the correlation/validity coefficient (r) between scores for a test and the criterion measure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are 2 types of criterion-related evidence?

A
  1. Concurrent validity

2. Predictive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

_______ validity = the criterion is measured at approximately the same time as the alternate measure and the scores are compared.

A

Concurrent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

_______ validity = the criterion is measured in the future (weeks, months, years) later.

A

predictive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

________ validity evidence = = 2 or more measurements are conducted to collect data and establish that a test battery is measuring what is purports to measure.

A

convergent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

________ = the degree to which repeated measurements of a train are reproducible under the same conditions.

A

reliability

17
Q

Use ____-____ scores to calculate a stability reliability coefficient.

A

test-retest

18
Q

What are the three types of reliability?

A
  1. Stability
  2. Internal-consistency
  3. Objectivity
19
Q

_______ reliability = scores do not change across days.

20
Q

What are 3 factors that contribute to low stability?

A
  1. The people tested may perform differently
  2. The measuring instrument may operate of be applied differently
  3. The person administering the measurement may change
21
Q

______-_______ reliability = evaluator gives at least two trials of the test within a single day. Change in the score trials indicate ____ reliability.

A

internal-consistency; poor

22
Q

What is a benefit of internal consistency reliability?

A

All measurements are taken within the same day.

23
Q

The internal-consistency reliability coefficient (is/is not) comparable to the stability reliability coefficient.

A

IS NOT; I-C coefficient is almost always higher

24
Q

________ = rater/judge reliability; aka ____-tester reliability.

A

objectivity; inter

25
What are 2 factors effecting objectivity?
1. The clarity of the scoring system | 2. The degree to which the "judge" can assign a score accurately
26
What are 5 considerations for reducing measurement error?
1. Valid and reliable tests 2. Instructions 3. Test-complexity 4. Warm up and test trials 5. Equipment quality and preparation
27
What does calibration require | ?
comparison between measurements
28
Under what 3 conditions can reliability be expected in?
1. The testing environment is favourable to good performance 2. People are motivated, ready to test, informed, and familiar 3. The person administering the test is trained and competent