Chapter 12: Principles of Test Selection and Administration Flashcards

1
Q

Concurrent Validity

A

Test scores are associated with those of other accepted tests that measure the same ability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Construct Validity

A

The ability for a test to represent the underlying construct. Overall validity to which the test measures what it was designed to measure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Content Validity

A

Is the assessment by experts that the testing covers all relevant subtopics or component abilities in appropriate proportions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Convergent Validity

A

Is evidenced by high positive correlation between results of the test being assessed and those of the recognized measure of the construct.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Criterion-Referenced Validity

A

The extent of which test scores are associated with some other measure of the same ability.

There are three subtypes of this: Concurrent, Predictive, and Discriminant.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Discriminant Validity

A

Is the ability of a test to distinguish between two different constructs and is evidenced by a low correlation between the results of the test and those of tests of a different construct.

A good discriminant validity test avoids wasting energy on things that do not matter to the test.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Evaluation

A

The process of analyzing test results for the purpose of making decisions. For example, a coach examines the results of physical performance tests to determine whether the athlete’s training program is effective in helping achieve the training goals or whether modifications in the program are needed.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Face Validity

A

Is the appearance to the athlete and other casual observers that the test measures what it is purported to measure.

With face validity an athlete is more likely to respond to it positively.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Field Test

A

A test used to assess ability that is performed away from the laboratory and does not require extensive training or expensive equipment.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Formative Evaluation

A

Periodic reevaluation based on midtests administered during the training, usually at regular intervals.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Interrater Agreement

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Interrater Reliability

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Intrarater Variability

A

Is the lack of consistent scores by a given tester.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Intrasubject Variability

A

Is a lack of consistent performance by the person being tested.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Measurement

A

The process of collecting test data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Midtest

A

A test administered one or more times during the training period to assess progress and modify the program as needed to maximize benefit.

17
Q

Objectivity

A

Also referred to as Interrater reliability, Interrater agreement

Degree to which different raters agree in their test results over time or on repeated occasions; it is a measure of consistency.

18
Q

Posttest

A

Test administered after the training period to determine the success of the training program in achieving the training objectives.

19
Q

Predictive Validity

A

Is the extent to which the test score corresponds with future behavior or performance.

20
Q

Pretest

A

A test administered before the beginning of training to determine the athlete’s initial basic ability levels. A pretest allows the coach to design the training program in keeping with the athlete’s initial training level and the overall program objectives.

21
Q

Reliability

A

Measure of the degree of consistency or repeatability of a test.

To note; a test can be reliable while not being valid, due to in not measuring what it is intended to measure.

Test-retest reliability scores this.

22
Q

Test

A

A procedure for assessing ability in a particular endeavor

23
Q

Test Battery

A
24
Q

Test-Retest Reliability

A

Statistical correlation of the scores from two administrations provides a measure of TRR

25
Q

Typical Error of Measurement (TE)

A

Measuring the same thing as Test-Retest Reliability this includes equipment error and biological variation of athletes in the results.

26
Q

Validity

A

Degree to which a test or test item measures what it is supposed to measure.