111 TESTING Flashcards Preview

MTS > 111 TESTING > Flashcards

Flashcards in 111 TESTING Deck (46)
Loading flashcards...
1
Q

State the purpose of a testing program?

A

NAVEDTRA 132 pg 1-2
To ensure a quality testing process is implemented to effectively assess the trainee’s achievement of learning objectives

2
Q

State the roles and responsibilities of the following for an effective testing program:Naval Education Training Command (NETC)

A

Testing policy and guidance

3
Q

State the roles and responsibilities of the following for an effective testing program:NETC N7

A

Testing policy and guidance oversight and monitors compliance by Centers.

4
Q

State the roles and responsibilities of the following for an effective testing programLearning Center Commanding Officer

A

Serves as CCA; manages Sites, DET; resolves difference, incorporates TYCOM test banks, as
appropriate

5
Q

State the roles and responsibilities of the following for an effective testing program Learning Center Director of Training

A

Ensures testing program(s) are conducted, oversees development of testing plans

6
Q

State the roles and responsibilities of the following for an effective testing program Learning Center Learning Standards Officer

A

Provides guidance to curriculum developers on testing, monitors Total Quality Indicators (TQI)
and test item analysis and remediation programs.

7
Q

State the roles and responsibilities of the following for an effective testing programCourse Curriculum Model Manager (CCMM)

A

Approves test design, maintains master test item bank

8
Q

State the roles and responsibilities of the following for an effective testing programCurriculum developer

A

Designs and develops the testing plan, admin guides, and the tests

9
Q

State the roles and responsibilities of the following for an effective testing programLearning Site Commanding Officer/Officer-in-charge

A

Implements testing plan, designates Testing Officer(s), designates the course supervisor

10
Q

State the roles and responsibilities of the following for an effective testing programLearning Site Testing Officer

A

Test administration, oversees grading, secures tests, maintains test bank(s),
coordinates/manages revisions, conducts IS training

11
Q

State the roles and responsibilities of the following for an effective testing programCourse Supervisor

A

Ensures, monitors, and validates admin, security, and test item analysis.

12
Q

State the roles and responsibilities of the following for an effective testing programParticipating Activities

A

Provides comments, feedback, new test items and maintains test and test item analysis data.

13
Q

State the primary course source data for creating test items

A

NAVEDTRA 132 pg 3-4

JDTA, OCCSTDS, CTTL/PPP Table, COI.

14
Q

List usable course source data to be used when the primary course source data is not available or has not been created

A

NAVEDTRA 132 pg 3-4
If JDTA data is not available then curriculum developers will bridge the absence of JDTA data using data elements from a combination of: Occupational Standards (OCCSTDs), CTTL, PPP
Table, and a COI

15
Q

Define the following tests- Formal

A

Test is graded and is used in the calculation of the trainee’s final grade

16
Q

Define the following tests:Informal

A

May or may not be graded – regardless, the grade will not be used in the calculation of the
trainee’s final grade

17
Q

For the below items, define the three levels of proficiency levels contained within each:Skill

A

Level 1: Imitation
Level 2: Repetition
Level 3: Habit

18
Q

For the below items, define the three levels of proficiency levels contained within each:Knowledge

A

Level 1: Knowledge/Comprehension
Level 2: Application/Analysis
Level 3: Synthesis/Evaluation

19
Q

List the five categories for performance and knowledge tests.

A

Pre-test For Validation of Material, Acceleration, Pre-requisite, Advanced Organizer
Progress Test Blocks of instruction
Comprehensive Test: Within Course or Final Exam
Oral Test Normally by board (panel of evaluators) assesses trainees comprehension
Quiz Short test to assess achievement of recently taught material

20
Q

Discuss the process of piloting a test.

A

NAVEDTRA 132 pg 4-8
It is a review process to assess test reliability and validity and make corrective adjustments before actually collecting data from the target population. It includes: Review by SMEs, piloting by CCMM and forwarded to LSO for approval, testing trainees who are in the end stages (test results not to count), surveying trainee test results, using test item analysis and survey to improve the test instrument.

21
Q

Describe the use of each test instrument as they relate to knowledge and performance tests :Job sheet

A

Job sheets direct the trainees in the step-by-step performance of a practical task they will
encounter in their job assignment.

22
Q

Describe the use of each test instrument as they relate to knowledge and performance tests :Problem sheet

A

Problem sheets present practical problems requiring analysis and decision making similar to
those encountered on the job.

23
Q

Describe the use of each test instrument as they relate to knowledge and performance tests :Assignment sheet

A

Assignment sheets are designed to direct the study or homework efforts of trainees.

24
Q

Describe the use of each test instrument as they relate to knowledge and performance tests :Multiple-choice

A

Multiple-choice test item is the most versatile of all knowledge test item formats.

25
Q

Describe the use of each test instrument as they relate to knowledge and performance tests :True or false

A

True or false test items provide only two answers

26
Q

Describe the use of each test instrument as they relate to knowledge and performance tests :Matching

A

Matching test items are defined as two lists of connected words, phrases, pictures, or symbols.

27
Q

Describe the use of each test instrument as they relate to knowledge and performance tests :Completion

A

Completion test items are free response test items in which the trainees must supply the
missing information from memory

28
Q

Describe the use of each test instrument as they relate to knowledge and performance tests :Labeling

A

Labeling or identification test items are used to measure the trainee’s ability to recall facts and
label parts in pictures, schematics, diagrams, or drawings.

29
Q

Describe the use of each test instrument as they relate to knowledge and performance tests :Essay

A

Essay test items require trainees to answer a question with a written response

30
Q

Describe the use of each test instrument as they relate to knowledge and performance tests :Case study

A

Case studies should be used when posing a complex issue, when a comprehensive
understanding of material is required.

31
Q

Describe the use of each test instrument as they relate to knowledge and performance tests :Validation of Test Instruments

A

After test instruments have been constructed, and before hey are actually assembled into a test, the content must be validated

32
Q

What are the two types of testing methods used in testing?

A

NAVEDTRA 132 pg 4-9
Criterion-Referenced Test: Assesses whether required level of skill or knowledge is met.
Norm-Referenced: Estimates individual skill or knowledge in relation to a group norm (e.g., Navy Advancement Exams).

33
Q

Discuss test failure policies and associated grading criteria within your learning
environment

A

Test (if failed), Re-train, Re-test. If passed, the highest score the student can receive is an
80%.

34
Q

Discuss during performance test design how the skill learning objective criticality is determined

A

Will be developed using job sheets. Problem sheets are normally not used as a means of performance assessment, but may be used to evaluate achievement of less critical learning objectives. Criticality of performance points to the need for selecting tasks for training that are essential to job performance, even though the tasks may not be performed frequently. The
following levels of criticality (high = 3, moderate = 2, and low = 1) will be useful when determining criticality of performance: High - value of 3. Skill is used during job performance.
Moderate - value 2. Skill influences job performance. Low - value 1. Skill has little influence on job performance.

35
Q

Discuss during knowledge test design how the knowledge learning objective criticality is determined to perform a task.

A

Knowledge tests will be developed using test items. Test items Knowledge test design begins
with determining the criticality of each learning objective. This process determines which
learning objectives to assess through formal testing and which learning objectives should be
assessed by informal testing. At the completion of this step, the assessment of each learning
objective is determined. Analysis of task data(discussed in Chapter 3) provides the information
for determining learning objective criticality. To determine criticality refer to the following
elements of course source data, at a minimum: criticality of performance, and frequency of
performance. Additional fields may be considered if deemed necessary by curriculum
developers. The factors used to determine the criticality of each learning objective will be listed
in the testing plan.

36
Q

Identify the ten sections of a testing plan.

A
Course Data
Course Roles and Responsibilities
Course Waivers
Test Development
Test Administration
Course Tests and Test Types
Grading Criteria
Remediation
Test and Test Item Analysis
Documentation
37
Q

State the purpose of test and test item analysis

A

To determine statistical validity, test and test item analysis techniques are required. The three types of analysis discussed and required for use are: difficulty index, index of discrimination, and effectiveness of alternatives. Test item analysis will be documented in the course’s testing plan.

38
Q

In a remediation program, discuss what the primary and secondary goal is.

A

A remediation program’s primary goal is to motivate and assist trainees in achieving the critical learning objectives of a course by providing additional instructional study time. A second goal of remediation is to remove barriers to learning. Because trainees learn in different ways, it may be necessary to use different methods of remediation to realize the most effective results.

39
Q

Discuss the three methods of remediation available to instructors:Targeted

A

Targeted remediation is designed to assist the trainee who is having difficulty in
accomplishing an objective(s) and/or understanding the material during normal classroom time. Targeted remediation involves limited one-on-one mentorship or SME engagement of the objective(s) area that the trainee is having difficulty with, using text and/or lab material.

40
Q

Discuss the three methods of remediation available to instructors:Scalable

A

Scalable remediation is designed to assist the trainee who is having difficulty in accomplishing objectives or understanding the material for a major portion of a course, during normal classroom time. Scalable remediation involves one-on-one mentorship or SME engagement of each major objective area that the trainee is having difficulty with using a total recall approach using one or a combination of: text, lab material, flashcards, mentor question and answer sessions.

41
Q

Discuss the three methods of remediation available to instructors:Iterative

A

Iterative Remediation involves one-on-one mentorship or SME engagement of each major objective area that the trainee is having difficulty with using a total recall approach using one ora combination of: text, lab material, flashcards, mentor question and answer sessions To complete iterative remediation the trainee must complete a minimum of 20 questions per each objective area with a minimum score of 80 percent and/or successfully complete two practice exercises or scenarios per each objective area.

42
Q

Define the following sections of a remediation program:Retest

A

When the trainee does not achieve a test’s minimum passing grade, the retest may cover the
portion of the test the trainee had difficulty with or the entire test. This decision should be based
on the degree of difficulty the trainee had with the test

43
Q

Define the following sections of a remediation program:Setback

A

When the trainee does not achieve a test’s minimum passing grade, the retest may cover the
portion of the test the trainee had difficulty with or the entire test. This decision should be based
on the degree of difficulty the trainee had with the test.

44
Q

Define the following sections of a remediation program:Drop from training and attrites

A

Every effort will be made to help trainees succeed. However, there are times when the trainee
is clearly unsuited,unable, and/or unwilling to complete the course. If this occurs, the trainee is
dropped from training. Trainees dropped from training may be classified as an academic drop,
nonacademic drop, or disenrollment. Trainees who are discharged from the Navy will be
classified as attrites.

45
Q

Define the following sections of a remediation program:Counseling

A

Preventive counseling will be instituted in ―A‖ and ―C‖ schools and should include counseling for
performance and personal problems.

46
Q

Define the following sections of a remediation program:Academic Review Boards (ARBs)

A

ARBs will be convened when other means of academic counseling, remediation, and an initial academic setback have failed to improve trainee performance. The initial academic setback may result from an academic counseling session and be directed by the CS. Additional academic setbacks must be recommended by the ARB and approved by the DOT. Examples of when an ARB may be necessary include the following:
• Trainee’s course average falls below minimum passing grade.
• Trainee is unable to achieve the objectives after counseling, remediation, retesting, and an
initial academic setback.
• Trainee’s performance is below expected academic progress.
• Trainee fails to achieve the objectives after an academic setback on those same objectives.