What makes a good test? Flashcards Preview

Teaching > What makes a good test? > Flashcards

Flashcards in What makes a good test? Deck (42)
Loading flashcards...

Characteristics of a good test (11)

1) Reliable
2) Valid
3) Causing no negative effects
4) Showing the effectiveness of the teaching/learning items
5) Giving the students a sense of self-accomplishment
6) Putting stress on what students are able to do
7) Reflecting students actual level
8) Clear and direct instructions
9) Varied
10) Time = Length
11) Need students' feedback


Don't forget... (8)

1) Assessing students’ performance
2) Form a positive attitude on the students towards tests
3) Sometimes not reflecting reality in your class
4) Not the only resource you have to see students’ progress
5) Prepare a portfolio of each student collecting writings and homework
6) Help students become aware of their strengths and weaknesses
7) Encourage them to keep a sort of diary with all their doubts and problems
8) Written and oral tests can be discouraging and cause loads of anxiety


Validity Definition

Teach what is in the program (No more / No less)


Validity Types

1) Content
2) Construct
3) Face


Content validity

1) Does a test test what it is supposed to test?
2) Choose a selection of things to test what we think are representative of students’ ability in knowing language


Construct validity

1) Establish what is valid (Is it wrong or right?)
2) Used to determine how well a test measures what it is supposed to measure


Face validity

1) Transparency or relevance of a test as it appears to test participants
2) A test can be said to have face validity if it looks like it is going to measure what it is supposed to measure


Reliability Definition

Its marking should be consistent, coherent, and objective


Reliability Types

1) Test
2) Scorer


Test Reliability

If it was possible to give the same person the same test at the same time, would the result be the same?


Suggestions for test reliability

1) Make tests varied
2) Make tests familiar
3) Make instructions clear and at the appropriate level of language
4) Restrict the task


Scorer reliability

If you gave the same test to two different people to mark, would they give the same score?


Suggestions for scorer reliability

1) Multiple choice test is the most objective
2) Answer key or marking guide for an objective test



How practical a test is to administer


Factors of Practicality

1) Time
2) Personnel
3) Space and equipment
4) Money



Effect that has a final test on the teaching programme that leads to it


Form of evaluation

1) Norm-referenced testing
2) Criterion-referenced testing
3) Direct test items
4) Indirect test items
5) Discrete point testing
6) Integrative testing


Norm-referenced testing purposes

1) Classify students
2) Highlight achievement differences between and among students to produce a dependable rank order of students across a continuum of achievement from high achievers to low achievers
3) Place students in remedial or gifted programs
4) Select students from different ability levels


Norm-referenced testing ways in which content is selected

A representative group of students is given the test prior to its availability to the public.


Criterion-referenced testing purpose

1) Determine what test takers can do and what they know, no how they compare to others
2) Report how well students are doing relative to a predetermined performance level on a specified set of educational goals or outcomes included in the school, district, or state curriculum.


Direct test items definition

When the learner’s response involves actually performing the communicative skill or language recognition/production task that is being assessed


Direct test items characteristics

1) Associated with the productive skills
2) There is an observable output that can be heard/seen.


Indirect test items

1) Measure student knowledge and ability by getting at what lies beneath their receptive and productive skills
2) The design of procedures designed to tap into the enabling skills underpinning the macro skills results in indirect assessment devices of the skill in question


Example of Indirect testing

Structure and Written Expression section in the TOEFL


Discrete point testing definition

Assumes that language knowledge can be divided into a number of independent facts: elements of grammar, vocabulary, spelling and punctuation, pronunciation, intonation, and stress


Discrete point testing resources

Tested by pure items (multiple choice / fill in the blanks)


Integrative testing definition

Argues that any realistic language use requires that coordination of many kids of knowledge in one linguistic event, and so uses items which combine those kinds of knowledge, like comprehension tasks, dictation, speaking and listening.


Integrative testing risks

Ignoring the systematic relationship between language elements and accuracy of linguistic detail


Integrative testing resources

Tested by exams of open-ended questions


Types of tests

1) Proficiency
2) Achievement
3) Diagnostic
4) Placement