What makes a good test? Flashcards

1
Q

Characteristics of a good test (11)

A

1) Reliable
2) Valid
3) Causing no negative effects
4) Showing the effectiveness of the teaching/learning items
5) Giving the students a sense of self-accomplishment
6) Putting stress on what students are able to do
7) Reflecting students actual level
8) Clear and direct instructions
9) Varied
10) Time = Length
11) Need students’ feedback

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Don’t forget… (8)

A

1) Assessing students’ performance
2) Form a positive attitude on the students towards tests
3) Sometimes not reflecting reality in your class
4) Not the only resource you have to see students’ progress
5) Prepare a portfolio of each student collecting writings and homework
6) Help students become aware of their strengths and weaknesses
7) Encourage them to keep a sort of diary with all their doubts and problems
8) Written and oral tests can be discouraging and cause loads of anxiety

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Validity Definition

A

Teach what is in the program (No more / No less)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Validity Types

A

1) Content
2) Construct
3) Face

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Content validity

A

1) Does a test test what it is supposed to test?

2) Choose a selection of things to test what we think are representative of students’ ability in knowing language

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Construct validity

A

1) Establish what is valid (Is it wrong or right?)

2) Used to determine how well a test measures what it is supposed to measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Face validity

A

1) Transparency or relevance of a test as it appears to test participants
2) A test can be said to have face validity if it looks like it is going to measure what it is supposed to measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Reliability Definition

A

Its marking should be consistent, coherent, and objective

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Reliability Types

A

1) Test

2) Scorer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Test Reliability

A

If it was possible to give the same person the same test at the same time, would the result be the same?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Suggestions for test reliability

A

1) Make tests varied
2) Make tests familiar
3) Make instructions clear and at the appropriate level of language
4) Restrict the task

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Scorer reliability

A

If you gave the same test to two different people to mark, would they give the same score?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Suggestions for scorer reliability

A

1) Multiple choice test is the most objective

2) Answer key or marking guide for an objective test

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Practicality

A

How practical a test is to administer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Factors of Practicality

A

1) Time
2) Personnel
3) Space and equipment
4) Money

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Backwash

A

Effect that has a final test on the teaching programme that leads to it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Form of evaluation

A

1) Norm-referenced testing
2) Criterion-referenced testing
3) Direct test items
4) Indirect test items
5) Discrete point testing
6) Integrative testing

18
Q

Norm-referenced testing purposes

A

1) Classify students
2) Highlight achievement differences between and among students to produce a dependable rank order of students across a continuum of achievement from high achievers to low achievers
3) Place students in remedial or gifted programs
4) Select students from different ability levels

19
Q

Norm-referenced testing ways in which content is selected

A

A representative group of students is given the test prior to its availability to the public.

20
Q

Criterion-referenced testing purpose

A

1) Determine what test takers can do and what they know, no how they compare to others
2) Report how well students are doing relative to a predetermined performance level on a specified set of educational goals or outcomes included in the school, district, or state curriculum.

21
Q

Direct test items definition

A

When the learner’s response involves actually performing the communicative skill or language recognition/production task that is being assessed

22
Q

Direct test items characteristics

A

1) Associated with the productive skills

2) There is an observable output that can be heard/seen.

23
Q

Indirect test items

A

1) Measure student knowledge and ability by getting at what lies beneath their receptive and productive skills
2) The design of procedures designed to tap into the enabling skills underpinning the macro skills results in indirect assessment devices of the skill in question

24
Q

Example of Indirect testing

A

Structure and Written Expression section in the TOEFL

25
Discrete point testing definition
Assumes that language knowledge can be divided into a number of independent facts: elements of grammar, vocabulary, spelling and punctuation, pronunciation, intonation, and stress
26
Discrete point testing resources
Tested by pure items (multiple choice / fill in the blanks)
27
Integrative testing definition
Argues that any realistic language use requires that coordination of many kids of knowledge in one linguistic event, and so uses items which combine those kinds of knowledge, like comprehension tasks, dictation, speaking and listening.
28
Integrative testing risks
Ignoring the systematic relationship between language elements and accuracy of linguistic detail
29
Integrative testing resources
Tested by exams of open-ended questions
30
Types of tests
1) Proficiency 2) Achievement 3) Diagnostic 4) Placement
31
Proficiency tests
Designed to measure people’s ability in a language regardless of any formal training they may had in that language
32
Achievement tests | Progress / Final
Directly linked to language courses. Their purpose is to establish how successful students have been in achieving course objectives
33
Diagnostic tests
Used to identify students strengths and weaknesses
34
Placement tests
Provide information which help to place students in a certain stage in a language program
35
Hybrid approach to test design
Make balance exams based on the topics that you are studying
36
Language operation in naturalistic contexts
Lexical, grammatical, sociolinguistic as well as discourse features have to be tested as they would operate in naturalistic contexts
37
Occurrence of natural language
Natural language occurs in discourse and in extralinguistic situational contexts, so test that embed these second language features should be given.
38
Motivation of students | Materials
Students are more motivated if the materials seem relevant to their communicative needs
39
Students should be concerned that...
Their instruments enable them to analyze students performance in terms of course objectives
40
Convergent items
One right answer required
41
Open-ended
Many possible answers
42
Dichotomy of testing items
1) Sequences of single sentences or phrases unrelated to one another 2) Sequential, naturalistic disclosure