ASSESSMENT AND EVALUATION Flashcards

(76 cards)

1
Q

defined as a
process of appraising
something or someone.

A

Assessment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

made to
identify the level of
performance of an
individual.

A

Assessment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Assessment:________________ Evaluation:______________

A

Quantitative; Qualitative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Making judgment about the
value or worth of objects or
events based on some
measurement to arrive at a
decision.

A

Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Performed to determine the
degree to which goals are
attained.

A

Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

a process of collecting, reviewing and using data, for the purpose of improvement in the current performance.

A

Assessment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

described as an act of passing judgement on the basis of set of standards

A

Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Assessment by nature is…

A

Diagnostic

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Evaluation by nature is…

A

Judgemental

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Provides feedback on performance and areas of improvement

A

Assessment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Determines the extent to which objectives are achieved

A

Evaluation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Purpose of Assessment is…

A

Formative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Purpose of Evaluation is…

A

Summative

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Orientation of Assessment…

A

Process Oriented

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Orientation of Evaluation…

A

Product Oriented

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

a chart that spells out the content and level of
knowledge to be tested; it can be general or specified based on teacher’s
preference.

A

Examination blueprint

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Factors that determines the number of items in the
Examination:

A

a. amount of material taught
b. type of test question used
c. amount of time available for testing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

learner is compared with a reference
group of learners

A

Relative terms (norm reference)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

learner is compared to a well defined
performance criteria

A

Absolute terms (criterion reference)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

CRITERIA FOR SELECTION OF EVALUATIVE
DEVICES/TOOL:

A

Validity
Reliability
Objectivity
Relevance
Practicality

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

the degree of accuracy which a test measures what is
intends to measure.

A

Validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

adequacy with which the test item measured on
the areas.

A

Content validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

the extent to which a relationship exists
between the test scores and later success.

A

Predictive validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

extent to which a relationship exist between
test scores ad an accepted contemporary criterion of performance on
the variable the test is supposed to assess.

A

Concurrent validity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
the consistency with which the test measures what it intends to measure.
Reliability
26
Dimensions of reliability:
a. Examinee/students b. Examiner/scorer c. Test content d. Time e. Situation
27
a test is administered twice; the correlation between the scores is an estimate of temporal reliability.
Test-retest method
28
two alternate or equivalent forms of a test are constructed and administered and 1 3 weeks interval.
Alternate or equivalent forms of method
29
odd numbered and even numbered items are scored as separate tests.
Split-half method
30
the degree of agreement between the judgment made by independent and competent observers as to whether or not a learner’s test performance meets the criteria stated in a learning objective.
Objectivity
31
the degree to which the criteria established for selection of questions conform with the aims of the measuring instrument.
Relevance
32
* the convenience in using the test instrument; * refer to the development of evaluative device capable of being administered and scored with reasonable ease within the limits of time and of the resources imposed by circumstances.
Practicality
33
Consider practicality as to:
a. Test construction b. Administration c. Scoring d. No. of examinees
34
KNOWLEDGE - Formal Evaluation Instruments:
Objective Examination - Selection Type, Supply Type, Interpretative Exercises Subjective Examination - Free response, restricted response, project assignment
35
KNOWLEDGE - Non-Formal Evaluation Instruments:
Practical Exam Oral Exam observational reports
36
ATTITUDES - Direct Methods:
Questionnaires Semantic Differential Attitude Goals Observation Rating Scale
37
ATTITUDES - Indirect Methods:
Test of judgment Test of memory and perception Information test
38
contains the stem, the key and a distractor
Multiple choice
39
contains premises/hypotheses, and the responses/alternatives which contain the jokers.
Matching type
40
responses which do not match any premise
Jokers
41
there are only two possible answers, that is true or false, correct or incorrect, right or wrong.
Alternate response
42
examinee is presented with a direct question or an incomplete statement
Supply Type
43
Examples of Supply Type:
* Fill in the blanks * Definition of terms
44
Interpretation of graphs, table, pictures, situational analysis
Interpretative Exercises
45
the learners perform the skills to be evaluated; simulated or actual
Practical Exam
46
assigning a learner a task or project to complete and then evaluate on the basis of the product of the completed performance.
Project assignment
47
usually a six step bipolar adjective scale indicating direction and intensity
Semantic Differential
48
Likert Scale
Attitude Goals
49
observation at work of cognitive and affective behavior and scale of observable behavior.
Observational Rating Scale
50
Free answer testing
Test of judgment
51
based on the assumption that what is perceived and remembered is influenced by one’s attitude.
Test of memory and perception
52
based on the assumptions that incases of uncertainty, people tend to guess in the direction of their attitude.
Information test
53
– the assignment of numbers to objects or events according to logically accepted rules
Measurement
54
Five Basic components of Evaluation:
Audience Purpose Questions Scope Resources
55
is the persons or groups for whom the evaluation is being conducted
Audience
56
to decide whether to continue a particular education program or to determine the effectiveness of the teaching process
Purpose
57
directly related to the purpose, are specific, and measurable
Questions
58
determined in part by the purpose for conducting the evaluation and in part by available resources
Scope
59
include time, expertise, personnel, materials, equipment’s, and facilities
Resources
60
Evaluation Models: to make adjustment in an educational activity, as soon as they are needed, whether those adjustments be in personnel, materials, facilities, learning objectives or even the health professional educator’s attitude.
Process (Formative) evaluation
61
Evaluation Models: to determine whether learners have acquired the knowledge or skills taught during the learning experience
Content evaluation
62
Evaluation Models: to determine the effects or outcomes of teaching efforts; its intent is to summarize what happened as a result of education
Outcome (Summative evaluation
63
Evaluation Models: to determine the relative effects of education on the institution and the community; the purpose is to obtain information that will help decide whether continuing an educational activity is worth its cost.
Impact evaluation
64
Evaluation Models: – designed and conducted to assist an audience to judge and improve the worth of some object/ educational program
Program evaluation
65
Components of Evaluation Design:
Evaluation Structure Evaluation Method Evaluation Instruments
66
all evaluations should be systematic and carefully and thoroughly planned before they are conducted
Evaluation Structure
67
include those actions that are undertaken to carry out the evaluation according to the design structure; all evaluation methods deal with data and data collection
Evaluation Methods
68
using existing instruments because instrument development requires expertise, time and expenditure of resources; requires rigorous testing for reliability and validity.
Evaluation Instruments
69
Three methods to minimize the effects of unexpected events:
1. Conduct a pilot test first 2. Include extra time 3. Keep a sense of humor
70
– basis for interpreting the results obtained from a test
Reference system
71
may measure the acquisition of skills and knowledge from multiple sources such as notes, texts and syllabi.
Norm-reference system/relative standard
72
measure performance on specific concepts and are often used in a pre-test/post-test format
Criterion reference system/absolute standard
73
refers to the process of adjusting student grades in order to ensure that a test or assignment has the proper distribution throughout the class
a. grading on a curve (SD
74
the scores are added up and divided by the number of scores. The mean is sensitive to extreme scores when population samples are small.
combining scores to obtain mean score
75
calculated based on the number of questions correct out of the MPL required, not the total number of questions
calculating minimum pass level (MPL)
76
an important tool to increase test effectiveness. It provides statistics on overall performance, test quality, and individual questions. Each items contribution is analyzed and assessed. To write effective items, it is necessary to examine whether they are measuring the fact, idea, or concept for which they were intended.
item analysis