PSYC4022 Testing and Assessment Week One Gathering Information Flashcards

(96 cards)

1
Q

Trait

A

Any distinguishable, relatively enduring way in which one individual varies from another

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

States

A

Any distinguishable, relatively enduring way in which one individual varies from another but less enduring that Traits

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Sensation Seeking

A

the need for varied, novel, and complex sensations and experiences and the willingness to take physical and social risks for the sake of such experiences.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Cumulative Scoring

A

Inherent in cumulative scoring is the assumption that the more the testtaker responds in a particular direction as keyed by the test manual as correct or consistent with a particular trait, the higher that testtaker is presumed to be on the targeted ability or trait.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Error

A

Long-standing assumption that factors other than what a test attempts to measure will influence performance on a test.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Error Variance

A

The component of a test score attributable to sources other than the trait or ability measured.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Classical Test Theory (CTT) or true score theory

A

The assumption is made than each test-taker has a true score on a test that would be obtained but for the action of measurement error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Psychometric Soundness

A

Reliability and Validity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Reliability

A

The consistency of the measuring tool.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Validity

A

A test is considered valid for a particular purpose if it does, in fact, measure what it purports to measure.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Norms

A

Also referred to as normative data, norms provide a standard with which the results of measurement can be compared.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Norm

A

behaviour that is usual, average, normal, standard, expected or typical.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Normative Sample

A

Is that group of people whose performance on a particular test is analyzed for reference in evaluating the performance of individual testtatkers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

to norm or norming

A

The process of deriving norms.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Standardisation or Test Standardisation

A

The process of administering a test to a representative sample of testtakers for the purpose of establishing norms.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Sampling

A

The process of selecting the portion of the universe deemed to be representative of the whole population is referred to as Sampling.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Sample

A

a portion of the univese of people deemed to be representative of the whole population.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Stratified Sampling

A

The process of inlcuding everyone in your representative population. i.e. All religions, races etc included in the Manhattan area.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Stratified Random Sampling

A

If everyone in the sample has the same chance of being included.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Standard Error of measurement

A

A statistic used to estimate the extent to which an observed score deviates from a true score.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Standard Error of Estimate

A

In regression, an estimate of the degree of error involved in predicting the value of one variable from another

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Standard Error of the mean

A

A measure of Sampling Error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Standard Error of Difference

A

A statistic used to estimate how large a difference between two scores should be before the difference is considered statistically significant.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Purposive Sampling

A

Arbitrarily selected sample based on representativeness of the population.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Incidental Sampling/ Convenience Sampling
A sample based on the greatest level of convenience.
26
Percentile
The percentage of people who fall below a particular raw score.
27
Percentage Correct
The distribution of raw scores - the number of items answered correct multiplied by 100 and divided by the total number of items.
28
Age Norms/ Age-Equivalent Scores
Indicate the average performance of different samples of testtakers who were at various ages at the time the test was administered
29
Grade Norms
Are developed by administering the test to representative samples of children over a range of consecutive grade levels.
30
Developmental Norms
Age or Grade Norms which develop, deteriorate or otherwise be affected by chronological age, school grade or stage of life.
31
National Norms
are derived from a normative sample that was natioanlly representative of the population at the time the norming study was conducted.
32
National Anchor Norms
You can anchor the scores on one test against the scores on another test.
33
Equipercentile Method
the equivalency of scores on different tests is calculated with reference to corresponding percentile scores.
34
Subgroup Norms
Segmentation of the normative sample.
35
Local Norms
Provide normative information with respect to the local population's performance on some test.
36
Fixed Reference Group Scoring System
The distribution of scores from one group of testtakers (the fixed reference group) is used as the basis for the calculation of test scores on future administrations of the test.
37
Norm Referenced
When you compare scores on a test to other scores on that test.
38
Criterion Referenced Evaluation
When you compare scores to some other criterion.
39
Psychological Testing
involves measuring psychological variables by means of methods to obtain a sample of behaviour.
40
Psychological Measurement
is the integration of psychological data gathered via psychological tests, interviews, case studies and behavioural observation, for the purpose of making a psychological evaluation.
41
Testing
Is the administration of a psychological test, is an integral part of assessment.
42
Assessment
Involves much more than testing.
43
Distal
Furthest From
44
Proximal
Closest Too
45
Reliability Coefficient
The proportion that indicates the ratio between the true score variance on a test and the total variance.
46
Variance
A useful statistic in measuring in describing test score variability (SD squared).
47
True Variance
Variance from true difference or reliability.
48
Error Variance2
Variance from irrelevant, random sources.
49
Random Error
is a source of error in measuring a targeted variable caused by unpredictable fluctuations and inconsistencies of other variables in the measurement process. Sometimes referred to as "noise"
50
Systematic Error
Error that is typically constant or proportionate to what is presumed to be thet true value being measured.
51
Test-Retest Reliability
Test-Retest Reliability is an estimate of reliability obtained by correlating pairs of scores from the same people on two different administrations of the same test.
52
Coefficient of Stability
When the interval between testing is > 6 months in a test re-test, the estimate of tst-retest reliability is often referred to as the coefficient of stability
53
Coefficient of Equivalence
The degree of the relationship between various forms of a test can be evaluated by means of an alternate-forms or parallel-forms coeffecieint of reliability, which is often terms the coefficient of equivalence.
54
Split-Half Reliability
An estimate of split-half reliability is obtained by correlated two pairs of scores obtained from equivalent halves of a single tests administered once.
55
Spearman Brown Formula
Allows a test developer or user to estimate internal consistency reliability from a correlation of two halves of a test.
56
SEM Formula
SD*(SQRT 1-r)
57
Z-T Score Conversion
X(SD) - M
58
SD Formula
(X-M)/n
59
What is the Objective of Testing?
To obtain some gauge, usually numerical in nature, with regard to an ability or attribute.
60
What is the Objective of Assessment?
Typically, to answer a referral question, solve a problem, or arrive at a decision through the use of tools of evaluation.
61
What is the process of Testing?
Testing may be individual or group in nature. After test administration, the tester will typically add up the number of correct answers or the number of certain types of responses… with little if any regard for the how or the mechanics of such content.
62
What is the Process of Assessment?
Assessment is typically individualised. In contrast to testing, assessment more typically focuses on how an individual processes rather than simply the result of that processing.
63
What is the Role of Evaluator in Testing?
The tester is not key to the process; practically speaking, one tester may be substituted for another tester without appreciably affecting the evaluation.
64
What is the Role of the Evaluator in Assessment?
The assessor is key to the process of selecting tests and/or other tools of evaluation as well as in drawing conclusions from the entire evaluation.
65
What is the Skill of the Evaluator in Testing?
Testing typically requires technician-like skills in terms of administering and scoring a test as well as interpreting a test result.
66
What is the Skill of the Evaluator in Assessment?
Assessment typically requires an educated selection of tools of evaluation, skill in evaluation, and thoughtful organisation and integration of data.
67
What is the Outcome of Testing?
Typically, testing yields a test score or series of test scores
68
What is the Outcome of Assessment?
Typically, assessment entails a logical problem-solving approach that brings to bear many sources of data designed to shed light on a referral question.
69
What are the 5 Micro-Skills of Interviewing?
Interview Micro-Skills 1. Squarely face the client. 2. Open Posture 3. Lean Toward the Client 4. Eye Contact 5. Relax
70
What the things to avoid in an Interview?
Interview Micro-Skills - Things to Avoid 1. Non-Listening 2. Partial Listening 3. Tape-recorder Listening 4. Rehearsing 5. Interruptions 6. Question threat
71
Name 8 Tools of Assessment
Tools of Assessment 1. Tests 2. Portfolio Assessment 3. Performance-based assessment 4. the case history 5. behavioural observation 6. role-play tests 7. computerised assessment 8. assessment using simulations or video
72
Name 16 sources of information for an Assessment
1.    Referral 2.    Consent/ Limitations 3.    Procedures/ Documents 4.    Mental Status Examination 5.    Psychosocial History 6.   Mental Health History 7.    History of Present Problem 8.    Past Intervention/ Responses 9.    Response Style/ Psychometric Testing 10.    Psychological Formulation 11.    Diagnosis 12.    Client Goals/ Proposals 13.   Risk 14.    Recommendations/ Intervention Plans 15.    Report & Technical Addendum 16.    Informing Interview
73
Give a brief History of the Clinical Interview
1. Synder (1945) - non-directive approach encouraged self-exploration 2. Strupp (1958) - importance of interviewer experience 3. Rogers (1961) - therapeutic alliance and client-centred approaches 4. 1960‟s fracturing of approaches 5. 1980's greater granularity in disorders paved way for very specific diagnostic criterion 6. 1980's hybrid of structured and non-structured 7. 1990's managed health care's impact on practice 8. 1990's computer-assisted interviewing 9. 1994 - single session therapy 10. 1990's repressed memories 11. 2000's cultural awareness
74
Name 4 Biases for Assessment
1. Halo 2. Confirmatory 3. Physical Attractiveness (Gilmore et al, 1986) 4. Interviewee distortions
75
Mehrabian (1972) broke down information received into verbal and non-verbal information. What % of information is gathered through facial expressions, tone and content of what is being said?
1. 55% facial expression. 2. 38% tone 3. 7% content of what is being said
76
What are the phases of clinical assessment? (According to Maloney & Ward (1976)
1.      Phase 1 – Initial Data Collection. 2.      Phase 2 – Development of Inferences 3.      Phase 3 – Reject, Modify or Accept Inferences 4.      Phase 4 – Develop and Integrate Hypothesis 5.      Phase 5 – Dynamic Model of the Person 6.      Phase 6 – Situational Variables 7.      Phase 7 – Prediction of Behaviour
77
What are 3 sources of error variance?
Sources of Error Variance 1.   Assessees are sources of error variance. 2.   Assessors are also sources of error variance. 3   Measuring Instruments are sources of error variance.
78
There are 11 cues from which you can take information for an assessment. What are they?
1. Personal Information Cues 2. Medical Cues 3. Immediacy Cues 4. Speech Cues 5. Language Cues 6. Physical Cues 7. Cognitive Cues 8. Risk Assessment Cues 9. Collateral Information Cues 10. Overt Behavioural Cues 11. Personal History Cues
79
Give some examples of a Personal Information Cue
Gender, Occupation, race, religious affiliations, socioeconomic status, appearance, lifestyle factors.
80
Give some examples of a Medical Cue
Medication prescribed, compliance, blood serology, previous diagnosis, current diagnosis, family history of diagnosis.
81
Give some examples of an Immediacy Cue
Engagement, affect, communication style, facial expressions, emotional expression, personality traits/ temperament, transferences.
82
Give an example of a Speech Cue
Tone, flow, perserverative, slurred, volume, pitch, pace.
83
Give an example of a Language Cue
Descriptors, words used, developmentally appropriate, use of humour
84
Give an example of a Physical Cue
Breathing, eye contact, voice, body movements.
85
Give an example of a Cognitive Cue
Attention, memory, intelligence, intellectual disability, judgement, decision making, perceptions.
86
Give an example of a Risk Assessment Cue
Risk of Harm to self, others, intent, means and plan.
87
Give an example of a Collateral Information Cue
Congruency between verbal and non-verbal, consistency between collateral, psychometrics and narrative. Referral Source and Question.
88
Give an example of an Overt Behavioural Cue
Behaviour in Waiting Room, occupation of space in therapeutic environment, feedback from client.
89
Give an example of a Personal History Cue
Psychosocial History, relationship status, conflicts, support networks.
90
What are 10 things you want from an interview?
1. Standardisation (see Groth-Marnat, 2009) 2. Intake Interview 3. Planning 4. Rapport-Building 5. Open-ended questions (TED) 6. Active Listening/ Attending behaviours (beware negative attending) 7. Open mind-set - avoid biases like? 8. Accurate (and discrete) note taking. 9. How would you approach note taking with a client? "I'm going to jot down a few notes to make sure I'm remembering everything correctly. Is that alright with you?" (Shea, 1998) 10. Empathy -> Active listening, paraphrasing.
91
What are the 5 P's of a Clinical Interview?
1. Presenting 2. Precipitating 3. Perpetuating 4. Pre-morbid 5. Protective
92
Describe a Presenting Issue
Exactly what are the thoughts, behaviours, feelings associated with their concerns.
93
Describe a Precipitating Issue
(Distal) if these experiences started 6 months ago, tell me about the month or so before that ALSO (Proximal) so these experiences come on in waves, talk me through the last time it happened, starting 10 minutes before you noticed the feelings.
94
Describe a Perpetuating Issue
What makes things worse for you? What things aid these feelings to continue happening (for example, panic attacks).
95
Describe a Pre-morbid Issue
Previous physical health and mental health status also risk factors (e.g. homelessness, history of abuse) though you wouldn't just plough in with "have you ever been abused" it would need to come in a long way down the track when true trust and rapport have developed.
96
Describe a Protective Issue
What helps this person continue to function (e.g. they are working, have a close family, have a good education).