Test 2 Chapter 5 Flashcards

1
Q

Measuring concepts

A

Reliability and validity are often carried out on self-report measures such as
•NEO-PI: measure 5 factor model
•MMPI-II: used to make clinical diagnosis
•Vocational Interest Inventory: career choices
Best to use existing measures because they’ve already passed reliability and validity issues

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Reliability

A

The consistency/stability of a measure of behaviour
•important because we often can’t get a measure of an operational definition more than once

Any measure you take can be thought of as comprising:

  1. True score: the person’s real score on the variable
  2. Measurement error: measure is unreliable (a reliable measure contains little measurement error)

Best way to assess reliability:
•correlation coefficients: number that tells us how strongly two variables are related to each other
*Person product-moment correlation coefficient: most common way to measure a correlation (ranges from negative to positive one, resulting in the reliability coefficient (r))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Types of reliability

A

1) Test retest
•how consistent is the measure across time (should be approx .80)
•take the test at two different times
•alternate forms: administering two different forms of the same test at two different times
2) Internal consistency
•how well a certain set of items measure the same intended concept
•Cronbach’s alpha: researcher calculates the correlation of each item on the test with every other item (produces interim correlations)
*items that don’t correlate with the rest in the measure can be removed
3) Inter rater
•extent to which raters agree in their observations
•uses Cohen’s alpha

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Validity

A
Construct validity 
•the content of the operational definition/measure captures all the necessary aspects of the construct and nothing more
 •is the measure actually measuring the intended construct 
 •ex: is PYSC 217 measuring my ability in the class rather than my extraversion levels

Internal validity
•can we infer causality from this study

External validity
•can we generalize results beyond this group/setting

Assessing validity
•researcher should use as many of the validity types as possible
•valid in one population does not mean valid in all populations (ex: the construct of psychopathy can mean one thing to criminals and another to university students)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Types of validity

A

1) Face validity
•measure appears to accurately asses the intended variable
•not required (can result in demand characteristics) and not sufficient enough to claim construct validity
*alternative: give a bunch of items and see what happens
2) Content validity
•comparing the content of the measure with the theoretical definition of the construct
•is it capturing all the important parts of the construct
•theory developing is the main idea
3) Predictive validity
•predicts future behaviour
•ex: people with higher GPAs correlate with better job outcomes
4) Concurrent validity
•able to distinguish between theoretically relevant behaviours
•ex: someone who scores high on entitlement will take more candy
5) Convergent validity
•extent to which scores on the target measure are related to scores on other relevant constructs
•entitlement scores are positively correlated with narcissism
6) Discriminant validity
•when a measure is not related to variables that it shouldn’t be related to

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Reactivity measures

A

Tells what the person is like when they’re being observed, but can’t show how they would be in natural circumstances
•can occur behaviourally and physiologically
•ways to combat the problem: allowing people to get used to the observer/equipment
•a variable’s levels can be conceptualized in four kinds of measurement scales:
nominal, ordinal, interval, and ratio (graph)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Overview of the scale development and validation process

A

1) Develop potential questionnaire items
2) Assess reliability: must be reliable before proceeding
3) Assess relationship to other constructs
•convergent and discriminant validity
4) Assess relationship to behaviours
•concurrent (measure now), and predictive (measure later) validity
5) write psychometric validation paper and submit for peer review

How well did you know this?
1
Not at all
2
3
4
5
Perfectly