Exam 1: Ch. 1-5 Flashcards

(92 cards)

0
Q

What makes psychology a science?

A

Invention of computers led to cognition

Mental processes and behavior are intertwined

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
1
Q

What defines science?

A

Knowledge in the form of testable predictions and explanations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What differentiates science from pseudoscience? Ex?

A

Lacks reliance on empiricism and skepticism

Ex: phrenology

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Empiricism

A

Claims based on evidence/data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Skepticism

A

Not accepting a claim without evidence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Confirmation bias

A

Selectively accepting evidence that confirms a belief, or vice versa

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the 4 goals of the scientific method?

A
  1. Description
  2. Prediction
  3. Explanation
  4. Application
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

SM: Description

A

Describes the events and relationships between variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

SM: Prediction

A

Make a prediction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

SM: explanation

A

Why does it occur?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

SM: Application

A

Apply knowledge to improve lives

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Difference between correlation and causation?

A

Correlation shows a relationship but does not tell you WHY the two are related. Causation explains the cause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Empirical Approach

A

using a collection of data to base a theory or conclusion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

General Research Process Steps (7)

A
  1. Develop question
  2. Generate hypothesis
  3. Form operational definitions
  4. Choose a design
  5. Evaluate ethical issues
  6. Analyze and interpret data
  7. Report results
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Why do you need literature review during the hypothesis development process?

A

Q

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a construct? Give an example

A

Concepts that are clearly defined; the concept that is being tested

Ex: emotion, memory, mood

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

IV

A

Altered or manipulated
2 levels
Experimental/control

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is an operational definition? Example?

A

How the construct will be measured?

Ex: by their reading ability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is the difference between basic and applied research?

A

Basic=lab setting

Applied=real world setting

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Selecting a sample: inclusion

A

H

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Selecting a sample: exclusion

A

Q

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Selecting a sample: power

A

Q

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Selecting a sample: representative

A

Q

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Reliability
What is it?
What are the three types?

A

How consistent a measure is; if you measure many times, will the results be the same?

  1. Internal consistency
  2. Test-retest
  3. iInter-rater
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Validity What is it? 4 types?
Whether it measures what it's supposed to measure 1. Face 2. Convergent 3. Discriminant 4. Criterion-prediction
25
Difference between reliability and validity?
Reliability has to do with consistency while validity has to do with whether or not it measures what it is supposed to measure
26
What is external (or ecological) validity?
Results applicable to the real world
27
What is quantitative data?
Data that is translated into and analyzed into numerical data
28
What is qualitative data? Ex?
Subjective data such as a case study, ex testing memory.
29
What are confounds? | How do experiments try to eliminate them? (2ways)
Other variables that may be causing an effect on another 1. Manipulate only 1 factor at a time 2. Measure outcome variable
30
Converging evidence
Best method to confirm evidence | Evidence from various sources that lead to the same conclusion
31
Replication
Doing the study over again the exact same way to support theories further
32
Multi-method approach
1
33
Components of informed consent What does IC ensure? What does it cover?
1. Competence, knowledge and volition 2. Who you are, what you're doing, why, benefits/risks, what they'll be asked to do and for how long, voluntary participation, no penalty for withdrawal
34
When is informed consent required?
G
35
When is informed consent not required?
Research will not cause any distress | Observational research
36
Minimal risk
No more risk than what daily life involves
37
Deception and its concerns
Importance of the study Availability of alternatives How harmful is it?
38
IRB- what is it; what is its purpose?
Institutional Review Board | It is a governing board that approves all research and protects research participants
39
IACUC: what is it? What is its purpose?
Institutional Animal Care and Use Committees Protects the rights of animals and gives the ok to use animals in research
40
Observational Designs
Sampling behavior that represents the population
41
OD: Naturalistic | Is this with or without intervention?
Without observation Natural setting Observer is a passive recorder
42
Observation with intervention | Three methods?
Most psych research 1. Participant observations 2. Structured observations 3. Field experiments
43
Open participant observation
Q
44
Structured Observation | Examples?
Observer intervenes to cause/set up event Ex/ observing mother/child in a lab Piaget's observing children problem solving
45
Field Observation What is it? Where is it done? Example?
Researcher manipulates one or more IVs in a natural setting Outside the lab Bystander effect
46
Time sampling What is it? 3 types?
Choose time intervals for making observations | Systematic, random, event sampling
47
Event sampling Type of? What is it?
Type of time sampling | Observer records each event that is special
48
Situation sampling What is it? What does it increase?
Observer behavior in different location and conditions | Increases external validity
49
Subject sampling What is it? Ex?
Observe some set of people | Ex/ every 10th member of an audience
50
Coding of observational data
Process of converting observed behavior into quantitative date Ex: coding children's behaviors into ratings
51
Inter-rater reliability
Do different people rate the same behaviors in the same way?
52
Nominal scale | Ex?
Names or mutually exclusive categories; no mathematical meaning Ex: blood group, edu levels
53
Ordinal scale | Ex?
Rank, order, greater/less than Ex: letter grades, rank from best to worst
54
Interval scale | Ex?
Rank order equidistant between values; calculate distance but not ratio Ex: temperature in F or C
55
Ratio scale | Ex?
Rank order, equidistant, meaningful zero Ex: response time, age, weight
56
How to control/prevent bias
Recognize its presence | Have uninformed or blind observer
57
Advantages of unobtrusive/nonreactive data
People cannot react to presence of behavior
58
Disadvantages of non reactive/unobtrusive data
Validity harder to obtain | Bias may be present
59
Disadvantages of reactive/obtrusive design
Individuals react to observer presence Behavior may not be typical of them Threatens external validity of findings
60
Physical trace
Remnants, fragments, products of past behavior
61
Archival data
Public records/private documents describing activities of groups individuals etc
62
Why is a multi-method approach important?
H
63
Content analysis
Coding archival records to allow researchers to make inferences
64
Selective deposit Problem with what? What is it?
Archives | Some info is saved, some is not. May be incomplete or inaccurate
65
Selective survival A problem with what? What is it?
A problem with archival records | Some archives/traces have survived while others have not.
66
Simple probability
Al
67
Stratified
Population split into groups
68
Selection bias
Specific group within population is under or over represented
69
Response rate bias
Some people are more likely to respond to surveys than others
70
Advantages/disadvantages of convenience sampling
S
71
Advantages/disadvantages of probability sampling
N
72
Cross-sectional survey design
Done all at once; a snapshot in time
73
Longitudinal survey design What type of sample? Good for what? Problems?
Same sample, multiple time points changes in individuals | Problem with sample attrition
74
``` Successive independent sample When is it done? What is it good for? What type of samples used? Consistency? ```
Done over multiple time points Good for describing changes in public opinion Uses different samples of same pop Questions/sampling consistent
75
What is attrition? Why is it a problem with longitudinal designs?
Q
76
Internal consistency Type of what? What does it mean?
Type of reliability | Do all questions/items measure the same thing
77
Test-retest Type of what? What does it mean?
Reliability | Do the times measure the same thing each time?
78
Face validity
Is it obvious as to what the items are intended to measure?
79
Discriminate validity
Does it distinguish between groups?
80
Criterion-prediction
Is the measure associated with real world examples of the construct? Ex: are emotional helpers interested in helping careers?
81
Do all experiments need a control?
No
82
DV
Affected by the manipulation of the IV
83
Does the DV depend on the level of the IV?
Yes
84
Validity: internal consistency
Do all the questions/items measure the same thing?
85
What is used to measure reliability? | What is considered "good"?
Cronbach's alpha | >.70
86
Reliability: test-retest
Do items measure the same thing each time?
87
Reliability: Inter-rater
Do different people rate the same behaviors in the same way?
88
Steps to informed consent (9)
``` Explain purpose Right to decline/stop at any time Potential consequences stopping mid stream Potential risks Potential benefits Limits of confidentiality Incentives Contact info Answer questions ```
89
Who is unable to give informed consent?
Children | Adults with mental disabilities
90
External validity concerning observation
Extent to which the study's findings may be used to describe people settings or conditions beyond those used in the study
122
Types of probability sampling? (2)
Simple random | Stratified