Chapter 6 Vocab Flashcards Preview

AP Psychology > Chapter 6 Vocab > Flashcards

Flashcards in Chapter 6 Vocab Deck (45)
Loading flashcards...
1
Q

learning

A

a relatively permanent change in an organism’s behavior due to experience

2
Q

habituation

A

an organism’s decreasing response to a stimulus with repeated exposure to it

3
Q

associative learning

A

learning that certain events occur together. The events may be two stimuli (as in classical conditioning) or a response and its consequences (as in operant conditioning). (p. 216)

4
Q

classical conditioning

A

a type of learning in which one learns to link two or more stimuli and anticipate events. (p. 218)

5
Q

behaviorism

A

the view that psychology (1) should be an objective science that (2) studies behavior without reference to mental processes. Most research psychologists today agree with (1) but not with (2). (pp. 6, 218)

6
Q

UR

A

in classical conditioning, the unlearned, naturally occurring response to the unconditioned stimulus (US), such as salivation when food is in the mouth. (p. 219)

7
Q

US

A

in classical conditioning, a stimulus that unconditionally—naturally and automatically—triggers a response. (p. 219)

8
Q

CR

A

in classical conditioning, the learned response to a previously neutral (but now conditioned) stimulus (CS). (p. 219)

9
Q

CS

A

in classical conditioning, an originally irrelevant stimulus that, after association with an unconditioned stimulus (US), comes to trigger a conditioned response. (p. 219)

10
Q

acquisition

A

in classical conditioning, the initial stage, when one links a neutral stimulus and an unconditioned stimulus so that the neutral stimulus begins triggering the conditioned response. In operant conditioning, the strengthening of a reinforced response. (p. 220)

11
Q

higher order conditioning

A

a procedure in which the conditioned stimulus in one conditioning experience is paired with a new neutral stimulus, creating a second (often weaker) conditioned stimulus. For example, an animal that has learned that a tone predicts food might then learn that a light predicts the tone and begin responding to the light alone. (Also called second-order conditioning.) (p. 220)

12
Q

extinction

A

the diminishing of a conditioned response; occurs in classical conditioning when an unconditioned stimulus (US) does not follow a conditioned stimulus (CS); occurs in operant conditioning when a response is no longer reinforced. (p. 221)

13
Q

spontaneous recovery

A

the reappearance, after a pause, of an extinguished conditioned response. (p. 221)

14
Q

generalization

A

the tendency, once a response has been conditioned, for stimuli similar to the conditioned stimulus to elicit similar responses. (p. 222)

15
Q

discrimination

A

(1) in classical conditioning, the learned ability to distinguish between a conditioned stimulus and stimuli that do not signal an unconditioned stimulus. (2) unjustifiable negativebehavior toward a group and its members. (pp. 222, 664)

16
Q

learned helplessness

A

the hopelessness and passive resignation an animal or human learns when unable to avoid repeated aversive events. (p. 223)

17
Q

respondent behavior

A

behavior that occurs as an automatic response to some stimulus. (p. 228)

18
Q

operant conditioning

A

a type of learning in which behavior is strengthened if followed by a reinforcer or diminished if followed by a punisher. (p. 228)

19
Q

operant behavior

A

behavior that operates on the environment, producing consequences. (p. 228)

20
Q

law of effect

A

Thorndike’s principle that behaviors followed by favorable consequences become more likely, and that behaviors followed by unfavorable consequences become less likely. (p. 229)

21
Q

operant chamber

A

in operant conditioning research, a chamber (also known as a Skinner box) containing a bar or key that an animal can manipulate to obtain a food or water reinforcer; attached devices record the animal’s rate of bar pressing or key pecking. (p. 229)

22
Q

shaping

A

an operant conditioning procedure in which reinforcers guide behavior toward closer and closer approximations of the desired behavior. (p. 229)

23
Q

discriminative stimulus

A

in operant conditioning, a stimulus that elicits a response after association with reinforcement (in contrast to related stimuli not associated with reinforcement)

24
Q

reinforcer

A

in operant conditioning, any event thatstrengthens the behavior it follows. (p. 230)

25
Q

positive reinforcement

A

increasing behaviors by presenting positive stimuli, such as food. A positive reinforcer is any stimulus that, when presentedafter a response, strengthens the response. (p. 231)

26
Q

negative reinforcement

A

increasing behaviors by stopping or reducing negative stimuli, such as shock. A negative reinforcer is any stimulus that, when removed after a response, strengthens the response. (Note: negative reinforcement is not punishment.) (p. 231)

27
Q

primary reinforcer

A

an innately reinforcing stimulus, such as one that satisfies a biological need. (p. 231)

28
Q

conditioned reinforcer

A

a stimulus that gains its reinforcing power through its association with a primary reinforcer; also known as a secondary reinforcer. (p. 231)

29
Q

continuous reinforcement

A

reinforcing the desired response every time it occurs. (p. 232)

30
Q

partial (intermittent) reinforcement

A

reinforcing a response only part of the time; results in slower acquisition of a response but much greater resistance to extinction than does continuous reinforcement. (p. 232)

31
Q

fixed-ratio schedule

A

in operant conditioning, a reinforcement schedule that reinforces a response only after a specified number of responses. (p. 232)

32
Q

variable-ratio schedule

A

in operant conditioning, a reinforcement schedule that reinforces a response after an unpredictable number of responses. (p. 233)

33
Q

fixed-interval schedule

A

in operant conditioning, a reinforcement schedule that reinforces a response only after a specified time has elapsed. (p. 233)

34
Q

variable-interval schedule

A

in operant conditioning, a reinforcement schedule that reinforces a response at unpredictable time intervals. (p. 233)

35
Q

punishment

A

an event that decreases the behavior that it follows. (p. 234)

36
Q

cognitive map

A

a mental representation of the layout of one’s environment. For example, after exploring a maze, rats act as if they have learned a cognitive map of it. (p. 236)

37
Q

latent learning

A

learning that occurs but is not apparent until there is an incentive to demonstrate it. (p. 236)

38
Q

insight

A

a sudden and often novel realization of the solution to a problem; it contrasts with strategy-based solutions. (pp. 236, 300)

39
Q

intrinsic motivation

A

a desire to perform a behavior effectively for its own sake. (p. 237)

40
Q

extrinsic motivation

A

a desire to perform a behavior to receive promised rewards or avoid threatened punishment. (p. 237)

41
Q

biofeedback

A

a system for electronically recording, amplifying, and feeding back information regarding a subtle physiological state, such as blood pressure or muscle tension. (pp. 240, C-8)

42
Q

observational learning

A

learning by observing others. Also called social learning. (p. 242)

43
Q

modeling

A

the process of observing and imitating a specific behavior. (p. 242)

44
Q

mirror neurons

A

frontal lobe neurons that fire when performing certain actions or when observing another doing so. The brain’s mirroring of another’s action may enable imitation and empathy. (p. 243)

45
Q

prosocial behavior

A

positive, constructive, helpful behavior. The opposite of antisocial behavior. (p. 246)