Chapter 7 Flashcards

1
Q

Schedules of Reinforcement

A

the rules that determine how often an organism is reinforced for a particular delivered

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Types of Schedule of Reinforcements

A

-Continuous Schedules
-Partial Schedules: Ratio & Interval

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Schedule Effects

A

the distinctive rate/pattern of responding associated with a particular pattern of reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Continuous Reinforcement

A

the desired behavior is reinforced every time it occurs
-great for training new behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Partial Reinforcement

A

the response is reinforced only part of the time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

4 Types of Partial Reinforcement Schedules

A

-Fixed-Ratio (FR)
-Variable-Ratio (VR)
-Fixed-Interval (FI)
-Variable-Interval (VI)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

“Stretching the Ratio”

A

gradually modifying the schedule of reinforcement so as to progressively increase the # of times a behavior is required to get reinforced

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Ratio Strain

A

a breakdown in the pattern of responding due to stretching the ratio of reinforcement too abruptly or too far

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Fixed Ratio Schedule

A

A response is reinforced only after a specified number of responses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What kind of rate of behavior does a fixed ratio schedule create?

A

produces a high, steady rate of responding with only a brief pause after delivery of reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Postreinforcement Pauses

A

a drop in target behavior after reinforcement has been delivered
-FR
-FI

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Examples of a Fixed Ratio Schedule

A

-After every 5th visit to Rita’s, you get a free dessert
-Delivering a food pellet to a rat after it presses a bar five times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Variable Ratio Schedules

A

the number of behaviors required in order to earn reinforcement varies around an average
-occurs when a response is reinforced after an unpredictable number of responses
-common in natural environments

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What kind of rate of behavior does a variable ratio schedule create?

A

A high steady rate of responding
-greatest activity out of all schedules

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Examples of Variable Ratio Schedule

A

-Gambling and Lottery games
-Lab Setting: delivering food pellets to a rat after one bar press, again after four bar presses, and then again after two bar presses

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Fixed Interval Schedule

A

the target behavior is reinforced the first time it occurs after a specific interval of time

17
Q

What kind of rato of behavior does Fixed Interval Schedules have?

A

-Scallop pattern: rest period, followed by rapid increase of behavior due to not wanting to miss rewards
-Activity increases as deadline nears

18
Q

Examples of Fixed Interval Schedules

A

-Paycheck
-Lab Setting: reinforcing a rat with a lab pellet for the first bar press after a 30-second interval has elapsed

19
Q

Variable Interval Schedule

A

-the length of the interval during which performance is not reinforced varies around some average
-occurs when a response is rewarded after an unpredictable amount of time has passed

20
Q

Examples of Variable Interval Schedules

A

-Pop quiz
-Lab Setting: delivering a food pellet to a rat after the first bar press following one minute interval; a second pellet for the first time response following a five minute interval; a third pellet for the first response following a three minute interval

21
Q

Fixed Duration

A

-reinforcement contingent on the continuous performance of a behavior for some constant period of time
-requires the behavior to be performed for a period of time

22
Q

Examples of Fixed Durations

A

-running on a treadmill for 30 minutes to then watch tv
-playing the piano for 20 minutes and then getting the iPad

23
Q

Variable Duration Schedule

A

-the required period of sustained performance varies around some average amount of time
-schedule works around some average

24
Q

Differential Reinforcement of Low Rate (DRL)

A

-a schedule in which a minimum amount of time must elapse between responses in order for reinforcement to occur
-clock resets if behavior occurs before time period has elapsed
-produces low rate of behavior

25
Q

Differential Reinforcement of High Rate (DRH)

A

Behavior occurring at a rate above a pre-determined minimum rate

26
Q

Noncontingent Reinforcement (NCR)

A

schedules in which reinforcement is delivered independently of target behaviors (random rewards)

27
Q

Fixed Time Noncontingent Reinforcement

A

reinforcer is given a period of time whether a behavior occur or not

28
Q

Variable Time Noncontingent Reinforcement

A

reinforcement is delivered periodically at irregular intervals regardless of what behavior occurs

29
Q

Partial Reinforcement Effect (PRE)

A

increased resistance to extinction after intermittent reinforcement rather than after continuous reinforcement

30
Q

Discrimination Hypothesis

A

After intermittent reinforcement, it’s tougher to notice extinction between intermittent reinforcement and continuous reinforcement
-VR is harder to discriminate between FR

31
Q

Frustration Hypothesis

A

-nonreinforcement of once-reinforced behavior is frustrating (an aversive emotional state)
-reduction of frustration is negatively reinforcing
-continuous = no frustration
-intermittent = frustration

32
Q

Sequential Hypothesis

A

PRE occurs because of the differences in the order of cues during training
-during training, a behavior is followed by either reinforcement or nonreinforcement

33
Q

Compound Schedules

A

Various combinations of simple schedules

34
Q

Multiple Schedules

A

2 or more simple schedules in effect, each cued by a particular stimulus

35
Q

Cooperative Schedules

A

reinforcement depends on the behavior of 2 or more individuals

36
Q

Concurrent Schedules

A

2 or more schedules are available to the individual at once

37
Q

T/F. After testing out different options, organisms will select the option offering the highest rate of reinforcement

A

True