Chapter 7: Schedules and Theories of Reinforcement Flashcards

(69 cards)

1
Q

Schedule of Reinforcement

A

The response requirement that must be met to obtain reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Continuous Reinforcement Schedule (or CRF)

A

Each specified response is reinforced.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Which reinforcement schedule is helpful for when behavior is first being shaped?

A

Continuous or CRF

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Intermittent (or Partial) Reinforcement Schedule

A

Only some responses are reinforced.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How many types of Intermittent (or Partial) reinforcement schedules are there? What are the different types?

A
  1. Fixed Ratio
  2. Variable Ratio
  3. Fixed Interval
  4. Variable Interval
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Steady-State Behaviors

A

Stable response patterns that emerges once the organism has had considerable exposure to the schedule.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Schedule Effects

A

The different effects on behavior produced by different response requirements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Fixed Ratio (FR) Schedule

A

Reinforcement is contingent upon a fixed, predictable number of responses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

FR1 is the same as…

A

Continuous Reinforcement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Fixed ratio schedules typically produce a (high/low) rate of response along with a (long/short) pause following the attainment of each reinforcer.

A

high

short

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Short pause is also known as…

A

Post-reinforcement Pause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why are FR Schedules sometimes referred to as “Break-and-Run”?

A

Because each time a reinforcement is reached, the organism will take a break before proceeding with their goal to reach the next reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

In Fixed Ratio schedules, _____ ratio requirements result in ____ breaks.

A

high

long

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

An easy FR schedule can be defined as…

A

dense or rich

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

A difficult FR schedule can be defined as…

A

lean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Stretching the Ratio

A

Moving a low ratio requirement (dense) to a high ratio requirement (lean)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Should “stretching the ratio” be done gradually or quickly? Why?

A

Gradually; If the change is made too quickly, behavior may become erratic or die out quickly.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Ratio Strain

A

A disruption in responding due to an overly demanding response requirement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Ratio strain is more commonly known as…

A

burnout

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Variable Ratio (VR) Schedule

A

Reinforcement is contingent upon a varying unpredictable number of responses.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

FR# means

A

The number (#) of responses needed to obtain reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

VR# means

A

The average number (#) of responses needed to obtain reinforcement.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Variable ratio schedules typically produce a (low/high) and (steady/disrupted) rate of response, often with (little or no/long) post-reinforcement pauses.

A

high

steady

little or no

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What makes gamblers a perfect example for variable ratio schedules?

A

Although gamblers lose significant amounts of money in gambling, they are reinforced through their intermittent and unpredictable winnings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
What makes VR have a high rate of behavior?
It's unpredictability.
26
As with an FR schedule, an extremely learn VR schedule can result in _______ ________.
ratio strain
27
Fixed Interval (FI) Schedule
Reinforcement is contingent upon the first response after a fixed, predictable period of time.
28
FI# means
The specific amount of time (#) needed to pass before behavior is reinforced.
29
Fixed interval schedules often produce a _________ pattern of responding, consisting of a post-reinforcement pause followed by a gradually (increasing/decreasing) rate of response as the interval draws to a close.
"scalloped" increasing
30
Variable Interval (VI) Schedules
Reinforcement is contingent upon the first response after a varying, unpredictable period of time.
31
VI# means
The average amount of time (#) needed to pass before behavior is reinforced.
32
Variable interval schedules usually produce a (low/moderate/high), steady rate of response, often with little or no post-reinforcement pause.
moderate
33
(Ratio/Interval) schedules produce higher rates of response because the schedule is entirely ______ contingent.
Ratio contingent
34
(Fixed/Variable) schedules produce little to no post-reinforcement pause because such schedules often provide the possibility of relatively _______ reinforcement.
Variable immediate
35
Duration Schedule
Reinforcement is contingent on performing a behavior continuously throughout a period of time.
36
Fixed Duration (FD) Schedule
The behavior must be performed continuously for a fixed, predictable period of time.
37
FD# means
The amount of time (#) a behavior must be performed before getting a reinforcement.
38
Variable Duration (VD) Schedule
The behavior must be performed continuously for a varying, unpredictable period of time.
39
VD# means
The average amount of time (#) a behavior must be performed before getting a reinforcement.
40
How are duration schedules different than interval schedules?
Duration schedules rely on continuous behavior being performed for a certain amount of time where as interval schedules expect behavior to be performed after a certain amount of time.
41
On a pure FI schedule, any response that occurs (during/following) the interval is irrelevant.
during
42
On _______ schedules, the reinforcer is largely time contingent, meaning that the rapidity with which responses are emitted has (little/considerable) effect on how soon the reinforcer is obtained.
interval little
43
In general ______ schedules produce post-reinforcement pauses because obtaining one reinforcer means that the next reinforcer is necessarily quite (distant/near).
fixed distant
44
What are the three types of response-rate schedules?
1) Differential reinforcement of high rates (DRH) 2) Differential reinforcement of low rates (DRL) 3) Differential reinforcement of paced responding (DRP)
45
What is a response-rate schedule?
A reinforcement that is directly contingent upon the organism's rate of response.
46
What is a "con" of during a duration schedule?
Duration schedules sometimes undermine the behavior similar to getting awarded for participating vs quality of performance.
47
Differential Reinforcement of High Rates (DRH)
Reinforcement is contingent upon emitting at least a certain number of responses in a certain period of time. OR Reinforcement is provided for responding at a fast rate.
48
Differential Reinforcement of Low Rates (DRL)
A minimum amount of time must pass between each response before the reinforcer will be delivered. OR Reinforcement is provided for responding at a slow rate.
49
Differential Reinforcement
One type of response is reinforced while another is not.
50
How do FI schedules differ than DRL schedules?
Responses that occur in an FI schedule do not impact the reinforcement, however, in a DRL schedule, an interruption in the duration will prevent the reinforcement from occurring.
51
What is an example of a DRL?
Teaching a kid to brush their teeth slowly so they learn the correct technique and to prevent sloppiness.
52
What is an example of a DRH?
Racing to the finish line.
53
Differential Reinforcement of Paced Responding (DRP)
Reinforcement is contingent upon emitting a series of responses at a set rate. OR Reinforcement is provided for responding neither too fast nor too slow.
54
What is an example of DRP?
Non-competitive running.
55
On a (VD/VI) schedule, reinforcement is contingent upon responding continuously for a varying period of time; on an (FI/FD) schedule, reinforcement is contingent upon the first response after a fixed period of time.
VD FI
56
As Tessa sits quietly in the doctor's office, her mother occasionally gives her a hug as a reward. As a result, Tessa is more likely to sit quietly on future visits to the doctor. This is an example of a(n) ________ ________ schedule of reinforcement.
variable duration
57
In practicing slow-motion exercise known as tai chi, Tung noticed that the more slowly he moved, the more thoroughly his muscles relaxed. This is an example of (DRL/DRH/DRP).
DRL or Differential Reinforcement of Low Rates
58
Non-Contingent Schedule of Reinforcement
A response is not required for the reinforcer to be obtained.
59
Non-Contingent Schedule of Reinforcement is also known as
Response-Independent Schedules
60
What are the two types of non-contingent reinforcement schedules?
1) Fixed Time Schedule | 2) Variable Time Schedule
61
Fixed Time (FT) Schedule
The reinforcer is delivered following a fixed, predictable period of time, regardless of the organisms behavior.
62
FT schedules deliver ______ reinforcers.
"free"
63
FT# means...
After # time passes, the reinforcer is delivered.
64
Variable Time (VT) Schedule
The reinforcer is delivered following a varying, unpredictable period of time, regardless of the organisms behavior.
65
VT# means...
After the average of # time passes, the reinforcer is delivered.
66
How do FT/VT schedules differ from FI/VI schedules?
FT and VT schedules do not require the organism to emit a specific response as FI and VI schedules do.
67
Non-contingent schedules can accidentally reinforce a behavior performed (before/after) the reinforcer is provided.
before
68
A superstition can arise when a behavior is reinforced on a _____________ schedule.
non-contingent
69
Who are two examples that are more prone to the development of superstitions? Why?
Gamblers and professional athletes because they associate a specific behavior with a big win and do not want to lose their status.