Operant Conditioning: schedules of reinforcement Flashcards Preview

PSYCHOLOGY: Learning > Operant Conditioning: schedules of reinforcement > Flashcards

Flashcards in Operant Conditioning: schedules of reinforcement Deck (23):
1

Define: schedule of reinforcement

The frequency (fixed/variable) and manner (Ratio/interval) in which a desired response is reinforced.

2

What does the type of schedule of reinforcement affect?

The response acquisition rate (speed of learning, showed by steepness of graph)
Strength of response (but all strengthened bc. reinforcement.

3

Define: continuous reinforcement

When every correct response is reinforced after it occurs.

4

When is continuous reinforcement best used and why?

acquiring responses: the organism will most strongly associate a response with its consequences.

5

Why do ____ ratio schedules of reinforcement produce higher response acquisition rates than interval schedules?

Ratio: Organisms must produce a certain number of responses to be reinforced
Variable: organisms not required to produce a high number of responses to be reinforced - as long as there's more than one.

6

Define: partial reinforcement

When some correct responses are reinforced but others are not

7

WHen is reinforcement most effectively used?

Maintaining already acquired responses: the organism will continue to reproduce the response because they know the reinforcement will eventually come ( so responses are stronger and less likely to become extinct)

8

Why do variable _______ schedules have a higher response acquisition rate than fixed_____?

Variable: the uncertainty of when the reinforcement will come keeps the organism responding steadily.

9

4 schedules of partial reinforcement?

Fixed-ratio
Variable-ratio
Fixed-interval
Variable-interval

10

Define: fixed-ratio schedule

when the reinforcer is given after a predictable number of desired responses.

11

Describe the response acquisition rate for fixed-ratio schedule

High; but drop after reinforcement

12

Skinner's example of fixed-ratio?

for every 10 times the rat pushed the lever, one food pellet was dispensed

13

Define: variable-ratio schedule

when the reinforcer is given after an unpredictable number of desired responses.

14

Why do variable ratio and interval schedules involve mean number of responses

Because the actual number/time varies between each time the reinforcer is given.

15

Describe the response acquisition rate for variable-ratio schedule

High and steady (uncertainty of when the reinforcement will occur keeps them responding steadily)

16

Which schedule is most resistant to extinction? Why?

Variable-ratio schedue (organism knows they'll have to produce some responses, and reinforcement will come as a result of them responding)

17

Gambling example of variable-ratio schedule?

the payout occurs on a variable-ratio schedule so the person believes they will eventually win.

18

Define: fixed-interval schedule

when the reinforcer is given for a correct response after a predictable time period has elapsed since the previous reinforcer was given.

19

Describe the response acquisition rate for fixed-interval schedule

Moderate yet erratic (when the organism realises that time is a key factor, it might only respond rapidly right before the reinforcer is presented)

20

Example of fixed-interval schedule?

Monthly performance review

21

Define: variable -interval schedule

when the reinforcer is given for a correct response after an unpredictable time period has elapsed since the previous reinforcer was given.

22

Describe the response acquisition rate for variable-interval schedule

Low but steady (uncertainty of when the response will reoccur keeps them providing the response steadily.)

23

Provide an example of variable-interval schedule

Whilst fishing, the time interval between catching a fish is random