Flashcards in Operant Conditioning: schedules of reinforcement Deck (23):
Define: schedule of reinforcement
The frequency (fixed/variable) and manner (Ratio/interval) in which a desired response is reinforced.
What does the type of schedule of reinforcement affect?
The response acquisition rate (speed of learning, showed by steepness of graph)
Strength of response (but all strengthened bc. reinforcement.
Define: continuous reinforcement
When every correct response is reinforced after it occurs.
When is continuous reinforcement best used and why?
acquiring responses: the organism will most strongly associate a response with its consequences.
Why do ____ ratio schedules of reinforcement produce higher response acquisition rates than interval schedules?
Ratio: Organisms must produce a certain number of responses to be reinforced
Variable: organisms not required to produce a high number of responses to be reinforced - as long as there's more than one.
Define: partial reinforcement
When some correct responses are reinforced but others are not
WHen is reinforcement most effectively used?
Maintaining already acquired responses: the organism will continue to reproduce the response because they know the reinforcement will eventually come ( so responses are stronger and less likely to become extinct)
Why do variable _______ schedules have a higher response acquisition rate than fixed_____?
Variable: the uncertainty of when the reinforcement will come keeps the organism responding steadily.
4 schedules of partial reinforcement?
Define: fixed-ratio schedule
when the reinforcer is given after a predictable number of desired responses.
Describe the response acquisition rate for fixed-ratio schedule
High; but drop after reinforcement
Skinner's example of fixed-ratio?
for every 10 times the rat pushed the lever, one food pellet was dispensed
Define: variable-ratio schedule
when the reinforcer is given after an unpredictable number of desired responses.
Why do variable ratio and interval schedules involve mean number of responses
Because the actual number/time varies between each time the reinforcer is given.
Describe the response acquisition rate for variable-ratio schedule
High and steady (uncertainty of when the reinforcement will occur keeps them responding steadily)
Which schedule is most resistant to extinction? Why?
Variable-ratio schedue (organism knows they'll have to produce some responses, and reinforcement will come as a result of them responding)
Gambling example of variable-ratio schedule?
the payout occurs on a variable-ratio schedule so the person believes they will eventually win.
Define: fixed-interval schedule
when the reinforcer is given for a correct response after a predictable time period has elapsed since the previous reinforcer was given.
Describe the response acquisition rate for fixed-interval schedule
Moderate yet erratic (when the organism realises that time is a key factor, it might only respond rapidly right before the reinforcer is presented)
Example of fixed-interval schedule?
Monthly performance review
Define: variable -interval schedule
when the reinforcer is given for a correct response after an unpredictable time period has elapsed since the previous reinforcer was given.
Describe the response acquisition rate for variable-interval schedule
Low but steady (uncertainty of when the response will reoccur keeps them providing the response steadily.)