Chapter 8- Unit 2 Flashcards

1
Q
  1. Define and give an example of intermittent reinforcement
A

Is an arrangement in which a behaviour is positively reinforced only occasionally rather than every time it occurs
-ex. Jan’s problem solving behaviour was not reinforced after each math problem that she solved: instead she recorded reinforcement after a fixed number of problem solving responses had occurred (she worked at a very steady rate)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q
  1. Define and give an example of response rate
A

The number of instances
Of a behaviour hat occur in a given period of time.
- ex. A child in a classroom is constantly ra ain’t her hand and snapping her fingers to gain the teachers attention. A teacher who keeps track of the frequency of finger snapping for a while and then introduces operant extinction would probably observe an increase in finger snapping during the 1st few minutes of extinction before the behaviour gradually began to taper off.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q
  1. Define and fife an example of schedule of reinforcement
A

Is a rule specifying which occurrences of a given behaviour, if any, will be do reinforced.
- ex. Set of rules that a teacher will follow when delivering reinforcers (ex. Tokens) The “rules” might state that reinforcement is t fine after every correct response to a question; or when a certain amount of time has elapsed,

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q
  1. Define CRF (continuous reinforcement) an give an example that is not in this chapter.
A

( simplest schedule of reinforcement)
Which is an arrangement in which each instance of a particular response is reinforced.
- ex. Each time you turn the key in the ignition of your car, your car will start

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q
  1. Describe 4 advantages of intermittent reinforcement over CRD for maintaining behaviour.
A
  1. The reinforcer remains effective longer because satiation takes place more slowly;
  2. Behaviour that has been reinforced intermittently tends to take longer to extinguish;
  3. individuals work more consistently on certain intermittent schedules;
  4. Behaviour that has verb reinforced intermittently is more more likely to persist after being transferred to reinforced in the natural environment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Explain what a FR schedule is. Illustrate with 2 examples of Far schedules in everyday life.

A

A reinforcer occurs each time a fixed number of responses Of a particular type are emitted
- ex, Jab completed 2 problems correctly (F2), the behaviour modifiers responses with GOOD WORK. Later she had to solve 4 problems (F4). Finally she had to make 16 correct responses (F16) to receive praise.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q
  1. What is a free- operant procedure ? Give an example.
A

Is one in which the individual is “free” to respond at various rates in the sense that there are no constraints in successive responses.
- ex. I’d Jan had been given a worse sheet containing 12 math problems to solve, she could have worked at a rate of one problem per minute, or a rate of 3 per one minute or some other rate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q
  1. What is a discrete- trials procedure? Give an example.
A

The individual is “not free” to respond at whatever rate he chooses because the environment places limits in the availability of the response opportunities.
- ex. A parent told a teenager “ you can use the family far after Tiba have helped do the dishes following 3 evening meals”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
  1. What are 3 characteristic effects o an FR schedule ?
A
  1. FR schedules produce a consistent response rate
  2. The rate of responding increases with higher FR schedules
  3. Poet reinforcement payee- following reinforcement, responding will temporarily stop. After the pause, the rate of responding resumes to peer reinforcement levels
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q
  1. What is a VR schedule?illustrate with examples of VR schedules in everyday life. Do your examples involve a free-operant procedure or discrete- trial procedure.
A

A reinforcer occurs after a certain number of a particular response and the number of responses required for each reinforcer changes unpredictably from one reinforcer to the next.

  • ex. Over a period of several moths, a door to door salesperson averages one sale every 10 houses. Sometimes a salesperson makes a sale after calling on 5 houses, sometimes 15, sometimes 10 etc. Over 7 months a mean if 10 house calls is required to produce reinforcement. Abbreviated VR 10. This is a free-operant procedure.
  • ex. A gambler using a slot-machine has no way of predicting how many time they just out money in the machine before they will win. I could be 10 Rome’s it could be 20 etc. If a gambler over the span of a day averaged about 13 times, it would be abbreviated VR13. This is a free-operant procedure.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Explain what an Fl/LH schedule is and illustrate with an example that isn’t in this chapter.

A

A fixed interval schedule ( a reinforcer is Presented following the first instance of s specific response after a fixed period of time( with a limited hold ? A deadline for meeting the response requirement of a schedule of reinforcement)
- ex. An online store had shirts in same each day at 1pm, and the sale only lasts for 15 minutes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Explain what a Vl/LH schedule is ? Illustrate with two examples from everyday life at least 1 not in this chapter.

A

A variable- interval schedule ( a reinforcer is presented following the first instance if a specific response after an interval of time, and the length of the interval chances unpredictable from one reinforcer to he next) with a limited hold ( a deadline for meeting the response requirement of a schedule of reinforcement).

  • ex. The timer game; a tuner was purchased which could be set to make a ding noise at any time interval between one To thirty minutes. Every time the timer made a fun noise l, if they were playing nicely They would get 5 extra minutes of t.v. Since they had to be cooperative the instant the alarm went off, they limited hold was 0 seconds. So it was VI 30: Lh 0 seconds.
  • ex. Pancakes need be flipped, at some point between 3-8 minutes; with the average time being 5 minutes. Once they are ready to be flipped l, you must do so within 20 seconds, or they will burn. This VI 5 minutes: LH 20 seconds.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q
  1. Give two examples of his VI/LH might be applied in training programs.
A

Example 1:The timer game in Classrooms. If children are working quietly then the timer goes off they get extra free time. VI 30 minutes/ LH 0 seconds.

Example 2: to train children to pay attention during lectures, a tracer holds up a green cue card at random times during the 30 minute class lecture, average once every 10 minutes m. Hen she does so, the children have 5 minutes to write down the word she said as she held up the card. His is Vl 10 minuets/ LH 5 minutes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q
  1. Explains What an FD schedule is. Illustrate with 2 examples if FD schedules that occur in everyday life (with at least one bit in this chapter)
A

Fixed duration schedule: reinforcer is ores enter only if a behaviour occurs into pushy for a fixed period of time; the value is the amount if tune that the Behaviour must be engaged I continuously before reinforcement occurs.

Example 1: melting solder- Ken must hold the tip of soldering iron for a cool Tanius amount of time, it it’s removed too wu icky, it will cool tin quickly.

Example 2: John must stay i the plank position for 1 dull minute to pass his gym class.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q
  1. What are concurrent schedules of reinforcement? Give an example.
A

When each of two or more behaviours is reinforced on different schedule dat the same time, the schedules of reinforcement that are in effect are called concurrent schedules of reinforcement.
Example: A person at home in the evening may have the choice studying or watching TV.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly