Exam 2 Review Flashcards

1
Q

Secondary reinforcers are also called ______ reinforcers.
a. developed
b. conditioned
c. transient
d. generalized
e. second-order

A

b. conditioned

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

In general, humans are more productive if they are praised for the quality of their work. Praise serves as a _______.
a. secondary negative reinforcer
b. primary negative reinforcer
c. primary positive reinforcer
d. secondary positive reinforcer
e. primary positive punisher

A

d. secondary positive reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

The slope of the line on a cumulative record indicates the _______.
a. schedule of reinforcement
b. time between behavior and reinforcer
c. number of reinforcers
d. subject’s response rate
e. length of the training session

A

d. subject’s response rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Instrumental conditioning can be defined as the process in which the _______.
a. reward is given only after many behaviors
b. environment signals when the reward will be available
c. opportunity for reward following behavior is not limited by the environment
d. instrumental response has no effect on when the reward is delivered.
e. environment limits the opportunity for reward for a behavior

A

e. environment limits the opportunity for reward for a behavior

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Thorndike’s Law of Effect essentially says that ______.
a. behavior occurrence depends on the stimulus consequence that follows it
b. how a subject perceives the stimulus consequences is more important than the stimulus itself
c. satisfying consequences are more powerful than unpleasant consequences in producing behavior change
d. in general, positive behaviors occur more often than negative behaviors
e. classical conditioning is an important component of instrumental conditioning

A

a. behavior occurrence depends on the stimulus consequence that follows it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

B.F. Skinner proposed that a stimulus is reinforcing if it _____.
a. increases the probability that the behavior will occur again
b. increases the likelihood that the reinforcement will occur
c. decreases the frequency of a classically conditioned stimulus
d. changes the perception of the stimulus consequence
e. decreases the chances that the behavior will extinguish

A

a. increases the probability that the behavior will occur again

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Food and water function as primary reinforcers because ______.
a. the reinforcing properties were developed through classical conditioning
b. their reinforcing properties were learned through experience
c. they are more effective in producing behavioral change than other reinforcers
d. they possess innate reinforcing properties
e. they are abundant in our environment and easily accessed

A

d. they possess innate reinforcing properties

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Operant responses are defined by B.F. Skinner in terms of the _____.
a. effect that they have on the environment
b. motivation required to elicit the response
c. goals that are achieved by the behavior
d. reinforcement and punishment that occur following the response
e. body movement used to make the response

A

a. effect that they have on the environment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

In the video depicting Thorndike’s cat in the puzzle experiment, the door of the puzzle box opened after a cat moved the door latch and pushed on a rope. As a result, the cat was able to step out of the box and reach the food. After dozens of training trials, the cat moved the door latch and pushed on a rope as soon as it was put into the box. Thorndike concluded that ______.
a. the rope is a conditioned stimulus for the pushing behavior
b. the cat was motivated to reach the food dish outside the puzzle box
c. cats do not like small spaces and it was motivated to escape into the open
d. these behaviors increased because of a satisfying consequence
e. learning occurs when the cat associates the puzzle box with food

A

d. these behaviors increased because of a satisfying consequence

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

A child intrinsically enjoys reading and finishes one book per week. Their parents, wanting to encourage more reading, begin giving the child five dollars for each book read. What is likely to happen to the reading behavior?
a. it will change based on the dollar amount
b. it will increase
c. it will decrease
d. nothing will happen
e. it will stop completely

A

c. it will decrease

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Resting for a period of time following reinforcement, before responding again, is known as _______.
a. ratio rest and run
b. post reinforcement pause
c. break point pause
d. spontaneous recovery
e. reinforcement induced rest

A

b. post reinforcement pause

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Which reinforcement schedule is best to use if you want to produce the fastest learning of a new behavior?
a. FI 1 min
b. PR 2
c. FR 1
d. VI 10 sec
e. VR 5

A

c. FR 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

A fixed interval schedule of reinforcement results in less behavior early in the interval and more behavior late in the interval. The behavior pattern that occurs is a (n).
a. steady rate
b. omission
c. progressive burst
d. scallop
e. ratio run

A

d. scallop

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Aggressive behaviors due to _____ are often observed during extinction.
a. disappointment
b. frustration
c. sadness
d. anger
e. disgust

A

b. frustration

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

We see a relatively high and steady rate of behavior on a variable ratio schedule because ______.
a. of the unpredictable amount of reinforcement for a set number of behaviors
b. of the unpredictable occurrence of a reinforcer
c. more reinforcement is available for each behavior
d. the behavior requirement is low and easy to complete
e. animals prefer to have choices to perform different behaviors

A

b. of the unpredictable occurrence of a reinforcer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

People who play slot machines in a casino are reinforced on a _____ schedule of reinforcement.
a. variable ratio
b. fixed ratio
c. fixed interval
d. progressive ratio
e. variable interval

A

a. variable ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

In shaping an operant response, the trainer _____.
a. uses intermittent schedules of reinforcement to make the desired response resistant to extinction
b. waits for the desired response to occur and then provides primary positive reinforcement (such as food)
c. reinforces successive approximations of the target behavior until that behavior occurs
d. demonstrates the desired behavior and waits for the subject to repeat it
e. provides secondary reinforcement for the appropriate response along with primary reinforcement

A

c. reinforces successive approximations of the target behavior until that behavior occurs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Gradually increasing the number of behaviors required for reinforcement is called _____.
a. progressive reinforcement
b. extinction
c. stretching the ratio
d. limited hold
e. intermittent reinforcement

A

c. stretching the ratio

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

If your instructor always gives a short quiz at the beginning of class on Mondays, studying for the quizzes will be reinforced on a ______ schedule of reinforcement.
a. progressive ratio
b. variable interval
c. fixed ratio
d. variable ratio
e. fixed interval

A

e. fixed interval

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

On a fixed interval schedule the subject ______.
a. always receives reinforcement after a fixed amount of time has passed
b. does not stop responding until the reinforcer occurs
c. must wait a fixed amount of time before responding again to receive reinforcement
d. increases responding as time of reinforcement availability approaches
e. waits patiently next to the food delivery location for reinforcement to appear at a specific time

A

d. increases responding as time of reinforcement availability approaches

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Which of the following best illustrates a negative contingency?
a. smile at a person on the street and they return a smile to you
b. getting stung by a bee after swatting it
c. put on sunscreen before going out on a sunny day to prevent sunburn
d. farmer rings the dinner bell when food is put on the table
e. people workout at the gym feel energized

A

c. put on sunscreen before going out on a sunny day to prevent sunburn

22
Q

Food deprivation can help to motivate rats to lever press for food. In this case, deprivation would be a (n).
a. positive contingency
b. abolishing operation
c. extinguishing operation
d. positive contrast
e. establishing operation

A

e. establishing operation

23
Q

Skinner observed that pigeons would repeat behaviors that, were not contingent on reinforcement but, happened to occur before the reinforcer. Skinner called these behaviors ________.
a. intrinsic
b. superstitious
c. establishing operations
d. generalized responses
e. contrived

A

b. superstitious

24
Q

Studies have shown that increasing the delay between the instrumental/operant response and reinforcement _______.
a. does not affect the learning of the response
b. strengthens the association between response and reinforcement
c. makes the response less resistant to extinction
d. slows the rate of response learning
e. later weakens the effects of punishment

A

d. slows the rate of response learning

25
Q

Coworkers who do not want you to work too fast or too hard, and make them look bad, may use the _______ schedule to ensure that you maintain a slower work pace.
a. fixed ratio
b. variable interval
c. two-task discrimination
d. differential Rf of low rate
e. differential reinforcement of high rate

A

d. differential Rf of low rate

26
Q

The closeness in time between the response and the stimulus consequence is referred to as ____.
a. contingency
b. contiguity
c. consistency
d. motivating operation
e. neutral operant

A

b. contiguity

27
Q

When we want to decrease or stop a target behavior, but do not want to use punishment, we can use a(n) ____ schedule.
a. differential reinforcement of low rate
b. continuous reinforcement
c. two-task discrimination
d. differential reinforcement of other behavior
e. extinction ratio

A

d. differential reinforcement of other behavior

28
Q

All of the following can increase responding in an operant procedure except _____.
a. increasing the contiguity of behavior and reinforcer
b. providing a higher quality reinforcer
c. increasing the amount of the reinforcer
d. removing the reinforcer following a behavior
e. making the reinforcer contingent on the behavior

A

d. removing the reinforcer following a behavior

29
Q

Paulo received a nice raise for doing the same work at his job. As a result, Paulo is now working harder than his coworkers. Paulo is demonstrating the _____.
a. positive behavioral contrast effect
b. positive reinforcement effect
c. primary motivation effect
d. incentive motivation effect
e. negative behavioral contrast effect

A

a. positive behavioral contrast effect

30
Q

Negative and positive contrast effects show that the effect of a reinforcer has on a behavior depends on the ________.
a. type of apparatus used in training behavior
b. magnitude and quality of the reinforcer
c. response and reinforcement ratio
d. subject’s previous experience with reinforcement
e. the type of reinforcer offered

A

d. subject’s previous experience with reinforcement

31
Q

A negative reinforcer is a stimulus _______.
a. given after a response that increases the probability of the response occurring
b. given after a response that decreases the probability of the response occurring
c. taken away after a response and decreases the probability of the response
d. taken away after a response and increases the probability of the response occurring
e. that occurs after an inappropriate behavior has not occurred for a certain amount of time

A

d. taken away after a response and increases the probability of the response occurring

32
Q

According to research by Jane Piliavin and colleagues (1975), the more intensity an emergency situation is perceived to be, bystanders are ______.
a. less likely to escape and more likely to help
b. more likely to help if the victim is a child
c. likely to help regardless of the type of victim
d. less likely to help if the victim was an adult male
e. less likely to help and more likely to escape

A

e. less likely to help and more likely to escape

33
Q

In a typical avoidance learning procedure _____.
a. early trials are mostly avoidance trials and escape trials occur throughout the training session
b. punishment occurs during the early trials and avoidance behavior rarely develops in the training session
c. early trials are mostly escape trials and avoidance trials occur later in the training session
d. avoidance behavior occurs immediately and remains at full strength throughout the training session
e. early trials are mostly avoidance trials and punishment trials start to occur later in the training session

A

c. early trials are mostly escape trials and avoidance trials occur later in the training session

34
Q

Your alarm clock makes a loud clanging sound as it rings. When you hit the snooze button the sound stops. This is an example of _____.
a. positive reinforcement
b. an escape trial
c. an extinction trial
d. one-way avoidance
e. negative punishment

A

b. an escape trial

35
Q

You do not like talking to telemarketers, so you do not answer your phone when an unknown number appears on the screen. This behavior is an example of ______.
a. active avoidance
b. passive avoidance
c. escape
d. discrimination
e. omission

A

b. passive avoidance

36
Q

In order to extinguish an escape response, you should ______.
a. signal that the aversive stimulus is about to occur
b. drug the subject so that it cannot respond before the aversive stimulus
c. present the aversive stimulus more often
d. continue presenting the aversive stimulus after the response occurs
e. make the aversive stimulus more intense each time

A

d. continue presenting the aversive stimulus after the response occurs

37
Q

In a signaled avoidance procedure, the aversive stimulus will be ____.
a. postponed if the subject makes the avoidance response during the aversive stimulus
b. presented if the subject makes the avoidance response during the signal
c. presented if the subject decreases the rate of responding during the aversive stimulus
d. postponed if the subject makes the avoidance response before the signal is presented
e. postponed if the subject makes the avoidance response during the signal

A

e. postponed if the subject makes the avoidance response during the signal

38
Q

There may actually be a detectable cue in Sidman’s unsignaled avoidance procedure. The signaling cue is likely ______.
a. fatigue
b. drive reduction
c. negative reinforcement
d. the aversive stimulus
e. time

A

e. time

38
Q

In active avoidance training, the subject is trained to ______.
a. alternate making two different responses in order to remove an aversive event
b. make a response in order to prevent the occurrence of an aversive event
c. make a response in order to terminate an aversive event
d. respond after the aversive stimulus has occurred
e. hold back making a response in order to prevent the occurrence of an aversive event

A

b. make a response in order to prevent the occurrence of an aversive event

39
Q

Studies have shown that increasing the delay between the operant response and the negative reinforcer_______.
a. does not affect the learning of the response
b. weakens the learning of the response
c. later weakens the effects of any punishment
d. makes the response less resistant to extinction
e. speeds up the learning of the response

A

b. weakens the learning of the response

40
Q

What is punishment (positive punishment)?

A

it is a positive contingency which leads to a decrease in the probability of a behavior when that behavior produces an aversive stimulus.

41
Q

What is omission (negative punishment)?

A

it is a negative contingency which leads to a decrease in the probability of a behavior when that behavior results in the removal of a stimulus. can be split up into temporarily or permanently. temporarily is to take away appetitive stimuli for a period of time, however appetitive stimuli can happen again; ex. taking away child’s toy and putting them in timeout. permanently deals with a response cost which is a situation where appetitive stimulus cannot be recovered; ex. girl gets grounded for 30 days and misses her senior prom.

42
Q

What type of schedule is best for punishment to be effective?

A

A continuous schedule (FR 1) works best to suppress behavior.

43
Q

Does punishment work well?

A

Punishment is not a good way to control behavior. We should focus on rewarding good behaviors to replace bad behaviors. Reinforcing strategies are more effective than punishing strategies.

44
Q

How is intensity related to punishment?

A

The greater the intensity of the punisher, the faster the response will decrease

45
Q

What are all the factors that affect punishment?

A

-Contingency
-Contiguity (time)
-Consistency
-Intensity
-Minimization of Positive Reinforcement
-Availability of an Alternative Response

46
Q

What is a conditioned punisher?

A

yelling “bad dog!” at your dog is an example of a conditioned punisher (only IF it is effective at reducing behavior). this conditioned punisher has been paired with other primary punishers (e.g. swatting dog’s backside, putting dog in its crate, etc.). same rules apply for establishing conditioned punishers as for conditioned reinforcers.

47
Q

What are the problems with punishment?

A

-Reluctance of caregivers to use it - does not always work
-Imitation of the punisher (modeling)
-Punishment prompts escape and avoidance behavior (e.g., lying, running away, avoiding contact, etc.)
-Punishment prompts negative emotional responses (e.g., aggression, apathy, etc.)
-Punishment does not address the underlying reason the problem behavior occurred in the first place
-Effects of punishment are often context specific and short-lived

48
Q

What is the difference between the Two Factor Theory of Punishment and Conditioned Emotional Response (CER) Theory?

A

these theories have the same first factor which is “classical conditioning of fear to the stimulus present prior to and during punishment.” for the Two Factor Theory of Punishment the second factor states “instrumental conditioning of an avoidance response that is incompatible with the punishable behavior.” for the Conditioned Emotional Response (CER) Theory the second factor states “fear as a motivational state is incompatible with the motivation for the punishable behavior.”

49
Q

What is Skinner’s definition of Punishment?

A

Punishment refers to an operation contingent upon and designed to reduce some undesirable behavior.