Exam 2 Review Flashcards
(50 cards)
Secondary reinforcers are also called ______ reinforcers.
a. developed
b. conditioned
c. transient
d. generalized
e. second-order
b. conditioned
In general, humans are more productive if they are praised for the quality of their work. Praise serves as a _______.
a. secondary negative reinforcer
b. primary negative reinforcer
c. primary positive reinforcer
d. secondary positive reinforcer
e. primary positive punisher
d. secondary positive reinforcer
The slope of the line on a cumulative record indicates the _______.
a. schedule of reinforcement
b. time between behavior and reinforcer
c. number of reinforcers
d. subject’s response rate
e. length of the training session
d. subject’s response rate
Instrumental conditioning can be defined as the process in which the _______.
a. reward is given only after many behaviors
b. environment signals when the reward will be available
c. opportunity for reward following behavior is not limited by the environment
d. instrumental response has no effect on when the reward is delivered.
e. environment limits the opportunity for reward for a behavior
e. environment limits the opportunity for reward for a behavior
Thorndike’s Law of Effect essentially says that ______.
a. behavior occurrence depends on the stimulus consequence that follows it
b. how a subject perceives the stimulus consequences is more important than the stimulus itself
c. satisfying consequences are more powerful than unpleasant consequences in producing behavior change
d. in general, positive behaviors occur more often than negative behaviors
e. classical conditioning is an important component of instrumental conditioning
a. behavior occurrence depends on the stimulus consequence that follows it
B.F. Skinner proposed that a stimulus is reinforcing if it _____.
a. increases the probability that the behavior will occur again
b. increases the likelihood that the reinforcement will occur
c. decreases the frequency of a classically conditioned stimulus
d. changes the perception of the stimulus consequence
e. decreases the chances that the behavior will extinguish
a. increases the probability that the behavior will occur again
Food and water function as primary reinforcers because ______.
a. the reinforcing properties were developed through classical conditioning
b. their reinforcing properties were learned through experience
c. they are more effective in producing behavioral change than other reinforcers
d. they possess innate reinforcing properties
e. they are abundant in our environment and easily accessed
d. they possess innate reinforcing properties
Operant responses are defined by B.F. Skinner in terms of the _____.
a. effect that they have on the environment
b. motivation required to elicit the response
c. goals that are achieved by the behavior
d. reinforcement and punishment that occur following the response
e. body movement used to make the response
a. effect that they have on the environment
In the video depicting Thorndike’s cat in the puzzle experiment, the door of the puzzle box opened after a cat moved the door latch and pushed on a rope. As a result, the cat was able to step out of the box and reach the food. After dozens of training trials, the cat moved the door latch and pushed on a rope as soon as it was put into the box. Thorndike concluded that ______.
a. the rope is a conditioned stimulus for the pushing behavior
b. the cat was motivated to reach the food dish outside the puzzle box
c. cats do not like small spaces and it was motivated to escape into the open
d. these behaviors increased because of a satisfying consequence
e. learning occurs when the cat associates the puzzle box with food
d. these behaviors increased because of a satisfying consequence
A child intrinsically enjoys reading and finishes one book per week. Their parents, wanting to encourage more reading, begin giving the child five dollars for each book read. What is likely to happen to the reading behavior?
a. it will change based on the dollar amount
b. it will increase
c. it will decrease
d. nothing will happen
e. it will stop completely
c. it will decrease
Resting for a period of time following reinforcement, before responding again, is known as _______.
a. ratio rest and run
b. post reinforcement pause
c. break point pause
d. spontaneous recovery
e. reinforcement induced rest
b. post reinforcement pause
Which reinforcement schedule is best to use if you want to produce the fastest learning of a new behavior?
a. FI 1 min
b. PR 2
c. FR 1
d. VI 10 sec
e. VR 5
c. FR 1
A fixed interval schedule of reinforcement results in less behavior early in the interval and more behavior late in the interval. The behavior pattern that occurs is a (n).
a. steady rate
b. omission
c. progressive burst
d. scallop
e. ratio run
d. scallop
Aggressive behaviors due to _____ are often observed during extinction.
a. disappointment
b. frustration
c. sadness
d. anger
e. disgust
b. frustration
We see a relatively high and steady rate of behavior on a variable ratio schedule because ______.
a. of the unpredictable amount of reinforcement for a set number of behaviors
b. of the unpredictable occurrence of a reinforcer
c. more reinforcement is available for each behavior
d. the behavior requirement is low and easy to complete
e. animals prefer to have choices to perform different behaviors
b. of the unpredictable occurrence of a reinforcer
People who play slot machines in a casino are reinforced on a _____ schedule of reinforcement.
a. variable ratio
b. fixed ratio
c. fixed interval
d. progressive ratio
e. variable interval
a. variable ratio
In shaping an operant response, the trainer _____.
a. uses intermittent schedules of reinforcement to make the desired response resistant to extinction
b. waits for the desired response to occur and then provides primary positive reinforcement (such as food)
c. reinforces successive approximations of the target behavior until that behavior occurs
d. demonstrates the desired behavior and waits for the subject to repeat it
e. provides secondary reinforcement for the appropriate response along with primary reinforcement
c. reinforces successive approximations of the target behavior until that behavior occurs
Gradually increasing the number of behaviors required for reinforcement is called _____.
a. progressive reinforcement
b. extinction
c. stretching the ratio
d. limited hold
e. intermittent reinforcement
c. stretching the ratio
If your instructor always gives a short quiz at the beginning of class on Mondays, studying for the quizzes will be reinforced on a ______ schedule of reinforcement.
a. progressive ratio
b. variable interval
c. fixed ratio
d. variable ratio
e. fixed interval
e. fixed interval
On a fixed interval schedule the subject ______.
a. always receives reinforcement after a fixed amount of time has passed
b. does not stop responding until the reinforcer occurs
c. must wait a fixed amount of time before responding again to receive reinforcement
d. increases responding as time of reinforcement availability approaches
e. waits patiently next to the food delivery location for reinforcement to appear at a specific time
d. increases responding as time of reinforcement availability approaches
Which of the following best illustrates a negative contingency?
a. smile at a person on the street and they return a smile to you
b. getting stung by a bee after swatting it
c. put on sunscreen before going out on a sunny day to prevent sunburn
d. farmer rings the dinner bell when food is put on the table
e. people workout at the gym feel energized
c. put on sunscreen before going out on a sunny day to prevent sunburn
Food deprivation can help to motivate rats to lever press for food. In this case, deprivation would be a (n).
a. positive contingency
b. abolishing operation
c. extinguishing operation
d. positive contrast
e. establishing operation
e. establishing operation
Skinner observed that pigeons would repeat behaviors that, were not contingent on reinforcement but, happened to occur before the reinforcer. Skinner called these behaviors ________.
a. intrinsic
b. superstitious
c. establishing operations
d. generalized responses
e. contrived
b. superstitious
Studies have shown that increasing the delay between the instrumental/operant response and reinforcement _______.
a. does not affect the learning of the response
b. strengthens the association between response and reinforcement
c. makes the response less resistant to extinction
d. slows the rate of response learning
e. later weakens the effects of punishment
d. slows the rate of response learning