Learning 4-5-6 Flashcards

(153 cards)

1
Q

What is instrumental conditioning?

A

In instrumental conditioning, the behaviour causes the presence or absence of a stimulus.

The stimuli encountered are a result of the organism’s behaviour, contrasting classical conditioning where the stimulus is presented regardless of the organism’s actions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How does instrumental conditioning differ from classical conditioning?

A

In classical conditioning, the stimulus is presented regardless of the organism’s actions, while in instrumental conditioning, responses produce a desired outcome.

Instrumental conditioning is often termed ‘operant conditioning.’

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Who conducted early investigations into instrumental conditioning?

A

Edward Thorndike.

Thorndike was interested in animal intelligence and used puzzle boxes to study this.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the Law of Effect?

A

If a response in the presence of a stimulus is followed by a positive event, the association between the stimulus and response becomes strengthened.

Conversely, if followed by a negative event, the association weakens.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are discrete-trial procedures?

A

Discrete-trial procedures involve conducting trials where the subject has limited opportunities to respond, scheduled by the experimenter.

Examples include maze trials where a rat must reach a goal box.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the main difference between discrete-trial and free-operant procedures?

A

In free-operant procedures, the subject is not removed after each trial, allowing for continuous behaviour.

This contrasts with discrete-trial procedures where the subject can only respond during specific trials.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is a Skinner box?

A

A Skinner box is a device that allows the study of free-operant behaviour, containing a lever or key that can be pressed or pecked for food.

It helps in measuring operant responses continuously.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is response shaping?

A

Response shaping is the reinforcement of successive approximations toward a desired behaviour.

For example, gradually rewarding more complex actions in a training scenario.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the four basic instrumental conditioning procedures?

A

The four procedures are:
* Positive reinforcement
* Negative reinforcement
* Positive punishment
* Negative punishment

Each procedure affects the rate of responding differently.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is positive reinforcement?

A

Positive reinforcement occurs when instrumental behaviour produces an appetitive stimulus, increasing the rate of responding.

For example, giving a dog a treat for sitting.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is negative reinforcement?

A

Negative reinforcement occurs when instrumental behaviour results in the absence of an aversive stimulus, increasing the rate of responding.

An example is banging on a wall to make a noisy neighbour quieter.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is positive punishment?

A

Positive punishment occurs when instrumental behaviour produces an aversive stimulus, decreasing the rate of responding.

An example is being criticized by a boss for lateness.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is negative punishment?

A

Negative punishment occurs when instrumental behaviour produces the absence of an appetitive stimulus, decreasing the rate of responding.

An example is putting a child on time-out, removing fun activities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What does it mean that a behaviour cannot be reinforced?

A

A behaviour cannot be reinforced if it is not naturally linked to that reinforcer.

Thorndike’s experiments showed that yawning could not be conditioned as it was not linked to escaping a puzzle box.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is necessary for a behaviour to be reinforced?

A

A behaviour cannot be reinforced if it is not NATURALLY LINKED TO THAT REINFORCER.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What did Thorndike’s experiments with cats demonstrate?

A

He was unable to condition yawning as an instrumental behaviour to open the puzzle box.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What activates the behaviour system related to a stimulus?

A

The presence of a stimulus (e.g. food) activates the behaviour system related to that stimulus (e.g. foraging and feeding).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What happens to self-care responses during food deprivation in hamsters?

A

They decrease in probability, while environment-directed activities, such as digging and scrabbling, increase.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What determines the success of instrumental conditioning?

A

The nature of the REINFORCER, including its quality and quantity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What did Trosclair-Lasserre et al. (2008) find regarding social attention as a reinforcer?

A

Social attention was an effective reinforcer for a 5-year-old boy with autism.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What was the pattern of responses required by the boy in the study?

A

The boy had to make an increasing number of responses to receive the reinforcer.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What effect did the magnitude of the reinforcer have on instrumental responses?

A

The instrumental responses were strongest when the reinforcer was of a big magnitude.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is the BEHAVIOURAL CONTRAST EFFECT?

A

A big reward is treated as especially good after reinforcement with a small reward and a small reward is treated as especially poor after reinforcement with a large reward.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What are the two types of relationships between a response and a reinforcer?

A
  • Temporal relation
  • Causal relation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
What is temporal contiguity?
The delivery of the reinforcer immediately after the response.
26
What is the implication of a long delay between response and reinforcer?
Conditioning doesn’t occur if the delay is too long.
27
What does CREDIT ASSIGNMENT refer to in reinforcement?
The difficulty in attributing the reinforcer to the instrumental response when many behaviours occur in the delay period.
28
What did Skinner's superstition experiment with pigeons illustrate?
The role of contiguity in reinforcement.
29
How did pigeons behave in Skinner's experiment?
Pigeons developed random behaviours that they mistakenly associated with food delivery.
30
What is LEARNED HELPLESSNESS?
A feeling that occurs after experiencing uncontrollable stressful events, leading to a belief that one cannot control or change the situation.
31
What are the three groups in learned helplessness experiments?
* Group E (escape) * Group Y (yoked/couple) * Group R (restricted)
32
What was the result for Group Y in the learned helplessness experiment?
They failed to learn the instrumental escape-avoidance response compared to other groups.
33
What does the exposure phase of the learned helplessness experiment involve?
Group E is exposed to shocks that can be escaped by performing an instrumental response.
34
What is the implication of predicting bad events in learned helplessness?
Learning to predict when a bad thing will happen may help reduce stress.
35
What is a schedule of reinforcement?
A program or rule that determines how and when a reinforcer follows a response.
36
What do schedules of reinforcement influence?
They influence how an instrumental response is learned and maintained over time.
37
What is a RATIO SCHEDULE?
A procedure in which the reinforcer occurs after X amount of responses.
38
What is continuous reinforcement?
Reinforcers are delivered after every response.
39
What is a token economy?
A reward system where tokens can be exchanged for bigger rewards.
40
What is the difference between continuous reinforcement and fixed-ratio schedules?
Continuous reinforcement delivers a reinforcer after every response, while fixed-ratio schedules deliver after a set number of responses.
41
What is the result of partial or intermittent reinforcement?
The reinforcer doesn’t occur after every response.
42
What does the cumulative record show?
The total number of responses during a period of time.
43
What happens in fixed-ratio schedules?
The number of reinforcers received per number of responses is fixed.
44
What is the definition of a ratio schedule?
A schedule where the number of responses required to obtain reinforcement is specified.
45
What does the slope of a ratio schedule represent?
The rate of responding (number of responses per unit of time).
46
What is a fixed-ratio (FR) schedule?
A schedule that requires a fixed number of responses to receive reinforcement.
47
What occurs in a fixed-ratio schedule after each reinforcement?
A brief pause before responding starts again.
48
What is ratio strain?
When increases in the fixed-ratio requirement cause significant pauses in responding or lead to complete cessation of responses.
49
How does a variable-ratio schedule differ from a fixed-ratio schedule?
The number of responses required for reinforcement varies between occasions.
50
True or False: Variable-ratio schedules are more likely to produce post-reinforcement pauses compared to fixed-ratio schedules.
False
51
What is the effect of unpredictable response requirements in variable-ratio schedules?
They maintain a stable rate of responding without delays.
52
What is the difference between variable reinforcement and intermittent reinforcement?
Variable reinforcement occurs unpredictably, while intermittent reinforcement happens only sometimes.
53
What is an example of a variable reinforcement?
Gambling, where reinforcement occurs unpredictably.
54
How does a fixed-interval schedule operate?
Reinforcement occurs after a specific amount of time has passed.
55
In a fixed-interval schedule, how do subjects typically respond as the reinforcement time approaches?
The response rate increases.
56
What is a variable-interval schedule?
A schedule where the time that passes between reinforcements varies.
57
What is the primary difference between ratio and interval schedules?
Ratio schedules depend on the number of responses, while interval schedules depend on the timing of responses.
58
What did Reynolds (1975) demonstrate about pigeons on different schedules?
Pigeons on a variable-ratio schedule responded more vigorously than those on a variable-interval schedule.
59
What implication does the difference between ratio and interval schedules have for human behavior?
Employees may be more productive when paid based on work done (ratio) rather than based on time intervals (interval).
60
What is a concurrent schedule?
A schedule where different responses are associated with different reinforcers and schedules simultaneously.
61
What does the rate of responding indicate in concurrent schedules?
The distribution of responses across response alternatives.
62
How is the rate of reinforcement calculated?
Total reinforcers for an option divided by total reinforcers for all options.
63
What does the Matching Law describe?
The relationship between the rate of responding and the rate of reinforcement across concurrent schedules.
64
What is the rate of reinforcement?
The frequency with which a reward is provided after a desired behavior occurs. ## Footnote Rate of reinforcement for each alternative is calculated as the total reinforcers for that alternative divided by the total reinforcers for all alternatives.
65
How is the rate of reinforcement for the left option calculated?
Total reinforcers for left responses divided by total reinforcers for left responses + total reinforcers for right responses. ## Footnote An example calculation might yield a rate of reinforcement = 0.5.
66
What does the Matching Law describe?
What happens when two alternatives are not reinforced according to the same schedule. ## Footnote The rate of responding on an alternative matches the rate of reinforcement.
67
What is the relationship between the rate of responding and the rate of reinforcement in concurrent schedules?
The rate of responding is always equal to the rate of reinforcement for each alternative. ## Footnote For instance, if 40% of all reinforcers come from alternative A, then 40% of time will be spent responding to A.
68
What are the two perspectives on what motivates instrumental behavior?
* Associative structure of instrumental conditioning (molecular perspective) * Response-allocation approach (molar perspective) ## Footnote These perspectives focus on how stimuli, responses, and outcomes relate and how instrumental behavior relates to long-term goals.
69
What are the three events involved in instrumental responding?
* Contextual stimuli (S) * Instrumental response (R) * Response outcome (O) ## Footnote The response outcome serves to strengthen or weaken the S-R association.
70
What percentage of human behavior is estimated to be habits?
About 45%. ## Footnote Habits are defined as actions performed automatically without thinking.
71
What motivates instrumental responding according to the Two-Process Theory?
The S-O association established through classical conditioning activates an emotional state/reward expectancy. ## Footnote This theory suggests that positive or negative emotions influence instrumental behavior.
72
What happens in the Pavlovian-instrumental transfer experiment?
Associations are established between a tone (CS) and food (O), and lever press and food. ## Footnote When the tone predicts a positive outcome, rats respond more actively, while a negative outcome decreases response.
73
What is the Response-Deprivation Hypothesis?
Restricting access to reinforcers is critical for motivating instrumental responding. ## Footnote Low probability behaviors can motivate instrumental responding if access is restricted.
74
What is the Behavioral Bliss Point?
The ideal combination of activities that maximizes well-being, where animals distribute behavior among alternatives. ## Footnote It reflects the balance of activities an individual finds optimal.
75
How does consumer demand relate to instrumental responding?
Number of responses/time spent responding = money; reinforcer = product; response requirement/time interval = price. ## Footnote Elasticity of demand indicates how consumption is affected by price changes.
76
What is the effect of availability of substitutes on demand elasticity?
More available options lead to higher demand elasticity. ## Footnote For example, newspapers have high demand elasticity due to alternatives like 24-hour news channels.
77
How does income level affect demand elasticity?
Higher income means less sensitivity to price increases. ## Footnote In instrumental contingencies, more available time reduces the effect of cost on behavior.
78
What is differential responding?
A subject responds differently to two or more stimuli.
79
Define stimulus discrimination.
An organism responds differently to two or more stimuli.
80
What is stimulus generalization?
The degree to which responding is the same to two or more stimuli.
81
Who first observed stimulus generalization?
Pavlov.
82
What is the opposite of differential responding?
Stimulus generalization.
83
What did Reynolds (1961) demonstrate about pigeons?
Pigeons showed differential responding controlled by different stimuli.
84
Fill in the blank: A stimulus can control behavior only if the subject has the _______.
[sensory capacity].
85
What is overshadowing in the context of stimulus control?
Learning about one stimulus is disrupted by the presence of another stimulus.
86
True or False: Visual cues are more likely to signal danger.
False.
87
What are the two main approaches to understanding compound stimuli?
* Stimulus-element approach * Configural-cue approach
88
What is stimulus discrimination training?
A procedure for bringing behavior under the control of a stimulus.
89
What are the two types of conditioning that can be used in stimulus discrimination training?
* Classical conditioning * Instrumental conditioning
90
What is the significance of generalization gradients?
They measure how much control a stimulus has over instrumental behavior.
91
Fill in the blank: Extinction is an _______ process.
[active].
92
What are the two behavioral effects of extinction?
* Reduced responding * Increased response variability
93
What does exposure therapy aim to achieve?
To help a patient face feared situations in the absence of aversive outcomes.
94
What is systematic desensitization?
An extinction procedure involving gradual exposure to feared stimuli while using mindfulness techniques.
95
What happens initially when a key fails to open a door?
Increased response variability as different actions are attempted to solve the problem.
96
What is the role of reinforcement in stimulus control?
The type of reinforcement used can affect how strongly a stimulus controls behavior.
97
True or False: Extinction is the same as forgetting.
False.
98
What is the role of context in behavioral responses?
Contextual stimuli can significantly influence the effectiveness of learned behaviors.
99
What is the main conclusion from the study by Campolattaro et al. (2008)?
Stimulus control can be established through discrimination training.
100
What do children often receive praise for that high school students do not?
Drawing basic pictures.
101
Fill in the blank: The loss of conditioned behavior due to extinction is _______ from forgetting.
[different].
102
What type of cues may signal food more effectively?
Visual cues.
103
What is increased response variability?
Increased response variability refers to a range of different actions taken in response to a situation, as seen in behaviors when attempting to open a door with a key
104
What did Neuringer et al. (2001) demonstrate about response variability?
Both groups in the study showed increased response variability and decreased response rate in extinction
105
What emotion is most frequently caused by extinction?
Frustration
106
How can frustration affect behavior during extinction?
Frustration can cause an increase in responding and even aggression
107
What is spontaneous recovery?
Spontaneous recovery is the reappearance of a conditioned response after a rest period following extinction
108
What is renewal in the context of conditioned responses?
Renewal is the recovery of a conditioned response when contextual cues present during extinction are changed
109
What is reinstatement?
Reinstatement is the recovery of a conditioned response when the unconditioned stimulus is encountered again after extinction
110
Does extinction cause a permanent loss of a conditioned response?
No, extinction does not cause a permanent loss of a conditioned response
111
What are the three main types of evidence of recovery?
* Spontaneous recovery * Renewal * Reinstatement
112
What is the relationship between extinction training and context?
Conducting extinction training in multiple contexts reduces the renewal of conditioned responses
113
What defines avoidance procedures?
Avoidance procedures involve preventing an aversive event from occurring through instrumental responses
114
What is the difference between avoidance and punishment procedures?
* Avoidance: response prevents an aversive event * Punishment: response produces the aversive event
115
What is the Escape from Fear (EFF) procedure?
In EFF, subjects learn to escape fear-producing stimuli by making an instrumental response to move to another compartment
116
What happens to fear levels with increased instrumental responding in avoidance situations?
Increased instrumental responding leads to reduced fear of the conditioned stimulus
117
What is the vicious cycle of avoidance and fear?
Avoiding a fear-inducing situation reinforces the belief that it should be avoided, increasing future fear
118
What is an effective strategy for reducing avoidance behavior?
Exposure to the fear-inducing situation allows the brain to learn that the situation is not dangerous
119
What occurs during the extinction of avoidance behavior?
The presentation of the conditioned stimulus without the unconditioned stimulus while blocking avoidance responses
120
What is flooding in the context of behavioral therapy?
Flooding involves prolonged exposure to the conditioned stimulus to produce a stronger extinction effect
121
What is the significance of conducting extinction training in multiple contexts?
It may increase the generalization of extinction effects and reduce the renewal of conditioned responses
122
What was the main finding from the study by Kelamangalath et al. (2009) regarding reinstatement?
Rats resumed lever pressing for cocaine after being given a free injection, indicating reinstatement of the conditioned response
123
What is the strategy of behavioural therapy for anxiety disorders mentioned?
Flooding therapy ## Footnote Flooding therapy involves exposing a person to the source of their fear in a controlled environment until the fear response is extinguished.
124
What is the goal of extinction of avoidance behavior?
To reduce or eliminate avoidance responses through exposure to feared stimuli ## Footnote This process involves gradually increasing exposure to the feared situation or object.
125
What types of phobias can be selected for the therapy exercise?
* Arachnophobia * Acrophobia * Claustrophobia * Agoraphobia ## Footnote These phobias represent common anxiety disorders that can be targeted through flooding therapy.
126
What are the components involved in designing a flooding therapy?
* Specific steps of therapy * Duration of therapy * Intensity of exposure * Frequency of sessions * Measurement of progress ## Footnote These components ensure a structured approach to therapy and effective treatment.
127
Describe the experimental setup for participants experiencing claustrophobia.
Participants are placed in a small, enclosed room designed to trigger fear, with gradual exposure to the room over time ## Footnote The therapist may provide support during this exposure.
128
What did Thorndike and Skinner conclude about punishment?
Punishment was initially seen as ineffective for controlling behavior and only producing temporary effects ## Footnote Their conclusions were later challenged by research showing punishment can be effective in certain contexts.
129
What is the effect of inappropriate punishment on behavior?
It can lead to recovery of the punished behavior ## Footnote For example, a child may avoid a dangerous action after a severe punishment but may repeat it if the punishment is not perceived as significant.
130
What are the two phases of punishment procedures?
* Establishment of instrumental response * Punishment of some responses ## Footnote The first phase involves reinforcing a behavior, while the second introduces punishment for specific responses.
131
How does the intensity of punishment affect instrumental responses?
Low-intensity punishment causes moderate suppression, while high-intensity punishment leads to complete suppression ## Footnote This relationship highlights the importance of punishment severity in behavior modification.
132
How does the introduction of punishment affect its effectiveness?
Severe initial punishment leads to high suppression, while mild punishment followed by increased severity results in less suppression ## Footnote This suggests a potential desensitization to punishment over time.
133
What was the finding of Azrin et al. (1963) regarding punishment schedules?
Continuous punishment schedules result in total suppression of instrumental responding ## Footnote Higher fixed-ratios still show some suppression, indicating varying effects of different punishment schedules.
134
Define spontaneous recovery in the context of extinction phenomena.
Reappearance of a conditioned response after it seems to have been extinguished ## Footnote This phenomenon can occur even after a period of not encountering the conditioned stimulus.
135
What is reinstatement in the context of conditioned responses?
Reappearance of a conditioned response after presenting the previously extinguished stimulus ## Footnote This often occurs after a lapse of time or after the subject has experienced a different context.
136
What does renewal refer to in extinction phenomena?
Reappearance of a conditioned response in a different context than the one where extinction occurred ## Footnote This highlights the context-dependence of conditioned responses.
137
What are the three possible extinction phenomena to identify in personal experiences?
* Spontaneous Recovery * Reinstatement * Renewal ## Footnote These phenomena can manifest in various real-life situations and are essential to understanding behavior modification.
138
What does Social Learning Theory propose about human social behavior?
Human social behavior is not innate but learnt ## Footnote Proposed by Bandura in 1977 and 1963
139
What are the two types of experiences in Social Learning Theory?
Direct experience and vicarious experience ## Footnote Direct experience involves personal reinforcement, while vicarious experience involves observing others.
140
How does direct experience influence learning according to Social Learning Theory?
A behavior is maintained by rewards and punishments experienced by the child.
141
What is vicarious experience?
Learning occurs through the processes of modelling and imitation of other people.
142
What is the difference between direct and vicarious experience?
Direct experience involves acquiring a behavior because we are rewarded for it, while vicarious experience involves acquiring a behavior after observing that another person is rewarded for it.
143
What did the Bobo Doll experiment by Bandura, Ross & Ross (1963) demonstrate?
Simply observing a model perform a behavior can produce an imitation of behavior in children.
144
What were the results of the Bobo Doll experiment?
Children in groups observing aggressive behavior behaved more aggressively than the control group.
145
In Bandura's 1965 study, what were the three ways the model's behavior was reinforced?
* Positive reinforcement * Punishment * No consequence
146
What was the conclusion regarding aggressive behavior in the punishment group from Bandura's 1965 study?
Much less aggressive behavior compared to other groups.
147
How can vicarious reinforcement promote prosocial behavior?
By observing a model being rewarded for helping others.
148
What were the findings of Hornstein et al. (1970) regarding prosocial behavior?
Group observing a pleased model was most likely to return a lost wallet; group observing an annoyed model was least likely.
149
True or False: According to Social Learning Theory, antisocial behavior cannot be learnt.
False
150
Fill in the blank: Bandura's theory emphasizes the role of _______ in learning.
[experience]
151
What is the significance of vicarious reinforcement in learning?
It affects how much children imitate a model's behavior.
152
What concept suggests that observing a behavior being punished can decrease its imitation?
Vicarious punishment
153
What role do unobservable constructs play in social perspectives on learning?
They affect learning in ways that are not simply cause and effect.