Cognitive: Learning Flashcards

You will be able to compare major learning theories and conditioning processes, including how behaviors are acquired and modified through observation and reinforcement.

1
Q

Define:

learning

A

A relatively permanent change in behavior as a result of experience.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the three theories for learning?

A
  1. Classical conditioning
  2. Operant conditioning
  3. Cognitive learning
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Define:

classical conditioning

A

Learning that takes place when two stimuli, one conditioned and one unconditioned, are presented together to induce the same response.

For example, Pavlov rang a bell when he was going to feed his dogs. The dogs would naturally salivate when food was presented to them, but over time when Pavlov rang his bell his dogs would salivate even without the presence of food.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define acquisition as it relates to classical conditioning.

A

Passively learning to give a known response to a new stimulus.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define stimulus as it relates to classical conditioning.

A

A change in the environment that brings about a response.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Define response as it relates to classical conditioning.

A

It is how an organism reacts to a stimulus, either instinctively or through learned associations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Fill in the blank.

Stimuli that increase the likelihood of a behavior are called __________.

A

reinforcers

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Identify the stimulus and the response:

When you put food in your mouth, you salivate.

A
  • stimulus: food
  • response: salivation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What was the premise of Ivan Pavlov’s classical conditioning experiment?

A

Dogs salivate at the sight of food because they form associations with food and events preceding eating the food. Pavlov sounded a bell right before presenting food, so the dogs would ultimately salivate at the sound of the bell.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Define in terms of Pavlov’s experiment:

neutral stimulus

(NS)

A

Stimulus that initially does not elicit a response until it becomes CS.

Pavlov’s example: The NS is the bell because it does not produce salivation until it is paired with the food.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Define in terms of Pavlov’s experiment:

unconditioned stimulus

(UCS or US)

A

Reflexively, automatically brings about a response.

Pavlov’s example: Food is the UCS because it automatically brings about salivation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Define in terms of Pavlov’s experiment:

unconditioned response (UCR or UR)

A

Automatic, involuntary reaction to the unconditioned stimulus.

Pavlov’s example: The UCR is salivation because the dogs automatically salivate when they eat food.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Define in terms of Pavlov’s experiment:

conditioned stimulus (CS)

A

It starts as neutral stimulus, but when paired with UCS, eventually brings about the conditioned response.

Pavlov’s example: The CS is the bell because, when paired with the food, it brought about salivation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Define in terms of Pavlov’s experiment:

conditioned response (CR)

A

A learned response to a previously neutral stimulus.

Pavlov’s example: Salivation is the CR because the dog learned to salivate in response to the bell.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

How is delayed conditioning timed?

A

A conditioned stimulus is presented just before the unconditioned stimulus. The greater the delay, the less likely conditioning is to occur.

Pavlov’s example: The bell rings just before food is presented.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How is trace conditioning timed?

A

A neutral stimulus is presented and then taken away before the unconditioned stimulus appears.

Pavlov’s example: Bell rings, followed by a long time lapse, then food is presented.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

How is simultaneous conditioning timed?

A

A neutral stimulus and unconditioned stimulus are presented together at the same time.

Pavlov’s example: The bell rings and food is presented at the same time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

How is backward conditioning timed?

A

An unconditioned stimulus is presented before the neutral stimulus.

Pavlov’s example: Food is presented before the bell rings.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Who were the researchers behind the Little Albert experiment?

A

John B. Watson and Rosalie Rayner

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Explain the Little Albert classical conditioning experiment.

A
  • Conditioned a nine-month-old baby named Albert to fear a rat.
  • Albert wouldn’t cry from the sight of the rat, but cried from loud noise.
  • Loud noise was played when Albert reached for the rat.
  • Albert eventually cried at sight of the rat.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Identify the UCS, UCR, CS, and CR in the Little Albert experiment.

A
  • UCS: loud noise
  • UCR: fear
  • CS: white rat
  • CR: fear
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Define in terms of classical conditioning:

extinction

A

The elimination of the CR through presenting the CS without the UCS repeatedly.

Pavlov’s example: ring bell without food, dog will not salivate from bell

Little Albert: present rat without loud noise, baby will not cry from rat

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Define in terms of classical conditioning:

spontaneous recovery

A

An original response disappears and then returns later on.

Pavlov’s example: salivation from bell stops and then returns

Little Albert: baby stops crying from presence of rat and then begins again

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Define in terms of classical conditioning:

generalization

A

A stimuli similar to the CS elicit the CR without any new conditioning.

Pavlov’s example: dog salivates from bells with different tones, pitches, or lengths

Little Albert: baby cries from other white fluffy stimuli, such as white bunnies or cotton balls

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
# Define in terms of classical conditioning: discrimination
CR is only produced by the presence of the CS because other stimuli is too **dissimilar**. ## Footnote **Pavlov's example**: dog will not salivate to a doorbell or telephone ring **Little Albert**: baby will not cry at presence of a black rat
26
What is higher-order (a.k.a. second-order) conditioning?
Learning which occurs when a previously learned CS is now used as the US to **produce** a CR to a new stimulus. ## Footnote Example: Flashing a light before Pavlov's bell would train the dogs to salivate from only the light.
27
# Define: operant conditioning
Learning that occurs when a subject performs certain **voluntary behavior**, and the consequences of the behavior determine the likelihood of its recurrence.
28
How did Edward Thorndike contribute to research on operant conditioning?
* Put cats in puzzle boxes to demonstrate trial and error in obtaining a fish. * Coined the terms "instrumental learning" and "Law of Effect". ## Footnote Through operant conditioning, behavior that is rewarded is likely to be repeated, and behavior that is punished will rarely occur.
29
What is instrumental learning?
Thorndike's term for type of **associative learning** where a behavior becomes more or less **probable** depending on its consequence.
30
Explain the Law of Effect.
* Behaviors followed by a positive consequence are strengthened and more likely to occur. * Behaviors followed by a negative consequence are weakened and less likely to occur. * It is concluded by Edward Thorndike.
31
What is a Skinner box?
Operant conditioning chamber for research animals, designed by B.F. Skinner, that contained levers, food dispensers, lights, and an electrified grid.
32
What are the four **training procedures** of B.F. Skinner's operant conditioning?
1. Positive reinforcement 2. Negative reinforcement 3. Positive punishment 4. Negative punishment
33
# Define in terms of operant conditioning: positive reinforcement
A **reward** training where a behavior is followed by a reinforcer that increases the probability that the behavior will occur again. ## Footnote Example: praise after participating in class
34
What is the **Premack** principle?
A type of **positive reinforcement** where a more probable behavior is used as a reinforcer for a less probable one. ## Footnote Example: treating yourself to an hour of TV after spending three hours studying for an exam
35
# Define in terms of operant conditioning: negative reinforcement
The process of increasing a behavior by **removing** an unpleasant stimulus following the behavior. ## Footnote Example: A kid does his chores to avoid getting yelled at.
36
What are avoidance and escape behaviors?
1. **Avoidance behavior**: takes away the aversive stimulus before it begins. 2. **Escape behavior**: takes away the aversive stimulus after it has already started.
37
# Define in terms of operant conditioning: punishment
An **unpleasant consequence** that follows a voluntary behavior, decreasing the probability the behavior will be repeated; a.k.a. positive punishment. ## Footnote Example: spanking a child for misbehaving
38
# Define in terms of operant conditioning: omission training
It is **removing a rewarding consequence** following a voluntary behavior, decreasing the probability the behavior will be repeated. ## Footnote Example: taking away a child's toy after misbehaving
39
What is aversive conditioning?
Learning that involves an **unpleasant stimulus** or reinforcer, such as negative reinforcement and punishment.
40
# Define: learned helplessness
A state of feeling **powerless** to change yourself or your situation because of a prior inability to avoid an aversive event.
41
What are the three types of **reinforcers**?
1. Primary 2. Secondary 3. Generalized
42
# Define and give an example of: primary reinforcers
Something that is **biologically**, naturally important and therefore rewarding. ## Footnote Example: food and drink
43
# Define and give an example of: secondary reinforcers
Something **neutral** that can become rewarding when associated with a primary reinforcer. ## Footnote Example: gold stars, tokens, points, money
44
# Define and give an example of: generalized reinforcers
Secondary reinforcer that can be associated with **several** primary reinforcers. ## Footnote Example: Money can be used to buy food and also other enjoyable items.
45
How does a token economy work?
* Operant conditioning system * Secondary reinforcers are used to increase acceptable behaviors * Tokens can be exchanged for privileges and prizes * Used in mental hospitals and jails
46
Define **behavior modification** in terms of operant conditioning.
* Small steps are rewarded until the intended goal is achieved. * Uses the behavioral approach to solve individual, institutional, and societal problems.
47
How is **shaping** used to teach a new behavior?
Positively reinforcing closer and closer approximations of the desired behavior.
48
Define **chaining** as it relates to operant conditioning.
* initially positively reinforcing each behavior in a certain order * later on, rewards only given for completing the whole sequence in order to establish a specific sequence of behaviors
49
What is the **purpose** of reinforcement schedules?
To determine how and when reinforcers will be given to the learner.
50
What is a **continuous** reinforcement schedule?
Provides reinforcement every time the behavior is exhibited by human or animal.
51
What is a **partial** reinforcement schedule?
Reinforcing behavior only some of the time. ## Footnote A.k.a. intermittent schedule
52
What is a **ratio schedule** and what are the four types?
It is a schedule based on the number of desired responses : 1. fixed ratio 2. fixed interval 3. variable ratio 4. variable interval
53
# Define: fixed ratio schedule
Reinforcement comes after a specific number of behavior responses. ## Footnote Example: Every three times you get a question right, you get a piece of candy.
54
# Define: fixed interval schedule
Reinforcement comes at a specific time. ## Footnote Example: Having an review at a job at a specific time each year to determine compensation
55
# Define: variable ratio schedule
Number of behavior responses needed for reinforcement changes. ## Footnote Example: You sit at a slot machine pulling the lever hundreds of time because you don't know how many pulls are needed before the jackpot.
56
# Define: variable interval schedule
Amount of time before reinforcement of behavior changes. ## Footnote Example: You study every night in preparation for a pop quiz because you don't know when it is coming.
57
How is **superstitious behavior** formed?
When reinforcement occurs during an **idiosyncratic** behavior, the organism is likely to repeat that behavior, even though it doesn't cause the reinforcement.
58
What did John B. Watson and B.F. Skinner study?
They studied only behaviors, **disregarded thought processes** because they were not observable.
59
What do cognitive theorists believe humans and other animals are capable of, beyond classical and operant conditioning?
Forming **expectations** and being consciously **motivated** by rewards.
60
What is the **contingency** model?
**Robert Rescorla's theory** that the key to classical conditioning is how well the CS predicts the appearance of the UCS.
61
What model did the contingency model counter?
**Pavlov's contiguity model** that classical conditioning is based on the association in time of the CS prior to the UCS.
62
What is the **blocking** effect?
**Leon Kamin's concept** that conditioning effect of neutral stimulus is blocked when already conditioned with UCS.
63
Name an example of **delayed gratification**.
Saving money for college or a car, rather than spending it immediately.
64
Who was Edward Tolman?
* He confirmed the presence of latent learning * He found unrewarded rats form cognitive map of the maze so when presented with a reward, they are motivated to improve
65
# Define: latent learning
Learning in the **absence** of rewards.
66
Define **insight** as it relates to learning.
A sudden appearance of an answer or solution to a problem.
67
Who observed insight in chimpanzees?
Wolfgang Kohler
68
# Define: observational learning
* Learning that occurs by watching the behavior of a model * A.k.a. social learning or vicarious learning
69
What are the four steps of **observational learning**, according to Albert Bandura?
1. Attention 2. Retention 3. Reproduction 4. Motivation
70
What were the **results** of the bobo dolls experiment?
* When offered rewards to imitate violent behavior, did not always lead to response. * Demonstrated modeling: those who watched violent models imitated them.
71
What provides the **biological basis** for observational learning?
Mirror neurons are activated when you perform an action and when you observe someone else perform a similar action.
72
# Define: conditioned taste aversion
* Intense dislike and avoidance of a food because of its association with an unpleasant or painful stimulus through backward conditioning. * Adaptive responses of organisms to foods that could sicken or kill them. * A.k.a. Garcia effect.
73
Define **preparedness** as it relates to learning.
Through evolution, animals are biologically predisposed to easily learn behaviors related to their **survival** as a species.
74
Who experimented on conditioned taste aversions and biological preparedness in rats?
John Garcia and Robert Koelling
75
What is instinctive drift?
CR that **drifts back** toward the natural, instinctive behavior of the organism.
76
What is the **evidence** of biological factors of learning?
Rats raised in enriched environments had thicker cortices, higher brain weight, and greater neural connectivity than rats raised in deprived environments.
77
What is long-term potentiation?
* Physiological change that correlates with a stable change in behavior due to experience * Neurons that fire together, wire together. * Studied by Donald Hebb and Eric Kandel
78
What psychological school was founded by John Watson?
The school of **behaviorism** ## Footnote Watson believed we learn through conditioning of stimulus-response chains.
79
What is another term for **"classical conditioning"**?
Pavlovian conditioning
80
What does backward conditioning cause?
Inhibitory conditioning ## Footnote Inhibitory conditioning prevents forward conditioning.
81
What is an alternate term for **"shaping"**?
Differential reinforcement of successive approximations.
82
What are some **primary drives** for learning?
Also known as **instinctual drives**, primary drives include basic functions like hunger and thirst.
83
What are some **secondary drives** for learning?
Secondary drives, or **acquired drives** may include fame, money, or other motivators that are not instinctual.
84
What is an **exploratory** drive?
It is **neither** a primary nor secondary drive, and appears to be motivated simply by the desire to do or learn something novel.
85
Fritz Heider, Charles Osgood, Percy Tannenbaum, and Leon Festinger believe that humans' thoughts and behaviors are motivated by what?
The desire for **homeostasis**. ## Footnote However, homeostasis-related theories, as well as drive-reduction theories, are criticized because people often do destructive things and seek stimulation, which do not provide balance.
86
What does **Hull's belief** that performance = drive x habit mean?
People will have a drive for something, then use previous behaviors that have accomplished that goal to inform their future actions, or performance.
87
Explain Edward Tolman's **expectancy-value** theory.
performance = expectation x value ## Footnote Combining the importance, or value of a goal, and the likelihood of actually getting it, or expectation, will inform future performance.
88
What did **Henry Murray and David McLelland** believe motivated people's behavior?
They believed people wanted to feel like they were successful, so they would modify their behaviors to either achieve success or avoid failure. this is known as the **need for achievement**.
89
According to John Atkinson, do we **desire success** more or **fear failure** more?
desire success
90
What was Neil Miller's **approach-avoidance** conflict?
It is the conflict one feels when a particular goal has **both** positive and negative valence, like going on a beach vacation (positive valence) when one's is afraid to fly (negative valence).
91
# Fill in the blank. Hedonism is the belief that behaviors are motivated by the desire to feel \_\_\_\_\_\_\_\_\_ and avoid \_\_\_\_\_\_\_\_\_.
pleasure; pain
92
# Fill in the blank. Hebb suggested that a moderate amount of \_\_\_\_\_\_\_\_\_ is required for motivation and performance.
arousal ## Footnote Too much or too little arousal will prevent optimal performance on a task. This is known as the **Yerkes-Dodson effect**.
93
Stopping at red lights is an example of what type of **learning**?
Response learning
94
What is the **opposite** of incidental learning?
Intentional learning ## Footnote Incidental learning happens accidentally, not on purpose.
95
Decreased response to a **familiar stimulus** is called what?
habituation
96
Experiments in which an animal presses a bar to get a **reward** are examples of what?
autoshaping ## Footnote The animal is changing its own behavior by responding to the reward.
97
What is the term for one strong stimulus **preventing** conditioning to a weaker stimulus?
overshadowing
98
What is the **opposite** of habituation?
sensitization ## Footnote Instead of decreasing responsiveness to a stimulus, one becomes more sensitive to a stimulus after repeated exposure.
99
How did M.E. Olds help provide evidence against drive-reduction theory?
He used electrical stimulation of the **brain's pleasure centers** as a form of positive reinforcement, and showed that animals would alter their behavior to receive the stimulation.
100
Is it easier to learn **continuous motor** tasks or **discrete motor** tasks?
continuous tasks ## Footnote When, like riding a bicycle, one motor task flows into the next, it is easier to learn than a series of individual motor tasks.
101
At what **age** is it easiest and hardest to learn new things?
People are able to learn new things most easily between the ages of 3 and 20. After age 50, people are least able to learn.
102
What is **Hermann Ebbinghaus** famous for?
He pioneered the study of **memory** and developing the **forgetting curve** and the **spacing** effect. ## Footnote He posited that people learn at different rates. Sometimes people learn very quickly early on in a subject or task, and then plateau and learn at a slower rate than before.
103
Who wrote the **first** psychology textbook?
Wilhelm Wundt, in 1874
104
Who wrote the first **educational** psychology textbook?
Thorndike, in 1903
105
The measure of one's capacity to perform a task or learn something new is called what?
aptitude
106
What is scaffolding (or scaffolding learning)?
It is the process of providing a learner with **less and less support** as it is needed, until no assistance is needed.
107
What is the primary **goal** of systematic desensitization in therapy?
To **reduce** anxiety responses to specific stimuli through gradual exposure. ## Footnote Systematic desensitization is based on classical conditioning principles and involves gradually exposing a client to anxiety-inducing stimuli while teaching relaxation techniques. This method helps to replace the anxiety response with a relaxation response.
108
# True or false: Counterconditioning involves replacing an undesirable response with a desirable one.
True ## Footnote Counterconditioning seeks to change the client's reaction to a stimulus by introducing a new response that is incompatible with the undesirable response. This technique is often used in behavior therapy to alter emotions or behaviors associated with specific stimuli.
109
What is a **discriminative stimulus** (Sᴰ) in operant conditioning?
A **cue** that signals the availability of reinforcement or punishment following a particular behavior. ## Footnote A discriminative stimulus indicates that a specific response will be reinforced, serving as a guide for behavior. It's crucial in signaling when a behavior should occur to gain reinforcement, thus influencing the learning process.
110
Explain the **partial-reinforcement extinction** effect.
Behaviors that are reinforced intermittently are more **resistant** to extinction than those reinforced continuously. ## Footnote The partial-reinforcement extinction effect is important in understanding how different reinforcement schedules affect behavior durability. Intermittent reinforcement makes it harder for the individual to discern when reinforcement has stopped, leading to greater persistence in the behavior.
111
What is self-efficacy?
It refers to an individual's belief in their **ability to succee**d in specific situations or accomplish a task. ## Footnote Self-efficacy affects how people think, behave, and feel. High self-efficacy can enhance motivation and resilience, while low self-efficacy might lead to avoidance and self-doubt.
112
**Differentiate** between vicarious reinforcement and vicarious punishment.
* **Vicarious Reinforcement**: Observing someone else receive a reward for a behavior, increasing the likelihood of the observer performing that behavior. * **Vicarious Punishment**: Observing someone else receive a punishment for a behavior, decreasing the likelihood of the observer performing that behavior. ## Footnote These concepts are part of Bandura's social learning theory, illustrating how people can learn by observing the actions and consequences of others' behaviors.
113
# True or false: Reciprocal determinism implies that behavior is solely influenced by the environment.
False ## Footnote Reciprocal determinism, proposed by Albert Bandura, suggests that behavior is influenced by the interaction between personal factors, behavior, and the environment. This means that each component can influence and be influenced by the others.
114
What does the **'S-R'** model stand for in psychology?
Stimulus-Response ## Footnote The S-R model, also known as behaviorism, posits that behavior is a direct response to external stimuli, with no consideration for internal mental states.
115
# True or false: The S-O-R model includes the organism's internal states.
True ## Footnote The S-O-R model stands for Stimulus-Organism-Response, emphasizing the role of internal processes and cognitive factors in influencing behavior between stimulus and response.
116
List two key components that **differentiate** the S-O-R model from the S-R model.
* Inclusion of **internal cognitiv**e processes * Consideration of the organism's **interpretation** of stimuli ## Footnote The S-O-R model accounts for the mental processes within the organism that mediate the stimulus-response relationship, allowing for a more complex understanding of behavior.
117
How does exposure to **media violence** potentially lead to **increased aggression** according to psychological theories?
* Desensitization to violence * Modeling of aggressive behaviors * Increased arousal levels ## Footnote Theories suggest that repeated exposure to violent media can decrease sensitivity to violence, provide aggressive behavior models to imitate, and heighten physiological arousal, increasing the likelihood of aggressive responses.