Learning Flashcards

1
Q

Instinctual behaviours

A

Examples: imprinting, homing, migratory behaviours, etc.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Pavlov’s experiment 1890s

A

NS –> no response
US –> UR
Repeatedly pair NS and US –> UR
Results: CS –> CR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Reflexive behaviours

A

Examples: eye-blinking, ‘sucking’ and ‘gripping’ in babies; some reflexive behaviours may disappear as you grow older

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Habituation

A

Decline in the tendency to respond to stimuli that have become familiar due to repeated exposure; ensures that benign stimuli do not interrupt our activity or cause us to expend unnecessary energy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Classical conditioning

A

A neutral stimulus is repeatedly paired with a stimulus that automatically elicits a particular response –> previously neutral stimulus becomes a conditioned stimulus that also elicits a similar response; classical conditioning is not so much the replacement of the US by the CS, but a learning mechanism where the CS (and the CR) prepare the animal for the onset of the US and UR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Watson & Raynor 1920: Little Albert experiment

A

Conditioned fear (fear associated with certain stimuli)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Fetishes

A

A person has heightened sexual arousal in the presence of certain inanimate objects, with the object becoming a conditioned stimulus that can elicit arousal on its own; evidence that it is due to classical conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Is classical conditioning the replacement of the US by the CS?

A

No, classical conditioning is a learning mechanism where the CS (and the CR) prepare the animal for the onset of the US and UR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Edwards & Acker 1972

A

Found that WWII veterans reacted to sounds of battle even 15 years after the war

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Compensatory reaction hypothesis

A

Sometimes, the UR and the CR can be opposites: insulin injections (US) deplete blood sugar levels (UR), and after a number of such injections, the body reacts to the CS in an opposite way to how it reacts to the US (blood sugar levels go up as the body ‘prepares’ itself for the injection); could lead to drug overdose if the CS is not present when drug is taken/ administered

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Siegel 1989

A

Tested the tolerance of rats for ‘overdoses’ of heroin in novel or usual environments –> rats more likely to overdose if they were given drug in new as opposed to usual environment where they had drugs (evidence for compensatory reaction hypothesis)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Acquisition

A

Process by which a conditioned stimulus comes to produce a conditioned response

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Trace/ forward conditioning

A

CS comes before US, but there is a gap between them; not as effective as delayed forward conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Simultaneous conditioning

A

CS and US start and end together; often fails to produce a CR

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Backwards conditioning

A

CS begins after US; least effective form of classical conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Delayed forward conditioning

A

Conditioned stimulus comes just before/ overlaps with the unconditioned stimulus; most effective form of classical conditioning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Contingency

A

How good of a predictor your conditioned stimulus is for the unconditioned stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Contiguity

A

Learning occurs due to temporal proximity of CS and US

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Extinction

A

If you present the conditioned stimulus without ever presenting the unconditioned stimulus, then the CR would gradually decrease; rate of decrease depends on factors such as initial response strength

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Spontaneous recovery

A

A CS-CR relation is extinguished, however, after a period with no CS presentations, the CS may elicit the CR again

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Flooding therapy

A

Fear elicited by a CS (certain phobias) is eliminated by the process of extinction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Stimulus generalisation

A

A conditioned response formed to one conditioned stimulus will occur to other, similar stimuli

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Generalisation gradients

A

Stimuli closer to the CS produce greater CRs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Stimulus discrimination

A

Occurs when an organism does not respond to stimuli that are similar to the stimulus used in training

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Discrimination training

A

Organism is reinforced for responses to one stimulus and not the other –> if organism learns to discriminate, it will respond more to the reinforced stimulus

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Systematic desensitisation

A

Takes a similar stimulus that produces a lesser reaction and repeatedly exposes the subject to it → generalising extinction rather than the acquisition of a response

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Blocking

A

Conditioning does not occur if a good predictor of the US already exists

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Higher-order conditioning

A

Once a stimulus has become an effective CS for a certain CR, then that stimulus can be used to condition other stimuli; effect begins to diminish after a number of trials

29
Q

Sensory preconditioning

A

Learning occurs in the absence of UR; classical conditioning reveals the association already learned between two events

30
Q

Taste-aversion learning

A

When an individual avoids a certain food or drink due to prior illness/ bad experience with it (substance makes them nauseous)

31
Q

Thorndike 1874-1935: Law of Effect

A

Law of Effect: Positive consequences increased the likelihood or probability of a response; behaviours are ‘stamped out’ if followed by negative consequences

32
Q

Instrumental conditioning

A

Concerns the probability/ likelihood of a response changing as a result of its consequences; the subject emits the response in order to produce a reward

33
Q

Skinner 1904-1990

A

Skinner’s version of the Law of Effect:
When a response is followed by a reinforcer, the strength of the response increases, and when a response is followed by a punisher, the strength of the response decreases

34
Q

Positive reinforcement

A

Adding a stimulus or event contingent upon a response increases that behaviour

35
Q

Negative reinforcement

A

Removing a stimulus or event contingent upon a response increases that behaviour

36
Q

Positive punishment

A

Adding a stimulus or event contingent upon a response decreases that behaviour

37
Q

Negative punishment

A

Removing a stimulus or event contingent upon a response decreases that behaviour

38
Q

Cumulative record

A

Graph of responses from a conditioning experiment

39
Q

Continuous reinforcement

A

Every instance of a response is reinforced; useful for shaping behaviour

40
Q

Partial or intermittent reinforcement

A

A designated response is reinforced only some of the time (useful for maintaining behaviours); less likely to undergo extinction because individuals are used to the unreliable nature of the reinforcement

41
Q

Ratio schedule

A

Depends on the number of responses, fixed and variable

42
Q

Interval schedule

A

Response is still important, but what determines whether or not a response will be reinforced depends on the passage of time

43
Q

Fixed-ratio schedule

A

Reinforcer is given after a fixed number of non-reinforced responses

44
Q

Variable-ratio schedule

A

Reinforcer is given after a variable number of non-reinforced responses; the number of non-reinforced responses varies around a predetermined average

45
Q

Fixed-interval schedule

A

Reinforcer is given for the first response after a fixed period of time has elapsed

46
Q

Variable-interval schedule

A

Reinforcer is given for the first response after a variable time interval has elapsed; interval lengths vary around a predetermined average

47
Q

Gaetani et al. 1986: Engineering compensation systems (effects of commissioned vs. wage payment)

A

Found that in this particular case, ratio reinforcement (commission-based pay) was more effective than interval reinforcement (hourly pay)

48
Q

Partial-reinforcement extinction effect

A

Partial reinforcement schedules provide greater resistance to extinction

49
Q

Side-effects of extinction:

A
  1. Increase in response rate before it goes down
  2. Increase in response topography (participant takes a different approach in the hopes of being reinforced); example: extinction-induced aggression
50
Q

What led to the development of Premack’s Principle?

A

Didn’t take into account how the significance of stimuli can change depending on context (Premack took issue with the idea of trans-situational reinforcers - reinforcers that have the same impact regardless of the situation)

51
Q

Premack’s Principle

A

Behaviours are either high probability or low probability; behaviour is reinforced when it is followed by higher probability behaviours

52
Q

Mitchell & Stoffelmayr 1973

A

Demonstrated Premack’s Principle with schizophrenic individuals

53
Q

Honig & Slivka 1964

A

Found that the effects of punishment are easily generalised

54
Q

Reynolds 1969

A

Found that one stimulus dimension had overshadowed learning of the other stimulus dimension

55
Q

Herrnstein & DeVilliers 1980

A

Found that non-human animals could form categories or concepts from complex stimuli

56
Q

The Kelloggs 1933

A

Raised the chimpanzee Gua alongside their son; Gua learned to understand some commands, but never any English words

57
Q

The Hayes 1951

A

Raised Vicki; she learned to make three ‘recognisable’ words (papa, mama, and cup)

58
Q

1960s attempts at teaching sign language to chimpanzees

A

Communication possible, but little evidence of syntax

59
Q

Communication between humans and non-human primates (symbols)

A

Rumbaugh developed ‘Yerkish’ language for non-human primates; pygmy chimpanzee/ bonobo was tested and results were compared to those of a two-year-old human - had the ability to understand sentences compared to human children’s performance (skill of a two or 2.5-year-old child)

60
Q

Breland & Breland 1961

A

Demonstrated biological constraints on instrumental conditioning (pig and raccoon)

61
Q

Tolman & Honzik 1930

A

Found that rats actively process information rather than operating on a stimulus-response relationship; latent learning

62
Q

Observational learning

A

Occurs when an organism’s response is influenced by the observation of others’ behaviour (models)

63
Q

Palameta & Lefebvre 1985

A

Found that group exposed to observational learning was quicker to display the same behaviour (in this case, eating the seed)

64
Q

Cook & Mineka 1987

A

Fear of snakes learnt by observation, but also biological constraints - no fear of flowers learned

65
Q

Bandura et al. (1963, 1965)

A

Aggression learned through modelling (observational learning)

66
Q

Bandura et al. 1967

A

Showing boy playing fearlessly with dog helped to reduce fear (observational learning)

67
Q

Bandura (four key processes for observational learning):

A
  1. Attention: extent to which we focus on others’ behaviour
  2. Retention: retaining a representation of others’ behaviour
  3. Production: ability to actually perform actions we observe
  4. Motivation: need to perform actions we witness (usefulness)
68
Q

Poche et al. 1988

A

Observational learning was effective in teaching abduction-prevention skills to children, but not as effective as observational learning paired with rehearsal